No markers. No constraints. On mobile.
Motion
Analyzed
Our state of the art AI captures and analyzes your motion in realtime.



Latest updates
-
06/23 PoseCam v3.0 released
-
03/23 PoseCam v1.4 released
-
11/22 PoseAI named one of Future40
-
10/22 Awarded Innovate UK grant
-
09/22 PoseCam v1.3.1 released
How it Works
1. Get PoseCam on mobile
or license our framework on iOS, Android or Windows
Our AI processes the pose on device at 50fps+
We have a showcase app which supports all iOS devices
We currently license our engine to developers for iOS, Android and Windows


2. Link to your game or engine
Stream to Unreal Engine via LiveLink, into Unity with our free scripts or use our simple API to integrate into your application. Visit us on the Unreal Marketplace or check out our API examples directly on Github

3. Play, dance or interact!
Once you've linked your game or app, users will be able to use their camera as a controller

Tailored for performance
Unique Features
Our models have been designed from the ground up to run in real-time on mobile devices. This means a fast and smooth experience for all users.

Lightweight
Our neural networks are optimized to reduce compute and memory for fast inference

Low-latency
We only send pose data (not images) from the phone to the game, using little bandwidth

Accurate
Our technology achieve state-of-the art pose accuracy on multiple benchmarks

Flexible
We have modes that cater for every scenario (room, desktop, event, free motion)

Perfect for:
-
Social apps
-
vTubing
Our Technology
Orientation Keypoints
United States Patent No. 11,164,336. Additional patents pending
Backed by novel machine learning research. Orientation keypoints are an elegant solution to human pose estimation in 6D

It's realistic
Our AI captures motion completely, body and hands, using only a single camera.
It's fast
Our software runs in realtime on a phone.
- iPhone 12 @ 60+ fps
- iPhone 8 or X @ 50+ fps
It's state-of-the-art
Our approach is backed by patented innovations. Read our recent peer-reviewed research.