top of page

PoseCam
for Windows
&
PoseAI Studio

For creators and developers.  Unlimited motion capture

Now in Early Access! 

Send feedback to studio@pose-ai.com

Only $14.99* for
annual subscription

Try 1 week free!


*pricing varies per country
studioscreenshot.png

PoseAI Studio

  • Visualize animations in real-time with Unreal Engine built tool
     

  • Record to FBX for game engines or further refinement in Autodesk, Blender, Maya, etc.
     

  • Stream from webcam or videos

PoseCam for Windows

  • Stream animations in real-time from Webcam
     

  • Full body, hands + facial blendshapes
     

  • Lightweight app

camscreenshot.png

This is your About section. Every website has a story and users want to hear yours. This is a great opportunity to give a full background on who you are and what your site has to offer. Double click on the text box to edit the content and add all the information you want to share. You may like to talk about how you got started and share your professional journey. Explain your core values, your commitment to customers and how you stand out from the crowd. You can also add a photo, gallery or video for even more engagement.

  • Add the metahuman to the demo project
    MetaHuman rigs are more complicated than the UE Mannequin, but here we walk you through the steps to animate a MetaHuman with PoseAI. We assume you start with our demo project for UE5, and have familiarized yourself with the animation blueprint and third person character controller blueprints from the demo. Import your Metahuman as per the Epic instructions on adding a MetaHuman to a project. Make sure to enable all features required by MetaHuman on import
  • Create an animation blueprint for the correct skeleton
    1. open the metahuman blueprint (something like BP_AvatarName) 2. select the Body component (components are usually on left side) 3. find the skeletal mesh in details (something like "m_med_nrw_body"). Click the search to button to find it in content browser 4. go to content browser, select the skeleton, right click and create asset - new animation blueprint. Call it something like ABP_AvatarName
  • Setup the Metahuman character blueprint
    1. go back to your metahuman blueprint, select body component again, in details set animation mode to "Animation Blueprint", and anim class to "ABP_AvatarName" or whatever name you created 2. under components Add the Pose AI Movement component to the metahuman 3.drag the PoseAIMovement component into the event graph, drag a pin off to create a Add Source Next Port node and connect it to Event Begin Play. Also create a Close Source node and connect to Event End Play (you may need to add Event End Play to graph) 4. create a handshake structure as input for the Add Source node. You can copy this from our demo as the settings should largely be the same. Importantly rig should be set to "MetaHuman" 5. save the rig blueprint
  • Setup the new animation blueprint
    Edit ABP_AvatarName which you created. Basically here you should copy the structure of our ABP example in both the anim and event graphs. The important thing is the LiveLink node feeding into the output pose in the AnimGraph, which needs to have the correct LiveLinkSubjectName. You could manually set this everytime you begin playing the scene (at which point the dropdown is populated by the engine), or you can use our template to automate that for you. Make sure instead of casting to BP_ThirdPersonCharacter, like in our example, you instead cast to BP_AvatarName in the event graph.
  • Setup the Daz-Unreal bridge and enable Fix Bone Rotations on import
    We assume rigs are imported into Unreal using the Unreal bridge. Please follow tutorials from Daz on how to install the bridge. Important: Make sure the in project settings in Unreal you ENABLE the Fix Bone Rotations on Import.
  • Export from Daz and import to Unreal using the bridge
    Make sure you use the bridge or the bone rotations will not be fixed and will not correspond to our LiveLink preset
  • Set the third person character to use the Daz mesh
    In our demo project, open the third person character and switch the mesh to Daz.
  • Setup an animation blueprint for the Daz skeleton
    Unreal does not easily share animation blueprints between different skeletons. Select your skeleton in the Unreal content browser and Create an Animation Blueprint for the daz skeleton Use our example UE Mannequin animation blueprint as a template and replicate the design for Daz The most important node is the LiveLink node in the animation graph. Make sure to set the retarget asset to PoseAILiveLinkRetargetRotations The LiveLink subject name must be set to the name given to your phone upon connection. The rest of our template uses our helper nodes to allocate this at runtime. If this is "None" then your character will not animate
  • Setup the third person blueprint
    open the third person character blueprint in our demo project select the mesh component and switch it to your Daz avatar with the mesh still selected, find the animation settings and set it to Use Animation Blueprint and set the anim class to the animation blueprint you created in the previous step Find the Handshake node(s) in the event graph. Change the "Rig" field to "DazUE" (capitalize exactly). This configures the AI to the correct bone structure and initial rotations. Important: some of our blueprints (like UE5 at time of writing) automatically switch the animation blueprint in the event graph between different blueprints for Desktop vs Room mode at Event Begin Play. Delete or modify these nodes or else it will switch the animation blueprint to a different skeleton and your avatar will not animate.
  • Note: if arms are animating strangely...
    If arms are backward than you did not successfully fix bone rotations on import. Please review earlier steps and consult the Daz forums. We are not support for this process.
  • Note: UE4 vs UE5 and Daz
    Currently animating the Daz rig with our plugin seems to work better in UE4 than in UE5. In particular the Daz rig in UE5 crashes occasionally - UE4 is stable. We hope to identify the source of this issue and resolve it soon
  • Install the PoseAI UE4 LiveLink plugin
    Install the PoseAILiveLink plugin from the Unreal Marketplace or copy the folder into the [Your Project Name]/Plugins/ folder. Make sure the plugin is enabled from within the editor (Edit/Plugins). This may require restarting the project. Go to project settings->plugins->UDP Messaging. Make sure enable UDP transport is checked. If this is not available please enable Epic's UDP Messaging plugin from Edit->Plugins as LiveLink is unlikely to work without it.
  • NOTE: MacOS Unreal
    On MacOS the marketplace build of the Pose AI plugin may not work when installed as usual to the Engine/Plugins/Marketplace/ folder, and when enabled can prevent the Engine from opening a project. A simple workaround is copying the PoseAILiveLink plugin folder instead to <yourprojectname>/Plugins/. Once this is done uninstall the plugin from the Engine and it should work as intended. Alternatively rebuilding from source may work as well.</yourprojectname> We are trying to resolve this with Epic as we understand this may be a known issue within the engine.
  • Add the PoseAIMovementComponent to the player blueprint
    Open the third person player controller blueprint. Add the PoseAIMovementComponent to the character. This will enable various motion capture events and provides helpful nodes to mange LiveLink sources for your character.
  • Add a source at Event Begin Play, close at Event End Play"
    A LiveLink source represents a motion capture input stream. Each Pose Camera App will be a unique source, and each source must be set up on a unique port. Drag the character's PoseAI Movement Component into the blueprint, and then drag off the node and create an AddSourceNextOpenPort node. Connect it to Event Begin Play. This will automatically add a source on the next free port at the beginning of play. You can use an AddSource node instead if you want to set the port manually, but in this case make sure if you use multiple nodes (i.e. multiple characters), you give each a unique source. Create a CloseSource node by dragging off the PoseAIMovementComponent and connect it to Event End Play. This will close and free up the port and is particularly important when using the Editor (as LiveLink sources persist outside of gameplay and if you don't close you will keep adding sources on new ports, and need to change your phone settings to point at the right one).
  • Modify the animation blueprint to include LiveLink
    Your animation blueprint needs to pull in the animation data from Pose Camera. Open the animation blueprint for your character. In the Event Graph, get the player object, cast it to your blueprint class name so you can access the PoseAIMovementComponent. From the component use the GetSubjectName method to store the LiveLinkSubjectName as a new variable in your blueprint. In the AnimGraph add a Live Link Pose node and connect to the blueprint's output pose. Set the LiveLink source on the node to your subject name variable. Select the Live Link Pose node and in details set retarget asset to "PoseAILiveLinkRetargetRotations" - this preserves your mesh asset's skeletal dimensions but uses the streamed joint rotations and root transform to animate. You can blend with conventional animations to suit your gameplay. See our advanced demo for an example where we blend with a state machine depending on gameplay mode and fade back and forth between mocap and conventional depending on the player's visibility on camera.
  • Connect Pose Camera on your phone
    Open the PoseCamera app on your phone. Go to settings (gear icon) and input the IP address of the computer running the Unreal Engine and the port number selected when configuring LiveLink (the first panel of LiveLink will display the first local host IP address detected but if you have multiple adapters this may not be correct). Make sure your phone and the computer are on the same local network. Close settings on your phone. The app should show the IP address you entered at the bottom of the main screen. Tap to connect. When successful the phone app should display "Unreal LiveLink" in place of the IP address. Unsuccessful? Try our troubleshooting page for tips resolving common connection issues
  • Begin the stream
    Once connected the phone app should offer "Tap to Stream". Tap and the phone should switch to camera preview mode. Within Unreal you should see your phone's name in the bottom LiveLink panel and will show green when it is streaming detected poses.
  • Aim your camera at the subject
    Place your phone appropriately to capture your target space with an unobstructed view. For example, in Desktop mode you might place your phone underneath your monitor or in Room mode on your AV console. You can toggle between front/back camera. You can even aim your phone camera at media playing on another device. Make sure the camera can see most of the target. For Room or Portrait ensure the full body, head to feet, is visible. For Desktop mode ensure the camera can see the head to pelvis and ideally both elbows. The segmented human icon overlaying camera preview turns blue when a subject is visible and shows which limbs are currently poseable.
  • (Optional) Use mocap events for gameplay
    In your character blueprint, select the PoseAIMovementComponent. You should have many usable events, such as OnJump, OnCrouch, OnFootstep which you can use for gameplay. You can also drag special StepCounter objects from the component, such as "Footsteps", which tracks the number of step events and generates an input speed which can be fed into the conventional character controller. Please see our advanced demo for an example.
  • Create an Animation Blueprint
    Create or modify an animation blueprint for your character's skeleton (i.e. UE4_Mannequin).
  • Add a Live Link Pose node
    In the AnimGraph of the animation blueprint add a Live Link Pose node and connect to the blueprint's output pose. Set the LiveLink source on the node (i.e. "My Iphone").* * If you do not see a source then you have not connected LiveLink successfully. Please review our tips for connecting Pose Camera or read the official Unreal Documentation for LiveLink.
  • Set retargetting for PoseAI
    Select the Live Link Pose node and in details set retarget asset to "PoseAILiveLinkRetargetRotations" - this preserves your mesh asset's skeletal dimensions but uses the streamed joint rotations and root transform to animate.
  • (Optional) Customize retargetting
    For better results you can create a custom instance of "PoseAILiveLinkRetargetRotations" for each character, setting different values for the scaleTranslation variable and assign to the the character's individual animation blueprint. This variable adjusts the root motion and pelvis height to accommodate different sized skeletons and may help avoid the mesh penetrating the ground.
  • Compile
    Compile and, if you are currently streaming, the preview skeleton should follow the PoseCamera movements.
  • Set your character to use the animation blueprint
    Open your character blueprint and select the Mesh component. In the Details panel set Animation/Animation Mode to Use Animation Blueprint. Set Animation/Anim Class to the blueprint you created or modified in the first step. Your character should now be driven by Pose Camera at runtime.
  • (Optional) Add the LiveLink Skeletal Animation component to Character
    In the components panel click on Add Component and add the LiveLink Skeletal Animation component to your character. This will also update the character in the editor with the animation stream. Check the character viewport while streaming to see your character animate.
  • (Optional) Create a blend for Desktop camera mode
    If you are using Desktop camera mode, Pose Camera will only stream the upper body. You can use blend pose to create the appropriate animation for the lower body, for example idle standing or a sitting animation. If the stream is in mirror mode, you will likely want to rotate the lower body by 180 degrees as well. Here is an example of an AnimGraph which can switch between animation modes based on boolean values.
  • Setup plugin and character
    Follow the steps outlined in this documentation to setup the plugin and your character using UE4 or Mixamo skeletons (Please see the note below regarding MetaHuman rigs).
  • Add the LiveLink component to your character
    If you did not already do the optional step in the character setup guide, add the LiveLink Skeletal Animation component to your character by clicking on +Add Component in the components panel. This will allow you to record animations while in the editor (otherwise animations will only record while in Play mode).
  • Add your character to the world
    Drag your character blueprint into the viewport to add it to the level.
  • Record with Take Recorder
    Open Window/Cinematics/Take Recorder. Select +Source -> From Actor -> YourCharacter (from the previous step). Click on the red circle at the top of Take Recorder to begin recording (there will be a countdown). When finished click the square stop button.
  • Open your animation and inspect
    By default each Take will be saved in subdirectories under Contents/Cinematics. Find the folder for your take, open the Animation subfolder and you should find an animation sequence capturing your streamed animation.
  • (Optional) Export your animation to FBX
    Unreal allows you to export animation sequences into FBX, to allow editing with other software. From the menu select Asset->Export to FBX->Animation Data
  • [Note] Recording MetaHuman rig animations
    While the plugin successfully animates MetaHuman rigs at runtime and in the editor, currently using the Unreal Engine's Take Recorder to record MetaHuman animations via our livelink plugin can be problematic, with artificats and warping of some transforms. Other users have reported similar issues on the Unreal Forums with MetaHuman and Take Recorder. This may be addressed by the MetaHuman team at some point (MetaHuman is still in beta). Modifying translation retargetting settings on the skeletal rig may improve the results but in our tests we still had warping on some body parts.
  • Try from a C++ project
    We have only successfully packaged LiveLink with C++ projects. For blueprint only projects you may need to add a blank C++ class to your project and compile to switch it to C++ mode, and you will need Visual Studio installed.
  • Configure project settings
    Different settings may work for you, but we use: Edit->Project Settings, then Project->packaging: build=Always, Full Rebuild=True. Plugins->udpsettings->EnabledByDefault=True. EnabledTransport=True
  • Try to build cleanly
    We usually delete the intermediate and saved folders just to make sure, and if necessary potentially close Unreal and delete any binaries (plugin and project).
  • Use our nodes OR apply a LiveLink preset in game.
    Our demo project includes some helper nodes like "Add Source" that setup the source on Event Begin Play and close the source on Event End Play. This should be enough to get livelink working during a packaged game. Alternatively, if you aren't using our nodes and instead directly using the unreal engine livelink UI, you need to make sure to create a LiveLink preset with your source setting and apply it in your game in order to create the livelink source
  • And if that is not working...
    Try building and launching your project from Visual Studio using the DebugGame solution configuration. Make sure to first Bake your Content for Windows from the UE File menu. We find in some cases running Visual Studio clears out whatever in UAT was messing with packaging.
bottom of page