---
title: Mocap Interoperability
description: Experiments on how to capture, stream, and replay mocap and animation data between game engines and virtual worlds.
image: https://i.imgur.com/bpBQwQJ.jpg
robots: index, follow
lang: en
dir: ltr
breaks: true
disqus: xrdevlog
---
# Mocap Interoperability
###### tags: `devlog`
![](https://i.gyazo.com/117e96a3327a2a71261f42965b35415d.gif)
:star: Join our weekly meetup for avatar interop! https://github.com/m3-org/avatar-interop :star:
**VRChat Webaverse mixamo animation interop**
{%youtube h-C5SRH3pB4 %}
Watch on Youtube: https://www.youtube.com/watch?v=h-C5SRH3pB4
---
## HEAT
Thread: https://twitter.com/HEATHEATHEAT__/status/1480597123628290050
Building a Motion Data Marketplace, will add a new layer of self-expression to the metaverse, allowing the creation & sale of user-generated emotes that work cross-platform.
When uploading your motion capture data to our (v1) market place, (as a .glb file), you will have the option to pick from a variety of skeletons and avatars (PFPs that HEAT owns, such as Meebits or a CloneX)
In time, you'll be able to upload your own avatars, allowing you to create branded or more personalized NFTs!
Ultimately we see dancers partnering with 3D artists, musicians, etc., to create pieces that can stand alone on their own as art while also being functional in-world.
![](https://i.imgur.com/ljSfF2h.gif)
---
## Dance Interop
Need to bake animations to 1 glb file
Assets folder from JanusWeb
![](https://i.imgur.com/NhBOhkQ.png)
- https://github.com/jbaicoianu/janusweb/blob/master/scripts/janusghost.js#L139
- https://github.com/jbaicoianu/janusweb/blob/5ff6926a3e12fc08a787f028ee1b03fb913007cb/scripts/janusplayer.js
Export a GLTF/GLB with several animations
https://www.polygonalmind.com/blog-posts/export-a-gltfglb-with-several-animations
---
## Blender
![](https://i.imgur.com/lIyDJQi.png)
- https://www.blendernation.com/2021/08/17/retargeting-any-motion-capture-data-in-blender-addon-and-tutorial/
- https://github.com/cgvirus/Simple-Retarget-Tool-Blender
---
## Ossos
https://twitter.com/SketchpunkLabs/status/1499243173788401670?s=20&t=tKha4JsOqTShoV4ppxBCrA
---
## Unreal Engine
https://www.youtube.com/c/NOITOMMOCAP/videos
https://www.youtube.com/watch?v=W-_fTUMA-xc
https://www.unrealengine.com/en-US/blog/new-training-videos-explore-slay-virtual-art-department-pipeline
https://unrealengine.com/marketplace/en-US/product/mixamo-animation-retargeting
> UE5 has a brand new IK Retargeting system that supports both runtime and offline retargeting. Allows you to copy animation between any two skeletons regardless of bone count/naming/orientation, proportions or ref pose.
- https://twitter.com/kiaran_ritchie/status/1497271131450929155
- https://www.youtube.com/watch?v=T4bdHCNcHYs
- https://twitter.com/games_inu/status/1497259321108807681
---
## Neos VR
![](https://i.imgur.com/2JNqyAm.png)
![](https://i.imgur.com/8vXWA5l.jpg)
---
## Aframe
Been testing animation lately in this repo
https://github.com/binzume/aframe-vrm
Apparently bvh samples
![](https://i.imgur.com/kbAvbIn.png)
---
## Playcanvas
inverse kinematics
https://twitter.com/bob_thomps/status/1499692634436161539
https://blog.playcanvas.com/introducing-the-anim-state-graph/
---
## FrankMocap
A Strong and Easy-to-use Single View 3D Hand+Body Pose Estimator
https://github.com/facebookresearch/frankmocap
Blender importer for MOCAP data made with pose estimation tools
https://carlosedubarreto.gumroad.com/l/mocap_import
bvh -> fbx support soon after ICCV Workshop 2021
## Shadermotion / Udonmotion
![](https://i.imgur.com/TdNGymr.png)
https://vrchat.com/home/launch?worldId=wrld_394bda87-7fa7-4d92-841a-03ae9f09a51b
Shadermotion can be uploaded with an avatar to allow for mocap streaming / recording in any world.
https://lox9973.com/ShaderMotion/player-gltf.html
![](https://i.imgur.com/YkHbCVy.png)
Flowchart made by [Constapatience](https://twitter.com/Constapatience)
![](https://i.imgur.com/CRFqBk4.jpg)
![](https://i.imgur.com/p2Y131l.jpg)
https://gitlab.com/lox9973/UdonMotion
### NPCs
Shadermotion can be used to record and replay NPCs in VRChat worlds.
![](https://i.imgur.com/Yex6FcX.png)
![](https://i.imgur.com/ALtjk4c.png)
{%youtube r8YpXP0RlZc %}
Watch video: https://www.youtube.com/watch?v=r8YpXP0RlZc
https://gitlab.com/lox9973/ShaderMotion/-/tree/master/
https://gitlab.com/lox9973/ShaderMotion/-/wikis/home
All MIT licensed
https://gitlab.com/lox9973/UdonMotion/-/tree/master/
https://gitlab.com/lox9973/VMDMotion
convert MMD to ShaderMotion video in Unity
### Udonmotion
udonmotion is for world recording without requiring special avatar
![](https://i.imgur.com/6Cy8NfI.png)
![](https://i.imgur.com/QmG6hsK.png)
![](https://i.imgur.com/q8bOfTE.png)
Sign language
![](https://gyazo.com/48c37c19a4b32afc958735fbadf06d9a.gif)
### File Formats
bvh
fbx
glb
ifire shadermotion -> bvh https://hackmd.io/EmOAt375S8KFqaNdPDJCXQ
A simple Unity Editor utility that converts bvh (animation) files to fbx using the Blender Python API and imports them into your project.
https://gist.github.com/CheapDevotion/85b8d70aa74afa7f4c135c1c971530ef
https://discourse.threejs.org/t/is-there-a-way-to-animate-a-dae-glb-or-obj-model-on-the-fly-using-a-bvh-animation/16082
b3d_mocap_import: addon for blender to import mocap data from tools like easymocap, frankmocap and Vibe
https://github.com/carlosedubarreto/b3d_mocap_import
---
## Rokoko Video
- https://www.rokoko.com/products/video Free video -> Mocap
- https://github.com/Rokoko/rokoko-studio-live-blender Blender plugin
---
## OSCMotion
OSCMotion WIP by lox9973
beaming motion data from unity to vrchat via OSC
https://discord.com/channels/433492168825634816/947332578241810462/981072733993590804
https://discord.com/channels/@me/732048139858608139/981071398413938728
{%youtube y9gpozT5-rI %}
Watch on youtube: https://www.youtube.com/watch?v=y9gpozT5-rI
---
## Metacosm
Portable dolly with avatar that can spawn splines with a seat
![](https://i.imgur.com/nAOhPBt.jpg)
![](https://gyazo.com/cbe0dd22551d7ae7933e3e0c49a8321b.gif)
https://twitter.com/MetacosmStudios/status/1529809578237501440
https://www.youtube.com/watch?v=4d0xbEwAOqQ
---
## Glycon3d
https://www.glycon3d.com/
![](https://i.imgur.com/jlp2sxg.png)
> Glycon lets you make your own motion capture using only your VR equipment!
>
> You just act out your scene in our virtual environment, or even bring in your own scene, props, and audio. Glycon records your sessions and builds mocap files you can use for your games, movies, educational material, or any other form of visual narrative. It's the simplest and fastest way to create professional quality motion capture, on any platform!
>
> Glycon compatible Headsets:
> Oculus Rift, Oculus Quest, Oculus Quest 2, HTC Vive, Vive Index, any Windows Mixed Reality headset, and any SteamVR compatible VR setup.
>
> Glycon creates FBX and BVH files that are compatible with:
> Blender, Cinema4D, iClone, LightWave, Maya, Max, Unreal, and Unity. Pretty much anything that can read FBX or BVH files.
Adds audio recording directly from the headset microphone. Also increases accuracy for recordings, so the audio will synch perfectly with the mocap data exported at the same time! How easy is it to use? Check out this video and see! https://www.youtube.com/watch?v=RQQGpo1g2oM
{%youtube RQQGpo1g2oM %}
Doesn't run in background, but has really good mocap / hand tracking with Oculus Quest 2. Updated frequently.
{%youtube isJ22ViP7lQ %}
Watch on Youtube: https://www.youtube.com/watch?v=isJ22ViP7lQ
:::info
Some say shadermotion is better than Glycon
:::
---
## Godot Mixmao Retargeting
https://github.com/pemguin005/Godot4-MixamoLibraries
---
## Mocap Fusion
https://www.mocapfusion.com/
![](https://i.imgur.com/o3ARAFo.png)
![](https://i.imgur.com/LEdPCYM.png)
---
## OMI Gltf subgroup
https://github.com/omigroup/gltf-extensions/discussions/43
https://github.com/V-Sekai/three-vrm-1-sandbox-mixamo
How Webaverse does animation interop:
the sword defines the animations to use: https://github.com/webaverse/sword/blob/f6777403e2f883f985234ff5a8d1dd28a06f2e01/.metaversefile#L16
the animation blend comes from the avatar system
and webaverse retargets it onto the VRM
![](https://i.imgur.com/ZsV9bN7.png)
.blend file for animation interop / shared library
CLI tool for creator workflow
blender 3.0 .blend file with bone names to match
or bone name mapping table
https://github.com/Miraikomachi/MiraikomachiVRM/blob/master/Miraikomachi.vrm
https://pixiv.github.io/three-vrm/examples/models/three-vrm-girl.vrm
https://github.com/Rokoko/rokoko-studio-live-blender
https://github.com/V-Sekai/V-Sekai-Blender-tools/tree/5793e1ba3c0cf1bedf75bed7ca49be31f5b6ad5b/addons/blender-skeletal-motion-animate
took animation outside, call them
![](https://i.imgur.com/Cx4qeDl.jpg)
![](https://i.imgur.com/73tC6w6.jpg)
the sword defines the animations to use: https://github.com/webaverse/sword/blob/f6777403e2f883f985234ff5a8d1dd28a06f2e01/.metaversefile#L16
the animation blend comes from the avatar system
and webaverse retargets it onto the VRM
.blend file for animation interop / shared library
CLI tool for creator workflow
blender 3.0 .blend file with bone names to match
or bone name mapping table
NLA
do animations
call with animation mixer
put as many as u want
delete skeletons
process to apply to skinned mesh / skeleton
gltf studio on computer
gltf specs, create avatar on system with animations we want
walk out into the metaverse, import anywhere with gltf loader
### Needs testing
https://www.move.ai/
https://www.poseai.co.uk/
https://store.steampowered.com/app/1759710/MocapForAll/
https://github.com/vrm-c/UniVRM/issues/1522
https://github.com/godotengine/godot/pull/56902
400k animation files for research
https://github.com/BandaiNamcoResearchInc/Bandai-Namco-Research-Motiondataset/blob/master/README_Japanese.md
PBNS: Physically Based Neural Simulation for Unsupervised Outfit Pose Space Deformation.
https://github.com/hbertiche/PBNS
![](https://github.com/hbertiche/PBNS/raw/master/gifs/seqs.gif)
![](https://github.com/hbertiche/PBNS/raw/master/gifs/resizer0.gif)
Due to the simple formulation of PBNS and no dependency from data, it can be used to easily enhance any 3D custom avatar with realistic outfits in a matter of minutes!
![](https://github.com/hbertiche/PBNS/raw/master/gifs/avatar1.gif)
New UE5 IK retargeting tool https://twitter.com/games_inu/status/1470120001092952072
Hand gesture puppeteering v2.0 iclone reallusion free plugin https://www.youtube.com/watch?v=VOEnVM2TlMw
## Pixcap
AI motion capture with video
> Our main target is game artists and devs that do not necessarily have a lot of animation experience.
https://80.lv/articles/pixcap-developing-a-cloud-based-3d-animation-platform
{%youtube k_nVx_s180A %}
https://www.youtube.com/watch?v=k_nVx_s180A
## Omniverse Machinima Pose Tracker
![](https://i.imgur.com/zJvb9pA.png)
> Using our pose tracking AI model - a fully convolutional neural network model. Pose Tracker allows the user to animate a character using video as a motion source. Connect your character to the Pose tracker with the automatic Retargeting tool. You can load a pre-recorded movie file as the source and generate animation from that to save to anim clips - Or you can stream a live camera to animate your character in real time.
- https://docs.omniverse.nvidia.com/app_machinima/app_machinima/release-notes.html
- https://docs.omniverse.nvidia.com/prod_extensions/prod_extensions/ext_pose_tracker.html
- https://www.nvidia.com/en-us/on-demand/session/omniverse2020-om1454/
- Animation retargeting overview
- https://medium.com/@nvidiaomniverse/what-is-animation-retargeting-4aadab383032
- https://www.nvidia.com/en-us/on-demand/session/gtcspring22-s41482/
Omniverse Machinima Pose Tracker Tutorial
https://www.youtube.com/watch?v=ewEVCBCSIKA
Export support? I'm still looking
https://docs.omniverse.nvidia.com/app_machinima/prod_content/mount-content.html?highlight=export