<center><img src="https://i.imgur.com/AxQOFqW.png" alt="alt text" width="768" height="384"></center>
</br>
# DIY Markerless Mocap
## Overview
Motion capture is a complex process that involves capturing the movement of a person or object and translating it into a digital format. The equipment and software required for this process can be expensive, making it inaccessible for many people. However, with the advancement of technology, it's now possible to achieve cheap motion capture using an iPhone and Blender.
In this tutorial, we will be going through the steps of using the ThreeD Pose Tracker app to capture motion data, importing it into Blender, and retargeting it onto a 3D model. While this tutorial may not be perfect, it provides a great alternative for those looking to achieve motion capture on a budget.
*Requirements: iPhone or Windows PC, Windows/Mac PC with Blender installed and a VRM file. Note that you can use an avatar other than a VRM, but we'll be focusing on a VRM workflow in this tutorial.*
## Process
1. Download [ThreeD Pose Tracker](https://digital-standard.com/tdpt_lp/en/) (TDPT). This can be installed on both [Windows](https://digital-standard.com/tdpt/) (purchase [here](https://booth.pm/ja/items/3698596)) and [iOS](https://digital-standard.com/tdptios/) (purchase [here](https://apps.apple.com/us/app/tdpt/id1561086509). In order to complete this tutorial, you will need to purchase the premium version for a few dollars USD. I will be using screenshots from the iOS app for this tutorial, but the same functionality is available on the Windows app. Note that if you are using the Windows app and are having issues, restart your computer and open it before opening any other application. This process works for me.
<center><img src="https://i.imgur.com/vD8gkRg.png" alt="alt text" width="321" height="694.5"></center>
</br>
2. Record your video or (not recommended) use a third party website to scrape and download a video from your social media platform of choice. This will be my ridiculous input video for this tutorial.
<center><img src="https://i.imgur.com/Wp6daSu.gif" alt="alt text" width="321" height="600"></center>
</br>
3. Open TDPT and select the `hamburger menu` icon on the bottom left of the UI. Press `Select video` and choose the video file you want. Note that this is stored in your iPhone's files. If you have your video in your `Photo Library` and want to move it to your files, select it, press the `Share` icon and select `Save to Files`.
<center><img src="https://i.imgur.com/C8n10hn.png" alt="alt text" width="321" height="694.5"></center>
</br>
4. The video will start playing, and the TDPT avatar will start moving in relation to position of the video.
<center><img src="https://i.imgur.com/b19RCdz.gif" alt="alt text" width="321" height="700"></center>
</br>
5. Once you're happy with your result, press the `Star` button on the UI to access the premium features. Scroll down to the `Exporting motion data` panel and select `BVH`. This will start recording the animation. To stop recording, tap the screen and press the `Stop` button.
<center><img src="https://i.imgur.com/nqk2gBA.gif" alt="alt text" width="321" height="700"></center>
</br>
6. The BVH file will now be stored to your iPhone's files. By default, this will be stored in `Files -> On My iPhone -> TDPT -> motion`. Send this file to your PC.
7. Now it's time to install a few free Blender add-ons. We will be using the [Rokoko Blender Plugin](https://www.rokoko.com/integrations/blender) and the [VRM Add-on For Blender](https://vrm-addon-for-blender.info/en/). Once you download these files, open Blender and select `Edit -> Preferences -> Add-ons -> Install` then select the files you downloaded and enable them.
<center><img src="https://i.imgur.com/xIOXoe5.png" alt="alt text" width="800" height="100"></center>
</br>
8. Select `File -> Import -> Motion Capture (.bvh)` and select your `.bvh` file. Your motion capture data will now be imported into Blender. To see it in action, press `Space`. Note that you may need to speed the animation up. To do this, simply go to your animation timeline, select all keyframes by pressing `A` and `Scale` your animation by pressing `S`.

</br>
9. Import your VRM file by going to `File -> Import -> VRM (.vrm)`. Click `Import Anyway` when prompted. Note that you can use other file types, but ensure that they are in T-pose before importing.

</br>

</br>
10. On your tools panel (right hand side of 3D viewport), open the Rokoko Plugin tab. Select your animation armature as your `Source` and your VRM armature as your `Target`. Click `Build Bone List`.
<center><img src="https://i.imgur.com/XpeSr5Z.png" alt="alt text" width="300" height="700"></center>
</br>
11. If your bones aren't mapped automatically, use the below as a reference for the mapping. If you're using a file other than a VRM, you will have to map these manually according to your avatar's armature.
<center><img src="https://i.imgur.com/NaJhVSX.png" alt="alt text" width="300" height="550"></center>
<center><img src="https://i.imgur.com/G204LLf.png" alt="alt text" width="300" height="550"></center>
<center><img src="https://i.imgur.com/Q8u4YPE.png" alt="alt text" width="300" height="550"></center>
12. Select `Retarget Animation`. This may take a few minutes to load depending on your machine and animation.
13. Ta-da! You have now animated your character. Delete your original BVH armature and export to your file of choice. If exporting as a `.glb`, be sure to select `Optimize Animation Size`.
<center><img src="https://i.imgur.com/992VDTD.png" alt="alt text" width="600" height="600"></center>
</br>
<center><img src="https://i.imgur.com/UZDGGZH.png" alt="alt text" width="300" height="500"></center>
</br>
## Conclusion
Achieving motion capture using an iPhone and Blender is now more accessible than ever. With the help of the ThreeD Pose Tracker app and free Blender add-ons, it's possible to capture and translate motion data onto a 3D model. The resulting output can be used for a variety of purposes such as game development, film, and animation. While the process may not be as sophisticated as traditional motion capture methods, it provides a cheap and efficient alternative for those looking to experiment with motion capture. With some practice and creativity, the possibilities are endless.
Below are some examples of what our animations look like across a variety of platforms
<center><img src="https://i.imgur.com/16R1yJO.png" alt="alt text" width="600" height="600"></center>
</br>
<center>Unreal Engine 5</center>
</br>
<center><img src="https://i.imgur.com/YB0M5cr.png" alt="alt text" width="600" height="600"></center>
</br>
<center>Hyperfy</center>
</br>
<center><img src="https://i.imgur.com/1opa6vP.png" alt="alt text" width="600" height="600"></center>
</br>
<center>Spatial</center>
</br>
<center><img src="https://i.imgur.com/wsfip4C.jpg" alt="alt text" width="300" height="450"></center>
</br>
<center>Augmented Reality</center>
</br>
:::info
*MetaMike is virtual world experience developer and has a passion for contributing to the education and enablement of the open metaverse.*
Follow MetaMike
[Website](https://itsmetamike.xyz/) | [Twitter](https://twitter.com/itsmetamike) | [LinkedIn](https://www.linkedin.com/in/itsmetamike/)
*If you'd like to support me, my work shown above is available to mint or purchase [here](https://www.itsmetamike.xyz/gallery).*
:::