devlog
VRChat Webaverse mixamo animation interop
Learn More โ
Watch on Youtube: https://www.youtube.com/watch?v=h-C5SRH3pB4
Thread: https://twitter.com/HEATHEATHEAT__/status/1480597123628290050
Building a Motion Data Marketplace, will add a new layer of self-expression to the metaverse, allowing the creation & sale of user-generated emotes that work cross-platform.
When uploading your motion capture data to our (v1) market place, (as a .glb file), you will have the option to pick from a variety of skeletons and avatars (PFPs that HEAT owns, such as Meebits or a CloneX)
In time, you'll be able to upload your own avatars, allowing you to create branded or more personalized NFTs!
Ultimately we see dancers partnering with 3D artists, musicians, etc., to create pieces that can stand alone on their own as art while also being functional in-world.
Need to bake animations to 1 glb file
Assets folder from JanusWeb
Export a GLTF/GLB with several animations
https://www.polygonalmind.com/blog-posts/export-a-gltfglb-with-several-animations
https://twitter.com/SketchpunkLabs/status/1499243173788401670?s=20&t=tKha4JsOqTShoV4ppxBCrA
https://www.youtube.com/c/NOITOMMOCAP/videos
https://www.youtube.com/watch?v=W-_fTUMA-xc
https://www.unrealengine.com/en-US/blog/new-training-videos-explore-slay-virtual-art-department-pipeline
https://unrealengine.com/marketplace/en-US/product/mixamo-animation-retargeting
UE5 has a brand new IK Retargeting system that supports both runtime and offline retargeting. Allows you to copy animation between any two skeletons regardless of bone count/naming/orientation, proportions or ref pose.
Been testing animation lately in this repo
https://github.com/binzume/aframe-vrm
Apparently bvh samples
inverse kinematics
https://twitter.com/bob_thomps/status/1499692634436161539
https://blog.playcanvas.com/introducing-the-anim-state-graph/
A Strong and Easy-to-use Single View 3D Hand+Body Pose Estimator
https://github.com/facebookresearch/frankmocap
Blender importer for MOCAP data made with pose estimation tools
https://carlosedubarreto.gumroad.com/l/mocap_import
bvh -> fbx support soon after ICCV Workshop 2021
https://vrchat.com/home/launch?worldId=wrld_394bda87-7fa7-4d92-841a-03ae9f09a51b
Shadermotion can be uploaded with an avatar to allow for mocap streaming / recording in any world.
https://lox9973.com/ShaderMotion/player-gltf.html
https://gitlab.com/lox9973/UdonMotion
Shadermotion can be used to record and replay NPCs in VRChat worlds.
Learn More โ
Watch video: https://www.youtube.com/watch?v=r8YpXP0RlZc
https://gitlab.com/lox9973/ShaderMotion/-/tree/master/
https://gitlab.com/lox9973/ShaderMotion/-/wikis/home
All MIT licensed
https://gitlab.com/lox9973/UdonMotion/-/tree/master/
https://gitlab.com/lox9973/VMDMotion
convert MMD to ShaderMotion video in Unity
udonmotion is for world recording without requiring special avatar
Sign language
bvh
fbx
glb
ifire shadermotion -> bvh https://hackmd.io/EmOAt375S8KFqaNdPDJCXQ
A simple Unity Editor utility that converts bvh (animation) files to fbx using the Blender Python API and imports them into your project.
https://gist.github.com/CheapDevotion/85b8d70aa74afa7f4c135c1c971530ef
b3d_mocap_import: addon for blender to import mocap data from tools like easymocap, frankmocap and Vibe
https://github.com/carlosedubarreto/b3d_mocap_import
OSCMotion WIP by lox9973
beaming motion data from unity to vrchat via OSC
https://discord.com/channels/433492168825634816/947332578241810462/981072733993590804
https://discord.com/channels/@me/732048139858608139/981071398413938728
Learn More โ
Watch on youtube: https://www.youtube.com/watch?v=y9gpozT5-rI
Portable dolly with avatar that can spawn splines with a seat
https://twitter.com/MetacosmStudios/status/1529809578237501440
https://www.youtube.com/watch?v=4d0xbEwAOqQ
Glycon lets you make your own motion capture using only your VR equipment!
โ
You just act out your scene in our virtual environment, or even bring in your own scene, props, and audio. Glycon records your sessions and builds mocap files you can use for your games, movies, educational material, or any other form of visual narrative. It's the simplest and fastest way to create professional quality motion capture, on any platform!Glycon compatible Headsets:
Oculus Rift, Oculus Quest, Oculus Quest 2, HTC Vive, Vive Index, any Windows Mixed Reality headset, and any SteamVR compatible VR setup.Glycon creates FBX and BVH files that are compatible with:
Blender, Cinema4D, iClone, LightWave, Maya, Max, Unreal, and Unity. Pretty much anything that can read FBX or BVH files.
Adds audio recording directly from the headset microphone. Also increases accuracy for recordings, so the audio will synch perfectly with the mocap data exported at the same time! How easy is it to use? Check out this video and see! https://www.youtube.com/watch?v=RQQGpo1g2oM
Learn More โ
Doesn't run in background, but has really good mocap / hand tracking with Oculus Quest 2. Updated frequently.
Learn More โ
Watch on Youtube: https://www.youtube.com/watch?v=isJ22ViP7lQ
Some say shadermotion is better than Glycon
https://github.com/pemguin005/Godot4-MixamoLibraries
https://github.com/omigroup/gltf-extensions/discussions/43
https://github.com/V-Sekai/three-vrm-1-sandbox-mixamo
How Webaverse does animation interop:
the sword defines the animations to use: https://github.com/webaverse/sword/blob/f6777403e2f883f985234ff5a8d1dd28a06f2e01/.metaversefile#L16
the animation blend comes from the avatar system
and webaverse retargets it onto the VRM
.blend file for animation interop / shared library
CLI tool for creator workflow
blender 3.0 .blend file with bone names to match
or bone name mapping table
https://github.com/Miraikomachi/MiraikomachiVRM/blob/master/Miraikomachi.vrm
https://pixiv.github.io/three-vrm/examples/models/three-vrm-girl.vrm
https://github.com/Rokoko/rokoko-studio-live-blender
took animation outside, call them
the sword defines the animations to use: https://github.com/webaverse/sword/blob/f6777403e2f883f985234ff5a8d1dd28a06f2e01/.metaversefile#L16
the animation blend comes from the avatar system
and webaverse retargets it onto the VRM
.blend file for animation interop / shared library
CLI tool for creator workflow
blender 3.0 .blend file with bone names to match
or bone name mapping table
NLA
do animations
call with animation mixer
put as many as u want
delete skeletons
process to apply to skinned mesh / skeleton
gltf studio on computer
gltf specs, create avatar on system with animations we want
walk out into the metaverse, import anywhere with gltf loader
https://www.move.ai/
https://www.poseai.co.uk/
https://store.steampowered.com/app/1759710/MocapForAll/
https://github.com/vrm-c/UniVRM/issues/1522
https://github.com/godotengine/godot/pull/56902
400k animation files for research
https://github.com/BandaiNamcoResearchInc/Bandai-Namco-Research-Motiondataset/blob/master/README_Japanese.md
PBNS: Physically Based Neural Simulation for Unsupervised Outfit Pose Space Deformation.
https://github.com/hbertiche/PBNS
Due to the simple formulation of PBNS and no dependency from data, it can be used to easily enhance any 3D custom avatar with realistic outfits in a matter of minutes!
New UE5 IK retargeting tool https://twitter.com/games_inu/status/1470120001092952072
Hand gesture puppeteering v2.0 iclone reallusion free plugin https://www.youtube.com/watch?v=VOEnVM2TlMw
AI motion capture with video
Our main target is game artists and devs that do not necessarily have a lot of animation experience.
https://80.lv/articles/pixcap-developing-a-cloud-based-3d-animation-platform
Learn More โ
https://www.youtube.com/watch?v=k_nVx_s180A
Using our pose tracking AI model - a fully convolutional neural network model. Pose Tracker allows the user to animate a character using video as a motion source. Connect your character to the Pose tracker with the automatic Retargeting tool. You can load a pre-recorded movie file as the source and generate animation from that to save to anim clips - Or you can stream a live camera to animate your character in real time.
https://www.youtube.com/watch?v=ewEVCBCSIKA
Export support? I'm still looking
https://docs.omniverse.nvidia.com/app_machinima/prod_content/mount-content.html?highlight=export