Try   HackMD

Hand Tracking

Device support for the popular leap motion peripheral has existed in janus for years. In this log we look at the past and future of using our hands in mixed reality.

Originally published February 2019

tags: devlog hands janus input ui

One of the first things people do in a VR demo is look at their hands. Leap motion support was added to janusvr on November 21, 2014 and in janusweb on June 24, 2016. Movements with leap motion are also transmitted through the networked presence server for both clients so multiple people can see hand gestures.

Looking at hands in a light shadow webvr demo

Hands are an ideal standard input device for prototyping UI/UX in VR simply because they are always powered, familiar, and connected to the user which already makes it better than most of the controllers people are using today.

Hands were used to search records and play them on machines

Reaching into 6 janusweb sites at once

Notes

Useful UI zones in VR

Designing a UI/UX for leap motion with VR will transition smoothly for use in AR, especially with project north star becoming more available to developers.

Prototype UI seen through project north star

Pull out various tabs with pinch and drag