Quest Hand Tracking
Roles:
Technologies:
Description: An assortment of experiments with Oculus’ hand tracking API, testing gestures, intuitive touch UI, and physics-based grab systems
Screenshot of using gestures to raycast (Bezier Curve) teleportation spots that will be executed in smooth transitions
This is a collection of technical prototypes that exemplify proof of concepts in "hand-tracking-first" development
Neat Stuff
Created multiple types of UIs interactable with chosen dominant hand and finger
Experimented with hand-physics interactions, switching between kinematic and non-kinematic items
UniRX has events as a first-class citizen and makes it easy for Web-Developers to program quickly
Made pinching and double pinch the main form of input for UI
Room For Improvement
With increased complex interactions through pointing, I still want to tighten the accuracy
Occlusion is still a problem when trying to expand gesture-types
Scaling environments for different playtesting was manual and should be automated
Physics grabbing is still tough and usually require set-grab-points for each object