Hand Tracking
Instructions for how hand tracking works.
Was this helpful?
Instructions for how hand tracking works.
Was this helpful?
Hand tracking is currently available on Meta Quest headsets that support OpenXR. For optimum performance and accuracy, we recommend the Meta Quest 3.
Following this, you will need to create a developer account and enable developer mode to use hand tracking with the Quest headset connected to the PC.
Fill in the appropriate information.
If prompted, login with your Meta Developer account.
Verify your account with one of the two following options:
Add a payment method to your account (Paypal excluded).
Once you belong to a developer organization, follow these steps to put your device in developer mode:
Download and open the Meta Horizon mobile app, and navigate to Menu > Devices.
Select your device. Then, select Headset Settings > Developer Mode and switch on Developer Mode.
Open the Meta Quest Link application, select Settings
, then click on the Beta
tab and enable Developer runtime features
.
Once enabled, hand tracking will automatically activate when users put down their controllers. Grabbing the controllers again will automatically deactivate hand tracking.
When selecting the View Adjust Mode
in the wrist panel, a user can scale, rotate, and move through a scene using hand gestures.
To move around the scene, select View Adjust Mode
and use the grasp gesture to move.
Use the grasp gesture with both hands to scale and rotate the scene. Grasp with both hands and move them apart to scale up the scene, or bring the hands closer together to scale down the scene.
If the hands are moved in rotation(for example, when one hand is moved forward and the other is kept stationary or backwards), the scene will rotate accordingly.
Hand tracking will be automatically activated when controllers are not tracking.
To control the robot, select Robot Control Mode
use the grasp gesture as shown in the image to take hold and manipulate the robot. Extending and clasping the index finger will open and close the gripper.
From the settings menu, turn on the Speech Recognition.
Use the command "Speech Recognition Start" to enable speech recognition.
Switch to "Robot Control Mode" either using the control panel or with the "Robot Control Mode" command.
Get the correct hand close to the Dextrous Hand.
Use "Engage Manual Control" command to take control of the hand.
While using hand tracking, users can interact with the UI using the tip of their index fingers.
For better accuracy and general purpose support hand tracking and Dexterous hand are mapped by a target for each finger consisting of 6 data points {3 scaling parameters / 3 offset parameters}, a total of 30 parameters for the whole hand to perfectly position targets on the dexterous hand.
Offset (b) and Scaling (a) for each finger per axis, (6 parameters per finger x 5 fingers = 30 parameters in total)
For each finger, each axis: y = a*x+b, (x is target point from hand tracking, y is virtual target for digital twin hand gripper)
If hand tracking is not enabled on the headset, visit the and enable the hand tracking feature.
Go to the page of the Meta Quest Developer Dashboard.
Go to the of the Meta Quest Developer Dashboard.
Confirm your mobile number to set up SMS two-factor verification, please visit to ensure that your country and phone carrier are supported for SMS.
If the robot's gripper is equipped with the "" by Inspire Robots, then assuming control of it must be done using the feature. See the for more information.