Hand Tracking
Instructions for how hand tracking works.
Supported Headsets
Hand tracking is currently available on Meta Quest headsets that support OpenXR. We recommend the Meta Quest 3 for optimum performance and accuracy.
How to activate
If hand tracking is not enabled on the headset, visit the Meta Quest documentation and enable the hand tracking feature.
Following this, you will need to create a developer account and enable developer mode to use hand tracking with the Quest headset connected to the PC.
Creating an Organization
Go to the Create New Organization page of the Meta Quest Developer Dashboard.
Fill in the appropriate information.
Verify Your Account
Go to the verification tab of the Meta Quest Developer Dashboard.
If prompted, login with your Meta Developer account.
Verify your account with one of the two following options:
Confirm your mobile number to set up SMS two-factor verification, please visit FB Help Center to ensure that your country and phone carrier are supported for SMS.
Add a payment method to your account (Paypal excluded).
Enable Developer Mode
Once you belong to a developer organization, follow these steps to put your device in developer mode:
Download and open the Meta Horizon mobile app, and navigate to Menu > Devices.
Select your device. Then, select Headset Settings > Developer Mode and switch on Developer Mode.
Meta Quest Link Configuration
Open the Meta Quest Link application, select Settings
, then click on the Beta
tab and enable Developer runtime features
.
Once enabled, hand tracking will automatically activate when users put down their controllers. Grabbing the controllers again will automatically deactivate hand tracking.
Gesture recognition
When selecting the View Adjust Mode
in the wrist panel, a user can scale, rotate, and move through a scene using hand gestures.
Moving
To move around the scene, select View Adjust Mode
and use the grasp gesture to move.
Scaling and Rotation
Use the grasp gesture with both hands enables the ability to scale and rotate the scene. Grasp with both hands and move them apart to scale up the scene or bring the hands closer together to scale down the scene.
If the hands are moved in rotation(for example, when one hand is moved forward and the other is kept stationary or backwards), then the scene will rotate according.
Robot Control
To control the robot, select Robot Control Mode
use the grasp gesture to take hold and manipulate the robot. Extending and clasping index finger will open and close the gripper.
If the robot's gripper is equipped with the "Dextrous Hand" by Inspire Robots, then assuming control of it needs to be done by using the Speech Recognition feature. See the Dextrous Hand setup page for more information.
UI
While using hand tracking, users can interact with the UI using the tip of their index fingers.
Technical Information
This information is specific to the Dexterous Hand when using Hand Tracking and explains how we map the operator's hand to the robotic hand.
For better accuracy and general purpose support, hand tracking and the Dextrous hand are mapped by a target for each finger consisting of 6 data points, 3 scaling, and 3 offset parameters, totalling 30 parameters for the whole hand to perfectly position targets on the Dexterous hand.
Offset (b) and scaling (a) for each finger per axis, (6 parameters per finger x 5 fingers = 30 parameters in total)
For each finger, each axis: y = a*x+b, (x is the target point from hand tracking, y is the virtual target for the digital twin hand gripper)
Last updated