User Guide
v8.3
v8.3
  • Extend Robotics User Manual
  • introduction
    • AMAS Overview
    • Compatible XR Systems
    • Supported Robots
      • Robots
        • Ufactory
        • Universal Robots
        • Dobot
        • Pal Robotics
        • Mitsubishi Electric
        • AGILE-X
      • Accessories
        • Robotiq
        • ROBOTIS
        • Ufactory
    • Choosing 3D Sensors
      • Limitations
      • Sensor Types
      • Visual Comparison
    • Typical Network Connection
    • Specifications
    • Release Notes
      • AMAS V7.6
      • AMAS V8.3
  • getting started
    • Getting Started with AMAS
    • Navigating Through the User Guide
  • AMAS VR Application
    • Download AMAS
      • Setup Quest Application
      • Running the Application
      • Home Scene Arrival
    • Control Panel
      • Floating Keyboard
      • Error popup
    • Interactions Modes
      • View Adjust Mode
      • Robot Control Mode
      • Interaction Utilities
    • Hand Tracking
    • Passthrough Mode
    • Manage multiple devices
      • RoboKit Addition and Configuration
      • Loading RoboKit
      • SenseKit Addition and Configuration
        • Sensor Specific Firmware Settings
        • 2D - 3D fusion
      • Loading SenseKit
      • AudioKit Addition and Configuration
      • Loading AudioKit
      • Configuration Bundles
    • Hand-eye Calibrations
      • Calibration Prerequisite
      • SenseKit to Robot Calibration
      • SenseKit to SenseKit Calibration
      • Manual Sensekit Calibration
      • Manual RoboKit Calibration
    • Motion Playback
      • Creating a Recording
      • Load a Recorded Motion
      • Replay a Recorded Motion
    • Visual Haptic
      • Force Torque Calibration
    • Speech Recognition
    • Utilities
      • Stationary Headset
    • UR Program Change
    • Mobile Robot Movement
      • Virtual Joystick
      • Quest series controller
    • Shutdown Instructions
    • Over-the-Air Firmware Updates
    • Safety Utilities
      • RoboKit Safety Utilities
      • Joint Limit Recovery Visualisation
      • Robot Rescue
  • SenseKit
    • Introduction
    • Components
      • Pre-requisite SenseKit Components
      • Shipped SenseKit Components
    • Mounting Options
    • Setup Instructions
    • Running Instructions
    • SenseKit IP Configuration
    • Calibration Instructions
    • SenseKit Manual Firmware Upgrade / Install
    • Shutdown Instructions
  • RoboKit
    • Introduction
    • Components
      • Pre-requisite RoboKit Components
      • Shipped RoboKit Components
    • Initial Setup Instructions
      • UFactory xArm
      • UR e Series
      • UR CB Series
      • Dobot CR-A Series
      • AgileX
      • Dexterous Hand with xArm
    • Instructions for Starting RoboKit
    • RoboKit IP Configuration
    • RoboKit Manual Firmware Upgrade / Install
    • Shutdown Instructions
    • Safety Protocols
  • API
    • RoboKit
      • Current Robokit Publishers, Subscribers and Services
      • Gripper Customizations
      • Message and Service Definitions
    • SenseKit
  • Other Information
    • Troubleshooting
    • Customized Hardware Integrations
    • Remote Demonstration Requirements
    • Backward Compatibility
    • Guides
      • PAL TIAGo
        • Mobile Base, Arm and Torso Control
        • 3D Sensor
        • Additional Sensor Data Visualization
        • First Person View Mode
        • Take Home Functionality
        • Audio
        • Bundle
      • Robot - Sensor Sync
      • UI Design Reference
    • Downloads
      • OTA
      • Calibration Board
  • System Requirements
    • PC Requirements
    • Network Requirements
Powered by GitBook
On this page
  • Supported Headsets
  • How to activate
  • Creating an Organization
  • Verify Your Account
  • Enable Developer Mode
  • Meta Quest Link Configuration
  • Gesture recognition
  • Robot Control
  • UI
  • Technical Information

Was this helpful?

  1. AMAS VR Application

Hand Tracking

Instructions for how hand tracking works.

PreviousInteraction UtilitiesNextPassthrough Mode

Last updated 8 months ago

Was this helpful?

Supported Headsets

Hand tracking is currently available on Meta Quest headsets that support OpenXR. We recommend the Meta Quest 3 for optimum performance and accuracy.

How to activate

Following this, you will need to create a developer account and enable developer mode to use hand tracking with the Quest headset connected to the PC.

Creating an Organization

  1. Fill in the appropriate information.

Verify Your Account

  1. If prompted, login with your Meta Developer account.

  2. Verify your account with one of the two following options:

    • Add a payment method to your account (Paypal excluded).

Enable Developer Mode

Once you belong to a developer organization, follow these steps to put your device in developer mode:

  1. Download and open the Meta Horizon mobile app, and navigate to Menu > Devices.

  2. Select your device. Then, select Headset Settings > Developer Mode and switch on Developer Mode.

Meta Quest Link Configuration

Open the Meta Quest Link application, select Settings, then click on the Beta tab and enable Developer runtime features.

Once enabled, hand tracking will automatically activate when users put down their controllers. Grabbing the controllers again will automatically deactivate hand tracking.

Gesture recognition

When selecting the View Adjust Mode in the wrist panel, a user can scale, rotate, and move through a scene using hand gestures.

Moving

To move around the scene, select View Adjust Mode and use the grasp gesture to move.

Scaling and Rotation

Use the grasp gesture with both hands enables the ability to scale and rotate the scene. Grasp with both hands and move them apart to scale up the scene or bring the hands closer together to scale down the scene.

If the hands are moved in rotation(for example, when one hand is moved forward and the other is kept stationary or backwards), then the scene will rotate according.

Robot Control

To control the robot, select Robot Control Mode use the grasp gesture to take hold and manipulate the robot. Extending and clasping index finger will open and close the gripper.

UI

While using hand tracking, users can interact with the UI using the tip of their index fingers.

Technical Information

This information is specific to the Dexterous Hand when using Hand Tracking and explains how we map the operator's hand to the robotic hand.

For better accuracy and general purpose support, hand tracking and the Dextrous hand are mapped by a target for each finger consisting of 6 data points, 3 scaling, and 3 offset parameters, totalling 30 parameters for the whole hand to perfectly position targets on the Dexterous hand.

  • Offset (b) and scaling (a) for each finger per axis, (6 parameters per finger x 5 fingers = 30 parameters in total)

  • For each finger, each axis: y = a*x+b, (x is the target point from hand tracking, y is the virtual target for the digital twin hand gripper)

If hand tracking is not enabled on the headset, visit the and enable the hand tracking feature.

Go to the page of the Meta Quest Developer Dashboard.

Go to the of the Meta Quest Developer Dashboard.

Confirm your mobile number to set up SMS two-factor verification, please visit to ensure that your country and phone carrier are supported for SMS.

If the robot's gripper is equipped with the "" by Inspire Robots, then assuming control of it needs to be done by using the feature. See the for more information.

Meta Quest documentation
Create New Organization
verification tab
FB Help Center
Dextrous Hand
Speech Recognition
Dextrous Hand setup page