User Guide
v8.3
v8.3
  • Extend Robotics User Manual
  • introduction
    • AMAS Overview
    • Compatible XR Systems
    • Supported Robots
      • Robots
        • Ufactory
        • Universal Robots
        • Dobot
        • Pal Robotics
        • Mitsubishi Electric
        • AGILE-X
      • Accessories
        • Robotiq
        • ROBOTIS
        • Ufactory
    • Choosing 3D Sensors
      • Limitations
      • Sensor Types
      • Visual Comparison
    • Typical Network Connection
    • Specifications
    • Release Notes
      • AMAS V7.6
      • AMAS V8.3
  • getting started
    • Getting Started with AMAS
    • Navigating Through the User Guide
  • AMAS VR Application
    • Download AMAS
      • Setup Quest Application
      • Running the Application
      • Home Scene Arrival
    • Control Panel
      • Floating Keyboard
      • Error popup
    • Interactions Modes
      • View Adjust Mode
      • Robot Control Mode
      • Interaction Utilities
    • Hand Tracking
    • Passthrough Mode
    • Manage multiple devices
      • RoboKit Addition and Configuration
      • Loading RoboKit
      • SenseKit Addition and Configuration
        • Sensor Specific Firmware Settings
        • 2D - 3D fusion
      • Loading SenseKit
      • AudioKit Addition and Configuration
      • Loading AudioKit
      • Configuration Bundles
    • Hand-eye Calibrations
      • Calibration Prerequisite
      • SenseKit to Robot Calibration
      • SenseKit to SenseKit Calibration
      • Manual Sensekit Calibration
      • Manual RoboKit Calibration
    • Motion Playback
      • Creating a Recording
      • Load a Recorded Motion
      • Replay a Recorded Motion
    • Visual Haptic
      • Force Torque Calibration
    • Speech Recognition
    • Utilities
      • Stationary Headset
    • UR Program Change
    • Mobile Robot Movement
      • Virtual Joystick
      • Quest series controller
    • Shutdown Instructions
    • Over-the-Air Firmware Updates
    • Safety Utilities
      • RoboKit Safety Utilities
      • Joint Limit Recovery Visualisation
      • Robot Rescue
  • SenseKit
    • Introduction
    • Components
      • Pre-requisite SenseKit Components
      • Shipped SenseKit Components
    • Mounting Options
    • Setup Instructions
    • Running Instructions
    • SenseKit IP Configuration
    • Calibration Instructions
    • SenseKit Manual Firmware Upgrade / Install
    • Shutdown Instructions
  • RoboKit
    • Introduction
    • Components
      • Pre-requisite RoboKit Components
      • Shipped RoboKit Components
    • Initial Setup Instructions
      • UFactory xArm
      • UR e Series
      • UR CB Series
      • Dobot CR-A Series
      • AgileX
      • Dexterous Hand with xArm
    • Instructions for Starting RoboKit
    • RoboKit IP Configuration
    • RoboKit Manual Firmware Upgrade / Install
    • Shutdown Instructions
    • Safety Protocols
  • API
    • RoboKit
      • Current Robokit Publishers, Subscribers and Services
      • Gripper Customizations
      • Message and Service Definitions
    • SenseKit
  • Other Information
    • Troubleshooting
    • Customized Hardware Integrations
    • Remote Demonstration Requirements
    • Backward Compatibility
    • Guides
      • PAL TIAGo
        • Mobile Base, Arm and Torso Control
        • 3D Sensor
        • Additional Sensor Data Visualization
        • First Person View Mode
        • Take Home Functionality
        • Audio
        • Bundle
      • Robot - Sensor Sync
      • UI Design Reference
    • Downloads
      • OTA
      • Calibration Board
  • System Requirements
    • PC Requirements
    • Network Requirements
Powered by GitBook
On this page
  • Background
  • Description of the rendering scheme
  • Math Formulation
  • Gravity Compensation

Was this helpful?

  1. AMAS VR Application

Visual Haptic

How 6 DoF force and torque sensor data is represented visually to intuitively support user perception in AMAS VR

Background

An important capability required to remotely operate robot (robotic teleoperation) is to transmit haptic (physical force, torque, pressure, and vibration) information measured from the robot to the remote operator. In order to allow the operator to perceive this haptic information as intuitive as possible, the user interface for robotic teleoperation generally requires complex mechanically actuated hardware, aiming to recreate physical haptic sensation at user’s hand and/or arm.

However, such solutions results in below limitations:

- Costly and unscalable operator user interface hardware.

- The physical haptic sensation feedback generally sensitive to latency

- The physical haptic sensation is imprecise in judgement of magnitude

- Nonobvious force and torque scaling

On the contrary, graphical representation of the force and torque has been well known in the scientific community. The use of vector-based representation is used to communicate terms for algorithms and mathematics in publications. The vector-based force-torque representation has the advantage of:

- No mechanical user interface hardware required

- Visual cue, not sensitive to latency

- Precise representation for force

- Clear scaling of the measure

However, such representation requires in-depth understanding of the format and conventions, and generally require thorough thinking to fully understand the representation, thus with the following limitations:

- Require training and education for operator

- Slow to understand, not suitable for real-time immediate feedback

- Increased cognitive workload in understanding complex vectors

With the aid in computer graphics and extended reality (VR/AR/MR) visualisation technologies, we present a novel, intuitive, uniform method to represent force-torque only using graphic rendering, while avoiding fatigue in understanding mathematic vectors.

Description of the rendering scheme

This method utilise misplacement between reference model and indicative live model. Specifically, an indicative live model overlaps or coincides with the reference model when no force or torque is applied externally from robotic system. When external force and/or torque is applied at the end effector, the indicative live model translates and/or rotates (misplaces) out of the reference model, depending based on the value of the external force and external torque applied at each coordinate axis.

In AMAS, the force and torque is rendered as shown below:

Math Formulation

Assign the scaled external force to translation of the indicative live model. The translational vector d can be computed as

Where d is the translational distance of the indicative live model in each axis, f is the external force computed from sensor measurements, and b is a scaling factor, which can be tuned depending on the user preference and nominal force computed in general operations.

Assign the scaled external axis torque to axis rotation of the indicative live model,

Gravity Compensation

Methods to obtain the measurements of external force and torque at the end effector, compensating the gravity of the body parts in the robotic system, depending on the location of the sensor, based on inverse dynamics and/or kinematics of the robotic system.

PreviousReplay a Recorded MotionNextForce Torque Calibration

Last updated 10 months ago

Was this helpful?

Where are the rotation angle of the around x, y, z axis respectively, and t represents the torque values about the corresponding axis. b is a scaling factor, which can be tuned depending on the user preference and nominal torque computed in general operations.

In the below images, the end effector is attached on the force and torque sensor, located at the end of the arm. After the process of . Gravity of the end effector is removed from the sensor measurement, regardless of the orientation.

Force Torque Calibration
https://youtu.be/fjuvYy8s4Qwyoutu.be
Rendering principle
After calibration no torque or force is observed when 0 degree rotated
After calibration no torque or force is observed when 90 degree rotated