User Guide
v2025.03
v2025.03
  • Extend Robotics User Guide
  • introduction
    • Release Notes
      • AMAS v2025.03
      • AMAS V10.3
      • AMAS V9.3
      • AMAS V8.3
      • AMAS V7.6
    • AMAS Overview
    • Compatible XR Systems
    • Supported Robots
      • Robots
        • Ufactory
        • Universal Robots
        • Dobot
        • Pal Robotics
        • Mitsubishi Electric
        • AGILE-X
      • Accessories
        • Robotiq
        • ROBOTIS
        • Ufactory
        • On Robot
    • Choosing 3D Sensors
      • Limitations
      • Sensor Types
      • Support Matrix
      • Visual Comparison
    • Specifications
    • PC Requirements
    • Typical Network Connection
    • Network Requirements
  • getting started
    • Getting Started with AMAS
    • Navigating Through the User Guide
  • Cortex
    • Introduction
    • Robot Setup Instructions
      • UFactory xArm
      • UR e Series
      • UR CB Series
      • Dobot CR-A Series
      • AgileX
      • Dexterous Hands
        • Robotera XHAND1
        • Inspire Robots Dexterous Hand
      • Extend Avatar
        • Electronics Assembly
        • Mechanical assembly
          • Avatar Component Parts list
          • Assembly Instructions
        • xArm Adjustments
    • Sensor Setup Instructions
      • Sensor Positioning
    • Cortex Firmware Configuration
    • Instructions for Starting the Cortex
    • Cortex IP Configuration
    • Shutdown Instructions
    • Cortex Safety Protocols
  • AMAS VR Application
    • Download AMAS
      • Setup Quest Application
      • Running the Application
      • Home Scene Arrival
    • Control Panel
      • Interactions Modes
        • View Adjust Mode
        • Robot Control Mode
        • Interaction Utilities
      • Floating Keyboard
      • Error popup
      • Web Browser
    • Hand Tracking
    • Hand Gestures
    • Upper Body Tracking
    • Passthrough Mode
    • Managing Cortex
      • Adding Cortex Connection
      • Loading Cortex Connection
      • Modifying Robot Settings
        • Safety Parameter Update
      • Manual Robot Positioning
      • Modifying Sensor Settings
        • 2D - 3D fusion
      • Sensor Position Calibration
        • Calibration Prerequisite
          • Calibration Board
        • Sensor to Robot Calibration
        • Sensor to Sensor Calibration
        • Manual Calibration
    • Accessories
      • SenseGlove Nova 2
      • bHaptics TactGloves
    • Record and Replay
      • Creating a Recording
      • Load a Recorded Motion
      • Replay a Recorded Motion
    • Data Collection
      • Recording Data
      • Data Visualisation
        • ROS Bag
        • HDF5
        • LeRobot
    • Tutorial
    • Visual Haptic
      • Force Torque Calibration
    • Speech Recognition
    • Utilities
      • Stationary Headset
      • Exhibition Scene
    • UR Program Change
    • Mobile Robot Movement
      • Virtual Joystick
      • Quest series controller
    • Shutdown Instructions
    • Safety Utilities
      • Safety Utilities
      • Joint Limit Recovery Visualisation
      • Robot Rescue
    • Avatar Additional VR Functionalities
      • Avatar Digital Twin Assembly
      • Pose Sync Avatar Camera Setup
      • Torso Initialisation
      • Embody Control Mode
  • Command Console
    • Over-the-Air Firmware Updates
    • Updating Robot/Sensor Configurations
      • Robot Specific Firmware Settings
      • Sensor Specific Firmware Settings
  • Cortex API
    • Robots
      • Current Robot Publishers, Subscribers and Services
      • Gripper Customizations
      • Message and Service Definitions
    • Sensors
  • Other Information
    • Troubleshooting
    • Customized Hardware Integrations
    • Remote Demonstration Requirements
    • Guides
      • PAL TIAGo
        • Mobile Base, Arm and Torso Control
        • 3D Sensor
        • Additional Sensor Data Visualization
        • First Person View Mode
        • Take Home Functionality
        • Audio
        • Bundle
      • Robot - Sensor Sync
Powered by GitBook
On this page
  • The Integration Process
  • Understanding Integration Requirements
  • Configuring Robot Arm and Sensor Parameters
  • Leveraging Enhanced Interaction Possibilities

Was this helpful?

  1. Other Information

Customized Hardware Integrations

Extending the Realm of the Application to your Hardware

The realm of an application is characterized by constant evolution, and the AMAS VR App stands as a testament to the pursuit of limitless innovation. This chapter delves into the exciting world of integrating 3rd party robot arms and sensors into the AMAS ecosystem, expanding the horizons of what's achievable within the realm of virtual reality robotics.

The Integration Process

Integrating third-party hardware with the AMAS VR App involves a deliberate process that seamlessly marries hardware and software. This section provides a roadmap for users seeking to embark on this integration journey, ensuring a smooth transition from physical robotic systems to immersive virtual environments.

Understanding Integration Requirements

Before embarking on the integration journey, it's crucial to understand the prerequisites for successful integration. This includes evaluating the compatibility of the chosen robot arm, ensuring necessary communication protocols, and addressing any hardware adaptations that might be required.

Configuring Robot Arm and Sensor Parameters

Once compatibility is confirmed, the next step involves configuring the robot arm's or sensor's parameters. This process ensures that the virtual representation of the sensor or the robot arm mirrors its real-world counterpart, enabling precise motion replication and interaction. This step comprises of but is not limited to:

  • ROS Driver Package (ROS Melodic)

  • MoveIt! Package

  • URDF Package

  • Gripper Description (if any)

  • Custom Gripper URDF

  • ROS driver package (ROS Noetic)

    • publishing at least

      • depth data

      • color data

      • intrinsics for above (optics calibration)

      • extrinsics for above (3D and color sensor geometric calibration)

  • 3D sensor accuracy-range relationship

Leveraging Enhanced Interaction Possibilities

The integration of third-party robot arms opens new avenues for interaction. Whether manipulating intricate objects in a virtual lab or simulating complex assembly processes, the synchronized motion between the physical robot arm and its virtual counterpart empowers users with unparalleled control and accuracy.

Integrating third-party 3D sensors opens possibilities for using domain-specific devices (nuclear, underwater, extremely short range, extremely long range, vibration resistant, etc.), devices already validated for application, or simply the latest state-of-the-art sensors.

PreviousTroubleshootingNextRemote Demonstration Requirements

Was this helpful?