User Manual
v7.3
Search
K

Customized Hardware Integrations

Extending the Realm of the Application to your Hardware
The realm of an application is characterized by constant evolution, and the AMAS VR App stands as a testament to the pursuit of limitless innovation. This chapter delves into the exciting world of integrating 3rd party robot arms and sensors into the AMAS ecosystem, expanding the horizons of what's achievable within the realm of virtual reality robotics.

The Integration Process

Integrating a 3rd party hardware with the AMAS VR App involves a deliberate process that marries hardware and software seamlessly. This section provides a roadmap for users seeking to embark on this integration journey, ensuring a smooth transition from physical robotic systems to immersive virtual environments.

Understanding Integration Requirements

Before embarking on the integration journey, it's crucial to understand the prerequisites for successful integration. This includes evaluating the compatibility of the chosen robot arm, ensuring necessary communication protocols, and addressing any hardware adaptations that might be required.

Configuring Robot Arm and Sensor Parameters

Once compatibility is confirmed, the next step involves configuring the robot arm's or sensor's parameters. This process ensures that the virtual representation of sensor or the robot arm mirrors its real-world counterpart, enabling precise motion replication and interaction. This step comprises of but is not limited to:
Third Party Robot or Component Requirements
Third Party Sensor Requirements
  • ROS Driver Package (ROS Melodic)
  • MoveIt! Package
  • URDF Package
  • Gripper Description (if any)
  • Custom Gripper URDF
  • ROS driver package (ROS Noetic)
    • publishing at least
      • depth data
      • color data
      • intrinsics for above (optics calibration)
      • extrinsics for above (3D and color sensor geometric calibration)
  • 3D sensor accuracy-range relationship

Leveraging Enhanced Interaction Possibilities

The integration of 3rd party robot arms opens new avenues for interaction. Whether it's manipulating intricate objects in a virtual lab or simulating complex assembly processes, the synchronized motion between the physical robot arm and its virtual counterpart empowers users with unparalleled control and accuracy.
The integration of 3rd party 3D sensors opens possibilities to use domain specific devices (nuclear, underwater, extreme short range, extreme long range, vibration resistant, ...), devices that were already validated for application or simply latest state-of-the-art sensors.