User Guide
v2025.03
v2025.03
  • Extend Robotics User Guide
  • introduction
    • Release Notes
      • AMAS v2025.03
      • AMAS V10.3
      • AMAS V9.3
      • AMAS V8.3
      • AMAS V7.6
    • AMAS Overview
    • Compatible XR Systems
    • Supported Robots
      • Robots
        • Ufactory
        • Universal Robots
        • Dobot
        • Pal Robotics
        • Mitsubishi Electric
        • AGILE-X
      • Accessories
        • Robotiq
        • ROBOTIS
        • Ufactory
        • On Robot
    • Choosing 3D Sensors
      • Limitations
      • Sensor Types
      • Support Matrix
      • Visual Comparison
    • Specifications
    • PC Requirements
    • Typical Network Connection
    • Network Requirements
  • getting started
    • Getting Started with AMAS
    • Navigating Through the User Guide
  • Cortex
    • Introduction
    • Robot Setup Instructions
      • UFactory xArm
      • UR e Series
      • UR CB Series
      • Dobot CR-A Series
      • AgileX
      • Dexterous Hands
        • Robotera XHAND1
        • Inspire Robots Dexterous Hand
      • Extend Avatar
        • Electronics Assembly
        • Mechanical assembly
          • Avatar Component Parts list
          • Assembly Instructions
        • xArm Adjustments
    • Sensor Setup Instructions
      • Sensor Positioning
    • Cortex Firmware Configuration
    • Instructions for Starting the Cortex
    • Cortex IP Configuration
    • Shutdown Instructions
    • Cortex Safety Protocols
  • AMAS VR Application
    • Download AMAS
      • Setup Quest Application
      • Running the Application
      • Home Scene Arrival
    • Control Panel
      • Interactions Modes
        • View Adjust Mode
        • Robot Control Mode
        • Interaction Utilities
      • Floating Keyboard
      • Error popup
      • Web Browser
    • Hand Tracking
    • Hand Gestures
    • Upper Body Tracking
    • Passthrough Mode
    • Managing Cortex
      • Adding Cortex Connection
      • Loading Cortex Connection
      • Modifying Robot Settings
        • Safety Parameter Update
      • Manual Robot Positioning
      • Modifying Sensor Settings
        • 2D - 3D fusion
      • Sensor Position Calibration
        • Calibration Prerequisite
          • Calibration Board
        • Sensor to Robot Calibration
        • Sensor to Sensor Calibration
        • Manual Calibration
    • Accessories
      • SenseGlove Nova 2
      • bHaptics TactGloves
    • Record and Replay
      • Creating a Recording
      • Load a Recorded Motion
      • Replay a Recorded Motion
    • Data Collection
      • Recording Data
      • Data Visualisation
        • ROS Bag
        • HDF5
        • LeRobot
    • Tutorial
    • Visual Haptic
      • Force Torque Calibration
    • Speech Recognition
    • Utilities
      • Stationary Headset
      • Exhibition Scene
    • UR Program Change
    • Mobile Robot Movement
      • Virtual Joystick
      • Quest series controller
    • Shutdown Instructions
    • Safety Utilities
      • Safety Utilities
      • Joint Limit Recovery Visualisation
      • Robot Rescue
    • Avatar Additional VR Functionalities
      • Avatar Digital Twin Assembly
      • Pose Sync Avatar Camera Setup
      • Torso Initialisation
      • Embody Control Mode
  • Command Console
    • Over-the-Air Firmware Updates
    • Updating Robot/Sensor Configurations
      • Robot Specific Firmware Settings
      • Sensor Specific Firmware Settings
  • Cortex API
    • Robots
      • Current Robot Publishers, Subscribers and Services
      • Gripper Customizations
      • Message and Service Definitions
    • Sensors
  • Other Information
    • Troubleshooting
    • Customized Hardware Integrations
    • Remote Demonstration Requirements
    • Guides
      • PAL TIAGo
        • Mobile Base, Arm and Torso Control
        • 3D Sensor
        • Additional Sensor Data Visualization
        • First Person View Mode
        • Take Home Functionality
        • Audio
        • Bundle
      • Robot - Sensor Sync
Powered by GitBook
On this page
  • Shades
  • Transparency
  • Occlusions

Was this helpful?

  1. introduction
  2. Choosing 3D Sensors

Limitations

Typical limitations for all sensors

PreviousChoosing 3D SensorsNextSensor Types

Was this helpful?

Shades

3D sensor works from own perspective and objects in the scene occlude vision behind. When looking from different viewer perspective, compared to 3D sensor, some areas may appear shaded (have missing data, are not visible).

More Shades Examples

View from above at scene captured by structured light scanner. Both projector and 3D sensor have own shades. The reason for that is 3D sensor needs to capture projection to generate data. Shade of any sub-sensor contributes to whole system. When fusing with colour data we might even get third shade but color sub-sensor is typically close to one of other sub-sensors and has similar shade.

Transparency

Most 3D sensors have trouble with sensing transparent objects (glass, plastic bottles).

Occlusions

For 3D devices with multiple sub-sensors what each sub-sensor sees in the scene may differ. Depending on 3D sensing type and internal handling this may lead to missing texture, missing 3D data or even wrong texture/3D data. Detecting such problem by sensor or other software is not always possible.

More Occlusion Examples

Occlusions between colour and 3D sensor would result in wrong texture when fusing data

If detected, occlusions may be removed (pink). Note that this further contributes to missing data in similar way as shades.

Scene concept from "Projection, Texture-Mapping and Occlusion with Intel® RealSense™ Depth Cameras" Intel whitepaper
Kinect4A point clould with transparent blue plastic bottle distorted and missing data
Occlusion problem visualization from "Projection, Texture-Mapping and Occlusion with Intel® RealSense™ Depth Cameras" Intel whitepaper