User Guide
v8.3
v8.3
  • Extend Robotics User Manual
  • introduction
    • AMAS Overview
    • Compatible XR Systems
    • Supported Robots
      • Robots
        • Ufactory
        • Universal Robots
        • Dobot
        • Pal Robotics
        • Mitsubishi Electric
        • AGILE-X
      • Accessories
        • Robotiq
        • ROBOTIS
        • Ufactory
    • Choosing 3D Sensors
      • Limitations
      • Sensor Types
      • Visual Comparison
    • Typical Network Connection
    • Specifications
    • Release Notes
      • AMAS V7.6
      • AMAS V8.3
  • getting started
    • Getting Started with AMAS
    • Navigating Through the User Guide
  • AMAS VR Application
    • Download AMAS
      • Setup Quest Application
      • Running the Application
      • Home Scene Arrival
    • Control Panel
      • Floating Keyboard
      • Error popup
    • Interactions Modes
      • View Adjust Mode
      • Robot Control Mode
      • Interaction Utilities
    • Hand Tracking
    • Passthrough Mode
    • Manage multiple devices
      • RoboKit Addition and Configuration
      • Loading RoboKit
      • SenseKit Addition and Configuration
        • Sensor Specific Firmware Settings
        • 2D - 3D fusion
      • Loading SenseKit
      • AudioKit Addition and Configuration
      • Loading AudioKit
      • Configuration Bundles
    • Hand-eye Calibrations
      • Calibration Prerequisite
      • SenseKit to Robot Calibration
      • SenseKit to SenseKit Calibration
      • Manual Sensekit Calibration
      • Manual RoboKit Calibration
    • Motion Playback
      • Creating a Recording
      • Load a Recorded Motion
      • Replay a Recorded Motion
    • Visual Haptic
      • Force Torque Calibration
    • Speech Recognition
    • Utilities
      • Stationary Headset
    • UR Program Change
    • Mobile Robot Movement
      • Virtual Joystick
      • Quest series controller
    • Shutdown Instructions
    • Over-the-Air Firmware Updates
    • Safety Utilities
      • RoboKit Safety Utilities
      • Joint Limit Recovery Visualisation
      • Robot Rescue
  • SenseKit
    • Introduction
    • Components
      • Pre-requisite SenseKit Components
      • Shipped SenseKit Components
    • Mounting Options
    • Setup Instructions
    • Running Instructions
    • SenseKit IP Configuration
    • Calibration Instructions
    • SenseKit Manual Firmware Upgrade / Install
    • Shutdown Instructions
  • RoboKit
    • Introduction
    • Components
      • Pre-requisite RoboKit Components
      • Shipped RoboKit Components
    • Initial Setup Instructions
      • UFactory xArm
      • UR e Series
      • UR CB Series
      • Dobot CR-A Series
      • AgileX
      • Dexterous Hand with xArm
    • Instructions for Starting RoboKit
    • RoboKit IP Configuration
    • RoboKit Manual Firmware Upgrade / Install
    • Shutdown Instructions
    • Safety Protocols
  • API
    • RoboKit
      • Current Robokit Publishers, Subscribers and Services
      • Gripper Customizations
      • Message and Service Definitions
    • SenseKit
  • Other Information
    • Troubleshooting
    • Customized Hardware Integrations
    • Remote Demonstration Requirements
    • Backward Compatibility
    • Guides
      • PAL TIAGo
        • Mobile Base, Arm and Torso Control
        • 3D Sensor
        • Additional Sensor Data Visualization
        • First Person View Mode
        • Take Home Functionality
        • Audio
        • Bundle
      • Robot - Sensor Sync
      • UI Design Reference
    • Downloads
      • OTA
      • Calibration Board
  • System Requirements
    • PC Requirements
    • Network Requirements
Powered by GitBook
On this page
  • Shades
  • Transparency
  • Occlusions

Was this helpful?

  1. introduction
  2. Choosing 3D Sensors

Limitations

Typical limitations for all sensors

PreviousChoosing 3D SensorsNextSensor Types

Last updated 10 months ago

Was this helpful?

Shades

3D sensor works from own perspective and objects in the scene occlude vision behind. When looking from different viewer perspective, compared to 3D sensor, some areas may appear shaded (have missing data, are not visible).

More Shades Examples

View from above at scene captured by structured light scanner. Both projector and 3D sensor have own shades. The reason for that is 3D sensor needs to capture projection to generate data. Shade of any sub-sensor contributes to whole system. When fusing with colour data we might even get third shade but color sub-sensor is typically close to one of other sub-sensors and has similar shade.

Transparency

Most 3D sensors have trouble with sensing transparent objects (glass, plastic bottles).

Occlusions

For 3D devices with multiple sub-sensors what each sub-sensor sees in the scene may differ. Depending on 3D sensing type and internal handling this may lead to missing texture, missing 3D data or even wrong texture/3D data. Detecting such problem by sensor or other software is not always possible.

More Occlusion Examples

Occlusions between colour and 3D sensor would result in wrong texture when fusing data

If detected, occlusions may be removed (pink). Note that this further contributes to missing data in similar way as shades.

Scene concept from "Projection, Texture-Mapping and Occlusion with Intel® RealSense™ Depth Cameras" Intel whitepaper
Kinect4A point clould with transparent blue plastic bottle distorted and missing data
Occlusion problem visualization from "Projection, Texture-Mapping and Occlusion with Intel® RealSense™ Depth Cameras" Intel whitepaper