SenseKit to Robot Calibration
Guide to the calibration process of SenseKit with respect to the Robot.
The SenseKit to Robot Calibration is a crucial step in the optimization process that ensures seamless coordination between the SenseKit device and the robotic arm within the AMAS VR App. This calibration process is pivotal to achieving a level of precision that is fundamental for a truly immersive and accurate virtual reality robotics experience. Let's delve into the significance and intricacies of the SenseKit to Robot Calibration:
The Calibration Process: A Closer Look
Prerequisites
Make sure that the SenseKit and RoboKit is properly setup, and you have the Charuco Board handy as per the previous instructions.
Grab or place calibration board
If your camera is mounted in a stationary way:
go in to the
Interaction
panel, switch into theRobot Control Mode
, turn on theGrip Mode
and grab calibration board with robotic arm
If your camera is mounted on the robot gripper/end-effector and moving with the robot arm
place the calibration board close to the camera in such way that it is visible for the camera (for example on table with robotic arm)
Open the calibration panel
Go to the
Configuration
Panel and click onSenseKit
.Select SenseKit you want to calibrate in the list of Active SenseKits.
In SenseKit configuration page click on
Calibrate
button located at the bottom of the right hand side panel.Select
SenseKit to Robot
calibration.Select RoboKit for the arm you want the camera to be calibrated to.
Select Mount Type
Select option depending on how your camera is mounted with respect to robot :
Base
: hand-eye calibration, camera mounted in stationary way with respect to robot
End Effector
: eye-in-hand calibration, camera mounted on the gripper/end-effector and moving with the robot
End Effector with Pose
: like above but with camera data sent including synchronized pose data, currently only available in custom Extend Robotics setups.
Performing the Calibration
You should see SenseKit Calibration panel with:
information such as:
Calibration Type
,Sample Size
,Position Error
andRotation Error
.buttons:
Take Sample
,Reset
,Save
andCancel
.
The Position Error and Rotation Error will be shown as zero until the 4th sample is taken as the software doesn't have enough data previously to calculate the error.
Make sure Robot Control Mode
in enabled in Interaction
panel for moving arm.
Move the end effector position and rotation of the joint to change the angle at which the camera observes the calibration board and take your first sample by clicking on
Take Sample
.Repeat
Step 1
three more times to reach a sub-optimal point where the SenseKit will align with the robot partially.Press
Save
if you're happy with the first result which will take you back to theConfiguration Page
of the SenseKit.Finally press
Save Local
orSave Local and Device
to save the calibration done so far.
Usually the first four samples only achieve a partial calibration with high positional and rotational errors. We recommend repeating Step 2
from 5 to 20 times to fine-tune the calibration and decrease the error.
In case the calibration gets botched at any sample you can press Reset
to reset to the last save and try from Step 1 again.
For best results make sure that calibration board is completely visible to the camera during the whole process. While Charuco calibration board supports partial views and occlusions this limits the number of markers possible to detect and potentially decreases accuracy of calibration.
Alternatively, fine-tuning can also be achieved using manual calibration. See Manual Sensekit Calibration for further instructions.
Improving calibration accuracy
Calibration accuracy may be improved by:
increasing the number of robot movements
maximizing the angular spread between robot movements
minimizing the distance between the camera and the calibration target
minimizing the distance moved by the robot arm between two positions
For detailed technical description of the problem see for example:
A Comparative Review of Hand-Eye Calibration Techniques for Vision Guided Robots
Last updated