../../../_images/img_pa_icon.png Calibration Manual

Camera Intrinsic calibration using MATLAB

Image Acquisition

If you want to calibrate the intrinsics of a Basler Ace camera you need the raw images of the camera. First you should launch the caros_camera node as follows:

roslaunch caros_camera caros_camera.launch camera_name:="basler" basler_serial:="22084405" calibrated:="0" 

The caros_camera node supports dynamic reconfigurable parameters for all currently supported cameras. Which means it is possible to alter the camera parameters during runtime. The parameters can be reconfigured through the rqt_reconfigure GUI, which can be launched using the following:

rosrun rqt_reconfigure rqt_reconfigure

Please see the cfg folder of the caros_camera node to see which dynamic reconfigurable parameters are available for each camera.

To verify that the communication with the camera is running, you can try and view an image using image_view as follows:

rosrun image_view image_view image:=/basler/caros_camera/left/image_raw

NOTICE! you should adjust the exposure and gain settings so that the chessboard used for the calibration is clearly visible, not over-exposed and not under-exposed either (Look at the chessboard corners, the corner-to-corner transition should be sharp).

Raw images of the chessboard in different poses, with varying orientation of position must now be acquired. To acquire the images you can use the image_saver of ROS, first run it as follows:

rosrun image_view image_saver image:=/basler/caros_camera/left/image_raw _filename_format:=left%04d.png _save_all_image:=false      

Now in order to save images you must make a service call to the image_saver node, in a second terminal:

rosservice call /image_saver_1520437488645967646/save

NOTICE! the image_saver node uses an anonymous name for the node by default, so you must find out what the node name is. You can do this using rosservice list for instance.

For each service call you make, an image is saved to the directory where you launched the image_saver. Take approximately 50-60 images.

MATLAB Procedure

https://www.mathworks.com/help/vision/ug/cameracalibrator_app_steps.png Single Camera Calibrator App Steps

Start MATLAB on the OS of your choice. You must then launch the Single Camera Calibrator App, found through ‘Apps’ in the top ribbon of MATLAB. The official documentation of how to use the tool is available Here. It gives a good description of how to navigate and use the tool.

To begin calibration, you must add images. You can add saved images from a folder. The calibrator automatically analyzes the images to ensure they meet the calibrator requirements. The calibrator then detects the points on the chessboard.

Add images

On the Calibration tab, in the File section, click Add images, and then select From file. Here you want to add the 50-60 images that have been acquired.

General Suggestions

  • You can select either a standard or fisheye camera model on the Calibration tab. In the Camera Model section, select Standard
  • You can specify two or three radial distortion coefficients. On the Calibrations tab, in the Camera Model section, with Standard selected, click Options. Select the Radial Distortion as 3 Coefficients.
  • Select the Compute Tangential Distortion check box, to make the calibrator estimate the tangential distortion coefficients.
  • Try to obtain a lower overall mean reprojection error by removing images associated with a high reprojection error. At the same time avoid removing too many images, since this might lead to a reprojection error that cannot be trusted to represent the real error. A rule of thumb for an OK calibration is that it should at least be below 0.5 pixels.

Exporting the calibration

Upon finished calibration the calibration parameters should be exported to your MATLAB workspace. Select Export Camera Parameters > Export Parameters to Workspace to create a cameraParameters object in your workspace. The object contains the intrinsic and extrinsic parameters of the camera and the distortion coefficients. You can optionally export the cameraCalibrationErrors object, which contains the standard errors of estimated camera parameters, by selecting the Export estimation errors check box.

If you want to export the calibration to ROS format (.yaml file) described Here, which is supported by caros_camera. You must first take the transpose of the obtained camera matrix, to align the matrix to the same format used in the ROS format. You can do this as follows: (If you want to avoid using scientific notation use format LONGG.)

K = cameraParams.IntrinsicMatrix'

To print out the distortion parameters do:

cameraParams.RadialDistortion

and

cameraParams.TangentialDistortion

IMPORTANT! when you copy these value to your ROS yaml distortion vector described Here, you must notice that the 5 parameters must be specified as: (k1, k2, t1, t2, k3). So the tangential distortion goes in between k2 and k3 and not after k3.

Extrinsic Calibration

Preparing the system

The marker holder with associated marker pattern must be mounted on the robot toolflange, using the pneumatic tool exchange system. You can either press the digital output button 0 on the UR control panel to attach or release the tool/marker holder, but the xPC realtime script on the UR controller must be stopped. Or you can just use the GUI for the robot_module which works with the xPC realtime script running.

The calibration system relies on having communication with the Robot and Camera that is to be calibrated. Specifically the action server for joint-space movements joint_trap_vel_action_server must be available on the ROS network. For communicating with cameras the caros_camera node is used, which is a part of the caros framework. The general documentation for caros_camera node can be found here.

If you want to calibrate a Basler Ace camera extrinsically, you should launch the camera node as follows:

roslaunch caros_camera caros_camera.launch camera_name:="basler" basler_serial:="22084405" calibrated:="1" 

The caros_camera node supports dynamic reconfigurable parameters for all currently supported cameras. Which means it is possible to alter the camera parameters during runtime. The parameters can be reconfigured through the rqt_reconfigure GUI, which can be launched using the following:

rosrun rqt_reconfigure rqt_reconfigure

Please see the cfg folder of the caros_camera node to see which dynamic reconfigurable parameters are available for each camera.

To verify that the communication with the camera is running, you can try and view an image using image_view as follows:

rosrun image_view image_view image:=/basler/caros_camera/left/image_rect

NOTICE! you should adjust the exposure and gain settings so that the marker is clearly visible, not over-exposed and not under-exposed either.

To verify the robot communication, you can try to echo the joint states as follows:

rostopic echo /ur10_1/joint_states -c

Once communication for the robot and camera is running we are ready to start the calibration.

Starting the Calibration

To start the calibration you must first run the caros_calibration node. You can launch the node as follows:

roslaunch caros_calibration caros_calibration.launch num_measurements:="100" use_dh_params:="false"

The following parameters are available for the calibration node:

Parameter Description Default
num_measurements Specifies the number of measurements to be taken during the sampling procedure 100
use_dh_params Specifies if the Denavit-Hartenberg parameters should be calibrated as well true
camera_in_hand Sets whether to calibrate camera-in-hand or marker-in-hand false

The num_measurements parameter is important to set to the desired amount, when launching the calibration node. Since it cannot be changed when the calibration is running.

Once the caros_calibration node is running, the user interface for the calibration must be started. This is done by running RobWorkStudio as follows:

cd ~/RobWork/RobWorkStudio/bin/release
./RobWorkStudio

RobWorkStudio will then automatically load the ReconCell Calibration Plugin. You should then be presented with the following GUI:

../../../_images/img_UI.pngCalibration GUI

As can be seen in the above image, there are several options available. If you just want to Calibrate fully automatically you simply click the blue ‘Calibrate’ button. If you want to use a supervised mode where the robot movements are confirmed by the user, enable the Supervisor Mode by setting the switch to ON.