calibration and sensor fusion -凯发k8网页登录
most modern autonomous systems in applications such as manufacturing, transportation, and construction, employ multiple sensors. sensor fusion is the process of bringing together data from multiple sensors, such as radar sensors, lidar sensors, and cameras. the fused data enables greater accuracy because it leverages the strengths of each sensor to overcome the limitations of the others.
to understand and correlate the data from individual sensors, you must develop a geometric correspondence between them. calibration is the process of developing this correspondence. use lidar toolbox™ functions to perform lidar-camera calibration. to get started, see what is lidar-camera calibration?
you can also interactively calibrate the sensors by using the app. for more information, see .
lidar toolbox also supports downstream workflows such as projecting lidar points on images, fusing color information in lidar point clouds, and transferring bounding boxes from camera data to lidar data.
apps
interactively estimate rigid transformation between lidar sensor and camera |
functions
topics
- what is lidar-camera calibration?
fuse lidar and camera data.
guidelines to help you achieve accurate results for lidar-camera calibration.
overview of coordinate systems in lidar toolbox.
interactively calibrate lidar and camera sensors.
this example shows how to read and save images and point cloud data from a rosbag file.