LIDAR AND CAMERA CALIBRATION USING MOUNTED SPHERE

VALUE PROPOSITION

Combing data from multiple sensors has become an increasingly important part of robotic and autonomous systems. Cameras and lidars generate data that can be combined to improve overall accuracy. To have sensor fusion, precise extrinsic calibration between the sensors must be known. When sensors are adjusted or accidently shifted, the alignment needs to be estimated, which can be frequent during product development. Achieving extrinsic calibration in a fast and simple way is the goal of this technology.

 

 

DESCRIPTION OF TECHNOLOGY

This technology is a simple, fast, and robust method performing extrinsic calibration between a camera and lidar. The only required calibration target is a hand-held colored sphere mounted on a white board. The inventors developed networks that automatically localize the sphere relative to the camera and the lidar. Then using a localization covariance model, they can calculate the relative pose between the camera and lidar. Using ground truth data, accurate calibration results were demonstrated.

 

BENEFITS

•       Automatic extrinsic calibration using sphere design

•       Convolutional neutral networks

 

APPLICATIONS

•       Light Detection and Ranging (LiDAR)

 

IP STATUS

Software Copyright

 

LICENSING RIGHTS AVAILABLE

Full licensing rights available

 

DEVELOPER: Dr. Daniel Morris

 

TECH ID: TEC2021-0039

 

Patent Information:

For Information, Contact:

Raymond Devito
Technology Manager
Michigan State University
devitora@msu.edu
Keywords: