In intelligent driving systems, the multisensor fusion perception system comprising multiple cameras and LiDAR has become a crucial component. It is essential to have stable extrinsic parameters among devices in a multisensor fusion system to achieve all-weather sensing with no blind zones. However, prolonged vehicle usage can result in immeasurable sensor offsets that lead to perception deviations. To this end, we have studied multisensor unified calibration, rather than the calibration between a single pair of sensors as previously done. Benefiting from the mutually constrained pose between different sensor pairs, the method improves calibration accuracy by around 20% compared to calibration for a pair of sensors. The study can serve as a foundation for multisensor unified calibration, enabling the overall automatic optimization of all camera and LiDAR sensors onboard a vehicle within a single framework.
Automated Extrinsic Calibration of Multi-Cameras and LiDAR
Xinyu Zhang,Yijin Xiong,Qianxin Qu,Shifan Zhu,Shichun Guo,Dafeng Jin,Guoying Zhang,Haibing Ren,Jun Li
Published 2024 in IEEE Transactions on Instrumentation and Measurement
ABSTRACT
PUBLICATION RECORD
- Publication year
2024
- Venue
IEEE Transactions on Instrumentation and Measurement
- Publication date
Unknown publication date
- Fields of study
Computer Science, Engineering, Environmental Science
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-46 of 46 references · Page 1 of 1
CITED BY
Showing 1-12 of 12 citing papers · Page 1 of 1