To read this content please select one of the options below:

Extrinsic calibration method for integrating infrared thermal imaging camera and 3D LiDAR

Dan Zhang (School of Building Environment Engineering, Zhengzhou University of Light Industry, Zhengzhou, China)
Junji Yuan (School of Building Environment Engineering, Zhengzhou University of Light Industry, Zhengzhou, China)
Haibin Meng (School of Building Environment Engineering, Zhengzhou University of Light Industry, Zhengzhou, China)
Wei Wang (School of Building Environment Engineering, Zhengzhou University of Light Industry, Zhengzhou, China)
Rui He (School of Building Environment Engineering, Zhengzhou University of Light Industry, Zhengzhou, China)
Sen Li (School of Building Environment Engineering, Zhengzhou University of Light Industry, Zhengzhou, China)

Sensor Review

ISSN: 0260-2288

Article publication date: 4 June 2024

Issue publication date: 26 June 2024

142

Abstract

Purpose

In the context of fire incidents within buildings, efficient scene perception by firefighting robots is particularly crucial. Although individual sensors can provide specific types of data, achieving deep data correlation among multiple sensors poses challenges. To address this issue, this study aims to explore a fusion approach integrating thermal imaging cameras and LiDAR sensors to enhance the perception capabilities of firefighting robots in fire environments.

Design/methodology/approach

Prior to sensor fusion, accurate calibration of the sensors is essential. This paper proposes an extrinsic calibration method based on rigid body transformation. The collected data is optimized using the Ceres optimization algorithm to obtain precise calibration parameters. Building upon this calibration, a sensor fusion method based on coordinate projection transformation is proposed, enabling real-time mapping between images and point clouds. In addition, the effectiveness of the proposed fusion device data collection is validated in experimental smoke-filled fire environments.

Findings

The average reprojection error obtained by the extrinsic calibration method based on rigid body transformation is 1.02 pixels, indicating good accuracy. The fused data combines the advantages of thermal imaging cameras and LiDAR, overcoming the limitations of individual sensors.

Originality/value

This paper introduces an extrinsic calibration method based on rigid body transformation, along with a sensor fusion approach based on coordinate projection transformation. The effectiveness of this fusion strategy is validated in simulated fire environments.

Keywords

Acknowledgements

The authors would like to add the following funding sources: Science and Technology Department of Henan Province. Henan Provincial Science and Technology Research Project. No. 242102320211. No. 232102320012. No. 222102320232.

Declarations.

Conflicts of interest: The authors declare that there is no conflict of interest regarding the publication of this article.

Citation

Zhang, D., Yuan, J., Meng, H., Wang, W., He, R. and Li, S. (2024), "Extrinsic calibration method for integrating infrared thermal imaging camera and 3D LiDAR", Sensor Review, Vol. 44 No. 4, pp. 490-504. https://doi.org/10.1108/SR-04-2024-0292

Publisher

:

Emerald Publishing Limited

Copyright © 2024, Emerald Publishing Limited

Related articles