To read this content please select one of the options below:

Multi-data sensor fusion framework to detect transparent object for the efficient mobile robot mapping

Ravinder Singh (Department of Instrumentation and Control Engineering, Dr B.R. Ambedkar National Institute of Technology, Jalandhar, India)
Kuldeep Singh Nagla (Department of Instrumentation and Control Engineering, Dr B.R. Ambedkar National Institute of Technology, Jalandhar, India)

International Journal of Intelligent Unmanned Systems

ISSN: 2049-6427

Article publication date: 7 January 2019

503

Abstract

Purpose

An efficient perception of the complex environment is the foremost requirement in mobile robotics. At present, the utilization of glass as a glass wall and automated transparent door in the modern building has become a highlight feature for interior decoration, which has resulted in the wrong perception of the environment by various range sensors. The perception generated by multi-data sensor fusion (MDSF) of sonar and laser is fairly consistent to detect glass but is still affected by the issues such as sensor inaccuracies, sensor reliability, scan mismatching due to glass, sensor model, probabilistic approaches for sensor fusion, sensor registration, etc. The paper aims to discuss these issues.

Design/methodology/approach

This paper presents a modified framework – Advanced Laser and Sonar Framework (ALSF) – to fuse the sensory information of a laser scanner and sonar to reduce the uncertainty caused by glass in an environment by selecting the optimal range information corresponding to a selected threshold value. In the proposed approach, the conventional sonar sensor model is also modified to reduce the wrong perception in sonar as an outcome of the diverse range measurement. The laser scan matching algorithm is also modified by taking out the small cluster of laser point (w.r.t. range information) to get efficient perception.

Findings

The probability of the occupied cells w.r.t. the modified sonar sensor model becomes consistent corresponding to diverse sonar range measurement. The scan matching technique is also modified to reduce the uncertainty caused by glass and high computational load for the efficient and fast pose estimation of the laser sensor/mobile robot to generate robust mapping. These stated modifications are linked with the proposed ALSF technique to reduce the uncertainty caused by glass, inconsistent probabilities and high load computation during the generation of occupancy grid mapping with MDSF. Various real-world experiments are performed with the implementation of the proposed approach on a mobile robot fitted with laser and sonar, and the obtained results are qualitatively and quantitatively compared with conventional approaches.

Originality/value

The proposed ASIF approach generates efficient perception of the complex environment contains glass and can be implemented for various robotics applications.

Keywords

Citation

Singh, R. and Nagla, K.S. (2019), "Multi-data sensor fusion framework to detect transparent object for the efficient mobile robot mapping", International Journal of Intelligent Unmanned Systems, Vol. 7 No. 1, pp. 2-18. https://doi.org/10.1108/IJIUS-05-2018-0013

Publisher

:

Emerald Publishing Limited

Copyright © 2019, Emerald Publishing Limited

Related articles