Fast head detection in arbitrary poses using depth information
ISSN: 0260-2288
Article publication date: 12 March 2020
Issue publication date: 26 May 2020
Abstract
Purpose
This study aims to develop a real-time algorithm, which can detect people even in arbitrary poses. To cover poor and changing light conditions, it does not rely on color information. The developed method is expected to run on computers with low computational resources so that it can be deployed on autonomous mobile robots.
Design/methodology/approach
The method is designed to have a people detection pipeline with a series of operations. Efficient point cloud processing steps with a novel head extraction operation provide possible head clusters in the scene. Classification of these clusters using support vector machines results in high speed and robust people detector.
Findings
The method is implemented on an autonomous mobile robot and results show that it can detect people with a frame rate of 28 Hz and equal error rate of 92 per cent. Also, in various non-standard poses, the detector is still able to classify people effectively.
Research limitations/implications
The main limitation would be for point clouds similar to head shape causing false positives and disruptive accessories (like large hats) causing false negatives. Still, these can be overcome with sufficient training samples.
Practical implications
The method can be used in industrial and social mobile applications because of its robustness, low resource needs and low power consumption.
Originality/value
The paper introduces a novel and efficient technique to detect people in arbitrary poses, with poor light conditions and low computational resources. Solving all these problems in a single and lightweight method makes the study fulfill an important need for collaborative and autonomous mobile robots.
Keywords
Citation
Hacinecipoglu, A., Konukseven, E.I. and Koku, A.B. (2020), "Fast head detection in arbitrary poses using depth information", Sensor Review, Vol. 40 No. 2, pp. 175-182. https://doi.org/10.1108/SR-05-2019-0127
Publisher
:Emerald Publishing Limited
Copyright © 2020, Emerald Publishing Limited