One-shot gesture recognition with attention-based DTW for human-robot collaboration
ISSN: 0144-5154
Article publication date: 23 August 2019
Issue publication date: 18 February 2020
Abstract
Purpose
This paper aims to present a one-shot gesture recognition approach which can be a high-efficient communication channel in human–robot collaboration systems.
Design/methodology/approach
This paper applies dynamic time warping (DTW) to align two gesture sequences in temporal domain with a novel frame-wise distance measure which matches local features in spatial domain. Furthermore, a novel and robust bidirectional attention region extraction method is proposed to retain information in both movement and hold phase of a gesture.
Findings
The proposed approach is capable of providing efficient one-shot gesture recognition without elaborately designed features. The experiments on a social robot (JiaJia) demonstrate that the proposed approach can be used in a human–robot collaboration system flexibly.
Originality/value
According to previous literature, there are no similar solutions that can achieve an efficient gesture recognition with simple local feature descriptor and combine the advantages of local features with DTW.
Keywords
Citation
Kuang, Y., Cheng, H., Zheng, Y., Cui, F. and Huang, R. (2020), "One-shot gesture recognition with attention-based DTW for human-robot collaboration", Assembly Automation, Vol. 40 No. 1, pp. 40-47. https://doi.org/10.1108/AA-11-2018-0228
Publisher
:Emerald Publishing Limited
Copyright © 2019, Emerald Publishing Limited