Asanka G. Perera, Yee Wei Law, Ali Al-Naji and Javaan Chahl
The purpose of this paper is to present a preliminary solution to address the problem of estimating human pose and trajectory by an aerial robot with a monocular camera in near…
Abstract
Purpose
The purpose of this paper is to present a preliminary solution to address the problem of estimating human pose and trajectory by an aerial robot with a monocular camera in near real time.
Design/methodology/approach
The distinguishing feature of the solution is a dynamic classifier selection architecture. Each video frame is corrected for perspective using projective transformation. Then, a silhouette is extracted as a Histogram of Oriented Gradients (HOG). The HOG is then classified using a dynamic classifier. A class is defined as a pose-viewpoint pair, and a total of 64 classes are defined to represent a forward walking and turning gait sequence. The dynamic classifier consists of a Support Vector Machine (SVM) classifier C64 that recognizes all 64 classes, and 64 SVM classifiers that recognize four classes each – these four classes are chosen based on the temporal relationship between them, dictated by the gait sequence.
Findings
The solution provides three main advantages: first, classification is efficient due to dynamic selection (4-class vs 64-class classification). Second, classification errors are confined to neighbors of the true viewpoints. This means a wrongly estimated viewpoint is at most an adjacent viewpoint of the true viewpoint, enabling fast recovery from incorrect estimations. Third, the robust temporal relationship between poses is used to resolve the left-right ambiguities of human silhouettes.
Originality/value
Experiments conducted on both fronto-parallel videos and aerial videos confirm that the solution can achieve accurate pose and trajectory estimation for these different kinds of videos. For example, the “walking on an 8-shaped path” data set (1,652 frames) can achieve the following estimation accuracies: 85 percent for viewpoints and 98.14 percent for poses.
Details
Keywords
Insects depend on the spatial, temporal and spectral distribution of light in the environment for navigation, collision avoidance and flight control. The principles of insect…
Abstract
Purpose
Insects depend on the spatial, temporal and spectral distribution of light in the environment for navigation, collision avoidance and flight control. The principles of insect vision have been gradually revealed over the course of decades by biological scientists. The purpose of this paper is to report on bioinspired implementations and flight tests of these sensors and reflexes on unmanned aerial vehicles (UAVs). The devices are used for the stabilization of UAVs in attitude, heading and position. The implementations were developed to test the hypothesis that current understanding of insect optical flight control systems is feasible in real systems.
Design/methodology/approach
Design was based on behavioral and anatomical studies of insects. The approach taken was to test the designs in flight on a UAV.
Findings
The research showed that stabilization in attitude, heading and position is possible using the developed sensors.
Practical implications
Partial alternatives to magnetic, inertial and GPS sensing have been demonstrated. Optical flow and polarization compassing are particularly relevant to flight in urban environments and in planetary exploration.
Originality/value
For the first time the use of multispectral horizon sensing, polarization compassing and optical flow-based heading control have been demonstrated in flight.