Zeguo Yang, Mantian Li, Fusheng Zha, Xin Wang, Pengfei Wang and Wei Guo
This paper aims to introduce an imitation learning framework for a wheeled mobile manipulator based on dynamical movement primitives (DMPs). A novel mobile manipulator with the…
Abstract
Purpose
This paper aims to introduce an imitation learning framework for a wheeled mobile manipulator based on dynamical movement primitives (DMPs). A novel mobile manipulator with the capability to learn from demonstration is introduced. Then, this study explains the whole process for a wheeled mobile manipulator to learn a demonstrated task and generalize to new situations. Two visual tracking controllers are designed for recording human demonstrations and monitoring robot operations. The study clarifies how human demonstrations can be learned and generalized to new situations by a wheel mobile manipulator.
Design/methodology/approach
The kinematic model of a mobile manipulator is analyzed. An RGB-D camera is applied to record the demonstration trajectories and observe robot operations. To avoid human demonstration behaviors going out of sight of the camera, a visual tracking controller is designed based on the kinematic model of the mobile manipulator. The demonstration trajectories are then represented by DMPs and learned by the mobile manipulator with corresponding models. Another tracking controller is designed based on the kinematic model of the mobile manipulator to monitor and modify the robot operations.
Findings
To verify the effectiveness of the imitation learning framework, several daily tasks are demonstrated and learned by the mobile manipulator. The results indicate that the presented approach shows good performance for a wheeled mobile manipulator to learn tasks through human demonstrations. The only thing a robot-user needs to do is to provide demonstrations, which highly facilitates the application of mobile manipulators.
Originality/value
The research fulfills the need for a wheeled mobile manipulator to learn tasks via demonstrations instead of manual planning. Similar approaches can be applied to mobile manipulators with different architecture.
Details
Keywords
Fei Qi, Yiwei Ge and Xianjun Liu
This paper aims to present a kinematics performance analysis and control for a continuum robot based on a dynamic model to achieve control of the robot.
Abstract
Purpose
This paper aims to present a kinematics performance analysis and control for a continuum robot based on a dynamic model to achieve control of the robot.
Design/methodology/approach
To analyze the motion characteristics of the robot, its kinematics model is derived by the geometric analysis method, and the influence of the configuration parameters of the robot on workspace is investigated. Moreover, the dynamic model is established by the principle of virtual work to analyze the mapping relationship among the bending shape, the forces/torques applied to the robot. To achieve better control of the robot, a control strategy for continuum robot based on the dynamic model is put forward.
Findings
Results of the simulations and experiments verify the proposed continuum structure and motion model, the maximum position error is 5.36 mm when the robot performs planar bending motion and the average position error of the robot in spatial circular motion is 5.84 mm. The proposed model can accurately describe the deformation movement of the robot and realize its motion control with a few position errors.
Originality/value
The kinematics analysis and control model proposed in this paper can achieve precise control of the robot, which can be used as a reference for the motion planning and shape reconstruction of continuum robot.