Myoelectric Control via Classifying Dynamic Motions

With wide-spreading computer technology, computerized devices and intelligent systems are right beside human life. In this circumstance, the need for interfacing the devices conveniently and easily has been augmented. To satisfy this need, numerous interfacing approaches in Human Computer Interface (HCI) have been studied. Myoelectric interface is a HCI strategy using myoelectric signals to transfer human thoughts or intentions to a device. Many studies have focused on myoelectric interface because myoelectric signal is suitable to represent human movements.

 

There are several benefits of the myoelectric interface: 1) more familiarity of gestures, 2) no limitation for outdoor activities, 3) no interference when using a hand, and 4) information of muscle force. These benefits are quite useful for real applications. For examples, controlling UGV (Unmanned Ground Vehicle) and UAV (Unmanned Aerial Vehicle) in military environments for surveillance and dangerous tasks by the myoelectric interface is possible. The characteristics of the myoelectric interface such as familiarity of using hand gestures, no limitation for outdoor activities, and no interference when using a hand compared to using a data grove are really advantageous. Moreover, virtual reality (VR) and augmented reality (AR) are interesting issues focused from many big IT companies like Google and Microsoft as the next generation of technology. These VR and AR also need interfacing techniques to connect between humans and virtual environments generated by a computer. The myoelectric interface can be helpful for VR and AR because the myoelectric interface does not limit human movements. We invented a dexterous myoelectric interface to control a 7-DOF robot manipulator using the orientation of a forearm, muscle forces, and dynamic hand gestures via IMU and EMG signals. Orientation, angular velocities, and linear accelerometers of a forearm of a human operator in 3D space can be known from an IMU sensor.

 

 

Muscle forces and dynamic hand gestures can be recognized from EMG signals. In this invention, dynamic hand gestures were classified to change manipulation modes. Each dynamic gesture represents a different manipulation mode for a robot manipulator. Moreover, estimated muscle forces were exploited for being aware of activation of continuous commands as combined with orientation information of the IMU sensor.