Close

Grasp Decoding with a Wearable Muscle-machine Interface

A project log for Wearable, Soft Robotic Exoskeleton Gloves

The New Dexterity / Open Bionics wearable, affordable, soft exogloves are bionic devices for rehabilitation and human augmentation.

lucas-gerezLucas Gerez 10/04/2020 at 18:220 Comments

The wearable sleeve muscle-machine interface developed for the control of the proposed glove system can be employed to decode various gestures/grasps, like pinch grasps, power grasps, tripod grasps, key grasps, spherical grasp and rest state. With these five different types of grasp it is possible to perform most of Activities of Daily Living (ADLs). Fig. 1 shows the five gestures. An example of the acquired signal for the power and pinch grasp can be seen in Fig. 2.

The five grasps are: a power grasp (a), a key grasp (b), a pinch grasp (c), a spherical grasp (d), and a tripod grasp (e). All the objects used are contained in the Yale-CMU-Berkeley grasping object set.

EMG and FMG values during the rest phase, power grasp and pinch grasp.

For user intention classification the machine learning based models can be developed which use the EMG and FMG data from eight different muscle sites collected using the wearable sensorized sleeve. At a particular instance in time, the input data vector for training the learning model can be represented as:

where xt1, xt2, xt3 are values for the EMG sensors E1, E2, and E3 at a time instance ‘t’. While represent xt4,  xt5,  xt6, xt7, xt8 values of the FMG sensors F1, F2, F3, F4, and F5 at time ‘t’. The desired output of the learned model at time ‘t’ can be represented as:

where Ht= SP corresponds to the spherical grasp, Ht= PO corresponds to the power grasp, Ht= PI corresponds to the pinch grasp, Ht= TP corresponds to the tripod grasp, Ht= K corresponds to the key grasp, while Ht= R corresponds to the rest state of the hand at time ‘t’. For each of the intended grasp motion Ht, a predefined Mt R4 that correspond to the motor state for each of the grasp strategies. For each of the grasp types a specific Mt is triggered for the proposed glove to execute the corresponding grasping motion. For a robust classification outcome, we use the Majority Vote Criterion (MVC). To do this, a sliding window, of size W = 10 was applied on the data while performing predictions. The MVC classifies all the samples in the window as the class that received the maximum number of votes in that window.

The code for the wearable muscle-machine interface (trainer, predictions, and operation) is available at the GitHub repository of the Hybrid Exoskeleton Glove project:

https://github.com/newdexterity/Hybrid-Exoskeleton-Glove/tree/master/Code/Muscle-Machine-Interface

Discussions