Here is an artificial hand that merges user and robotic control to make life easier for amputees. EPFL scientists developed it using an algorithm that learns how to decode user intention and translates it into finger movement of the prosthetic hand:
The amputee must perform a series of hand movements in order to train the algorithm that uses machine learning. Sensors placed on the amputee’s stump detect muscular activity, and the algorithm learns which hand movements correspond to which patterns of muscular activity. Once the user’s intended finger movements are understood, this information can be used to control individual fingers of the prosthetic hand.
The algorithm developed kicks in as soon as the user tries to grasp an object. It tells the hand to close its fingers when the object comes in contact with the sensors. The above video shows this artificial robotic hand in action.
[HT]