Robots are going to do many thing for us in the future, including helping people dress up. Zackory Erickson, Henry M. Clever, and their colleagues have come up with a deep haptic model predictive control for robot-assisted dressing. The predictions are made using haptic and kinematic observations from the robot’s end effector.
Must see robots & drones:
Deep Haptic Model Predictive Control for Robot-Assisted Dressing
The robot predicts the forces a garment will apply to a person’s body. As the researchers explain:
We evaluated our approach with a PR2 robot that attempted to pull a hospital gown onto the arms of 10 human participants. With a 0.2s prediction horizon, our controller succeeded at high rates and lowered applied force while navigating the garment around a persons fist and elbow without getting caught.
[Source]
*Our articles may contain aff links. As an Amazon Associate we earn from qualifying purchases. Please read our disclaimer on how we fund this site.