In the past few years, we have covered how you can use Myo armbands to interact with robots and drones. This project is about using gestures and speech to interact with a group of robots. The gestures are decoded onboard using two Myo armbands.
More like this ➡️ here
Wearable multi-modal interfaces for human multi-robot interaction
A group of robots can be selected by a single gesture. One robot is explicitly selected, which in turn recruits others near it. Robots navigate autonomously but their course can be corrected by the operator. Want to cherry-pick robots from a group? No problem.
*Our articles may contain aff links. As an Amazon Associate we earn from qualifying purchases. Please read our disclaimer on how we fund this site.