Here is a framework that allows for expressive motion sequences in humanoid robot, significantly improving their non-verbal communication. This approach relies on LLMs to generate social appropriate gesture motion sequences. Robots will then execute motions via inverse kinematics.
Apple’s AI researchers introduce EMOTION, a framework for expressive humanoid gestures.
⦿ LLMs interpret social context and generate motions from human demos
⦿ Robots execute motions via inverse kinematics
⦿ Gestures are refined through human feedback
Hardware: Fourier GR-1 pic.twitter.com/B5fxBttfpx— The Humanoid Hub (@TheHumanoidHub) February 7, 2025
This framework was tested with Fourier GR-1. EMOTION can incorporate human feedback. The researchers generated and evaluated various expressive gestures.
[HT]