New Embodied AI System Teaches Complex Movements Via Muscles

Researchers develop AI-powered muscle stimulation to guide users through novel tasks and skills.

Apr. 10, 2026 at 6:54pm

A highly detailed, glowing 3D illustration of a human hand with electrodes attached, emitting neon cyan and magenta lights that represent the muscle stimulation signals from an advanced AI system, conceptually illustrating the integration of technology and the human body.Embodied AI technology uses muscle stimulation to physically guide users through complex tasks, blending human and machine intelligence.Chicago Today

Researchers at the University of Chicago have developed a new 'embodied AI' system that uses electrical muscle stimulation (EMS) and multimodal AI to physically guide users through complex tasks and skills. The system adapts to the user's environment and intent, providing dynamic, context-aware muscle cues to help people learn new procedures, from operating unfamiliar appliances to performing physical therapy exercises.

Why it matters

This innovative approach to human-computer interaction has the potential to revolutionize fields like healthcare, industrial training, accessibility, and everyday life by enhancing learning and performance through direct bodily engagement. It represents a shift from simply providing information to actively guiding the physical experience.

The details

The key innovation of the embodied AI system is its ability to transmit 'procedural knowledge' – the intuitive understanding of how to perform a task – directly to the user's muscles. By combining computer vision and large language models, the system can generate tailored muscle stimulation instructions based on the user's environment and intent, rather than relying on pre-programmed routines. In user studies, participants were able to successfully complete tasks like opening child-proof bottles and operating unfamiliar cameras with the assistance of the dynamically generated muscle cues.

  • The University of Chicago research team recently earned a Best Paper Award at the ACM CHI 2026 conference for their groundbreaking work.
  • The team's previous project, SplitBody, which focused on reducing mental workload during multitasking via muscle stimulation, received a Best Paper Award at ACM CHI 2024.

The players

Yun Ho

A PhD student in the Department of Computer Science at the University of Chicago and a lead researcher on the embodied AI project.

Romain Nith

A researcher at the University of Chicago and a co-author on the embodied AI paper.

Pedro Lopes

A professor in the Department of Computer Science at the University of Chicago and the principal investigator overseeing the embodied AI research.

University of Chicago

The institution where the embodied AI research is being conducted.

Got photos? Submit your photos here. ›

What they’re saying

“I am curious about how people understand and build relationships with devices that communicate with them through body movements (rather than audio/visual). In 'embodied AI', I got to explore this question in the realm of physical assistance. It was especially insightful to have participants 'think aloud' as they used our system and learn how they interpret machine-induced movements.”

— Yun Ho, PhD student, Department of Computer Science, University of Chicago

What’s next

The research team has open-sourced their code, encouraging further development and innovation within the community. As the field of embodied AI evolves, the researchers are also prioritizing ethical considerations, such as user control and safety, to ensure the technology empowers users rather than replacing traditional instruction.

The takeaway

This new embodied AI system represents a significant advancement in human-computer interaction, moving beyond simply providing information to actively guiding physical experiences. By directly engaging the user's muscles, the technology has the potential to revolutionize learning, rehabilitation, accessibility, and performance across a wide range of industries and everyday tasks.