Binghamton University Researchers Develop Talking Robot Guide Dogs

New AI-powered system provides verbal communication and enhanced situational awareness for visually impaired users.

Apr. 9, 2026 at 12:19am

A highly structured abstract painting featuring sweeping geometric arcs, concentric circles, and precise botanical spirals in soft, flat colors against a clean background, conceptually representing the complex interplay of advanced technologies in the creation of an innovative robotic guide dog system.An abstract illustration captures the seamless integration of AI, robotics, and human-centered design in the development of an advanced robotic guide dog system.Binghamton University Today

Researchers at Binghamton University have created a talking robot guide dog system that uses large language models to determine optimal routes, safely guide users to their destinations, and provide real-time verbal feedback about the surroundings. The system was tested with legally blind participants in a multi-room office environment, and the users expressed enthusiasm about the potential for this technology to integrate robotic guide dogs into everyday life.

Why it matters

This innovation in robotic guide dogs goes beyond the limited communication abilities of biological guide dogs, offering visually impaired individuals greater situational awareness, control, and two-way dialogue during navigation. As AI and robotics continue advancing, this research demonstrates how these technologies can be leveraged to enhance accessibility and independence for people with disabilities.

The details

The robot guide dog system developed by Shiqi Zhang and his team at Binghamton University's School of Computing uses large language models like GPT-4 to enable verbal communication between the robot and the user. The system can provide information about potential routes before departure, as well as narrate the surroundings and obstacles during travel. This allows for more control and situational awareness compared to traditional guide dogs, which can only respond to a limited set of commands.

  • The research paper on this system was presented at the 40th Annual AAAI Conference on Artificial Intelligence in 2026.

The players

Shiqi Zhang

An associate professor at Binghamton University's School of Computing who led the development of the robot guide dog system.

Binghamton University

A public research university in New York where the robot guide dog system was developed by the School of Computing.

Got photos? Submit your photos here. ›

What they’re saying

“For this work, we're demonstrating an aspect of the robotic guide dog that is more advanced than biological guide dogs. Real dogs can understand around 20 commands at best. But for robotic guide dogs, you can just put GPT-4 with voice commands. Then it has very strong language capabilities.”

— Shiqi Zhang, Associate Professor, Binghamton University School of Computing

“This is very important for visually impaired or blind people, because situational and scene awareness is relatively limited without vision.”

— Shiqi Zhang, Associate Professor, Binghamton University School of Computing

“They were super excited about the technology, about the robots. They asked many questions. They really see the potential for the technology and hope to see this working.”

— Shiqi Zhang, Associate Professor, Binghamton University School of Computing

What’s next

The team plans to conduct more user studies, increase the system's autonomy, and have the robots navigate longer distances, both indoors and outdoors, to further develop the technology and prepare it for real-world deployment.

The takeaway

This innovative robot guide dog system demonstrates how advancements in AI and robotics can be leveraged to enhance accessibility and independence for people with disabilities. By providing visually impaired users with greater situational awareness, control, and two-way communication during navigation, this research represents an important step forward in improving the quality of life for those with visual impairments.