.
Teal overlay

Robot guide dog assesses environment and speaks to users

A new AI-powered robot guide dog is able to not only physically guide blind and visually impaired people but also assess its surroundings and speak to its users as they move.

The RoboGuide prototype has been developed at the University of Glasgow and combines a number of technologies with the existing Unitree four-legged walking robot.

RoboGuide uses an array of sensors located on the robot’s exterior to continuously scan and assess the surrounding environment.

Bespoke software specially designed by the researchers allows the system to work out optimal routes between points in its surroundings, while the sensor data is also analysed in real-time to take into account moving obstacles.

Principal investigator Dr Olaoluwa Popoola said that one of the main drawbacks of existing robotic systems was that their navigation technology limited their usefulness for visually impaired users.

Systems that relied on GPS could be effective outdoors, for example, but might struggle indoors where the signal is weaker.

Other models that use cameras to provide visual ‘sight’ could be limited by line of sight, making them less effective in guiding people around obstacles or bends.

Robot can use AI to converse with its users

As well as cutting-edge sensor technology, the RoboGuide system uses AI large language model (LLM) technology to communicate.

LLMs have come to prominence recently through systems such as ChatGPT and are increasingly being used to provide natural-seeming conversation in a variety of settings and use cases.

In RoboGuide’s case, it enables the artificial guide dog to understand comments and questions from users, as well as provide information, instructions and warnings verbally.

The technology has been developed with support from the Forth Valley Sensory Centre (FVSC) Trust and the Royal National Institute of Blind People (RNIB) Scotland.

It was tested for the first time with volunteers from the FVSC and RNIB at Scotland’s oldest museum, the Hunterian.

As well as successfully guiding the volunteers around the museum’s first floor, RoboGuide provided interactive spoken information on six of the exhibits.

Co-investigator Dr Wasim Ahmad said that the project’s aim now is to develop a system that can be adapted for use with a range of different robots to support the visually impaired wherever they need extra help.

Today’s news was brought to you by TD SYNNEX – the UK’s number one solutions distributor.

Back to Top