Listen to this article
|
Visually impaired people may soon be able to use AI-powered robotic service dogs to navigate the world around them. Researchers from the University of Glasgow, along with industry and charity partners, have unveiled RoboGuide.
The robot uses artificial intelligence to not only assist its users to move independently, but also to speak to them about what’s going on around them. RoboGuide employs a variety of cutting-edge technologies mounted onto an off-the-shelf robot body, according to the University of Glasgow.
RoboGuide relies on computer vision to navigate
RoboGuide isn’t the first robotic assistant developed for visually impaired people. Another example is Gilde from MassRobotics resident startup and Pitchfire winner Glidance Inc.
The University of Glasgow researchers said RoboGuide does address two significant challenges for such robots. The first is that the technology these robots use to navigate their surroundings can limit their usefulness as guides.
“Robots which use GPS to navigate, for example, can perform well outdoors, but often struggle in indoor settings, where signal coverage can weaken,” said Dr. Olaoluwa Popoola, the project’s principal investigator at the University of Glasgow’s James Watt School of Engineering. “Others, which use cameras to see, are limited by line of sight, which makes it harder for them to safely guide people around objects or around bends.”
RoboGuide is equipped with a series of sophisticated sensors to accurately map its environment.
“We use computer vision and 3D technology, where it scans the whole environment and it understands where each object, each pillar, each obstacle is,” said Dr. Wasim Ahmad, project co-investigator.
The team has developed software that assesses the mapping data and uses simultaneous localization and mapping (SLAM) algorithms to determine optimal routes from one location to another. The software interprets sensor data in real time, enabling the robot to track and avoid moving obstacles.
University of Glasgow adds interactivity
In addition to improved navigation, RoboGuide includes the ability to talk to its user. The robot uses a large language model (LLM) — similar to the technology that powers ChatGPT — to understand and respond to questions from people around it.
LLMs are deep learning algorithms that process natural language inputs, such as spoken questions, and predicts responses based on those inputs. RoboGuide’s AI is trained on massive data sets of language to be able to predict and generate context-appropriate responses to questions and commands.
RoboGuide recently reached a significant milestone in its development. In December 2023, volunteers with visual impairments tested it for the first time.
The quadruped robot guided them through the Hunterian Museum in Glasgow. RoboGuide helped the volunteers tour exhibits on the first floor and provided interactive spoken information on six exhibits.
The results were promising. “One hundred percent I would use this in the future,” said volunteer Kyle Somerville (see video below). “As well, there are a lot of people I know that would definitely either want to try this or would definitely use it.”
More refinements to come
The research team plans to use data from this demonstration to further refine the RoboGuide platform, with the aim of bringing a more complete version to market.
“Ultimately, our aim is to develop a complete system which can be adapted for use with robots of all shapes and sizes to help blind and partially sighted people in a wide range of indoor situations,” said Ahmad. “We hope that we can create a robust commercial product which can support the visually impaired wherever they might want extra help.”
The World Health Organization has estimated that 2.2 billion people worldwide live with impaired vision. Assistive devices such as RoboGuide could make it significant easier for their users to live more independently and enhance their quality of life.
About the author
Matt Greenwood is an experienced writer with more than 20 years of experience in public-sector communications.
jorge zavala says
I think this is great wonderful I can’t wait till I have something in my hand. I really need some assistance, especially in the future development of robotic assistance.