How researchers are combining depth camera and Internet of Things technologies with artificial intelligence to create a wearable that augments sensory awareness for people living with impaired vision.
Photoreceptors in Adams’ eyes are declining, weakening his peripheral vision and ability to see in dim light or at night. He sees the world in out-of-focus fragments that he pieces together into a blurry mosaic.
But a coworker’s makeshift body sensor system that transforms sight into feeling is giving Adams hope.
“For years I have been thinking about how I could use technology to improve my vision or to potentially replace what I’m missing,” said Adams, a program manager who has worked at Intel for over 20 years.
Dubbed Intel RealSense Spatial Awareness Wearable, the latest prototype is powered by an Intel Joule compute module, an Intel RealSense depth camera, several element tinyTILE boards featuring Intel Curie, data processing and haptic technologies. The prototype ‘sees’ objects ahead and around the wearer, identifying their location: high, low, right or left.
When the wearer gets closer to an object, the system triggers thumb-sized vibrating sensors across the chest, torso and near the ankles. The closer the object, the more intense the vibration.
“The potential was immediately obvious,” said Adams, who has been helping improve the prototype since it was first publicly demonstrated at the 2015 International Consumer Electronics Show.
His field of vision is limited to less than 20 percent of a normal visual field, so he misses visual stimuli that are important for social interaction. He often misses handshakes or other subtle social cues, and people just seem to appear out of nowhere.
He said the wearable prototype increased his confidence about making sense of and appropriately responding to his immediate surroundings.
“It showed that I can rely on something other than my limited sight to understand what’s around me.”
Since the first prototype was built in 2015, Robert Cooksey, a designer and research scientist at Intel’s Maker and Innovator Group, and his teammates have improved the design to make it more portable. Once it’s worn, the wearable brings ambient spatial awareness of the environment, boosting peripheral vision, said Cooksey.
“Working with Darryl has been a gift,” said Cooksey. “He’s worn our prototypes more than anyone, and has really helped us understand how to make them work better.”
Putting Prototypes to the Test
While the technology can’t reverse blindness, Adams and Cooksey believe the 39 million blind people and 285 million people with impaired vision could benefit from it if product makers realise the potential.
In 2016, ophthalmology and visual science students at the University of Iowa led by Dr. Stephen Russell, MD, built a similar prototype based on the open-source design shared by Intel.
“Dr. Russell knows technologies aimed at combating diseases for retinal degeneration, and he saw Intel’s prototype as a simpler way to develop a device,” said Dylan Green, a senior undergraduate and research assistant working on the University of Iowa project.
Green’s team made several improvements to the Intel blueprint to create a more streamlined device that was easier to put on and operate. The team now has 10 prototypes – called LEO Belt, which stands for Low-vision Enhancement Optoelectronic Belt – that are being tested on people in a makeshift obstacle course.
“The main goal for our project is to help bring more affordable alternative solutions than what’s out there today,” said Green. “We plan to go through all the medical hoops, show it’s a useful device and increase public awareness of it. We want to help speed the process of getting this technology out there.”
Green said the technology is evolving rapidly. His team already switched from using a Microsoft Surface tablet computer to an Intel Compute Stick, which is half the size of a deck of cards.
He said new compute modules and camera technology are getting smaller and more powerful, which will lead to better devices that enhance people’s lives.
Cambrian Explosion of Innovation
Cooksey credits colleague Rajiv Mongia, director of experience and outreach for Intel’s Maker and Innovators Group, for sparking the idea that led the Intel team to create the Spatial Awareness Wearable prototype. Mongia sees this innovation as an example of what’s driving the 4th industrial revolution.
“In the first three industrial revolutions, people were often challenged to get their hands on the technologies necessary to innovate. As a result, innovation was limited to those that had access to the financial or knowledge resources for them to pursue their dreams,” said Mongia.
“What’s driving the fourth industrial revolution is accessible technologies that make it easier for tens of millions of people to innovate.”
When powerful technologies are accessible to more people, they can experiment, prototype and bring to market. Mongia said this will create a wave of innovation similar to the Cambrian Explosion in biology, which led to the variety of species that we have today.
“Innovation in the next 10 years is going to be mind boggling.”
Using sensory substitution, like the SAW prototype, to help people make better sense of the visual world has tremendous potential, said Adams.