Google wants to turn a regular Android phone into an artificial intelligence-powered helper for the blind people
Lookout is a new app that uses image recognition and artificial intelligence to define a scene through a phone’s camera. Google (GOOG) announced the app at its annual I/O developer meeting this week.
Google is analyzing the app now and said it will be released later this year, starting on Pixel devices.
If you walk into building hall with an Android camera held in your hand or damaged around the neck on a lanyard, pointed outward, Lookout will detect people and objects like elevator doors, and read the nearby text. The app doesn’t need an internet connection.
Google says Lookout is developed to keep users busy with what’s going on without bombing them with information. It only labels the most important items. If a person wants to stop the recitation, they put a hand over the camera or give it a knock to pause the app.
The corporation says the app uses similar technologies to Google Lens, a visual hunt tool. It combines machine learning, image recognition, and several diverse machine learning models, according to the company.
“What blind people need is access to info,” Erin Lauridsen, director of access technology at LightHouse in San Francisco. “A lot of times the way these things are endorsed and produce capital is all about making blind people ‘see’ but it really is about taking the visual info and translating it into non-visual information.”
Google is not the first company to use artificial intelligence and smartphones to help the blind. Microsoft has a similar app for iOS called Seeing AI.
Seeing AI can read printed or handwritten text, describe colors and recognize denominations of currency. Unlike Google Lookout, Seeing AI has a facial recognition component. After directing the phone at a person, Seeing AI describes them and says how far away they are. If you’ve skilled the app to recognize faces, it’ll tell you the person’s name. (Google said it may add facial recognition to Lookout in the future).
App-based technology for the blind isn’t limited to big businesses. The VocalEyes AI app, which can define a person’s emotion or age based on their face, was created by an 18-year-old high university student at an MIT summer program.
There are also more outdated apps that still use humans instead of AI. For example, Aira gives members smart glasses with a built-in camera so trained employees can define what they see through the camera. In the meantime, Be My Eyes relies on volunteers to check out what’s around you through a live video chat.
“I think that AI is trending over-all, and it makes sense to see how it can help for people who are blind but I think there are a lot of unreciprocated questions,” said Lauridsen.