A lot of Google’s projects in recent months have been focused on improving the lives of people with impairments. Recently, Google revealed a new app that it’s working on called “Lookout” which aims to help those with visual impairments.
Detailed on its blog earlier this week, Google says Lookout is designed to assist users who are either visually impaired or blind to be more independent. It does this by giving users audio cues for their surroundings, including people, objects, and even written text. Google explains:
After opening the app, and selecting a mode, Lookout processes items of importance in your environment and shares information it believes to be relevant—text from a recipe book, or the location of a bathroom, an exit sign, a chair or a person nearby. Lookout delivers spoken notifications, designed to be used with minimal interaction allowing people to stay engaged with their activity.
Lookout has a few different built-in modes as well. These can be used in different scenarios, with options for being out or at work, use in the home, as well as a “scan” mode which is used to read aloud any text the app finds.
When you select a specific mode, Lookout will deliver information that’s relevant to the selected activity. If you’re getting ready to do your daily chores you’d select “Home” and you’ll hear notifications that tell you where the couch, table or dishwasher is. It gives you an idea of where those objects are in relation to you, for example “couch 3 o’clock” means the couch is on your right. If you select “Work & Play” when heading into the office, it may tell you when you’re next to an elevator, or stairwell. As more people use the app, Lookout will use machine learning to learn what people are interested in hearing about, and will deliver these results more often.
Google plans to make Lookout available to users later this year, but there’s currently no firm ETA on a release window. The company shows how the app works in action in the video below.
FTC: We use income earning auto affiliate links. More.