Product: Lookout
Top Line: Google announced on its blog that it has developed Lookout, an AI-powered navigation app designed to provide more independence to people who are blind or visually impaired.
Close Up: Now available to people with Pixel devices in the U.S. (in English only), Lookout helps those who are blind or have low vision identify information about their surroundings. It draws upon similar underlying technology as Google Lens, which lets you search and take action on the objects around you, simply by pointing your phone. Since we announced Lookout at Google I/O last year, we’ve been working on testing and improving the quality of the app’s results.
Google says it designed Lookout to work in situations where people might typically have to ask for help—like learning about a new space for the first time, reading text and documents, and completing daily routines such as cooking, cleaning and shopping. By holding or wearing the device, which Google recommends hanging your Pixel phone from a lanyard around your neck or placing it in a shirt front pocket, Lookout tells you about people, text, objects and much more as you move through a space. Once you’ve opened the Lookout app, all you have to do is keep your phone pointed forward. You won’t have to tap through any further buttons within the app, so you can focus on what you're doing in the moment.
Google notes that as with any new technology, Lookout will not always be 100 percent perfect. Lookout detects items in the scene and takes a best guess at what they are, reporting this to you. Google asks users to send their feedback to its disability support team at g.co/disabilitysupport so it can continue to improve the app.
Vital Stats: People with a Pixel device in the U.S. can download Lookout on Google Play. To learn more about how Lookout works, visit Google’s Help Center.