Coming soon to a phone near you: A new wave of accessibility tools

When is it coming? Available as a beta feature in Google’s Lookout app on June 2.

Three years ago, Google launched an Android app called Lookout as a sort of catchall toolbox for its blind and low-vision users.

If you open the app and point the camera at a snippet of text, for instance, the app will begin to read those words out loud. “Explore mode,” meanwhile, tries its best to describe what’s in front of the camera — it likes to describe the water bottle that perpetually lives on my desk as “tableware with text ‘The Washington Post.’ ”

Google’s latest addition to this toolbox is a feature that can describe what’s going on in certain kinds of photos.

“Imagine reading a news story with a photo or receiving an image on social media that people are reacting to, but not knowing what’s in that image,” said Scott Adams, product manager for Lookout, in a video meant for app developers. “That’s where the potential of [machine learning] comes in.”

For now at least, you probably shouldn’t expect Lookout to correctly explain gritty, complicated scenes. Adams said that, beyond offering basic captions like “dogs playing in the grass,” the tool will be able interpret text — say, on a sign — and identify people in images by their “upper body clothing and whether they’re an adult or youth.” Even better, all of that image recognition will happen solely on your phone or tablet — no more sending info to far-flung servers.

File source

Show More

Related Articles

Back to top button