When the Echo Show launched, it opened the door for new Alexa abilities. Amazon is rolling out one such feature today, announcing new functionality called Alexa Show and Tell. With it, blind and low-vision users who own an Echo Show can ask Alexa to identity the items they’re holding in their hands.
Amazon worked with blind employees and with Vista Center for the Blind in Santa Cruz to develop this feature. While some items are obvious by touch alone, that isn’t always the case for packaged goods like canned food and boxed pantry items. The goal of Alexa Show and Tell is to help fix that by reading out the identities of those items to users.
“The whole idea for Show and Tell came about from feedback from blind and low vision customers,” said Alexa for Everyone head Sara Caplener in today’s announcement. “We heard that product identification can be a challenge and something customers wanted Alexa’s help with. Whether a customer is sorting through a bag of groceries, or trying to determine what item was left out on the counter, we want to make those moments simpler by helping identify these items and giving customers the information they need in that moment.”
Using Alexa Show and Tell is pretty straightforward. Users just need to stand in front of the Echo Show and ask “Alexa, what am I holding?” or “Alexa, what’s in my hand?” Alexa with give users verbal and audio cues to help users position the item in a way so that the Echo Show’s camera can see it, and once the device’s cameras have a clear view, she’ll give a read-out of the product’s name.
Alexa Show and Tell is going live today, though for now it only seems to cover packaged pantry items and it’s only available on first and second-generation Echo Shows. It sounds like Amazon has plans to bring this functionality to more devices in the future, so we’ll keep an eye out for more.