Click For Photo: https://techcrunch.com/wp-content/uploads/2019/09/echo-show-and-tell.png?w=722
Amazon is introducing a new feature to its Alexa Show device designed to help blind and other low-vision customers identify common household pantry items by holding them up in front of Alexa’s camera and asking what it is. The feature uses a combination of computer vision and machine learning techniques in order to recognize the objects the Echo Show sees.
The Echo Show is the version of the Alexa-powered smart speaker that tends to sit in customers’ kitchens because it helps them with other kitchen tasks, like setting timers, watching recipe videos, or enjoying a little music or TV while you cook.
Blind - Users - Show - Duty - Pantry
But for blind users, the Show will now have a new duty: helping them better identify those household pantry items that are hard to distinguish by touch — like cans, boxed foods, or spices, for example.
To use the feature, customers can just say things like “Alexa, what am I holding?” or “Alexa, what’s in my hand?” Alexa will also give verbal and audio cues to help the customers place the item in front of the device’s camera.
Amazon - Feature - Collaboration - Amazon - Employees
Amazon says the feature was developed in collaboration with blind Amazon employees, including its principal accessibility engineer Josh Miele, who gathered feedback from both blind and low-vision customers as part of the development process. The company also worked with the Vista Center for the Blind in Santa Cruz on early research, product development, and testing.
“We heard that product identification can be a challenge and something customers wanted Alexa’s help with,” explained Sarah...
Wake Up To Breaking News!