Amazon earlier added a new feature to its Alexa digital assistantA skill called "Show and Tell", allowing the Echo Show equipped with a video camera to identify daily necessities or kitchen ingredients through image recognition, allowing the visually impaired to judge the details of the objects they are holding.
Currently, this feature is being promoted in the United States first, and is only available on Echo Show devices with a video camera design. The reason is that this feature requires the use of a camera to capture and identify images so that the system can make further analysis and judgments.
Users only need to hold the object in their hand and let the Echo Show's video camera capture it, and then ask the Alexa digital assistant, which will make a judgment through image recognition and inform the user verbally.
According to Sarah Caplener, head of Alexa for Everyone, the design of the "Show and Tell" skill was based on feedback from the blind, partially sighted, and other visually impaired groups. The hope is that through the Alexa digital assistant service and image recognition, visually impaired users can more easily determine the details of objects.
In the future, this function may not only help the visually impaired to judge the details of objects, but also make it easier for users to know how to use specific items. For example, by scanning ingredients to find out how to cook them, it may even be possible to determine whether food is expired, or tell users how which ingredients should be stored.
Similar applications, Microsoft has previously used its internal team to create aSmart glasses that can help blind people judge the scene ahead, it will be possible to take pictures through the camera lens on the glasses, and the glasses will be able to work with the connection, and then describe the scene in front of them to the wearer through sound prompts, so that they can understand the current scene as if they can see the scene in front of them.




