After Apple and Google introduced more accessibility features for World Accessibility Day, Meta also helped the visually impaired through the Ray-Ban Meta glasses, a collaboration with Ray-Ban.Easier to "know" the surrounding situation.
Similar functions, in fact, like Microsoft has proposed in the pastSeeing AI Development Project, or Google'sLookout App ServiceHowever, Meta's function only requires the built-in camera lens of the glasses to capture the scene in front of the user, and then it can describe the content of the scene through artificial intelligence. For example, it can report that the user is in a park, there is a path and grass in front, etc., or clearly describe the items the user sees in the kitchen.
This feature is mainly intended to provide assistance to the visually impaired, but it can also be used by general users to easily identify the scenes they see in front of them. It is expected to be updated in the United States and Canada in the next few weeks, and will be expanded to more regions for use later.
In addition, Meta also announced a partnership with the Be My Eyes Foundation to add a new service called "Call for volunteersThe "Call a Volunteer" function can assist visually impaired people who need help by matching them with volunteers through online services to help them confirm the details of objects and scenes in front of them.
