Samsung’s virtual assistant Bixby also offers a Google Lens-like feature called Bixby Vision. The feature lets you to easily translate words, look for products online, identify landmarks, and much more with just your smartphone’s camera. While the feature is already quite useful in its current shape, Samsung is now adding three more accessibility features to make it even better — Quick Reader, Scene Describer, and Color Detector.
The new accessibility features were announced on Global Accessibility Awareness Day and they’re designed to help the visually impaired learn more about their surroundings. As its name suggests, the Quick Reader feature can read out written texts (such as labels or signs) in real-time. It can recognize over 1,000 common objects and items such as food, vegetables, cleaning products, etc. and it includes support for 57 languages.
The Scene Describer feature provides descriptions of images including captured scenes and downloaded pictures using machine learning. This can be used to identify obstacles when one is navigating their surroundings or to describe any image captured using the phone’s camera.
And finally, the Color Detector feature is designed to help those with visual impairments identify colors of items in the frame. This can come in handy while distinguishing between different colors of clothes, for example. The feature is capable of identifying and differentiating between 33 colors.
All three of these aforementioned accessibility features have already been rolled out to users and are available on Bixby Vision 3.5 and above. However, while the Quick Reader and Color Detector features are available in all regions, the Scene Describer feature is currently limited to a few regions only.
Source: Samsung Newsroom
The post Samsung’s Bixby Vision adds 3 new accessibility features: Quick Reader, Scene Describer, and Color Detector appeared first on xda-developers.
0 comments:
Post a Comment