Today's Google presentation was not just about new phones, watches, and earbuds. Google also unveiled several AI-powered accessibility features that should help people use their Pixel devices regardless of their physical abilities or limitations.
The first one is Guided Frame, a feature designed to help users with blindness take great selfies. Guided Frame can give spoken suggestions on how to improve your frame by positioning your face, selecting the correct angle, or moving to a better-lit location. With the latest update, Guided Frame gets better object recognition, face filtering in group photos, and focus on subjects. In addition, you can launch Guided Frame from the Camera app without using Android's native TalkBack screen reader.
The next is Magnifier, a Pixel-exclusive app (available on the Pixel 5 and newer) that uses the camera to zoom in on objects around you. It can now search for specific words in your surroundings (for example, when checking the menu in a restaurant), select the best lens for particular scenarios, use picture-in-picture format, and work as a mirror with illumination for selfies.
Google's foldable phones, including the latest Pixel 9 Pro Fold, now have a so-called Live Transcribe mode, which lets owners set their foldable in a tabletop posture for better visibility and let everyone participate in the conversation while also seeing their transcriptions.
Finally, Live Caption and Live Transcribe features are now available in more languages. The latest additions include Korean, Polish, Portuguese, Russian, Chinese, Turkish and Vietnamese for the Live Caption feature and 15 languages for Live Transcribe offline mode (over 120 languages in online mode), which comes in handy in areas with poor network coverage, such as subways, airplanes, and more.
You can check out more details about the latest AI-powered accessibility improvements in Pixel devices in a blog post on the official website.
0 Comments - Add comment