At its annual Google I/O conference, Google announced a bunch of new AI-powered experiences for Android devices in an attempt to "reimagine how you can interact with your phone." The updated Circle to Search will be able to explain your homework, Gemini will get better at understanding context, and more.
Starting today, Circle to Search, a feature introduced with the Galaxy S24 Series, can help students do their homework by offering step-by-step instructions about solving physics and math. Future updates will make it possible to work with even more complex tasks with symbolic formulas, diagrams, graphs, and more.
Upcoming Gemini updates will make it possible to bring the new assistant to the foreground in any app and drag its content to other applications, ask questions about things on the screen, and more. Google says these capabilities will be available on "hundreds of millions of devices" in the next few months.
Speaking of Gemini, its local, on-device Nano model will soon go multimodal. That means Gemini Nano will be able to process not just text but also sounds, spoken language, images, and more. Multimodal capabilities in Gemini Nano will roll out to Pixel users later this year.
In addition, Gemini Nano will power TalkBack to help users with blindness get a clearer and better understanding of what surrounds them. Because the language model runs on-device, results appear much faster without the need for an active internet connection.
Finally, Gemini Nano will be better at detecting fraud, providing you with real-time alerts during a call. Google says the model can detect scam patterns while keeping all your conversations and related data solely on your device. The company will share more details about this feature later this year.
According to the official blog, customers will learn more about ecosystem updates and new features for Android 15 on the second day of Google I\O 2024.
14 Comments - Add comment