Google is bringing a bundle of new features to Gemini Live, its AI assistant that you can have real-time conversations with. Next week, Gemini Live will be able to highlight things directly on your screen while sharing your camera, making it easier for the AI assistant to point out a specific item.
If you’re trying to find the right tool for a project, for example, you can point your smartphone’s camera at a collection of tools, and Gemini Live will highlight the correct one on your screen. This feature will be available on the newly announced Pixel 10 devices when they launch on August 28th. Google will begin rolling out visual guidance to other Android devices at the same time before expanding to iOS “in the coming weeks.”
Google is also launching new integrations that will soon allow Gemini Live to interact with more apps, including Messages, Phone, and Clock. Say you’re in the middle of a conversation with Gemini about directions to your destination, but you realize you’re running late. Google says you’ll be able to interrupt the chatbot with something like: “This route looks good. Now, send a message to Alex that I’m running about 10 minutes late.” From there, Google can draft a text to your friend for you.
Lastly, Google is launching an updated audio model for Gemini Live that the company says will “dramatically improve” how the chatbot “uses the key elements of human speech, like intonation, rhythm and pitch.” Soon, Gemini will change its tone based on what you’re speaking about, such as using a calmer voice if you’re asking about a stressful topic.
You’ll also be able to change how fast — or slow — Gemini talks, which sounds a bit similar to how users can now tweak the style of ChatGPT’s voice mode. And, if you ask Gemini for a dramatic retelling of a story from the perspective of a particular character or historical figure, the chatbot may adopt an accent for a “rich, engaging narrative.”