Home / News / Gemini Live Can Guide You In The Real World Once You Let The AI See It

Gemini Live Can Guide You In The Real World Once You Let The AI See It

Google unveiled the new Pixel 10 phones, the Pixel Watch 4 models, and the affordable Pixel Buds 2A earbuds earlier this week, but Gemini AI was also a big star of the Made by Google event. Artificial intelligence is at the center of all these products, with Google having announced a few interesting AI experiences. Magic Cue, Camera Coach, and conversational edits in Google Photos are highlights, but Gemini Live also got a few upgrades this week, including the ability to see what you see and interact with the real world via the Pixel phone’s display.

The new Gemini Live feature will not be a surprise to any Pixel user who saw Google’s Project Astra demo at I/O 2025 a few months ago. At the time, Google shared concepts for Gemini Live, including the AI assistant’s ability to recognize objects in the real world and answer questions in real time while highlighting those objects on the screen.

Maybe you’re trying to repair something following an online tutorial and you don’t know which tool to pick from a box or table. Sharing your screen with Gemini Live and asking the AI to identify the tool mentioned in the tutorial could do the trick. The AI would look at the camera stream, identify the object, and then highlight the right tool by drawing a rectangle around it.

Google unveiled the new Pixel 10 phones, the Pixel Watch 4 models, and the affordable Pixel Buds 2A earbuds earlier this week, but Gemini AI was also a big star of the Made by Google event.
Artificial intelligence is at the center of all these products, with Google having announced a few interesting AI experiences.
The new Gemini Live feature will not be a surprise to any Pixel user who saw Google’s Project Astra demo at I/O 2025 a few months ago.
At the time, Google shared concepts for Gemini Live, including the AI assistant’s ability to recognize objects in the real world and answer questions in real time while highlighting those objects on the screen.
Sharing your screen with Gemini Live and asking the AI to identify the tool mentioned in the tutorial could do the trick.

Google unveiled the new Pixel 10 phones, the Pixel Watch 4 models, and the affordable Pixel Buds 2A earbuds earlier this week, but Gemini AI was also a big star of the Made by Google event. Artificial intelligence is at the center of all these products, with Google having announced a few interesting AI experiences. Magic Cue, Camera Coach, and conversational edits in Google Photos are highlights, but Gemini Live also got a few upgrades this week, including the ability to see what you see and interact with the real world via the Pixel phone’s display.

The new Gemini Live feature will not be a surprise to any Pixel user who saw Google’s Project Astra demo at I/O 2025 a few months ago. At the time, Google shared concepts for Gemini Live, including the AI assistant’s ability to recognize objects in the real world and answer questions in real time while highlighting those objects on the screen.

Maybe you’re trying to repair something following an online tutorial and you don’t know which tool to pick from a box or table. Sharing your screen with Gemini Live and asking the AI to identify the tool mentioned in the tutorial could do the trick. The AI would look at the camera stream, identify the object, and then highlight the right tool by drawing a rectangle around it.

Tagged:

Leave a Reply

Your email address will not be published. Required fields are marked *