Select Language:
Google has announced that the full version of Search Live, now including video as well as audio, is beginning its rollout on mobile devices in the United States for users participating in the AI Mode Labs experiment. This feature, previously demonstrated at Google I/O in May, initially released as an audio-only version last month, and is now expanding to include video.
According to Google, Search Live is tightly integrated with Google Lens, the company’s visual search tool. Users can activate Lens in the Google app, tap the Live icon, and pose questions while pointing their camera at objects. This setup allows for dynamic, conversational interactions where the search engine responds based on visual context, such as different angles or objects in motion, all within an AI-enabled mode.
Previously, users could only communicate with Search through voice, but now, with the visual component added, Search can see what you’re looking at, enabling a more interactive experience. For instance, you can ask questions about an object in front of your camera, and Search responds with relevant information, links, and insights based on what it visually perceives.
Screenshots demonstrate the feature in action, showing the interface and how the camera view is used during the search process. A recorded video also illustrates the real-time interaction, emphasizing the new capabilities.
In social posts, Google representatives highlighted practical use cases. Robby Stein mentioned that Search Live with video input can assist with science projects or textbook content by acting as a visual learning partner — simply point your camera, engage in a live voice conversation, and get immediate responses. Rajan Patel added that the feature works like having an expert on standby, capable of seeing what you see and providing helpful audio responses along with visual context and web links for further exploration.
Overall, this enhancement offers users a more immersive and responsive way to interact with search, combining voice, visual cues, and web resources seamlessly.