Updated:Google later stated, "Search Live is not yet available to users worldwide. The feature is currently only available to users in the United States and India, and is being tested in other markets. We sincerely apologize for the previous changes in information."
Google announced earlier that it will combine mobile phone cameras with real-time AI voice question answering.Search Live featureGoogle AI Mode has been officially launched globally. This feature was first unveiled at the Google I/O 2025 developer conference last year and launched first in the Google App in the United States last September. Now, users in more than 200 countries and regions around the world can start experiencing this innovative "what you see is what you ask" search mode, wherever the Google AI Mode chatbot provides services.
Breaking the "take a picture first, search later" limitation, enabling real-time visual dialogue.
Search Live's core concept is very intuitive: users simply point their phone's camera at an object, landmark, or any scene in front of them and ask the AI a question. The system analyzes the real-time footage captured by the camera and provides corresponding answers and interactions via voice.
This completely breaks away from the cumbersome steps of using Google Lens in the past, which required taking a still photo first and then selecting a range to search, making the process of finding answers as natural as having a conversation with a real guide next to you.
Powered by Gemini 3.1 Flash models: Faster, more stable, and supports multiple languages.
In addition to significantly expanding its service footprint, Google has also made key upgrades to the underlying computing technology of Search Live.
The official statement indicates that Search Live has now fully switched to the latest Gemini 3.1 Flash model. This upgrade brings three significant improvements to the user experience:
• More natural conversational fluency:The model performs better in understanding continuity issues and context.
• Low-latency, instantaneous response:The lightweight and high-speed computing characteristics of the Flash model provide a faster and more reliable system response, which is crucial for visual question answering that requires real-time processing of large video streams.
• Natively Multilingual Support:The new model was designed from the outset to have powerful multilingual processing capabilities, which is the core reason why Google is confident in launching it to more than 200 countries and regions at once.
How do I enable the Search Live feature?
This feature has been rolled out to the Google App on Android and iOS platforms. Users can easily access it in the following two ways:
• via Google App:Simply open the application and click the "Live" button below the search bar to begin.
• Through Google Lens (smart lens):In the smart camera interface, look for the "Live" icon at the bottom of the screen. Clicking it will also activate the real-time visual dialogue mode.



