Google announced a suite of AI-powered enhancements to its Search, Maps and Lens services at a recent show in Paris, and Lens is expected to take advantage of a particularly useful new feature in the coming months.
Soon, Google Lens users on Android will be able to search what they see in photos and videos using the Google Assistant itself. The integration will work across countless websites and apps and will allow people to learn more about the information contained in the images – for example, building names, recipes or car models – without having to move away from the images. As Google explained in its Paris presentation, “if you can see it, you can search it.”
Confused? Check out the latest Google Lens update in action via the tweet below, which shows a user identifying the Luxembourg Palace from a friend’s video of the landmark.
We will be rolling out a ✨major update✨ in the coming months to help you search what you see on your mobile screen. Soon you’ll be able to use Lens through the Assistant to search for what you see in photos or videos on websites and Android apps. #googlelivefromparis pic.twitter.com/UePB421wRYFebruary 8, 2023
Google has yet to announce a release date for the new feature, although the company has promised to roll out the update “in the coming months” (which for our money would probably mean February or March 2023).
Significant improvements are also heading to the Google Multisearch feature. The ability to add a text query to your Lens search is now available worldwide in all supported languages and countries, and Google is also introducing the ability to find variations (like shape and color) of objects captured by Lens.
As explained by Google in Paris: “You can, for example, search for ‘modern living room ideas’ and see a coffee table you like, but you prefer it in a different shape – say a rectangle instead of a circle. You’ll be able to use Multisearch to add “rectangle” text to find the style you’re looking for.” See the function in action below:
Multisearch is now available worldwide! Try this new way to search with images and text at the same time. 🤯 So if you see something you like but want it in a different style, color or cut, just take or upload a photo using Lens then add text to find it. 🔎#googlelivefromparis pic.twitter.com/4yT6voiJknFebruary 8, 2023
A new era of search?
Elsewhere during a recent Google showcase, the company announced a number of AI-powered updates for Google Search and Google Maps.
For example, Google will soon integrate its “experimental conversational AI service,” Bard, into its search engine to offer users more accurate and convenient search results. As explained by Google in Paris, you’ll soon be able to ask questions like “what are the best constellations to look for when stargazing?” and then delve into what’s the best time of year to see them with helpful AI suggestions.
The move follows Microsoft’s announcement of a redesigned AI-powered search engine, Bing, which uses the same technology as ChatGPT.
As for Google Maps, the service’s Immersive View feature, which allows you to virtually explore landmarks, will get a major upgrade across five major cities around the world, while Live View, which uses your phone’s camera to help you explore the city through a neat overlay AR – is set to a similar extension.
We’ll be testing all of the above features ourselves over the coming months, but for a recap of everything announced at Google’s Paris showcase, visit our Google Live from Paris blog.