During a recent Paris showcase, Google announced a new Google Lens update coming to Android mobile devices. “In the coming months,” Google will allow anyone using Lens on Android to search anything on their screen. As Google said during the presentation: “If you can see it, you can search it.”
In the coming months, we will introduce a ✨major update ✨ to help you search what’s on your mobile screen.
You’ll soon be able to use Lens via Assistant to search for what you see in photos or videos on websites and apps on Android. #googlelivefromparis pic.twitter.com/UePB421wRY
— Google Europe (@googleeurope) February 8, 2023
Users would be able to search for the name of a building, food recipes, car models, or any image that might contain searchable information. It will also work on websites and apps, and you won’t need to leave the screen to search in Lens.
At the same event, Google announced an update to its Multisearch feature: which allows users to search for images and add text to refine search results. You can now add text to specify the types of results you’re looking for with an image.
Multisearch is now live globally! Try this new way to search with images and text at the same time. 🤯
So if you see something you like, but want it in a different style, color, or fit, snap or upload a photo with Lens, then add text to find it. 🔎#googlelivefromparis pic.twitter.com/4yT6voiJkn
— Google Europe (@googleeurope) February 8, 2023
In the Tweet above, Google demonstrates that you can perform an image search and further refine the results by using Lens and then adding a text search, specifying a different style.
At the same event, Google also unveiled Bard, its ChatGPT rival that will soon be featured in search. Just this week, Microsoft also announced its upcoming AI-powered improvements to Bing browser and Edge.
Start a new Thread