In late 2022, search engine giant Google was caught napping when the Microsoft-backed OpenAI burst in to the scene with generative Artificial Intelligence (gen AI)-powered ChatGPT.
Since then, Google has made steady strides with its own gen AI, Bard and the most recent Gemini update has helped it leap ahead of OpenAI's ChatGPT. To keep the momentum going, it is bringing all-new ways to search on phones.
Circle to Search
Yes, it works as the title reads. The user has to circle a particular object in the photo and a pop-up Google search screen appears offering information about it.
Not just photos, but even social media posts and videos (have to be paused) too, users can instantly get data on Google search.
For instance, you are watching a movie on an OTT app, and spot a lovely looking aviator sunglasses and want to know if it is available online and the price details. Just pause the video and circle the object with a finger-swipe gesture, and a pop-up Google search screen appears offering information about it. The interesting thing about this feature is that users need not have to leave the app to perform the search and get results.
Not just circle to search gesture, he/she can also long-press the home button, or can highlight, scribble on, or tap anything on screen to see helpful, and high-quality search results without switching apps.
Circle to Search feature is slated to be rolled out to select premium Android smartphones on January 31, starting with the Pixel 8, the Pixel 8 Pro, and the new Samsung Galaxy S24 series.
Point and search with the multi-search feature of Google Lens
This new novel search method option of using images and text to get results was first showcased in 2022, but it was under testing. And, was limited people to select countries.
It has finally arrived at Google Lens. Initially, it will be available in the US starting today (January 18.). People in other global regions can test it out on the Search Generative Experience (SGE) option (blue test flask icon) in the left top corner.
"The new multisearch experience will show results with AI-powered insights that go beyond just visual matches. This gives you the ability to ask more complex or nuanced questions about what you see, and quickly find and understand key information," Google said.
Users just have to tap on the Google Lens feature on the search bar and take a snap and on the stage, you can ask questions related to the object in the photo.
Get the latest news on new launches, gadget reviews, apps, cybersecurity, and more on personal technology only on DH Tech.