Canada

Google Lens will be “Ctrl + F” for the world, says the search giant

Google Lens already has a lot of fantastic AR tricks up its sleeve, but it will soon get its most useful feature to date, called Scene Exploration.

At Google I / O 2022, Google visualizes the new feature, which it says acts as a “Ctrl + F” shortcut to find things in the world in front of you. Keep your phone’s camera close to the stage, and Google Lens will soon be able to superimpose useful information on products to help you make quick choices.

An example of Google’s demonstration of the feature was the candy shelves, which Lens superimposed with information that included not only the type of chocolate (such as dark chocolate), but also their customer rating.

In theory, this feature of Google Lens can be super powerful and save a lot of time, especially when shopping. And Google says it works with some smart real-time technology, including knowledge graphs that gather multiple streams of information to give you local tips.

The disadvantage? Scene Exploration has no release date yet, with Google saying it is coming “in the future” without a precise schedule. This means that it can deliver on the earliest promises of Google Lens, which took several years to mature. But that doesn’t seem like a huge leap from Lens’ existing shopping tools, so hopefully we’ll see the first signs of that sometime this year.

Analysis: one of the most useful tricks of AR so far

(Image credit: Google)

There is no doubt that the Scene Exploration mode has a huge potential for shopping, as old-school browsing in stores will probably increasingly take place behind the phone screen – or maybe, after all, smart glasses.

But Google says there are more favorable applications. The feature can obviously help conservationists identify endangered plant species, or give volunteers a convenient way to sort donations.

Either way, this certainly looks like a powerful and intuitive development of another Lens feature that Google announced at I / O 2022 called Multi-Search. This allows you to combine image search with a keyword to help you find obscure products or items without having to know their name.

Multi-Search arrived on Google Search last month (check out the Android or iOS Search app) and will soon be able to use a more specific version called “Near Me”. Google’s example was to take a picture of a particular dish and then be able to search for local restaurants that serve that particular food.

It could be argued that this kind of feature makes us all idiots, helplessly relying on the crutch of powerful Google lens and search technology. But features like Scene Exploration and Multi-Search seem to be some of the most useful examples of AR we’ve seen, and their flexibility should be a boon to all types of users.

Now all we have to do is wait to see how long it takes them to fully materialize in Google Lens.