Google unveiled its multi search feature



The source – blog.google 

The new feature combines a query by image and text. Which means it can find things they don't have the words to describe. It can be an unknown piece of equipment, a specific closet item of a different color, or something with a similar property of the item (socks with a print like on a shirt).

The visual search feature is based on MUM and Google Lens technology. The use of MUM and allows for multimodality of search, when a user can enter both visual (photo) and textual information (query) at the same time.

How to use multisearch? 

Tap the camera icon in the Google app for Android or iOS and take a picture. Or select one of the previously taken screenshots, swipe up and tap "+ Add to your search," and then add text.

For now, the multisearch works in beta. It is available in the U.S. for queries in English.

Google unveiled its multi search feature
Share

Spelling error report

The following text will be sent to our editors: