The source – blog.google
The announcement of this technology came out back in May. Now, in the coming weeks, the Google team is going to release the first version of the function.
MUM – it is a multitasking unified neural network, which works with the context and can understand the information in text and images. Its use will allow you to search for relevant videos, apply multimodal search in Google Lens, as well as get a relevant look at the Search page.
Here's what updates to Search Google plans to implement:
- Combining photo search with a text query for Google Lens. It will now be easier to get information without knowing the exact name, just by taking a picture of what you need. The "Context Search" feature should go live in the fall in the coming months.
- The "Image Search" button via Google Lens technology. When you press it, the app will recognize the content of the picture on the screen and show you where to buy similar products. Lens will appear in the Google mobile app and in the Chrome desktop browser.
- New options for displaying search results. We'll expand the Related Queries section to cover more of the search topic, and the "What you need to know" section will appear on the page. It will be displayed in response to informational queries, showing important search details.
- A new page design with a focus on ideas and finding inspiration. The user will be shown pertinent videos, images or article previews with pictures and animations.
- MUM technology will recognize the content of the video and allow it to not only break it down into key points, but also show related topics that are not directly mentioned in the video.
- Product showcases: matching items of different colors and styles that can be filtered by parameters.
In addition Google Maps will appear:
- a tool for governments and public organizations that will allow you to quickly assign addresses to people and businesses;
- a new "Forest Fires" layer – it will show current information on this topic.