Source – blog.google
The company announced a technology for processing complex search queries - MUM (Multitask Unified Model). Like BERT, MUM is built on the Transformer architecture, but it is expected to be 1000 times more powerful and able to perform multiple tasks simultaneously, understanding information in 75 languages.
MUM is multimodal, so it understands information in text and images and can expand to more modalities such as video and audio in the future. This is a significant advantage, since a user needs to ask an average of 8 search queries today in order to get complete answers to solve their problems.
With the MUM technology, it will be possible to recognize the suitability of a particular piece of equipment, for example, for the action specified in a search query, already from a photograph.
A public launch date has yet to be determined: Google is currently conducting internal testing of MUM and the results seem impressive.