For example, if I type "Apple" in English, I may want to find nutritional information about the apple or find information about the computer and telephone brand. The algorithm therefore uses the terms surrounding the problematic word to deduce the correct intention. architecture, but going much further. A thousand times more powerful than BERT according to Pandu Nayak, Google MUM aims to understand how human beings communicate to better interpret the results and provide appropriate responses.
The method used here, the T “Text-To-Text Transfer Transformer” framework , brings together lessons learned and improvements from Natural Language Processing models based on transfer learning . In essence, tasks are, for the most BTB Directory part, formulated in a format that uses texts as input and output. Google and the in-depth understanding of search intentions The diagram above gives an idea of how the T works. We see that each task considered takes the form of a textual entry.
The model is pre-trained to generate target “output” text from the initial “input” text. These tasks include translation green, linguistic acceptability red, textual similarity yellow, and information synthesis blue. If the subject fascinates you and doesn't scare you, here isa resource produced by Googleabout transfer learning with T. In short, Google MUM is an extension of BERT from which it takes the machine learning mechanisms by pushing them further.
|