Home Artificial Intelligence Google’s MUM will start adding more context to searches

Google’s MUM will start adding more context to searches

0
Google’s MUM will start adding more context to searches

[ad_1]

Google is getting a little help from its MUM (Multitask Unified Model) to add more context to search results.

MUM was announced at Google I/O in May and aims to transform how the web giant handles complex queries. The model uses the T5 text-to-text framework and is said to be “1,000 times more powerful” than BERT (Bidirectional Encoder Representations from Transformers), which itself was a major breakthrough when it was introduced in 2018.

One example used by Google to explain the leap forward MUM introduces is mountain climbing.

Today, you could do multiple searches for how to prepare to climb a specific mountain—searching for its elevation, the average temperature during certain months, trail difficulties, recommended gear to take.etc

However, if you were talking to a human expert you could just ask: “What should I do differently to prepare for x from y mountain I’ve previously climbed?”

MUM could understand such context and provide all the relevant information in a single search.

Google has a notorious habit of announcing products that either never launch or end up in its ever-growing graveyard. Fortunately, Google appears to be fond of its MUM.

This week, Google announced visual search which enhances Lens so you can search for a pattern – as you could before – but ask specifically to search for it on anything from a new bag to underwear.

Perhaps more exciting/useful than spiffy new pants though is the ability to take a picture of a broken part you may not even know the name of and search “How to fix” to bring up relevant videos and guides. Anyone can become an amateur plumber, bike repairer, or anything else with a little help from MUM.

A new section called “Things to know” may also pop up to, you guessed it, add more context to searches.

The provided example is a search for “acrylic painting” which – alongside the usual results – also brings up some useful tips around how to do it, various styles, what household items can be used, how to clean up, and other relevant information. This feature would probably have saved me a few headaches and quite a few back-and-forth trips to various hardware stores for missing bits over the past month alone.

Some of MUM’s smarts are also coming to videos. MUM will enable related topics to be plucked from a video even if it’s not necessarily in the title or description. In an example video of macaroni penguins, the system is able to pluck out further information like how macaroni penguins find their family members and navigate predators.

Google says all of the announced features will roll out in the coming weeks and months.

(Photo by Adam Winger on Unsplash)

Find out more about Digital Transformation Week North America, taking place on 9-10 November 2021, a virtual event and conference exploring advanced DTX strategies for a ‘digital everything’ world.

Tags: ai, artificial intelligence, Google, google i/o, google lens, Model, multitask unified model, mum, visual search

[ad_2]

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here