Brands
Discover
Events
Newsletter
More

Follow Us

twitterfacebookinstagramyoutube
Youtstory

Brands

Resources

Stories

General

In-Depth

Announcement

Reports

News

Funding

Startup Sectors

Women in tech

Sportstech

Agritech

E-Commerce

Education

Lifestyle

Entertainment

Art & Culture

Travel & Leisure

Curtain Raiser

Wine and Food

YSTV

ADVERTISEMENT
Advertise with us

Google's MUM to help improve visual search results

At its flagship event Google I/O earlier this year, the US-based company had announced that it had reached a critical milestone for understanding information with Multitask Unified Model or MUM.

Google's MUM to help improve visual search results

Thursday September 30, 2021 , 3 min Read

Google will introduce a new way to search "visually" in the coming months, making it easier for users to express their questions in more natural ways.


Google, at its Search On livestream event on Wednesday night, shared details of how the tech giant is bringing the latest in artificial intelligence (AI) to its products, giving people new ways to search and explore information in more natural and intuitive ways.

Earlier this year at its flagship event Google I/O, the US-based company had announced that it had reached a critical milestone for understanding information with Multitask Unified Model or MUM.

The system offers expert-like answers to questions with fewer queries to complex tasks.


"We've been experimenting with using MUM's capabilities to make our products more helpful and enable entirely new ways to search...In the coming months, we'll introduce a new way to search visually, with the ability to ask questions about what you see," Google said in a blogpost.


Google - citing an example - said using the new capability, users can tap on the Lens icon when they're looking at a picture of a shirt, and ask Google to find you the same pattern but on another article of clothing, like socks.

"This helps when you're looking for something that might be difficult to describe accurately with words alone...By combining images and text into a single query, we're making it easier to search visually and express your questions in more natural ways," it added.
Google

Unsplash

Google said MUM will unlock deeper insights in the future that users may not have known to search for and connect them with content on the web that they wouldn't have otherwise found.


It is also making it easier to find visual inspiration with a newly designed, browsable results page. This new visual results page is designed for searches that are looking for inspiration, it added.


Google said it is already using advanced AI systems to identify key moments in videos, like the winning shot in a basketball game, or steps in a recipe.


"Today, we're taking this a step further, introducing a new experience that identifies related topics in a video, with links to easily dig deeper and learn more. Using MUM, we can even show related topics that aren't explicitly mentioned in the video, based on our advanced understanding of information in the video," it said.

The first version of this feature will roll out in the coming weeks, and Google will add more visual enhancements in the coming months, it added.

Google said it is also making it easier to shop from the widest range of merchants - big and small - and helping people better evaluate the credibility of information they find online.


"All this work not only helps people around the world, but creators, publishers, and businesses as well. Every day, we send visitors to well over 100 million different websites, and every month, Google connects people with more than 120 million businesses that don't have websites, by enabling phone calls, driving directions and local foot traffic," it said.


Google added that as it continues to build more useful products and "push the boundaries of what it means to search, we look forward to helping people find the answers they're looking for, and inspiring more questions along the way".


Edited by Megha Reddy