SHARE

Google held an event focused on its most famous tech—its search engine—today to announce a batch of new changes. The event, called Search On, described updates that are on their way in the coming months that could change the way that people search for information, and what they can ask the search giant to find for them.

Google also exhibited how they’ll be using their new artificial intelligence system, MUM, or Multitask Unified Model, to expand their search capabilities so users can better explore complex topics. 

Despite its wonky name, MUM, along with other artificial intelligence innovations, is intended to make search feel more natural and intuitive. To do this, MUM works to understand separate components related to a subject such as word descriptions, images, videos, and draws connections between these different components. So, to really understand what a lion is, for example, MUM pieces together what it sounds like, looks like, how it moves, what it eats, where it lives, and how it’s part of the cat family (but not the kind of cat you keep as a pet). 

“People turn to search engines to find information they can trust,” Danielle Romain, VP of Trust at Google, said at Search On. “We also know that people don’t just want quick facts. For many topics, they want to really understand the information that’s out there.”

[Related: This new AI tool from Google could change the way we search online]

First off, MUM will allow users to search with imagery through Lens, which involves using a smartphone camera. This search mode is what Google describes as the “point-and-ask.” That means that iOS users can point at an object or image with Lens, and ask questions about items or patterns in that image on their Google app. For example, if you wanted to find socks that have the same pattern as a shirt you own, you can tap on the Lens icon and grab a visual of the shirt, and type in text asking Google to find you the same pattern on socks. 

AI photo
Google

This mixed media tool, Google says, could help you look for things that you might not be able to describe accurately with only words. It can also be useful for honing in on hard-to-pin-down patterns like chevron, or paisley—not everyone knows those words off the top of their head. Besides clothing, it can also tell you what a specific part of a bike gear is called in case you need to replace or repair it. It does this by finding other similar images across the web that may contain the request you want. 

Google is keen on finding new ways to make search more immersive and expansive by connecting users with information related to their query that they might have not otherwise found. A new feature called “Things to know” will soon appear near the top of the first page of Google search. It looks like a collapsible outline that shows you how other people have explored the topic you’re interested in learning more about. 

As an example, if you wanted to learn about creating acrylic paintings to decorate your apartment, you may search for “acrylic painting” in Google. The “Things to know” tab will show you the aspects people are likely to look at first, like a step-by-step on how to acrylic paint, clean-up tips, to help you orient the search and navigate to the aspect that you want to further expand on.

Nestled in the first search results will also be a new tab that can allow you to zoom in and out of a topic. The tab will either suggest overarching themes and trends that help you broaden the search, like “painting styles,” or more specific queries that help you refine the search, like “acrylic painting techniques.”

Google search new features broaden refine
Google

Search will more seamlessly integrate and parse images and video

Google is re-designing their results page to be more browsable, and visual, in hopes of sparking inspiration that could lead to more specific queries. Instead of being a list of text-only links to web sites, images will shortly appear underneath these web links on your search pages. 

AI photo
Google

Google has said that engineers have been training MUM to recognize the relationship between words and images. In this event, they announced that they’re using MUM to parse through videos to identify related topics not explicitly mentioned in the clips and offer links for going deeper into those topics. For example, if you were looking at a Youtube video about how macaroni penguins—yes, that’s the real name for a species of penguin—find family and avoid predators, Google may recommend the topic, “macaroni penguin’s life story,” under the video, even if those exact words did not come up in the video. Previously, the company used AI to identify key moments in videos, and put marks on the video scrub bar that indicated when those moments happen. 

The first version of this new video feature will start appearing in search in the next few weeks, and will be enhanced over the coming months.