Google Multisearch: New Search Technology, Including In Video Search
Google is synonymous with search. It’s been that way for decades. The company is so dominant in this regard that it’s been used as a verb for nearly two decades. Whenever someone asks a question they don’t know the answer to, all they have to do is Google it and the information is made available in seconds.
But even though the program is so successful that it made its way into Merriam-Webster’s dictionary, Google has continued to drive innovation and create new methods for users to discover information.
Google has released several announcements over the past several months revealing new readily available and upcoming features that users would have access to. Both the Search On 2022 and Google For India 2022 conferences reveal some exciting new features that will enhance the ways we all interact with Google Search.
In this article, we’ll break down some of the exciting new features that Google will be rolling out to users.
Google is constantly looking to create new ways for users to interact with the information available on its platform. One of the most transformative additions to search in recent years is the Google Lens feature that premiered on Google Pixel 2 phones back in 2017. This feature essentially allowed users to take a picture of an object and discover information about that object.
Now Google is adding another new search feature that will fundamentally change how users search for information.
With the new Google Multisearch tool, users will be able to create search queries using both text and images from within Google Lens. The SERP will display existing pages that best match the image and text queries.
So if you upload a picture of a plant you don’t recognize and ask Google what type of plant it is, Google should use visual and textual results to give you the most relevant SERP. In this case, it would probably be an article from a website like Britannica that tells you all the various details about that specific plant.
Google Multisearch has been available in the U.S. since April 2022 for English speakers, and Google India announced in December that Indian users would be able to use the feature in English as well. Google plans on making multisearch available in more languages over the coming months.
Google Multisearch Near Me
In addition to the multisearch feature that beta testers have had access to since 2022, Google is adding the ability to use Google Lens to search for items within your vicinity. Google’s blog provides a great example of how this new feature can be used. For instance, if you take a picture of a patterned shirt using Google Lens, you can find stores near you selling that shirt or similar products.
Search In Video
In Video Search could be another great addition to Google’s family of apps that totally changes how we interact with video content. When you enter a query into Google, you may receive a YouTube video suggestion in your SERP.
Many videos on YouTube are between 8-15 minutes long so content creators can increase the number of monetization opportunities available within each video. While that might not seem like a lot, choosing to watch an entire 15-minute video is a significant commitment when you’re looking for a particular piece of information. After all, what you’re looking for may not even be in the video.
With the new Search In Video feature, you can search for keywords within a YouTube video. So if users are looking for a specific tidbit of information inside a video, they can search to see if that information is mentioned in the video. If the keywords are found, then you’ll be able to skip directly to the moment in the video where the topic is mentioned using a timestamp.
Bilingual Search Results
India is facing a very unique situation. The country is 3.2 million square miles, approximately 33% of the size of the United States, but has a population nearly 500% larger than the U.S. India is also a multicultural and multilingual nation, with more than 400 native languages. Among the most widely spoken languages are Hindi and English—which are also the country’s official languages.
Hundreds of millions of individuals also speak multiple languages interchangeably. This can make it difficult for Google to provide an optimal experience as Google’s SERP does not translate results unless you change the SERP language in your account settings. This is less than ideal for a nation with so many multilingual individuals.
To address this issue, Google is rolling out a new feature called Bilingual Search Results. Now, Google will display its SERP in both English and the user’s local language. This feature will first be available in English and Hindi, and four additional languages will be added throughout the year.
While the U.S. is nowhere near as multilingual as India, nearly 1 in 5 Americans speak more than one language—with Spanish being the country’s second most spoken language. This feature would be a great addition as more people begin to speak multiple languages interchangeably.
Bilingual Voice Search
Multilingualism can create some unique challenges for technologies that rely on vocal queues. Most notably, code-switching can cause issues if the Ai used in voice-controlled tech isn’t programmed to detect and understand it. If you’re unfamiliar with the term, code-switching is a common phenomenon among multilingual individuals. Essentially, code-switching occurs when an individual switches between languages over the course of a single sentence or conversation.
Google is releasing Bilingual Voice Search to help multilingual users make voice searches naturally as they speak. The new speech recognition model will help Hinglish speakers to effectively use Google voice search by detecting “accents, surrounding sounds, context, and speaking styles.”
Along with Google multisearch and bilingual search results, this would be another welcome feature in many American households where “Spanglish” is common spoken.
Transaction Search in Google Play
During Google For India 2022, Google Pay Director Sharath Bulusu announced that users would have a new way to access and view their digital payments. Using Google Pay’s new Transaction Search feature, users can say what type of expenses they are searching for, and Google will automatically filter their transactions by category and date to pull up the relevant transaction history.
This feature creates a more streamlined process for users to track their transactions and keep their budgets balanced. So for instance, if you wanted to know how much money you had spent eating out, you can ask Google a natural language question such as “how much have I spent at restaurants this month,” and Google will display every Google Pay transaction from a restaurant for the past month.
While a feature like Google multisearch might be more innovative on its face, this is an excellent quality of life improvement that we’re happy to see.
Project Relate is another fantastic innovation that was announced at both Search On 2022 and Google For India 2022. This new application has been in development since 2018 and will help make Google Assistant and physical verbal interactions more accessible for those with non-typical speech. Non-typical speech is a problem faced by more than 250,000 individuals, and Google’s newest technology will help bridge the communication gap and create a more inclusive world.
By using the application, users will create unique speech profiles that will help Google Assistant better understand how they speak. The app features 3 modes: listen, repeat, and assistant. Listen records what you’re saying and transcribes it, repeat plays back what you said in a digital voice format, and assistant activates the Google assistant interface from inside the app. The beta version of the app has launched for English speakers in the U.S., Australia, Canada, New Zealand, and India—where the beta will also support Hindi.
If you’re an individual with non-typical speech and you’d like to participate in the beta, be sure to submit your application to Google.
Search With Live View
Google Live View has been out for a while now. The feature gives users the ability to visualize their environment and receive directions in a dynamic way. By using your phone’s camera to show your surroundings, live view can use AR technology to overlay directions based on how you are positioned relative to your destination.
Now Google is giving users the ability to see nearby destinations and search for points of interest while in Live View. So if you need to find an ATM, you can search for nearby ATMs and Live View will overlay directions to the location on your screen.
As of the launch back in September 2022, the feature is available in Los Angeles, London, New York, Parris, San Francisco, and Tokyo.
How Will Google’s New Search Features Affect SEO?
Many of these features lead us back to some fundamental SEO advice that brands need to follow in order to get the best return on their SEO efforts.
If you’re not already, you need to be using both visual and textual elements on your pages. Now that users can search both words and images at the same time with Google multisearch, this will be even more important.
You also need to make sure that you’re providing optimized transcripts for any video content you’re publishing. Not only does this lead to a better user experience, but it can help search engines crawl your content and determine what ranking you deserve.
Brands should also make sure their Google listings are up-to-date and accurate. If you’re a restaurant in LA and your hours aren’t accurate, it could cause you to lose potential customers who are using Search with Live View.
When looking over these new features, think about how it’s relevant to your current SEO efforts and make adjustments accordingly.
If you’re looking for a strategic partner to help you navigate the changing SEO landscape, contact Fidelitas today to discuss how we can help improve your business’s online presence.