Google Artificial Intelligence Understanding Search Queries Better Than Before
What is Google BERT Update?
Google announced that they are applying BERT model to Search that can effect 1-in-10 queries. Google BERT is one the major update that will impact long search queries that are complicated due to context. Last year, Google introduced and open-sourced a neural network-based technology to understand search queries and enables anyone to train their own state-of-the-art question answering system.
What Is BERT Algorithm?
BERT is called Bidirectional Encoder Representations from Transformers. Transformers refer to models that process words in relation to all other words in a sentence. It means that BERT models can interpret the appropriate meaning of a word by looking at the words that come before and after. This will lead to a better understanding of queries, compared to processing words one-by-one in order.
How BERT Will Impact On SEO?
There are lots of questions about this latest update specifically how BERT can impact on SEO. Google said that BERT is so complex but it will help to find more useful information so that is why it will affect both search rankings and featured snippets. For now, BERT will impact on one in 10 searches in the U.S. in English, and it is coming to more languages and locales over time.
Furthermore, Google’s BERT Update not analyzes not web pages but search queries. I think On-Page SEO becomes more important in terms of using words in precise ways. Poor content may not be helpful for this Google BERT update.
According to Google:
“Particularly for longer, more conversational queries, or searches where prepositions like “for” and “to” matter a lot to the meaning, Search will be able to understand the context of the words in your query. You can search in a way that feels natural for you.”
Examples of BERT Improvements:
Before launch this update, Google did a lot of testing and found that BERT algorithm can better understand nuances queries and connections between words. Here are some of the examples that shows how BERT is able to understand the intent behind your search quires.
When you search for “2019 brazil traveler to usa need a visa”. You can see how BERT helped Google to show more relevant result for this query. Because this search query is about a Brazilian traveling to USA, not the other way around.
Here’s another example of search query “do estheticians stand a lot at work”.Previously, Google’s systems understand the words “stand” and “stand-alone” as same meaning, which lead to irrelevant search results. Now BERT models understand that “stand” is related to the concept of the physical demands of a job, and displays more useful search results.
Search Queries With and Without BERT:
Here are some more before/after examples of queries with and without BERT that help to understand the subtle nuances of language.
With the BERT model, Google can better understand that “for someone” is an important part of this query, whereas previously we missed the meaning, with general results about filling prescriptions.
In the past, a query like this would confuse our systems–we placed too much importance on the word “curb” and ignored the word “no”, not understanding how critical that word was to appropriately responding to this query. So we’d return results for parking on a hill with a curb!
While the previous results page included a book in the “Young Adult” category, BERT can better understand that “adult” is being matched out of context, and pick out a more helpful result.
One thing to take notice that these examples are simply meant to illustrate the types of language understanding challenges that BERT helps with, but there are of course many other queries where BERT will have an impact.
Google applying BERT to make Search better for people across the world. The search giant said that understanding language is an ongoing challenge, and with BERT they are trying to make things better. Google said that they are trying to provide meaningful and most helpful information for every search query which you are looking for.