Google is updating its search algorithm to better handle the potentially confusing search phrases it deals with on a daily basis. The company has been working on an AI they’re calling BERT (Bidirectional Encoder Representations from Transformers) to improve the understanding of conversational queries.
What’s the difference?
Certain structuring of a query would sometimes confuse the search engine: phrases like, “can you get medicine for someone pharmacy” or “parking on a hill with no curb”. The reason for this is that Google’s previous algorithm looked at the query as a group of words which contained some important key sections along with a bunch of excess information. With the new BERT system, parts of the query which would have previously been ignored will now be taken into account.
For example, “parking on a hill with no curb” might have brought up results regarding parking on a hill that did have a curb, because only “parking”, “hill”, and “curb” would have been treated as important keywords. BERT is designed to understand the relevance of the word “no” in the phrase. Similarly, the query “can you get medicine for someone pharmacy” would have found results about acquiring medicine from a pharmacy, but would have left out the “for someone” section.
It’s this conversational tone that often mixes up a Google search, forcing users to become more precise and sometimes a little creative with the way they structure their queries. BERT is meant to analyze how every word in the phrase corresponds with others in the query, whether the words are ahead or behind of a primary keyword – hence the first part of the system’s name, Bidirectional.
The developers taught the AI by giving it a number of phrases where 10 to 15 percent of the words were arbitrarily omitted, and the program had to then estimate what words were missing.
Google’s search Vice President Pandu Nayak says that they still have some way to go to fully comprehend what a user wants when entering a query.