BERT is Coming
Providing a better search experience to search users requires being able to gain a deep understanding of what people are looking for. Think of it this way, you may have searched for something in the past that you didn’t know how to articulate well; or maybe you had an idea of the word you were looking for, forgot its name, did not know how to spell it, or you were just looking to learn more about a general subject.
Situations like this can be difficult for search engines, but Google is working hard to make it right. Google has made a big leap in search technology. By using natural language processing (NLP), they’ve created the Bidirectional Encoder Representations from Transformers, or as they call it–BERT, for short.
This technology enables anyone to train their own state-of-the-art question answering system. By applying BERT models to both rankings and featured snippets in search, they’re able to do a much better job in helping you find useful information. When it comes to ranking results, it is estimated that BERT will help Google better understand one in 10 searches in the U.S. in English.
Given that BERT will change how users find results to their search queries, this could impact the way that customers discover your business. We’re excited to see how BERT will change the shape of search moving forward and connect people with the information that they are looking for.