After all the core updates this year, Google still has one more up its sleeve for the end of 2019. With the BERT update, Google will introduce one of the biggest changes to the search system since the introduction of RankBrain around five years ago. The BERT update was officially announced by the company in a blog article. BERT is an abbreviation of Bidirectional Encoder Representations from Transformers and is a neural network-based technique for Natural Language Processing (NLP) pre-training. The search engine company already introduced this technology back in November 2018 in one of its blog posts as well as releasing the technology as Open Source.
The aim of BERT is for user queries to be understood even better. For example, when a user enters a word, its context should be recognized and taken into account. In order for this to work, prepositions such as “for” or “to” will become more the focus of algorithmic analysis and evaluation.
Google will first introduce the BERT update for the English language, but more languages will follow. According to Google, the update is already expected to deliver significantly better results for one in ten queries in the USA. It may sound like a small amount, but it’s an enormous step forward. You can find some examples in the blog post.
In addition to the normal search results, Google is also planning to use the BERT update to improve featured snippets, i.e. the highlighted short answers that are occasionally found above the normal search results. According to Google, the BERT update will deliver better results in over 20 countries with the company explicitly mentioning the Korean, Hindi, and Portuguese languages.
Even though Google sees the new BERT update as a big step forward in improving its users’ speech understanding, the company is aware that many search queries still don’t deliver the perfect result. The company announced, however, that speech processing still has a way to go and that further improvements can be expected over time.