BERT Explained - What Google’s Latest Update Means for Search | Reflect Digital

In late October, Google announced its latest update, named BERT, which will allow Google to better understand search intent and return more relevant results to users.

Google has called it, “the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search”.

But what is BERT? How does it work? And what should marketers and SEOs do in response?

What is BERT?

BERT (or the slightly less catchy Bidirectional Encoder Representations from Transformers) is an algorithm-based technique created by Google to better understand and process natural language.

Thanks to machine learning, BERT allows Google to better understand the full context of a search.  For example, being more considerate of how words like “from”, “to” and “for” impact surrounding words and what a searcher is actually looking for, rather than returning results focused on the main keywords of a search. 

An example used by Google is the query “2019 brazil traveler to USA need a visa”. Before BERT, the results displayed information for US travellers heading to Brazil. After BERT, Google Search now understands the importance of the word “to” in that query, and returns information specific to a traveller going from Brazil to the US.

Simply put, BERT lets Google return more accurate results for more nuanced, complex and conversational queries.

What searches does BERT affect?

As of late October 2019, BERT affects 10% of searches in the U.S. in English. Google plans to roll-out BERT to more countries and languages over time.

BERT impacts both standard organic listings and featured snippets at the top of search results. 

Due to how BERT operates, it allows Google to use learnings from one language and apply them to another. BERT is also live for any language that supports featured snippets, and Google specifically states that BERT is showing improvements for featured snippets in Hindi, Korean and Portuguese. 

What should marketers and SEOs do in response to BERT?

Danny Sullivan, Google’s public Search Liaison, has stated that there’s nothing to optimise for with BERT

Fundamentally, it is not possible to optimise for machine-learning technology, or “a neural network-based technique for natural language processing (NLP) pre-training”, as Google describes it. 

Danny Sullivan has again reinforced that SEOs should continue to write content for users. This makes perfect sense at BERT is technology that helps Google better understand searches, and thereby return content that best meets the needs of the user. 

As BERT will make it easier for Google to return content that better matches more complex queries, it could be the case that pages that better address a specific user need may improve in search visibility after being previously overshadowed by broader, more and/or more generic pages.

This, however, remains to be seen. For now, SEOs and marketers should continue creating content to meet their user’s needs.

meet-frankie-min

MEET THE AUTHOR.

FRANKIE PLUMMER

Frankie is passionate about driving quality organic traffic that generates real revenue. He takes the lead on SEO strategy and implementation for a wide range of clients, harnessing on-page, technical and off-page SEO to improve organic visibility, traffic, leads and sales.

Frankie likes to collaborate with clients, ensuring he understands their business in detail, helping him to meet their goals. Frankie also offers hugely popular Search Engine Optimisation education and training, helping clients better understand SEO and make informed decisions about their digital marketing.

More about Frankie
Square Pink

Let’s talk.

It’s time to move the game on.

To find out how we can help you take your business to the next level, contact our friendly and expert team today.

TRUSTED.

google_logo
iso_logo
the_drum_logo
google_logo
iso_logo
the_drum_logo

RECOGNISED.

award_logo_uksa_winner2020_footer
award_logo_drumsa_finalist_2020_footer
wirehivebuos_footer
award_logo_uksa_winner2019_footer

GREEN.

1 trees planted... so far