Google Algorithm Update What You Need to Know Remaining up-to-date with Google can help you get higher traffic

Gayle Kurtzer-Meyers

https://img.particlenews.com/image.php?url=260DRL_0XrcHL1d00

Photo by Mitchell Luo on Unsplash

What Do You Need to Know About Google Algorithm Update?
Hello BERT
Last year, on October 25, 2019, Google rolled out its most significant algorithm update of the past five years, Google BERT. Before we unlock the mystery behind the latest Google algorithm update, let’s first take a cursory glance at Google’s journey.
“In old days, instead of asking a teacher, people looked at the dictionary to know the complete definition of teacher. Now Google becomes our teacher and to know about Google, people Google it.” -Munia Khan-

How Google’s search engine evolved?

As the commercialization of the Internet in the 1990s took over, internet marketers and SEO experts have been actively participating in a never-ending battle — the competition to rank their website aiming for the first page of search engine sites. (SERPs).

After Yahoo made one of the costliest business decisions of all time — failure to buy Google, the latter became the leading search engine after the mid-2000s.

Google’s ascension to the top meant that all the existing marketing strategies were revised, and internet marketers tried to “game” the search engine. Through these techniques, generating good traffic on Google earned them high profits. Over time, poor SEO practices caused a massive decline in the quality of SERPs.

Thus, Google recognized the problem at hand, took matters into its own hands, and made certain adjustments to its search engine algorithms. Google’s objective was crystal-clear: show the best possible results when a user entered a search query .i.e. display links that addressed their problems, whether they were looking for a product, a service, or a piece of information. Here are some interesting turns that the tech giant took after 2000.

· 2003: Google cracked down on spam links by separating “good” links from “bad” links.

· 2010: Google embraced social media. The latest Google algorithm update incorporated social media signals.

· 2011: Google released the Panda update to penalize websites containing thin content and excessive ads.

· 2012: Google came up with the Penguin update and took action against keyword stuffing.

· 2015: Google announced the implementation of machine learning in its algorithm.

Since 2015, the writing was on the wall. Google was prioritizing a long-term strategy that emphasized the semantic search. Semantic search determines a searcher’s intent by figuring out the context, which improves search accuracy. Last year’s update, Google BERT is the precursor of the tech giant’s future ambitions.

What is Google BERT?

In layman's terms, Google BERT will help Google understand words in a sentence (user query). But, former Google algorithms used to do the same, so what’s the notable change? Well, the difference here is that BERT deciphers all the nuances of content in a sentence.

BERT is the product of Google’s intensive R&D efforts in AI. Google researchers laid the foundation of BERT in 2018.

Earlier, Google search algorithms computed search queries by processing words from left to right, whereas Google BERT applies bidirectional processing to determine the link between phrases.

https://img.particlenews.com/image.php?url=2hOSS4_0XrcHL1d00

Image Text: Google Search

Alt-Text: Comparison of Google Search

Image Description: Comparison of Google Search for BERT

Source: Google’s search blog

Consider the above image from Google’s blog. Before the advent of the latest Google update, the term “stand-alone” matched with the phrase “stand.” However, the word “stand” refers to different user intent in both queries. Observe how Google BERT can understand this context and shows an appropriate response.

It noted that Google BERT has a better understanding of prepositions, such as “to” in search queries.

A deeper overview of Google BERT

BERT is an acronym that stands for Bidirectional Encoder Representations from Transformers. It’s an open-sourced neural network technique, which belongs to an AI discipline: NLP. BERT is an extension of the former pre-training contextual representations, such as ULMFit and ELMo. What made it stand out is that BERT meets the following criteria:

ü Deeply bidirectional and unsupervised language representation

ü Pre-trained through a text corpus

These criteria carry weight because pre-trained representations are either contextual or context-free. Besides, the contextual terms are split into bidirectional or unidirectional models.

Context-free models provide a single word embedding representation from the vocabulary for all words. For instance, consider a user that searches for queries that contain the following words/phrases:

· Bank account

· Bank of the river

As humans, we know that both appear similar but are miles apart in terms of actual meaning. However, machines are still far away from the Skynet-level capabilities, and thus with context-free models, such as GloVe, the definition of “bank” would be the same in both bank account and bank of the river.

Contextual models improve as they determine each word’s representation by comparing them with other words in the sentence. Now, if a sentence says, “I opened a bank account,” a unidirectional contextual model evaluates the word bank according to its previous comments. i.e., “I opened a.” However, still the name after “bank” “account” isn’t evaluated. This is the limitation of the unidirectional context model.

Enter the fresh Google algorithm update, BERT, which goes one step further by using a “deeply bidirectional” approach for evaluating sentences. Doing so goes through the whole phrase from both sides. i.e., “I opened a bank account” and predicts accurate outcomes.

It begs an important question: If the bidirectionality model is so effective, why were computer scientists sleeping on it for so long?

Well, for that, let’s go back to the unidirectional models. It is reasonably simple to train them to predict a word based on a sentence’s previous remarks. On the other hand, a similar approach isn’t viable with the bidirectional models. This is because the target word (that has to be predicted) is ultimately going to indirectly find itself in a multi-layer model that compares words from two sides.

A masking technique aims to resolve this dilemma. It masks a specific word in input, and then a bidirectional approach is used to predict that word. For instance, consider the following sentences.

John went to a [MASK #1]. He studied [MASK# 2] and began a career as a web developer.

Here, the masked words are “university” and “Computer Science.” Google BERT predicted them by modeling the relationship between both sentences, using the full context to guess the right word. In a way, it’s similar to how humans think and guess, and this is what a neural network is all about — mimicking a human brain.

Historically, experts did develop these solutions earlier, but what transformed them into a working solution was the development of Cloud TPUs. The Transformer model architecture was another vital piece of the puzzle. Unsurprisingly, Google developed both after 2015.

How did Google BERT affect real-time ’s launch sparked sudden fluctuations in SEO analyses, affecting metrics, such as site traffic, page ranking, and keyword performance? Here’s the analytics view of a nutrition and supplement review website that received higher traffic in the post-BERT period.

https://img.particlenews.com/image.php?url=2ZAknj_0XrcHL1d00

Image Text: A Twitter screenshot

Alt-Text: A Twitter screenshot about Google Bert impact

Image Description: A Twitter screenshot about Google Bert impact

Source: Twitter

If your position was on the opposing side of the stick and your website traffic was negatively affected, perhaps here’s what was wrong with it:

· Your keyword research was good, but your content quality was low and didn’t resonate well with your target audience.

· A competitor beat you with a more effective keyword strategy.

· Your content is unlikely to be displayed for informational searches.

· It could be that your website is optimized well for search engines, but it doesn’t align well with your audience.

How to optimize your website for Google BERT?

According to Google, there’s a strong correlation between BERT and informational searches. Therefore, from now on, a better strategy would be to optimize your website for humans. If you go through the history of Google algorithm updates, it’s quite apparent that Google intends to eliminate SEO shortcuts that were rampant in the past two decades. During all these changes, Google never penalized brands that invested in content quality. Instead, it’s continuously trying to promote original and rich content.

As for the SEO experts, it’s time to dive into entity-based SEO. Google defines an entity as anything that is:

ü Singular

ü Unique

ü Well-defined

ü Distinguishable

Entity-based SEO works based on the context association between words, phrases, and sentences. For instance, in the statement “Java is dead,” both “Java” and “dead” are entities, where a relationship binds them together.

Final thoughts

While trying to “game” Google, many online marketers don’t come to terms with a critical understanding; there’s no need to take a shortcut. Embracing changes in the recent Google algorithm updates would ultimately improve your user experience and boost your conversion rates.

“Chase your dreams. Design your life in a way that people will have to search you on google and not social media.”-Nicky Verd-

Comments / 0

Published by

I am a Licensed Community Association Manager for the State of Florida and a published author. My top articles are about Florida RE, property management, and the many beautiful venues and activities available in the Sunshine State. Thank you for reading my work and joining me on the journey.

Kissimmee, FL
1904 followers

More from Gayle Kurtzer-Meyers

Comments / 0