What is BERT SEO and Natural Language Processing (NLP)?

What is BERT SEO and Natural Language Processing?

Due to BERT, Natural Language Processing (NLP) is the new must-have ingredient in every mobile SEO campaign.

You may have heard the proverb “your words determine your destiny”. Likewise, in Search Marketing, how we use words on a page matters. Google’s BERT offers insights on its organization of search results. It illustrates how the relationship between word entities and the use of language is shaping the future of SEO.

Mobile-First Indexing influences Google’s increased dependence on the physical (GPS / Google Maps) location of the searcher. It also knows the searcher’s language settings on their phone and within their Google account. This helps inform the search results that are returned and offers more personalized search results.

As soon as Google released its statement announcing the BERT algorithm influenced search format, we knew this merits time and attention. Since it aims to improve the interpretation of complex long-tail search queries and display more relevant search results, it is the biggest search update in 5 years. Only after internalizing what the words that BERT encompasses actually mean, can we do search marketing better.

Here is what we have arrived at as of today.

What is BERT?

BERT, is an acronym for Bidirectional Encoder Representations from Transformers. It is a neural network-based technique for natural language processing pre-training. In simpler terms, it can be used to help search engines better decipher the context of words in search queries.

BERT models exceed previous capabilities. They evaluate the full context of a word by looking at both words that come BEFORE and AFTER it. This helps it gather context that is particularly useful for understanding the intent that prompted the search query.

What is a Google entity?

Google was granted Patent No. US 9,477,759 B2 on Oct. 25, 2016. It defines a Google entity as: “A thing or concept that is singular, unique, well-defined and distinguishable.” It also addresses its question answering abilities that rely on entity references in unstructured data. The patent explains how Google finds relevant information within web pages.

It’s important to understand that an entity does not need to be a physical object, it can also be a color hue, a date, a fragrance, and more. An entity is anything that is: Singular. For example, an entity may be an individual, physical building, Geo location, product item, a sound, cognitive idea, abstract concept, factual element, or other things in existence. It can also be any combination of these to aid how Google crawls and indexes a site.

An Entity does not belong to a parent node; rather, its descendants are read-only. The name of an Entity is held in the nodeName property. In addition, an Entity is a Node object, and therefore inherits some properties and methods from it.

What is entity recognition?

Entity recognition is what elevates search from strings to things. It helps the Google Search Appliance to identify and classify interesting entities in documents and then store these entities in its library or search index. SEO’s can enrich metadata-poor content to increase chances to win rich results with text using NLP. A term can be a name for an entity, which after that is referred to as a “named entity”.

What is entity-query matching?

Entity-Query matching expands Google’s advancing content perception abilities. How to use NLP in on-site optimization has become a top question in 2020. Start every content marketing campaign with extensive market research. This will help you understand the context that search engines may associate with user queries on each topic.

How to know which pages are credible enough to see results with NLP optimization?

If you already have quality content on a topic and rank for it, you have better chances. Find a page that has weak content but may be ranking due to inbound links and start by improving those pages. Your content will serve real viewers needs better if written in natural language.

What is context-based Natural Language Processing?

Context-based Natural Language Processing starts with how search engines try to identify contexts that may be used to match particular user requests. It involves the processing of such user requests by leveraging a Dialog System Engine that relies on the contexts. These typically reference speech contexts that source a sequence of user requests or dialog system answers that are stored and categorized

How can Search Marketers use NLP to Drive Revenue?

Structured data reflects what your content is about. If it is implemented correctly and kept current, it should do so. For eCommerce sites, include contextual content that can answer search queries with purchase intent. Avoid NLP entities that may have a negative implication, such as “depressed”, or “mad”. It takes having the big picture versus the values themselves. This will help decipher granular differences in entities.

Search results page, known as SERP, are extremely popular. The right content can help your business surface in features such as Google’s Local Packs and featured snippets. By adopting a BERT-focused mindset, your business can win client calls even from zero-click searches. Adding semantic processing in your content publishing workflow involves using natural language processing to add useful semantically structured information that describes your content. Numerous search strategies can include NLP to improve SEO and user engagement.

The natural-language processing field is growing in huge strides. Combined with machine learning and driven by deep-learning techniques, word usage patterns emerge. Entities and nodes produce content structure from vast troves of text. Google then chooses how it wants too use them.

Google Knowledge Graphs display related entities. A knowledge panel of a widely-known person may display a connection between that individual as fits your search query and another individual. By clicking on “See the connection”, more information including both individuals will populate. To experiment yourself, search for your favorite celebrity and then select an suggested celebrity within the “People Also Search For” SERP results. This is one way to trigger these related entity nodes.

Conversational AI chatbots and voice assistants rely on state-of-the-art natural-language models to answer people’s question when a business is not available. These models can be trained to facilitate a consumer-centric approach to answering frequently asked questions.

Does every business need to care about Google search entities?

Yes, those who want to get found in organic search. The goal of how you use entities in your content should include motivating people to take actions that directly relate to your business goals.

Accurately representing your business name and entities helps customers find your business online. This starts with a completely optimized Google My Business listing with all entities filled in.

They would probably never think of it this way. The microphone on your phone is always listening. In the background machine learning is assessing what it hears from the entities it picks up. Recent versions of the iPhone and Android show you the distance to the last place it heard you speak. Or where to see the movie that it last heard announced in radio/TV ad you listen to. What else it is listening to?

Cortana installed on my computer has recently said to me several times, “I can learn to understand you much better if I can get familiar with the way you talk.” However, I wasn’t talking to it nor thinking it was “listening”.

In the past few months, we are seeing more and more evidence that Google’s Mobile-First Indexing is not just a change of the primary crawler, but a major shift in Google’s strategy for organizing information and processing queries. The relationship between languages and entities in Mobile-First Indexing, ‘Entity-First Indexing,’ can not be overstated.” – Cindy Krum of Mobile Moxie

What are the best ways to use words and phrases in content?

Words in your content become pairs of entities.

Words in web content should flow in the same manner as people speak, yet it need to be semantically correct in terms that BERT is looking for. This will help you have the most comprehensive results in SERPs.

Look for single entities that best relate to your head search phrase. Check how to use entities correctly to add context value to readers. Currently Google has the best datasets to help algorithms learn.

SEO’s who formerly lacked a business acumen today are forced to understand business strategies and have holistic digital marketing skills. The words or search phrases used in your content and communications matter. Every sentence should be useful and matters.

The Importance of a Natural Language Processing Model

Why would you use the NLP API Processor to improve entity-query matching?

The benefits of moving towards a more natural language approach are basically two fold.

  • It makes your content easier for people to read and act on.
  • It makes your content easier for search engines to understand and catalog.

Back to the QUESTION ANSWERING USING ENTITY REFERENCES IN UNSTRUCTURED DATA patent. It tells us that an entity is basically what people search for on Google. Embracing NLP can be huge for e-Commerce sites. Implement and test your product markup to help organize your landing pages in a predefined manner.

Consider what the above patent states:

Consider what the above patent states:

“Ranking the one or more entity references based on the respective weighted sums: selecting an entity result from the one or more entity to references based at least in part on the ranking of the one or more entity references; and providing an answer to the query based at least in part on the entity result.”

How the Google Assistant uses Natural Language Processing

Another Google Patent is more focused on processing mobile conversational search queries.

Google Assistant is an artificial intelligence-powered virtual assistant that can engage in a personalized conversation with a searcher. It’s out-performing all other at correctly responding to Conversational Search Queries. It relies in part on context from previous search sessions.

A more recent Google patent was granted November 19, 2019, is more directly about the Google Assistant. It is titled Context-based Natural Language Processing and informs us about natural language processing (NLP) methods that involve multi-modal processing of user requests. It was timed with the announcement that they are using BERT to understand natural language more efficiently.

The technology giant is seeking to overcome its challenges when faced with query requests that “are not reasonably understandable if taken alone or in isolation”. Pandu Nayak Google Fellow and Vice President of Search published an article on Oct 25, 2019 title Understanding searches better than ever before. He said, “We see billions of searches every day, and 15 percent of those queries are ones we haven’t seen before–so we’ve built ways to return results for queries we can’t anticipate”.

With the BERT algorithm even small words can now help the search intent be better understood.

“The word ‘to’ and its relationship to the other words in the query are particularly important to understanding the meaning. It’s about a Brazilian traveling to the U.S., and not the other way around. Previously, our algorithms wouldn’t understand the importance of this connection, and we returned results about U.S. citizens traveling to Brazil. With BERT, Search is able to grasp this nuance and know that the very common word “to” actually matters a lot here, and we can provide a much more relevant result for this query.” – Google

BERT Provides a Better Understanding of Words and Language

“Language understanding is key to everything we’re doing on search,” said Pandu Nayak. “This is the single, biggest, most positive change we’ve had in last five years.”

Google may decipher user requests better by identifying a speech type, entity nodes, or an environmental context that goes along with the user’s search. You can use these entity insights to demonstrate your industry niche expertise. The patent references “dialog systems” and mobile applications.

“Conventional dialog systems are widely used in the information technology industry, especially in the form of mobile applications for wireless telephones and tablet computers. Generally, a dialog system refers to a computer-based agent having a human-centric interface for accessing, processing, managing, and delivering information. Dialog systems are also known as chat information systems, spoken dialog systems, conversational agents, chatter robots, chatterbots, chatbots, chat agents, digital personal assistants, automated online assistants, and so forth.” – Patent US20160259775A1

Now that is a comprehensive list of communication entities that fall under “dialog system”! The patent explains how Google Assistant is designed to connect with searchers via NLP.

“A dialog system interacts with its users in natural language to simulate an intelligent conversation and provide personalized assistance to the users. For example, a user may generate requests to the dialog system in the form of conversational questions, such as “Where is the nearest hotel?” or “What is the weather like in Alexandria?,” and receive corresponding answers from the dialog system in the form of audio and/or displayable messages. The users may also provide voice commands to the dialog system… – Patent US20160259775A1

Now that we have a foundational grasp of how Google Search uses NLP, it’s fascinating how this may influence our approach to content creations and SEO.


There are a multiple ways that NLP is used to improve SEO and user engagement.

1. Ensure that your Website and Content are Topic Relevant

2. Focus on Semantic Content

3. Use Ontology Categorization and Relatedness in Content Connections

4. Use Google AutoML Enables Businesses to Find Entities

5. Update Older Posts that are Lagging or Outdated

6. Plan for the Searcher who Relies on Voice-Activated Searching

7. Be Willing to Experiment with Structured Data

Now for a deeper look at the ways that Hill Web Marketing is using natural language processing in tandem with semantic web technologies and machine-learning. In this way we gain a true SEO advantage for e-Commerce sales.

1. Ensure that your Website and Content are Topic Relevant

Oddly, many websites ramble on without a clear focus on a particular topic. If you want to sell ‘healthcare instrument innovations’, make sure that your website is relevant to the topic ‘medical devices’ and everything that is related and useful.

High-scoring web pages do more than just provide sales copy or direct answers to questions. They also contain supporting information. Many times one answer surfaces another question from the reader. Provide related answers and anticipate their needs. Include information you know they will need – and haven’t thought of before.

Google wants to provide solutions on the web; that is, “correct and relevant solutions” that are easy for readers to understand. A structured approach to your content creation combined with structured data markup will help a lot.

If you provide these need answers and solutions in your content, you can expect the authority of your domain to grow. It also increases your chances to gain a Google Rich Card.

2. Focus on Semantic Content

Semantic Knowledge Mapping displays in categories words related to one another.

Current Google Algorithms rely more on contextual language then verbatim keywords. It focuses on the whole context of searcher’s queries since the recent algorithm update. By being the best answer to the right questions, your business will provide the most value.

Check your content length to match your searcher’s intent. In-depth informational content meets one need and a quick answer meets another.

By conducting a semantic analysis of your natural language content, you can locate all of the words in your content that capture the real meaning of your text. Now you can further identify which text elements to assign to their logical and grammatical role. In this way you can build relationships between different concepts in your text that align with BERT.

The August 8, 2019 Semantic Knowledge Matching Based on BERT for Dispatching Fault Disposal Information article reveals how vast the BERT model is. Jixiang Lu and Tao Zhang say that “Bert is a pre-trained model based on massive Wikipedia data (2500 M words} and BookCorpus data (800 M words)”.

It is able to garner contextual word representations. For one example, consider all that populates on Google Maps. It uses real-world entities that may have multiple properties associated with them.

3. Use Ontology Categorization and Relatedness in Content Connections

Winning content is also highly succinct, with more factual content that is written by authoritative sources. It is also engaging. When people can quickly and emotional relate to your messages, it’s endearing. Focus on solving deeper human problems. SEO’s can embrace Artificial Intelligence to improve the user experience and provide clearly worded solutions. B2B content also needs compelling conversion triggers and smart UX design.

Google is invested in building a knowledge-based library of concepts. This helps the technology giant to better understand things such as what different businesses or entities are ‘Known for’ or to define entities better connected relationships. Web pages for specific entities may gain top positioning in search results when user engagement history indicates that search intent may include that entity within a query.

Your search strategy may need to include both organic search and paid adverting to reach digital shoppers.

4. Use Google AutoML Enables Businesses to Find Entities

Check it our for yourself. Google tells us how AutoML Natural Language works.

1. Upload your documents. Label text based on your domain-specific keywords and phrases.

2. Train your custom model. Classify, extract, and detect sentiment.

3. Evaluate. Get insights that are relevant to your specific needs.

Google’s Natural Language API discerns syntax, entities, and sentiment in text, and organizes text into a predefined set of categories. Content meant for news articles, scholarly content, or to uncover the sentiment of your examples, may find the Natural Language API worth trying. In addition, we like having the ability to use our own labels by using a custom classifier.

5. Update Older Posts that are Lagging or Outdated

Existing walls of “traditional SEO copy-writing” that few readers ever consume need to crumble. Knowledge Graphs, entities, and natural language are all vital. They have a key place in evergreen content along with acknowledging how people are moving from written text to the quick gratification of visual media and video content.

A purely technical writer may lack topic knowledge or passionate about what their is writing. That makes it harder to write in the best tone and choice of word phrases that people easily relate to. Know what search can inform you for writing better content.

Then you’ll need an in-depth technical SEO audit to ensure there are no glitches that hold your content back.

The best SEO content strategies start simple:

Here is a script for extracting entities from your output.

6. Plan for the Searcher who Relies on Voice-Activated Searching

Numerous technologies may empower the Google Assistant. We can draw clues from Google about its use of BERT to match answers to questions. Using NLP supports better contextual search based on speech patterns and and environmental contexts.

Voice-activated searches are a natural way for people to discover and purchase goods. This means that marketers should adjust their content strategy and SEO efforts to best align with how Google Search is evolving. If the searcher’s mindset and search preferences are to rely on a voice assistant app, this changes their search behavior from text input to spoken input.

During fine-tuning, BERT has a “maximum sequence length for each sentence is set to 128 and maximum prediction per sequence as 20”, according to Tao Zhang. So we can see where it manages longer search phrases better. This fits voiced search queries as they tend to be longer, like a natural sentence.

Voice search will grow, especially for the masses of mobile users who want hands-free, on-the-go convenience. They think fast and expect an efficient user experience. In anticipation of this explosive mobile search trend, BERT handles complex natural language queries better than anything to date.

7. Be Willing to Experiment with Structured Data

While Speakable markup is still n beta, it is something that you can try. If identify important questions that seem to be commonly voice-activated Speakable Schema markup could audibly answer back. As with other SEO strategies, there are no guarantees. A skilled search marketer will have their own experiences from trying, testing, and tweaking efforts. They can conduct a comprehensive site audit for find ways to improve your search strategies.

Google algorithm updates impact SEO best practices, so it’s best to be flexible and learn quickly.

Use as much structured data markup that fits your context as possible. There are multiple SEO tools that are built to extract entities. These entities along with their unique identifiers may be used to help describe your content to search engines.

Search engines seek to understand the entities that appear on your pages, their relationships to other entities, their connected relationships to attributes (properties) about those entities and the relationships to classifications of those entities. Your site’s architecture, ontologies, and structured data all help.

Named Entity Recognition in Query Google Patent No. US9009134B2

Its application status is active as of today, February 11, 2020. It further explains part of speech tagging in computer science and sentiment analysis.

What is Query Segmentation?

Ths patent answers and defines what Query Segmentation is. Knowing that it typically refers to segmenting out a specific query into units of a smaller size. It tells us that “Often there may be limitations on the types of segmented units possible, resulting in limited functionality in the method. Syntactic parsing generally focuses on identifying the linguistic structure of a query. Query classification generally falls into two groups: classification according to search intent, such as informational, navigational or transactional; and classification according to the semantics of a query.”

This could be applied to ‘purchasing’ or “buyers”. Both forms of query classification consider the whole query. Once classified and there is usually no further diagnostics on the internal structure of the query.

We also learn more about how it seeks to overcome ambiguous topic entities and challenges when employing an unsupervised learning method.

“Since the topics in a model may be predefined, and the possible topics of a document may be given, a new method for learning a topic model, referred to as Weakly Supervised Latent Dirichlet Allocation (WS-LDA) may be employed.

For each word

a. Draw topic assignment zn˜Multinomial(θ)

b. Draw word wn˜Multinomial(βZn), a multinomial distribution conditioned on topic zn.””

We learn that one data source for matching entities may be “query log data such as a query log from a commercial web search engine”, or something similar. The patent offers example forms of implementing various embodiments to sort entities form computer-readable media. It provides a partial which includes: “RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information”.

Machine translation help for entity extraction can be found in open-sources on the web. It’s exciting to jump into custom entity modeling – especially because entity understanding helps us communicate better with real consumers. Cindy Krum says this best, so, I’ll quote her again.

“Overall, entities provide Google a better and deeper understanding of topics because they give Google the ability to easily develop connection and relationships between different topics (entities). Deeper understanding of an Entity and its relationships, in turn, gives Google the opportunity to potentially serve information about the Entity in any language (with live translation from the Google language APIs if necessary), since now the language has only a supportive role for the query – like a modifier. Whatever Entity Understanding and Entity Relationships Google learns in one language can automatically be translated to other languages, especially in Google-hosted, position-zero results like the Knowledge Graph.”

The Google BERT algorithms tries to find the right connections between your published pages and the topics that they identity. You can help Google find the right “entities” on your site.


Data can serve as a new benchmark. True common-sense NLP technology incorporates new techniques, like structured knowledge models.

Human-to-human conversations can easily understand the questions asked in order to select the correct answer to respond with. Body posture, gestures, tone of voice all make it clear what the conversation is really about. NLP is helping machines be better at doing the same. As I learn and experiment more, I will update this article to make it more useful. Your journey using NLP and comments are welcome.

Find new ways to enrich your content.

Request your Schema markup Audit by a Minneapolis Pro Search Provider

Jeannie Hill:

This website uses cookies.