BERT infers questions in mobile Featured Snippets4 min well spent
I recently googled “Surveymonkey reviews” on my mobile phone.
At first, you see the normal ads…
But then, you see a double featured snippet with a question each, even though the search query is “surveymonkey reviews”.
First, I noticed that Google shows the questions and double featured snippet only on mobile devices, not on desktop…
But when I looked at the results pages themselves to see how they answered the question, I couldn’t find the question. Google automatically adds them. They’re not in the content.
The content does include the answers Google shows, though:
Google is baking something. It seems to somehow know that these are the most asked questions people have about Surveymonkey… or any brand.
I think those questions are coming from People Also Asked boxes. When I looked at the questions that are showing up for “surveymonkey”, guess what I found?
Correct! I found two of the featured snippet questions in the brand query PAAs.
Questions as a content staple
I think it’s time for brands to take PAAs seriously and create content for each one of them. It might seem a bit weird to write for a question that asks “is [my product] legit?” and it could be that Google doesn’t even want to show the brand itself. But I also think nobody is trying but should.
Questions like “how much does [my product] cost” or “Is [my product] free?” are easy to answer.
I would include and FAQ section on the “about” page for that content and address larger questions in a full blog article.
One of the tools I like to use for that is alsoasked.com. It shows you all found PPA questions for you to include in your content.
If you look closely, alsoasked.com also surfaces the two questions that Google shows in the Featured Snippet!
BERT at its finest
So how does Google come up with these questions? BERT!
In fact, Google mentioned it in the introductory blog post (emphasis mine):
Last year, we introduced and open-sourced a neural network-based technique for natural language processing (NLP) pre-training called Bidirectional Encoder Representations from Transformers, or as we call it–BERT, for short. This technology enables anyone to train their own state-of-the-art question answering system.https://blog.google/products/search/search-language-understanding-bert/
So, BERT has a core function that improves question-answering systems.
Wikipedia tells us a bit more about them:
Question answering (QA) is a computer science discipline within the fields of information retrieval and natural language processing (NLP), which is concerned with building systems that automatically answer questions posed by humans in a natural languagehttps://en.wikipedia.org/wiki/Question_answering
It gets better. Google calls Featured Snippets out specifically.
Well, by applying BERT models to both ranking and featured snippets in Search, we’re able to do a much better job helping you find useful information. In fact, when it comes to ranking results, BERT will help Search better understand one in 10 searches in the U.S. in English, and we’ll bring this to more languages and locales over time.Same blog article as mentioned above
In a nutshell, what we’re seeing is BERT at its finest. And I expect to see way more of that. Now, go and create content for PAAs!