Tech Bound #43: How to improve the E-A-T of a site

6 min read
– Case Study

The 08/01 Google algorithm update & optimizing for E-A-T

Around August 1st, Google rolled out a massive core algorithm update that had severe (positive and negative) impacts.
What do we know about the Medic update?
The first voices raised said the update hit mostly sites in the health sector, naming it the “medic update”. While it is true that health is the category that was impacted the most, we think it was about YMYL sites. “Your money or your life” site often handle sensitive information and transactions in verticals like finance, insurance, health. The update seems to even have affected e-commerce sites and blogs selling and pitching YMYL products. Research by Dr. Peter Meyers from Moz confirms “the multi-day update impacted a wide range of verticals”.
SEMrush’s Sensor indicates that this update was not targeted at a specific vertical or industry. When we compare Law & Government with Arts & Entertainment, for example, we see that both graphs show volatility around August 1st:
SEMrush sensor
Also check out the deviations feature of Sensor. It shows you whether the fluctuations are above normal or not.
The update was a heavy hitter. Some sites lost 90% of traffic. The last update I remember with similar punches is Penguin.
Here’s an example from a German health site:
(Implications of the Google 08/01 update on a German health site, MoM)
(Implications of the Google 08/01 update on a German health site, DoD)
Google released a lousy Tweet about the release but didn’t say what the update was about, leaving webmasters no hints about what to improve if affected. As you can imagine, the frustration in the SEO community sits deep (check the responses to the Tweet).
What can you do?
So what DO we know? Evidence is not straightforward, to say the least. The latest Google updates were very broad, meaning it wasn’t possible to identify a single key factor that all winner/loser sites had in common. This update is no different.
Google says there’s nothing you can do, which I would say is neither helpful nor true. Only a Sith thinks in absolutes ;-).
Many speculate that one of the main drivers of the update was E-A-T. Before we jump into what it is and how it works, remember that it’s often not a single factors that makes or breaks SEO. It’s an accumulation of positive or negative signals.
How to improve E-A-T
Some speculate that the update had to do with E-A-T, which stands for “expertise, authority, trustworthiness”. The term first appeared in the Google Search Quality Rater Guidelines, a key document for SEOs understanding of Google ranking factors.
E-A-T mainly applies to YMYL sites. Google wants to make sure sensitive information is handled by experts and authorities, partially to mitigate spam and partially to provide high quality search results. But what exactly impacts E-A-T and how that factors into rankings has always been blurry.
How do you signal trust, expertise, and authoritativeness? Here’s what we think a site with high E-A-T needs:
  • Must-have pages or “trust pages” that every serious business has like “About”, “Contact”, “Support”, etc.
  • (Good) reviews and ratings of the site or product(s).
  • Topical relevance of content. Do you create content about random topics or about a specific category? This, of course, doesn’t apply to every site, for example publishers/newspapers. But for E-A-T sites it’s important. Image a doctor writing about house slippers. Realistic? No! He would write about medicine or even a specific part of it.
  • Is the user experience of a site good or does it look spammy? Marie Haynes mentions the site as a loser in her article about the update, for example. I personally think the UX doesn’t look trustworthy, apart from the site not having an “about” page. That is definitely a factor for E-A-T but it’s unclear how much that played into the update.
  • Authors writing for the site must have authority and be named on the page. It appears to help if the author has a Wikipedia page, is quoted in papers or has written scientific papers herself.
  • Links from reputable sites in the vertical.
  • Press coverage.
  • Regular updates and edits on the site.
To round it off, I want to quote Glenn Gabe: “You will not recover by putting band-aids on the situation. You will not recover in a few weeks, or a month or two. You must implement significant changes to improve your site and make sure they stick around for the long-term.
Personal opinion about the 08/01 update
Let me add my personal opinion here (take it for what it is). Logically, it doesn’t make a lot of sense that his update is all about E-A-T. I don’t have enough data to prove it, but that would be an exaggerated reaction to missing E-A-T. Drops of 90% are usually reserved for link or spam penalties.
If E-A-T really was the main driver in this update, I don’t think Google would be doing itself a favor. For three reasons:
  • It doesn’t always take an expert to provide valuable information. Scientific papers, for example, often need to be interpreted by publications to make them accessible to the public. Expert opinions have higher value that those of non-experts – no doubt – but pushing non-expert sites so hard is… harsh.
  • If the goal was to downgrade shady business Google’s spam filters should be good enough. Why an extra update?
  • If the spam algos are failing, Google should learn enough from user behavior to distinguish between spammy and serious sites, at least in most cases.

It smells to me that something else is driving this update. Again, just my personal opinion.


Your weekly dose of awesome content

Tim Kadlec: “How fast is AMP really?”
“AMP’s biggest advantage is the restrictions it draws on how much stuff you can cram into a single page.” Google is cheating quite a bit with AMP, which fits to its aggressive strategy right now.
Medium: “How AirBNB is putting AMP at the core of its digital strategy”
“Implementing AMP is not easy. Airbnb added AMP functionality only for certain page types, to be efficient with it. Then, the team rolled out a test on search results pages to measure impact.
Two issues Airbnb experienced that surprised me was that AMP seems to be in conflict with deep linking and that it creates more requests on the server side. Eventually, Airbnb decided not to use AMP. That’s a reaction I’ve seen with quite a lot of companies. The technology is not problem-free.”
The Verge: “Inside Google’s plan to make the whole web as fast as AMP”
Hongkiat: “Progressive Web Apps – The Future of the Modern Web?”
FTF: “UX: The new SEO ranking factor”
I don’t think UX is new as a ranking factor but I agree it has become crucial. It can make or break a site completely. The sites I see ranking well that have a spammy looking design are close to 0, opposed to a couple of years ago.
L2Inc: “Memes are the new influencers”
Interesting to see that the more followers an influencer has, the more brands she/he works with, which means the more diluted is the effectiveness.
Goinflow: “Research: How the Top 50 U.S. eCommerce Companies Dominate SEO”
Interesting research from inflow about the top 50 e-commerce companies.
The secret history of Silicon Valley. Very interesting presentation from OG Steve Blank about the history of Silicon Valley.