15 experts reveal the trends for Technical SEO in 202025 min well spent
Most things just don’t change fast enough to have an opinion every year. But technical SEO has reached a point of critical mass that makes it really exciting.
Maybe the scale already tipped over last year (who’s counting) but based on conference trends, content, and my personal perception, technical SEO is now so complex and deep that we’re seeing specializations within the specialization.
Consider how the conference Tech SEO Boost grew just from 2018 to 2019.
Fields like mobile SEO, internal linking, performance, rendering, machine learning with Python, or content optimization have become so deep that specialists can make a living by just focusing on those aspects of technical SEO alone.
Lastly, the sheer amount of content for the topic “technical SEO” is growing fast.
Reason enough for me to ask some friends and experts in the industry for their opinion about technical in SEO 2020.
In this article
- AJ Kohn
- Andrew Shotland
- Bartosz Goralewicz
- Barry Adams
- Cyrus Shepard
- David Sottimano
- Eric Wu
- Hamlet Batista
- Lily Ray
- Micah Fisher-Kirshner
- Ruth Burr Reedy
- Patrick Stox
- JR Oakes
As my friend Bartosz from Onely says “2019 was a game-changer when it comes to technical SEO. There’s never been a year as dynamic when it comes to changes in Google and it’s a preview for the pace we can expect in 2020.“
Get a cup of coffee and ready to be inspired 😉.
In the last 1-2 years, we’ve seen more SEOs adopt Python for tasks like analyzing data and applying simple machine learning programs.
Patrick Stox agrees: “More SEOs getting into Python and machine learning, although I hope some will explore alternatives such as machine learning in other systems like Elasticsearch.”
Patrick sees “most of the on-page technical changes to be done with serverless functions, service workers at the edge, edge workers, whatever you want to call them. Some SEOs refer to this as Edge SEO. The advantage is that changes are made before the pages are served to bots and users. Several tools outside of standard SEO tools like A/B testing systems are starting to take advantage of this technology. So crawl and identify issues, write rules to fix the issues and changes are made before the page is served to bots or users.
SEO task automation: future = t.b.d.
As SEOs, we dream of automating simple tasks to focus on the more exciting stuff. It’s already possible to write meta-descriptions or set redirects at scale for a while now, but true automation is still a Yeti. Some claim to have seen it but evidence is thin.
Micah Fisher-Kirshner is “optimistic that the end of year scripting automation examples and requests by SEOs to help automate what they do will become a common practice among the more technical field.”
SEO is a tricky field for automation as so much has to be put into context, whereas our sibling search engine advertising already shows lots of use cases, such as outlier detection, campaign optimization, and CTR prediction.
“Google uses information about search queries, historical ad performance and other contextual signals combined with machine learning, to predict whether or not someone will click on your ad. This predicted click-through rate helps determine the selection, ranking and pricing of your ads–meaning machine learning is already working to show the right ads to the right customers.”https://www.blog.google/products/ads/adwords-machine-learning-part-1
Eric Wu doesn’t expect a miracle to happen any time soon but thinks it’s healthy for SEOs to explore: “I believe Technical SEO will continue down similar paths as it has before. Sites that haven’t done much in Technical SEO will continue to need to do the fundamentals. Larger enterprises that are transitioning to modern JS frameworks will need to be more diligent on SSR. Lastly, the Tech SEOs in the industry will continue to push toward more automation of tasks, whether its 3rd party tools or rolling your own with python and Jupyter notebooks. I don’t believe self rolled automation will become mainstream in the SEO community for years, but I think the growing interest is healthy as it pushes us to empathize more with developers that we work with.”
“The Tech SEOs in the industry will continue to push toward more automation of tasks, whether its 3rd party tools or rolling your own with python and Jupyter notebooks” (Eric Wu)Tweet
Web performance: the classic
Faster = better is tough to argue with. It’s one of the few principles that hold true in almost every regard, whether SEO, conversion optimization, or marketing in general.
Google has been and continues to be one of the biggest proponents of faster web experiences. AMP, CrUX, and adding performance as a ranking signal for desktop and mobile devices are just a few initiatives of recent years.
Hamlet sees this trend intensifying: “As Chrome starts shaming slow sites, you can anticipate a great deal of effort in improving site page speed, particularly on mobile.”
Bartosz agrees: “Web performance is becoming more and more a part of Google’s algorithm. If terms like CrUX, User Metrics or Largest Contentful Paint don’t ring a bell for you, it’s time to fix it ASAP.“
Performance is a broad field and, I would say, a science in itself. It shows different sub-trends like web packaging and HTTP/3.
AJ Kohn: “I’d be paying close attention to Web Packaging“Tweet
For AJ Kohn, the trend goes away from AMP: “I’d be paying close attention to Web Packaging”, and Patrick sees a progression of the HTTP protocol for next year: “HTTP/3 will be a hot topic and probably speed in general.”
Schema and structured data: unsurprisingly evolving
Structured data is not a new topic but one of the fastest evolving ones.
Jackie Chu: “While not new, schema implementation with the explicit purpose to create a “diversity of results” was a recurring theme at Google’s Webmaster product summit. I think diversity of results means fresher, more volatile SERPs, continued iteration on rich result features and a continued focus on entities and their reputation online. Use structured data and a strong HTML footprint to reinforce your entity and subject matter expertise, while always paying attention to best practices around site speed, content delivery, and formatting.”
Micah sees “a continuation of an expanded use of Schema for content articles, with many more SEOs working with content teams as they see value in the traffic coming in.”
A big opportunity is speakable schema, which is “already available for non-news websites which presents a potential “fake news” problem for Google assistant. I’d assume that Google will step up fact checking efforts and these signals will be incorporated into assistant answer rankings and/or search results.” (David Sottimano)
Barry Adams of Polemic Digital says “We will see more types of structured data being adopted by Google, initially in beta before more widespread rollouts. Recent success with beta programs for Speakable, HowTo, QAPage, and FAQPage, shows that Google sees structured data as a cornerstone of its indexing approach for the foreseeable future.“
Barry Adams: “We will see more types of structured data being adopted by Google, initially in beta before more widespread rollouts.”Tweet
However, technical SEOs are divided. Some think Google will be able to detect structured data without schema, others say schema markup will continue to be vital.
Steven Van Vessum of ContentKing: “[…] we’ll see search engines relying less on Structured Data to understand entities, and show more and more enhanced snippets for pages that do not have Structured Data.”
Part of Google’s push into more rich snippets is the abuse of schema. I’ve observed the same thing anecdotally as some sites implement schema in their code without showing the data on their page and still get the rich snippet. David Sottimano’s expectation fits that shoe: “FAQ schema rich snippets will be tightened or pulled entirely because of abuse.”
Amazon SEO: a new opportunity?
Amazon competes more and more with Google in terms of ad revenue and as a search engine. It has become a starting point for purchasing journeys – a known weakness for google, which is stronger at the top of the funnel.
According to Hamlet, “Amazon is now the first place people search for products and this trend will continue next year. In the e-commerce space, I think more SEOs will take an interest in learning about Amazon SEO. I also expect e-commerce focused SEOs to start paying more attention to the top of the funnel terms as Google Adwords advertisers continue to squeeze out every single commercial intent keyword.”
Hamlet Batista “In the e-commerce space, I think more SEOs will take an interest in learning about Amazon SEO.“Tweet
Ambiguity: more hints, fewer hard rules
The increasing use of Machine Learning has given birth to the concept of a “fluid search engine” that weighs many signals according to context. We’ve seen a lot of that in core algorithm updates this year, as it’s getting harder to recognize patterns.
Cyrus Shepard, founder of Zippy and previous guest at Tech Bound, sees that trend intensifying: “One of the biggest technical SEO trends that will undoubtedly grow in 2020 is the concept of “Ambiguity”. For years—and especially in the past 12 months—Google has moved more and more away from using ‘hard’ signals to treating SEO directives as ‘hints’.
Canonicalization is a hint
Links are a hint
Pagination is a hint
This is only a partial list, and we can expect to see it grow in the years ahead. While this can be frustrating to SEOs—who simply want Google to follow defined rules—adding complexity to its algorithms to improve search results sometimes means Google needs the freedom to ignore the SEO directives we feed them.
As technical SEOs, this means we may need to be more sophisticated about the “hints” we define and understand that Google’s algorithm continues to evolve far beyond the official documentation and published best practices that define the industry.”
Data accuracy: thinner quality of data
SEOs have grown frustrated with the data Google provides. Or maybe it’s just me. The SERPs continue to evolve at a rapid pace but Search Console data can’t keep up. It’s increasingly more difficult to get a good quantification of your performance and the experience searchers have.
Patrick sees no light at the end of the tunnel because “Data accuracy is going to get worse as more data is lost or invisible” and David agrees: “Legacy reports in GSC will be discontinued and substituted by worse replacements; professionals will need to look to 3rd parties for accurate data and usable interfaces, ex: crawl activity report”
Good times for 3rd party tools, I guess.
Zero-click SERPs: the biggest threat to SEO
Andrew Shotland, CEO and founder of localseoguide.com, says “Execs Will Start to Get “Zero-Click” SERPs.” I think it’s about time we step our tracking game up and stop looking at organic traffic and rankings only.
Andrew adds: “While those who have SearchEngineLand bookmarked have likely been following the whole “zero-click” SERP story as it has gone mainstream over the past year or so, there still are many with budget approval authority who just see organic traffic and revenue as either “up” or “down”. And last touch attribution models are woefully inadequate to give you a good idea of how a result you didn’t click on led to a conversion, particularly for businesses with physical locations. And since many ecomm SERPs are showing localized results, understanding how Google My Business, the original zero-click SERP feature, can lead to both online and offline conversions will be essential to execs who want to get a true picture of how their SEO is performing. Pretending it doesn’t matter, or it’s “free”, won’t suffice anymore.”
We need to accept that Google is not a search engine anymore but now a competitor to everyone.
David adds: “Expect organic click share to decrease yet again, expect ad placements and SERP features like https://www.google.com/search?q=mortgage+calculator&gl=us to take up more SERP real estate. Pay to play has been an ongoing theme and I wouldn’t expect it to stop unless Google is forced to comply with imposed regulations.”
JR Oakes expects “to see more summaries (instead of rich snippets) to enter search results as Google is able to use the corpus of all content to provide more authoritative and multi-sourced answers).” It all goes back to entity extraction and understanding.“Google will start to understand and present the important connections/questions/attributes surrounding entities in unique ways according to the type of entity.”
JR Oakes: “Google will start to understand and present the important connections/questions/attributes surrounding entities in unique ways according to the type of entity.”Tweet
Even further, he thinks “SERPs that have the intent of visual results will see a strong preference for the presentation of images over links, even more than how they are featured now.”
Indexing API: the new way to crawl the web
The future of indexing and crawling lies in APIs. It’s increasingly uneconomical for search engines to crawl the web as it grows. Pushing webmasters to use APIs to submit new content and content changes is the future.
Google’s job indexing API is just the start and word on the street has it that you can use it for content other than jobs as well. Bing recently charged ahead with their indexing API and I expect Google to follow, soon.
Barry Adams: “There are some interesting things on the horizon for technical SEO that I feel could start making an impact in 2020. Google has been playing with its live indexing API for a while and I think in 2020 we could see this pilot program expand and even become public. This could have a major impact for tech SEOs, as optimizing sites for crawling becomes less important in favor of plugging sites into the indexing API.“
David Sottimano: “Google will release the public indexing API this year which would be a much better alternative to nonsensical crawling of a website and unreasonably consuming resources for both parties.”
Indexing APIs could also solve the issue of partially indexed sites. For Bartosz, “One of the biggest challenges we observed in 2019 (and is somehow still not popular in the SEO community) is that big brands like e.g. Walmart have up to 60% (or more in extreme cases!) of their products/content not indexed in Google. We are talking millions of dollars per month to be gained by simply…indexing your content.”
Clean-up work and basics
You can’t ever do the basics well enough. I have yet to encounter a site that didn’t have some issue to clean up. Things break, no matter how high your level of SEO is. I saw many good results from just improving basics like meta-titles, 404s, duplicate content, and keyword targeting in the past.
Lily Ray: “Many of the biggest technical SEO initiatives in 2020 will involve ‘clean up’ work.Tweet
Emerging trends like Python and hopefully soon automation will help us do a better job at that. But because most sites have traditionally not done a good enough job at it, Google might take the matter into their own hands.
Steven van Vessum from ContentKing: “In 2020, we’re going to see that search engines will further their attempt at auto-correcting technical SEO issues such as implementing wrong redirects, canonicals, hreflang, robots directives (Meta/X-Robots-Tag) and returning wrong HTTP status codes. While this approach is far from perfect, sites with serious issues (and an inability to fix these issues correctly) will benefit from this.
The topic graph will more tightly integrate with search utilizing your interest and expertise with subjects to influence types of results returned.”
JR opens up an interesting perspective on rendering: “Google will parse the DOM as well as the shadow DOM looking for content”, content-hungry as it is.
Accessibility: good for users AND search engines
After my mind, doing things for accessibility also does good for SEO. Descriptive alt-tags, logical h-tag structures, and helpful anchor text is good for both, search engine and user. I’m not aware of something you’d do for accessibility reasons that’d go against SEO.
However, JR says “Alt tags have a limited life expectancy as the utility for them will slowly disappear towards screen readers and browsers that can understand the image’s context.”
Ruth still sees the responsibility on the side of SEOs and webmasters: “Another big topic I’m seeing come up more and more in the last year is accessibility, not just for search engines, but for people with disabilities who are using alternative technologies like text readers to browse the web. I think SEOs need to be paying attention to things like how browseable (is that a word?) the sites we work on are using a text reader, and how things like color contrast affect visibility for the vision impaired, both because it’s the right thing to do but also because these are machine-readable ways that Google can understand whether a site is the best possible resource for everyone. I’m not saying that these things are a “ranking factor,” whatever that even means anymore, but just that they could be and they’re worth doing well.”
Machine learning: more complexity
Ever since Google introduced machine learning in search, it got harder for SEOs to see through the complexity. Then BERT came out and now everything seems possible.
David Sottimano agrees that: “it’s difficult to understand on a granular level what’s going on with search results but I’d expect additional unconventional metrics to influence SERPs.
Look at this query: ‘best sunglasses’ (here), why is Google ranking more “Men” related pages? Is it because Google understands that there’s a larger proportion of men looking at this search result? If so, where is that data coming from?
Google is doing really clever stuff (https://twitter.com/jroakes/status/1019935398766825472; see Tweet below) and I think we’re going to need a lot of critical thinkers to write about their findings to *try* and explain the state of search results in 2020.”
Wow. This is interesting. This is Google serps responding to seasonality. eg. The intent changes based on time of year, and so do rankings. pic.twitter.com/IHEhg6NWbY— M’kay 🍺 (@jroakes) July 19, 2018
Ruth sees a chance of better content winning because Google is able to understand quality at a higher level: “Finally, I’m really hoping that all the kerfuffle around BERT will drive more people to understand Natural Language Processing, with the result that content around the web gets shorter, simpler, and easier to read.”
For JR Oakes, the implications of machine learning and connected to that, Natural Language Understand, could even impact links: “I would expect to see better measurements of the veracity, uniqueness, and authority associated with content based on the tremendous amount of work being done on NLU (Natural Language Understanding). Potentially leading to less of a reliance on links.”
HREFlang: necessary evil?
HREFlang is one of the most messed up tags – ever! It’s one of the biggest problems for international sites and I can totally see Google using other signals to determine the right results for different languages itself.
Barry: “It wouldn’t surprise me if in 2020 Google stops supporting hreflang meta tags. In its current form, hreflang is messy and prone to errors. I believe Google is looking for a simpler, cleaner solution to internationalized content, perhaps by supporting hreflang only in XML sitemaps, introducing an alternative approach, or upgrading Google’s indexing system to make sense of it without hreflang signposting.“
As with canonical or title tags, if Google sees enough contradicting signals, it’s going to decide on its own what to do.
Traction in SEO businesses
Andrew Shotland says “Investors Will Start Taking SEO Businesses Seriously”.
“The open secret in SEO is that it’s hard to get investors interested in it as a category. Given the very public SEO travails of various name sites such as TripAdvisor and Expedia, venture capitalists and PE companies will realize SEO is a critical industry with a global TAM and ever-growing budgets. As part of this investor SEO wokeness, I expect to see a few “wow!” acquisitions in the space as big players like BrightEdge and SEMRush try to round out their offerings while fending off the hundreds of start-ups SEOs are collectively fantasizing about as they do a site audit for the thousandth damn time. Crazy call: SiteBulb, ScreamingFrog & Moz, yes Moz, get acquired.”
I don’t think it’s so crazy at all. We’ve seen Conductor being acquired by WeWork – THAT was crazy!
Technical SEO in 2020: More complexity, more opportunities
If I had to summarize the sentiment I get from these predictions, I’d say we’re facing more complexity but also more opportunities to differentiate sites from others through good work. The deeper technical SEO gets, the more we have to work on doing a good job at it. But we’re also going to see higher rewards in the form of traffic.
Because technical SEO is getting much deeper and more complex, I also see large sites or companies that don’t invest the resources to optimize falling behind and being taken over by smaller startups and leaner teams.
As always, time will prove if we’re right or wrong, but I’m excited for what’s to come!
If you’re curious about what I think for 2020 in general, check out my predictions for 2020 Marketing trends.