The changing dynamic of programmatic SEO
Wish you could get more traffic from Google?
Half the battle is understanding WHAT you need to fix on your site.
This is the power of Ahrefs Webmaster Tools! A cost-friendlier alternative to expensive audits.
AWT shows which keywords your pages rank for, how Google sees your content, and what changes can boost your traffic.
Imagine what this could do for your business…
Visit ahrefs.com/webmaster-tools for this free tool.
Programmatic SEO is the most effective way to scale SEO. The idea is simple: build a lot of pages - sometimes hundreds of thousands - for a keyword pattern.
So far, programmatic SEO has been reserved for companies that get content from users (UGC), products (ecommerce retailers), or data (SaaS) because they didn’t have to create the content themselves. Content creation is the number one bottleneck when it comes to SEO growth.
Even though integrators could create a lot of templatized pages, they were held back by having to create the content themselves.
But now, AI tools significantly reduce content creation bottlenecks and allow companies to create programmatic content at scale even if they don’t aggregate user-generated content, products or data. The benefits go beyond just more traffic. Companies can run SEO split tests on programmatic content, target longtail queries, and use retargeting to improve customer acquisition costs.
Programmatic SEO for integrators
Programmatic SEO is the strategy of creating a large number of pages with the same structure to target a query pattern.
For example, Betterteam built thousands of pages for job descriptions.
Causal has +1,000 pages for different Excel and Google Sheets formulae.
Gusto has hundreds of payroll calculators for different states.
Mind you, none of the three examples is an aggregator - they're all integrators! The approach to creating content for every programmatic page is very different.
Creating programmatic content follows the same 5-step process:
- Define query syntax (pattern) based on your product
- Define page elements (features) based on user intent and what’s already performing well
- Writers create the content (sometimes, engineers build a calculator)
- Split test and refine
It's a product-led approach to SEO that makes the website a part of the product. Instead of a content strategy focusing on growing a blog or content hub, the strategy focuses on building landing pages.
Two new challenges for programmatic SEO
One of the three examples above used GPT-3 to build out all the content (yes, it ranks like a charm). Several of my clients have started doing the same, and I'd be surprised if more companies aren't having the same thought. AI changes the programmatic SEO dynamic.
However, two new challenges arise: quality assurance at scale and tracking longtail SEO traffic.
It's no secret AI tools can hallucinate and get even simple facts wrong. CNET burned their fingers a bit on the AI content stove. It got basic facts wrong, likely due to missing human quality control, even though the AI content performed well. We should wonder what that says about Google's fact-checking algorithms, but the bigger point is editing and fact-checking cannot be automated (yet). Who guarantees the content is correct and accurate when you build thousands of pages with AI? Or even a hundred?
The best solution is still human editors. They might not be able to check all pages, but at least samples. I expect more solutions to come to makert that make fact-checking scalable. Plagiarism detectors like Copyleaks, Unicheck and others can already provide relief now.
Another solution to quality problems is investing the time to getting the prompt right. In my experience, too many people jump the gun on going with the first ouput they get. It should take no less than 10-20 times (sometimes, more) before you get good output. At scale, testing and refining the prompt pays dividends.
Tracking longtail traffic has already become harder since seach console filters queries for privacy reasons (sometimes over 50%). A client of mine with a huge site (hundreds of millions of pages indexed) gets so much unreported longtail traffic that Search Console is basically useless. Traffic tracking alone can be solved server-side, but identifying longtail keywords and rank tracking becomes impossible.
The best solution is to split traffic and rank tracking: measure traffic at the (organic) channel level and track keywords in 3rd party trackes based on your best estimation. in most cases, you can use common sense to derive what query a page is optimized for.
Despite new challenges, I'm bullish on AI content. It might change search behavior to some degreec but the opportunities to make SEO more efficient outweigh the downsides, in my humble opinion.