Automating content creation with generative AI is a promising solution for resource-strapped businesses and teams. But when it comes to SEO, cost-savings don’t matter as much as content quality.
As Google’s search algorithms place greater weight on helpful content, gauging AI-driven content’s value from an SEO standpoint is critical.
Let’s use Google’s template for search quality – experience, expertise, authoritativeness and trustworthiness or E-E-A-T – to assess and improve AI-generated content.
In this article:
AI-generated content is not a fleeting hype – it’s here to stay. In the last few months, many quickly incorporated AI into their content creation process.
But the rapid rise of AI tools has also encouraged some to generate content just for the sake of it, without regard for quality. And there are thousands of Google search results to prove it.
When you look up “Regenerate response” -chatgpt <keyword> on Google, you’ll get results for webpages that have copied and pasted ChatGPT content without much editing – evident from the “Regenerate response” phrase taken from the AI chatbot’s interface. (h/t Jennifer Slegg)
Below is a sample query for the health industry.
The ability to generate and publish content quickly at scale raises questions about how Google will adapt to such shifts and how SEOs can ensure their content will not be used by AI tools to outrank them.
In November 2022, Google’s Duy Nguyen said that the search engine has “algorithms to go after” those who post AI-plagiarized content. As such, we can safely assume that Google can detect AI content.
In the quality raters guidelines (QRG), Google clearly states that content copied, auto-generated, or otherwise created without adequate effort, originality, talent, or skill such that the page fails to achieve its purpose will be marked with the “lowest” quality rating.
At the moment, we also know that Google is not against AI-generated content per se. It’s against “spammy automatically generated content.” (This seemingly deviates from – and supersedes – what Google’s John Mueller said in April 2022.)
One way to verify Google’s stance on AI content is by looking at the SERPs today. How well is AI-driven content performing in organic search? The accounts vary.
In one example, Mark William Cook conducted an experiment that involved creating a website with 10K pages filled with 100% AI-generated content without human editing. The website tanked shortly a few months after going live.
Then we have Bankrate’s AI-generated content that has been live for six months. SISTRIX assessed the performance of one of their articles and found that the content is faring well:
But why was AI-generated content successful in one situation and not so much in the other?
If we compare the websites, we’ll see that:
Another example is that of a brand-new test website I created last year with 30 blog posts, each around 1,000 words.
One blog post, which went live last October 2022, was written by someone with experience in the niche. I decided to update it in January 2023 by supplementing it with AI-generated content that is human-edited. The blog went from 1,000 words to 5,000.
The website has no authority in the niche, so the performance did not change much. I only saw some initial spike in impressions, which then returned to normal.
(Note: Do not judge the performance of any AI-generated content based on the initial increase in impressions or clicks. We need to see its performance for at least three months.)
After looking at the above three scenarios (pure AI content; AI content + human editing + authority and trust; AI content + human editing), we can assume that AI content can work to some extent.
But AI content alone is not guaranteed to work, even if you generate longer content. It still needs other factors supporting it to signal trust to Google.
The winning formula is Bankrate’s (AI content + human editing + authority and trust).
This shows that in most instances, who wrote what doesn’t count. Instead, the quality of the content and the overall website trustworthiness matter. (Yes, you can rank without backlinks, but that’s a story for another day.)
Your strongest weapon against the flood of AI-generated content is your website’s overall authority and trustworthiness. But what does that look like?
The concept of E-E-A-T applies to three areas:
We know that trust is the most vital component of E-E-A-T. Untrustworthy pages have low E-E-A-T in the QRG, no matter how much they demonstrate experience, expertise or authoritativeness. Pages with the lowest E-E-A-T or lowest reputation are considered untrustworthy.
We can learn from Bankrate and others that followed the same pattern. While the QRG does not translate to direct ranking factors, it helps us gauge content quality according to Google’s standards.
If I were to evaluate whether a website that is using AI content today sends clear trust signals to Google, here’s what I would look at:
On the page level
Sitewide, there are generic trust signals to consider, including information about the website and its reputation. This translated to the following checklist.
On the site level
In today’s world where AI is writing content, is authorship less important?
I’ve always supported hiring writers with experience in the niche/industry they are writing about and, ideally, with some online presence that reflects that experience.
Google’s Gary Illyes recently said that Google does not give too much weight to who writes your content. Yet, in Google’s quality raters’ guidelines, authorship is clearly mentioned in several instances.
For example, one of the reasons a parenting blog post was marked as “high quality” was that:
“The author of this blog post has become known as an expert on parenting issues (Expertise) and is a regular contributor to this and other media websites (positive content creator reputation).”
Authorship is still a crucial part of E-E-A-T. It may be more or less critical, depending on what industry you are optimizing for.
It is also important to highlight that if the business/website is responsible for the content (e.g., using ghostwriters), then the website’s reputation would substitute the authors’.
Also, consider E-E-A-T as a filter vs. a ranking factor.
You need to meet a filter to be eligible for ranking and performing in SERPs (with varying importance based on industry) and not to get marked as “lowest quality” content.
If a person is experienced on the topic but didn’t write a well-crafted, informative piece of content, don’t expect to rank well. Having experienced authors, specifically in specific industries, will protect your content against being filtered out.
Many websites were hit by the product review updates (there have been six so far) that aim to ensure high-quality product reviews are rewarded. Google defines the latter as:
“[C]ontent that provides insightful analysis and original research and is written by experts or enthusiasts who know the topic well.”
Not having the right authors with original experience is a disadvantage when writing product reviews. This is one of those situations where E-E-A-T and authorship are important.
While the importance of authorship and the overall website reputation varies depending on the niche, with AI in play, I’d project they will be even more critical in the future.
According to Google’s search quality rater guidelines:
“The Low rating should be used if the page lacks appropriate E-E-A-T for its purpose. No other considerations such as positive reputation or the type of website can overcome a lack of E-E-A-T for the topic or purpose of the page.”
In Arabic, we say, “The opposite reveals the truth.” To know the importance of E-E-A-T for SEO performance, let’s explore the lack of it.
Remember the 2018 Google “Medic” update that strongly hit many websites in the health and nutrition sectors? Analysis of the impacted websites shows they had one or more of the following:
On the other hand, websites that saw increased visibility after the update showed one or more of the following:
After the medic update hit many websites, it was clear that you could have a solid technical foundation and highly optimized website but still lose rankings due to a lack of E-A-T signals (now E-E-A-T).
There’s no reason to ignore AI tools completely. Despite the alarmist narratives on generative AI, the tech can’t stand on its own.
We must watch out for the risks of AI use, but that should not stop us from embracing opportunities to enhance our marketing efforts.
The key is to double down on the many things only we humans can do:
All these actions make us more credible and trustworthy sources of information than AI ever will.
(And since we’re talking about AI, let’s leverage it for our benefit. If you’d like to audit a website for E-E-A-T, this script from Daniel Foley Carter is useful.)
Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.
WordPress announced the rollout of Studio by WordPress, a new local development tool that makes…
Google updated their guidance with five changes on how to debug ranking drops. The new…
Google has officially completed its March 2024 Core Update, ending over a month of ranking…
Here is a recap of what happened in the search forums today, through the eyes…
The Google March 2024 core update finished a week ago and Google did not tell…
Small business owners and solopreneurs wear many hats. From managing finances to handling customer service,…
This website uses cookies.
Leave a Comment