How to demonstrate E-E-A-T in AI-generated content

The concept of E-E-A-T, Google's template for search quality, can help us assess and improve the quality of AI-generated content.

Chat with SearchBot

Automating content creation with generative AI is a promising solution for resource-strapped businesses and teams. But when it comes to SEO, cost-savings don’t matter as much as content quality.

As Google’s search algorithms place greater weight on helpful content, gauging AI-driven content’s value from an SEO standpoint is critical.

Let’s use Google’s template for search quality – experience, expertise, authoritativeness and trustworthiness or E-E-A-T – to assess and improve AI-generated content. 

In this article:

The meteoric rise of generative AI

AI-generated content is not a fleeting hype – it’s here to stay. In the last few months, many quickly incorporated AI into their content creation process

But the rapid rise of AI tools has also encouraged some to generate content just for the sake of it, without regard for quality. And there are thousands of Google search results to prove it.

When you look up “Regenerate response” -chatgpt <keyword> on Google, you’ll get results for webpages that have copied and pasted ChatGPT content without much editing – evident from the  “Regenerate response” phrase taken from the AI chatbot’s interface. (h/t Jennifer Slegg)

Below is a sample query for the health industry.

Regenerate response in health niche

The ability to generate and publish content quickly at scale raises questions about how Google will adapt to such shifts and how SEOs can ensure their content will not be used by AI tools to outrank them.

Google on AI content

In November 2022, Google’s Duy Nguyen said that the search engine has “algorithms to go after” those who post AI-plagiarized content. As such, we can safely assume that Google can detect AI content.

In the quality raters guidelines (QRG), Google clearly states that content copied, auto-generated, or otherwise created without adequate effort, originality, talent, or skill such that the page fails to achieve its purpose will be marked with the “lowest” quality rating.

At the moment, we also know that Google is not against AI-generated content per se. It’s against “spammy automatically generated content.” (This seemingly deviates from – and supersedes – what Google’s John Mueller said in April 2022.)

One way to verify Google’s stance on AI content is by looking at the SERPs today. How well is AI-driven content performing in organic search? The accounts vary.

In one example, Mark William Cook conducted an experiment that involved creating a website with 10K pages filled with 100% AI-generated content without human editing. The website tanked shortly a few months after going live.

Mark William Cook's SEO experiment

Then we have Bankrate’s AI-generated content that has been live for six months. SISTRIX assessed the performance of one of their articles and found that the content is faring well:

SISTRIX analysis of Bankrate AI-generated article

But why was AI-generated content successful in one situation and not so much in the other?

If we compare the websites, we’ll see that:

  • The website that tanked a few months after launch was brand new with little authoritativeness or reputation. Humans did not edit the content, so no fact-checking or proofreading was done.
  • Bankrate.com, on the other hand, is a pretty established website with history and backlinks. More importantly, people edited and fact-checked the AI content thoroughly before publishing.

Another example is that of a brand-new test website I created last year with 30 blog posts, each around 1,000 words. 

One blog post, which went live last October 2022, was written by someone with experience in the niche. I decided to update it in January 2023 by supplementing it with AI-generated content that is human-edited. The blog went from 1,000 words to 5,000. 

The website has no authoritativeness in the niche, so the performance did not change much. I only saw some initial spike in impressions, which then returned to normal.

(Note: Do not judge the performance of any AI-generated content based on the initial increase in impressions or clicks. We need to see its performance for at least three months.)

Sara Taher's website with AI-generated content

After looking at the above three scenarios (pure AI content; AI content + human editing + authoritativeness and trustworthiness; AI content + human editing), we can assume that AI content can work to some extent. 

But AI content alone is not guaranteed to work, even if you generate longer content. It still needs other factors supporting it to signal trust to Google.

In most instances, who wrote what doesn’t count. Instead, the quality of the content and the overall website trustworthiness matter. (Yes, you can rank without backlinks, but that’s a story for another day.)

Your strongest weapon against the flood of AI-generated content is your website’s overall authoritativeness and trustworthiness. But what does that look like?

E-E-A-T for AI content: SEO checklists

E-E-A-T for AI content: SEO checklists

The concept of E-E-A-T applies to three areas:

  • The website as a whole.
  • The content on the page in question.
  • The author or the entity behind the content.

We know that trustworthiness is the most vital component of E-E-A-T. Untrustworthy pages have low E-E-A-T in the QRG, no matter how much they demonstrate experience, expertise or authoritativeness. Pages with the lowest E-E-A-T or lowest reputation are considered untrustworthy. 

We can learn from Bankrate and others that followed the same pattern. While the QRG does not translate to direct ranking factors, it helps us gauge content quality according to Google’s standards. 

If I were to evaluate whether a website that is using AI content today sends clear trust signals to Google, here’s what I would look at:

On the page level

  • Clear editorial disclaimer on how and where AI-generated content is used and whether it is human-edited and fact-checked.
  • The AI content is actually edited and fact-checked by writers.
  • Each piece of AI-generated content must offer a unique take and stand out from existing content on the topic. This is why human oversight is required. You cannot just rehash what’s in the top results, rephrase it, and consider it quality content. It’s not.
  • Links to external authoritative sources and entities. 
  • The content delivers on its promise to the user. The page content needs to fulfill its purpose.
  • The amount of effort, originality, and talent or skill that went into creating the content. (QRG)
  • For informational and YMYL content, accuracy and consistency with well-established expert consensus are important. (QRG)
  • Clear information on who wrote this content and why this person is qualified to do so (author bio). This varies in importance based on the topic at hand. Generally, who (what individual, company, business, foundation, etc.) is responsible for the website and its content should be clear. 
  • The personal experience of the content creator on the topic in hand (if relevant).
  • The expertise of the content creator on the topic (e.g., financial advice).
  • Bonus: The content discusses other points of view.

Sitewide, there are generic trust signals to consider, including information about the website and its reputation. This translated to the following checklist.

On the site level

  • Authoritativeness: To what extent is this website known for the topic at hand?
  • Clear information about the company: The website has an About page, privacy policy, terms and conditions, return, exchange and shipping policies pages (if applicable), and ideally linked to from the footer.
  • The online reputation of the website: If the website is not the primary creator of the content, then the reputation of the content creator. (QRG)
  • Customer reviews of the website or the business.
  • The footer includes company information: If the company is part of a group of companies owned by one bigger entity, it’s a good idea to add this information in the footer.

How important is authorship for E-E-A-T?

In today’s world where AI is writing content, is authorship less important? 

I’ve always supported hiring writers with experience in the niche/industry they are writing about and, ideally, with some online presence that reflects that experience.

Google’s Gary Illyes recently said that Google does not give too much weight to who writes your content. Yet, in Google’s quality raters’ guidelines, authorship is clearly mentioned in several instances.

For example, one of the reasons a parenting blog post was marked as “high quality” was that: 

“The author of this blog post has become known as an expert on parenting issues (Expertise) and is a regular contributor to this and other media websites (positive content creator reputation).” 

Screenshot from rater guidelines shows Google's focus on expert authors contributing to E-E-A-T.

Authorship is still a crucial part of E-E-A-T. It may be more or less critical, depending on what industry you are optimizing for.

It is also important to highlight that if the business/website is responsible for the content (e.g., using ghostwriters), then the website’s reputation would substitute the authors’.

Also, consider E-E-A-T as a filter vs. a ranking factor. 

You need to meet a filter to be eligible for ranking and performing in SERPs (with varying importance based on industry) and not to get marked as “lowest quality” content.

If a person is experienced on the topic but didn’t write a well-crafted, informative piece of content, don’t expect to rank well. Having experienced authors, specifically in specific industries, will protect your content against being filtered out.

Many websites were hit by the product review updates (there have been six so far) that aim to ensure high-quality product reviews are rewarded. Google defines the latter as: 

“[C]ontent that provides insightful analysis and original research and is written by experts or enthusiasts who know the topic well.”

Not having the right authors with original experience is a disadvantage when writing product reviews. This is one of those situations where E-E-A-T and authorship are important.

While the importance of authorship and the overall website reputation varies depending on the niche, with AI in play, I’d project they will be even more critical in the future.

How important is E-E-A-T for a website’s performance?

According to Google’s search quality rater guidelines:

“The Low rating should be used if the page lacks appropriate E-E-A-T for its purpose. No other considerations such as positive reputation or the type of website can overcome a lack of E-E-A-T for the topic or purpose of the page.”

In Arabic, we say, “The opposite reveals the truth.” To know the importance of E-E-A-T for SEO performance, let’s explore the lack of it. 

Remember the 2018 Google “Medic” update that strongly hit many websites in the health and nutrition sectors? Analysis of the impacted websites shows they had one or more of the following:

  • Missing About page.
  • No or poor online reputation.
  • Promoting medical treatments that go against the scientifically agreed-upon consensus.
  • Missing or questionable authors.
  • No links to reliable external sources.
  • Too many affiliate links and salesy content. (I can confirm from personal experience that having too many affiliate links with no “nofollow” or “sponsored” attributes can lead to a manual action in YMYL topics.)

On the other hand, websites that saw increased visibility after the update showed one or more of the following:

  • Authors of the content are clearly labeled and have relevant expertise highlighted in their bios.
  • Signs of authoritativeness.
  • General transparency with users.
  • Links to reliable external sources.
  • Clear information where needed (e.g., About page, Contact page and other policy and brand pages).

After the medic update hit many websites, it was clear that you could have a solid technical foundation and highly optimized website but still lose rankings due to a lack of E-A-T signals (now E-E-A-T).

There’s no reason to ignore AI tools completely

Despite the alarmist narratives on generative AI, the tech can’t stand on its own.

We must watch out for the risks of AI use, but that should not stop us from embracing opportunities to enhance our marketing efforts.

The key is to double down on the many things only we humans can do:

  • Gaining real-life experiences.
  • Sharing our expertise with people in need of guidance.
  • Tapping others to lend authoritativeness on a topic we might not fully know about.

All these actions make us more credible and trustworthy sources of information than AI ever will.

(And since we’re talking about AI, let’s leverage it for our benefit. If you’d like to audit a website for E-E-A-T, this script from Daniel Foley Carter is useful.)


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About the author

Sara Taher
Contributor
Sara Taher is a Canadian based SEO consultant with over 8 years experience in the field. She is most known for her SEO tips and riddles that she shares on her Linkedin account. You can sign up for her newsletter & check out her blog here.

Get the must-read newsletter for search marketers.