Optimizing for AI search: Why classic SEO principles still apply

AI-powered search is reshaping SEO, but traditional tactics still hold power. Here's how to optimize for both AI and search engines.

Chat with SearchBot

Change is the only constant – a truth marketers, especially those in SEO, know all too well. 

The search landscape is shifting faster than ever, with AI-powered tools like ChatGPT, Perplexity, Claude, and Gemini reshaping how people find and interact with information.

Yet, despite all the innovation, the core principles of SEO remain as relevant as ever. 

While AI-enhanced search engines may seem like a revolutionary leap, they still rely on the foundational elements that have driven search success for years.

This article explores how time-tested SEO strategies – like site structure, indexing, and keyword optimization – still play a crucial role in an AI-driven search world. 

Because when it comes to SEO, the more things change, the more they stay the same.

What is old is new again

The core principles of SEO have remained remarkably consistent over the years. 

Keyword optimization, quality content, backlinks, and user experience have always been key factors in achieving high search engine rankings.

Nearly two years ago, I quoted Bing’s Fabrice Canel in another Search Engine Land article from when he announced the Microsoft Bing partnership with OpenAI:

  • “SEO will never be dead, but it will change. Stick with the same SEO playbook as before.”

That advice still applies today.

Crawling and indexing

The crawling and indexing process remains fundamental to SEO, even in the era of AI. 

Search engine spiders crawl the web, indexing content to make it accessible for searches. AI-powered tools follow similar principles.

To ensure your content is effectively spidered and indexed, follow these best practices:

Create a clear site structure

A well-organized website with a logical hierarchy of pages makes it easier for AI tools to navigate and index your content. 

Use logical directory structures, descriptive URLs, proper header tags (especially H1s), and a sitemap to guide spiders through your site.

Speed up your site

Fast-loading pages improve user experience and are favored by AI algorithms and traditional search engines. 

Compress images, leverage browser caching, and minimize unnecessary code to enhance your site’s performance.

Avoid using JavaScript (JS) in unauthenticated online spaces

This makes for a better user experience, and it’s also likely that some AI bots don’t render JavaScript. Google’s Martin Splitt recently pointed this out

If you must use JS, prerender server-side into HTML as much as possible to avoid any issues getting your content into AI databases.

This reminds me of the days before Googlebot could render JS. 

I still remember back when people built sites in Flash and wondered why they weren’t appearing in search results!

Don’t block AI bots with robots.txt or other means

Jed White took this a step further by advising to avoid overly restrictive bot blocking when using content distribution networks like Cloudflare or AWS. 

As I heard repeated many times at a recent conference, if you block AI bots, you have no chance of appearing in the answers they generate.

Boost AI search visibility with fresh, relevant content

Like with Google and Bing, fresh, relevant content signals to AI algorithms that your site is active and authoritative. 

Regularly update your website with new articles to maintain visibility in AI-generated searches.

Dig deeper: How to optimize your 2025 content strategy for AI-powered SERPs and LLMs

Get the newsletter search marketers rely on.


AI search content optimization

AI search tools utilize advanced language models to understand and generate human-like text. They can: 

  • Process vast amounts of data.
  • Identify patterns.
  • Answer user queries precisely. 

Optimizing for these tools’ discovery mechanisms is crucial to ensuring your content appears in AI-generated searches.

Topic- or entity-based keyword research remains a cornerstone of effective SEO. 

Understanding the informational needs of those you seek to reach and knowing the specific terms and phrases users are likely to search is key. 

Incorporate these keywords naturally within your content, HTML titles, and headers. 

AI models excel at contextual understanding, so focus on semantic relevance rather than keyword stuffing for better results.

The SEO community is split on the role of structured data or schema tagging in AI searches

Regardless, I recommend using markups like those found on Schema.org. 

That tagging certainly helps traditional search engines understand the context and meaning of your content. 

Since Google alone is responsible for over two-thirds of organic search traffic, using structured data tagging makes sense no matter what.

By providing clear and organized data, you enhance the chances of content being accurately interpreted and featured in search responses.

Dig deeper: AI search is gaining traction, but it isn’t replacing Google: Survey

AI search is evolving – but SEO best practices remain critical

As AI-powered search tools continue to evolve, the importance of basic SEO best practices cannot be overstated.

You can ensure your content remains discoverable in traditional and AI-generated searches by focusing on keyword optimization, structured data, and effective spidering and indexing.

Embrace old-school basic SEO principles as you adapt to new technologies.


Contributing authors are invited to create content for Search Engine Land and are chosen for their expertise and contribution to the search community. Our contributors work under the oversight of the editorial staff and contributions are checked for quality and relevance to our readers. The opinions they express are their own.


About the author

Elmer Boutin
Contributor
Elmer Boutin is the Head of SEO at Milestone Inc. where he supports website optimization efforts for clients. Following a career in the U.S. Army as a translator and intelligence analyst, he has worked in digital marketing for over 25 years doing everything from coding and optimizing websites to managing online reputation management efforts as an independent contractor, corporate webmaster and in agency settings. He has vast experience and expertise working for businesses of all sizes from SMBs to Fortune 5-sized corporations including Rocket Companies, PFS, Banfield Pet Hospital, Corner Bakery Cafe, Ford Motor Company, Kroger, Mars Corporation, and Valvoline; optimizing websites focusing on local, e-commerce, informational, educational and international.

Get the newsletter search marketers rely on.