The definitive SEO audit part 1 of 3: Technical
In the first of his three-part series on SEO audits, columnist Dave Davies explores the technical aspects he reviews when conducting an audit.
Disclaimer: Every situation is unique. This outline of the elements of a technical SEO audit discusses the common first points I look at with unpenalized sites hoping to increase their traffic. If you have a penalty or serious technical issues, this list is not exhaustive and will not cover all the areas you will need to research or methods to employ.
Performing an SEO audit or having one performed for you is an opportunity to understand what’s working for your website, what is not, and where opportunity lies. While every site is different, there are a number of areas to be reviewed that are common among nearly every one. This three-part series is going to cover:
- Technical SEO
- Content (or “On-site”) SEO
- Off-site SEO
Today’s installment will discuss the technical SEO audit. The first place I almost always start when taking on a new site or performing an audit is with the technical. It’s in this area you can find some of the quickest wins, improve your user experience and make the most of everything that follows. Here are some of the key areas to review and address:
1. Indexing
The first thing you want to check is how many pages are indexed by the search engines. Simply navigating to your desired search engine and entering [site:domain.com] will produce the number of your indexed pages and will display the pages the engine deems more important.
You may instinctively know if this number matches roughly your total number of pages; if not, you can crawl it yourself using tools such as Xenu (free) or Screaming Frog (free up to 500 URLs and £99.00 Per Year for unlimited). You’ll need one or both of these tools further on in the auditing process, so you might as well download one of them now. (Xenu doesn’t give you the number of pages by default, but it does create a sitemap which you can paste into an Excel document and then reference the number of rows.)
It’s additional work reviewing the top-ranking pages to make sure they make sense. If your home page or other key pages are not high on the list, this means you may have a penalty or that your internal linking structure is so poor that the priority is not being given to the most important pages.
2. Robots.txt
If you find your number of indexed pages isn’t right (and even if it is), the next step is to check your robots.txt file (located at domain.com/robots.txt). Here, you’re simply looking to make sure that you are not blocking the search engine crawlers from anything you’d like indexed.
3. Preferred domain
Many sites are accessible at https://domain.com/, https://www.domain.com/, https://domain.com/, https://www.domain.com/, and perhaps even all those URLs with index.php, index.html or index.asp at the end.
While Google is far better than it once was at unifying the different URLs, you’ll still want to ensure that all these separate URLs are 301 redirecting to a single one. This ensures that there’s no confusion ever about which version of the page to rank, and it unifies all incoming link weight to one point.
4. Sitemaps
Until a few months, ago I’d have written that the only requirement was to produce an XML sitemap and submit it through your search console. As of February 2016, however, the Webmaster Guidelines changed to include a recommendation to have a human-readable sitemap. If you don’t have a human-readable sitemap, Xenu will generate one for you.
5. Broken internal links
Both as an SEO and a user, I hate broken links. For a user, they’re understandably frustrating — and after all the hard work to get a visitor to your site, you don’t want to annoy them. From an SEO standpoint, a broken link is essentially wasted PageRank. Link juice tries to flow to the page, and when it can’t get there, it simply evaporates.
There are many tools you can use to hunt down broken links, including Google Search Console, Xenu and Screaming Frog. I find Xenu to be the fastest and easiest to use for this task.
6. HTTPS
It might be obvious, but I’d be remiss not to include it. If a site isn’t secure, this is one of the first things I recommend. There are a few scenarios where there are pros and cons to weigh, such as large-scale publishers fearing the loss of their social share counts, but for most sites, it’s a given.
If you are pondering making the switch, however, I highly recommend selecting a time when a hiccup in rankings is tolerable. If your big business is during the holidays, now would be a far better time to make the switch than late October. Such hiccups are not as common as they once were and tend not to last long, but with any URL changes, minimizing fallout must be a top priority.
7. Mobile-friendliness
If your site does not provide a good user experience for mobile visitors, this can negatively impact your mobile rankings — and there’s a decent chance that will impact your desktop rankings in the near future.
You can test your site with Google’s own Mobile-Friendly Test tool. If you fail, it’s time to call your developer or roll up your sleeves.
8. Google PageSpeed
Speeding up your site improves your perceived value in Google’s eyes. It’s important enough that they provide a tool called PageSpeed Insights that will scan any page you enter and provide not just a score (on a scale of 1 to 100) but also recommendations on how to fix issues. Their grading of “OK” starts at 65, and their grading of “Good” starts at 85. Obviously, the higher the better.
Enjoy the process here; this is one of the very few times in SEO when you get to see a direct numeric value given immediately to an action you take. This tool is also accessible via “Site Speed” in Google Analytics.
9. Real page speed
Some may debate whether user behavior is a metric. I am firmly in the “it is” camp, and so, beyond the PageSpeed score mentioned above (which can often be increased with little actual real-world speed increases for a visitor), measuring your actual load times and reducing or eliminating bottlenecks can greatly improve the load time on a site.
There are a number of tools to aid in this (including Google Analytics to a degree), but my personal favorite is the free-to-use Web Page Test. It will only do a single page at a time; however, it gives you a waterfall view of each resource call and the time it takes, making repairing issues far easier.
10. Internal link structure
Every link on a page reduces the value of the others. This is not to say you should run off and eliminate all your links, but when looking at your site, you must consider what your most important pages are for both visitors and search engine priority.
We’ve all seen sites with hundreds of links in a massive, unusable navigation. This is not only a nightmare for the visitor (and likely horrible for your load times), but it also dilutes the passing of PageRank so much that it becomes unreliable at best. I always look for a focused internal linking structure that’s designed to pass visitors and weight to the more important targets.
11. Broken external links
Google Search Console is the first place to look for your broken external links — that is, links to your site that hit a 404 page. Many sites have them, and they can be the result of the webmaster on the other site making an error, but that doesn’t mean you don’t need to fix them.
I regularly visit the Crawl Errors section of the Search Console for every client and fix any broken links by redirecting them to working pages. The reason? If you think about how long it takes to develop a good link, and then compare that with the time to set up a 301 redirect, the answer quickly presents itself.
This step is especially crucial if the site has ever undergone a redevelopment that has required new URLs.
12. Duplicate titles & descriptions
Duplication of title and meta description tags isn’t just bad for SEO, it’s a bad experience for searchers. Screaming Frog is the easiest tool I’ve found for quick detection and reporting of duplicate titles and descriptions (or H1 tags, or… ). To use it, you simply run the crawl, select the element you’re interested in finding duplicates of (H1, for example), and order by occurrences.
Other common technical SEO issues
As noted previously, each site is different, and your site may have technical issues not covered here. The above list details the first dozen things I review, and I often find new issues while I’m doing so. I mention this as a reminder to keep your eyes open as you go through the data from the various tools and look for anything that doesn’t quite seem right.
Here are a couple of common areas I didn’t include above and why:
- Keywords in URL. There’s no doubt that having the keywords in your URL has a positive impact on rankings. I didn’t include it in the list because redirecting URLs can have a negative impact on rankings, and sites with good links to their internal pages can actually do more harm than good in changing their URL structure just to chase a small advantage. If you’re creating new pages, however, including keywords is recommended.
- Above-the-fold content. Content that’s meant to be seen is given a higher priority in Google’s eyes than content buried “below the fold” (that is, content you need to scroll to see). From a pure SEO standpoint, it would be great if all the keywords, H1 tags and juicy content could be placed right below the logo. Back in the real world, however, there are a lot of core usability and design practices that don’t allow for this. It is something I always consider, but if the user is better served by a design that doesn’t have a lot of above-the-fold content, then we need to understand that some SEO weight may be lost and find different ways to get it back.
More SEO Audits
In my next two articles, I’ll be looking at SEO audits for content and off-site signals. The off-site portion will cover links, social and other SEO areas that aren’t directly on your site.
I invite your feedback, so if you have any items in your own checklists that apply to virtually every site, feel free to let me know on social media.
Contributing authors are invited to create content for Search Engine Land and are chosen for their expertise and contribution to the search community. Our contributors work under the oversight of the editorial staff and contributions are checked for quality and relevance to our readers. The opinions they express are their own.
Related stories
New on Search Engine Land