How to fix the SEO issues that keep you from achieving your goals

Effective SEO requires a holistic approach. JR Oakes deep-dives into his approach from his session at SMX Report.

Chat with SearchBot

At this year’s SMX Report, I provided an overview of SEO Issues that hold us back from achieving our goals. I took a holistic look at resources, communication and mental constructs around SEO that often hinder progress.

Oftentimes we look for the quick fixes that drive major ranking improvements. These still exist, but the relationships involved with connecting us to clients, and the website to users are where the most sustained value can be found.

Here are some questions to ask before we even get started with the fixes:

Is the company ready?

NBD5  ADU0Sl1t8hB0LRdN48g0oLn2rzwGIQih7Zs4T XMvL2M0UE4KRFrTIT ZyxDDCPfoPeLjmp OoHmim1bpf1V1ToSXjxyZVBmjE1JyCczf5UoQhTVKQrhhCa8jri8pk4w=s1600

At LOCOMOTIVE, we work with a wide range of clients. One of the key benefits is we see a wide range of problems, and we also develop insights on how different companies handle SEO from an implementation perspective. Three of the key factors for our clients with the greatest YoY organic wins are:

  1. Sufficient resources to implement,
  2. Acceptance of value of SEO across various stakeholding teams,
  3. Openness to testing and failing.

Technical SEO recommendations will impact a wide range of teams across your organization, from developers to content teams and more. If your teams are currently barely keeping up with a two-year backlog of issues, adding fresh Technical SEO recommendations will probably never see the light of day.

I once had to make a business case to an IT lead for the SEO team on having access to Search Console and Google Analytics. The level of distrust of SEO was so high within the IT team, they actively worked to thwart any requests from our team. This was at the kickoff of our engagement. If you have teams actively working against your SEO priorities, it will not work.

If it takes you six months to build an accepted business case to add two lines to the site’s robotx.txt file to exclude paths with poor content experience, then you will be very limited in what you are able to accomplish with SEO.

Is the SEO team ready?

It is not always clients creating a bottleneck to SEO growth. It is often the SEO agency team. Every SEO team should focus strongly on the following three areas:

  1. Communicating issues clearly,
  2. Prioritizing projects well,
  3. Testing and reporting on impact.

If you have ever sent a client raw output from Screaming Frog as a CSV and asked them to fix the 32,000 301 URIs, you are doing it wrong. This tells their teams:

  1. I don’t value your time.
  2. I don’t understand what is required to fix these issues.

It is on the SEO consultant to go through this list, look for site-wide 301s in the footer, clean up parameters (e.g. https:example.com/sid=12345), and provide the client a clear concise list of 301s that must be handled in themes or components, and 301s that must be resolved in content. 

To help prioritize issues in our tech audits, we like using a tool called Notion

Notion allows us to prioritize all issues for clients:

V98NIuC0bTxrTHsB3OU3e8T POt49 HI3g9 GSMJSyAC U By6TQDG PKgXRgG5OtLSAKQBES4o J9shn0Qhyx4wZIiIh8i6vXOO45WsNOVs9ZgcE6jes6AIo8hVihWxQM AryzG=s1600

Create views that are filtered to the relevant teams:

Yj22hTNButAWPlUankzLeabXLRzJwrPmhp Kzqu8ejnooDQwTB Vsi9dtpmL7Jj6yX9uVwM1qGDdJMlvhQqwT8mtxGggzxtJBpnxDRVod2Gyf4iAPZJ7e5n83r70ytzq1NTyPAOB=s1600

And finally, add very clear information on the what, why, and how as tickets to resolve each issue:

UglYy3Esb7jLOXXyyJMUdc27j7DKx6M5f7d9EnpWbS OvOAROFDav5dW6V6qjP3lAPZICbmrKA3R4IDQhmckI60HrLon51W IbnCsoS2sT8W9GbNYo2bKpHbD2 Wd9lzheJas3pP=s1600

Outside of technical audits, we use the ICE method to help qualify and prioritize recommended growth projects:

FKSGsBiNhM0qDy1Ub8Cw Qym8RXHIlrmhVXJaWuYTI7XtZWv5nmZOF IEB483YVozx9 Y7Y4jYcJwLD4b2GF5UrZh6UWvxjPnbZa6ACuD08CeFkaWWiHBbsOXYyPEVC6WIfUqwQL=s1600

This allows you and the client to quickly prioritize quick wins versus projects that will take significant resources:

Fa XNtJQ10WEgYlg75IkZQV6R 5Gut24CoJIpeffoUtGr 2 ZfNVrkvgD1nhG7gGY673iPeUrsRRoCe33HPkgafABWqI5pVjzkseynK AFhI93RFn7h6JSLssGc M4na81bhLIE=s1600

Finally, showing the value of the projects you have worked on is critical to gaining the trust of internal teams as well as resources to tackle larger projects.

Using Google Data Studio can make this very easy and efficient. By creating easy-to-reuse templates, regular expressions for URIs, and sharing with the right stakeholders, you can quickly and easily show the value of the work to gain buy-in for larger projects.

TFE6thdvhcd4DUoqWoGi1p6Q0EWWOyxp7epIHp7caHfdlYBzdSEge93ech1y0l D58MQ4EioRiTC4F J7Dl YvFLte2 BrBdJggznMCcolUedRMIHmINE Bz0qmU9u Z9 MwK2PA=s1600

SEO Issues

Most SEO issues can be broken down into a few categories. I like talking about these items, rather than using technical jargon because it helps relate back to things that are meaningful and understandable for non-SEOs.

XJqIPQVbkMvVTMx Hy5a5Tf3GtLZ84jlv5wl9lFXTfLr1SCHm4Z 60wyUVPmGjcF2ecdHwFDeV4HV XPUow9 SAkVfHU7PWKKEHmBpjQAj9 MyZ5mAKZv4ItTbkLhVLNU7mZCYvT=s1600

Links, specifically <a> elements, are a vote that another URI is important. If you posted something to Reddit, you would not expect the post to gain wide visibility after receiving only one upvote. Links to pages on your website are similar.

Links are also discovery mechanisms for search engine crawlers. They help them find good, as well as bad, URIs. It is our job as SEOs to help them discover the good, but keep them from discovering anything bad. In this case, bad could mean a URI with no content, or a page specifically designed for your logged-in users. Essentially, “bad” are pages you don’t want users to find.

DNH5DRO1HUkXbBKRAW1L2Qx 5ZhPfD6ODIgqcJI RLQ81aEGh30wKGj9GbeZV68kw 1MgbM2Xyq7TETcLJuFTTeLlRk QLothrZTVF5nQ7qyDq473hm84kcseKQwh4z QqA3ebkd=s1600

Having an up-to-date dynamic XML sitemap is the first step in solving for the “good” URIs. XML sitemaps help search engines find the content you want them to show to their users. An XML sitemap should list ALL URIs you want users to find on your website and nothing else.

G 9DaSdBbXx7S1QfDhtfZZC HON0lKfEkIQo Zb Z IPJr1PpwedU9vFTttqo607BhJazVkNQ0WEXKUlEs0pV3cYv4v8Q1o3byhphCHDONb9tBK2cWd5jXdNNzX31znqBGw5fn1i=s1600

Google gives site owners a tool called the Coverage Report in Search Console. This will show you URIs indexed, but aren’t in your sitemap. If your XML sitemap is all the “good” URIs, then this is a good place to look to see why other URIs are being indexed, and if they should be.

The coverage report will also show you URIs submitted in your sitemap, but Google decided not to include in their search results. This is often because there is other code like your robots.txt file or a meta robots tag telling search engines you don’t want them to show the URI. In other cases, the URI is either not the best URI on your site for its topic, or the URI doesn’t align as a good answer to topics where there is searcher demand.

You can use Google Analytics for ground truth for all pages being found by users. Again, using your XML sitemap as the baseline for your “good” URIs, comparing the pages users are landing on from search results (organic) to pages in your sitemap is a good exercise to find URIs you should include in your sitemap, or that you should be excluding.

Ev Og6UTfi5bBbjIk H7 MIsbK5Ko879Y PacoEkMRoDh8Hz6jUZOnhOXQofxDU9An9nZny5F O3SOzXmp Butqty5Y1qu  R2c01xhc0Avvj13unBaNTvvWqFdaO 3mSZWzCvNM=s1600

If search engines are finding URIs they shouldn’t find, you should consider:

  1. Removing links to the URIs.
  2. Block search engines from accessing the path or URI in robots.txt.
  3. Asking search engines to not index the URI via a header or meta robots flag.
  4. Block access to the URI at a server level. (e.g. 403 Forbidden)
  5. Removing the URI via Search Console removal tool.

It is worth noting that site owners should carefully consider options 2 and 3 above as blocking a URI in robots.txt will block them from reading and processing a meta robots or header noindex directive.

If search engines aren’t finding a URI that you want users to find, consider:

  1. Adding more links from other pages to the URI.
  2. Asking other websites to link to the URI.
  3. Including the URI in your XML sitemap.

Content

1xEc5kCjylq1Wu2kYlBjaZysstGRFgFsv RmiQLCA5XOW2WAmQrq4PpxWd2TIEeXV45AZ4geIHRvD2vc4P0RKBgikNqeK0ltyI6F9Qcsd0Y8VLKIIKF PcUm4votiL05U 6DiQUk=s1600
Cartoon by HubSpot is licensed under CC BY-NC-SA 2.0

Content is the most often abused word in SEO. Content is treated as an unhelpfully broad noun in most communications from Google and other SEOs. “Just make your content better”. What if we treated it as a verb? To satisfy. Content is not text. In fact, there are millions of pages ranking right now with very little written content. Content enhancement by adding some entities or LSI keywords is not a far step from keyword spamming of years past.

One of our biggest visibility wins in the last two years was simply adding a downloadable PDF to certain pages where a PDF was strongly associated with the user intent of the page. Content is all about listening and designing an experience that satisfies what the user was trying to learn, or do as clearly and efficiently as possible.

From a technical aspect, there are things we can do to help us measure pages satisfying to users.

Parameters

Many companies use parameters to track usage across a website, or perform other functions like sorting content, or establishing user state. This can lead to situations where we have many URIs representative of the same page being tracked in reporting tools.

WTsgwiCx6q6veCDTQYRExm7lkK6n5bCQmvZl JfCGTmwul0x5pTiUC1ZBGizgAdREFi5uBjv9jT9bHa2m9D5Vc50P3ZbmQzx3pSYp0 PkSF Z4rEkkyyVOe99uNw1bt1nXgkjeQt=s1600

In the example above, Google has sent traffic to two separate versions of the same webpage based on the site’s usage of a sid parameter in internal links. This makes our lives harder as marketers because instead of seeing this page has 880 user sessions and is an important page, the data is fragmented across multiple URIs.

To resolve these situations, we have a few tools:

  1. Excluding certain parameters in Google Analytics.
  2. Including a canonical link element to the non-parameter version in the HTML.
  3. Updating internal links to remove unnecessary parameters.

Important to note internal links will almost always be a stronger signal for search engines than a canonical link element. Canonical link elements are a hint of the correct URI version. If Google finds forty internal links to https://example.com/page.html?sid=1234, even if you have https://example.com/page.html specified as the canonical version, the linked version will, most likely, be treated as the correct URI. 

Feedback

Incorporate feedback mechanisms into your pages that report back to your analytics tools

HcxXgvDE4cyoMlZTkTzbLLoc78OL0J TaGVD7nwCk CBGJo Md8 CMRlF4NtlH7UvMO96XDMFLS2NmXC3rK9hm11AOQ62G5o0rdMpHJHAtw4mZawM2ebb3fv73f5DGG3kOlFAmFf=s1600

Using this feedback can help you to sort pages that have the following issues:

  • Outdated content,
  • Didn’t answer the user’s question,
  • The wrong page is linked to in navigation,
  • The content is confusing or the wrong media is used.

Custom Metrics

Consider using custom metrics like read time, persona, jobs-to-be-done, logged-in vs logged-out content to enhance how you report on your pages in analytics tools:

IQXcGatTzbI0uZsw3NwTH4SoNB1AjM01aKUe4W TSjEwCYJhuBS3nBf7gSf10z7IqPc24s Vu2ceprmgdby7ELlWfsyd3vFXrK8tuep7qII93UujdaB6zPIUK9rbp2wktALpB5Iw=s1600

Ensure you are tracking site search queries in your analytics tools. Site search, over and over again, has proven to be a wonderful tool to diagnose:

  • Important pages that should be in the navigation.
  • Content you should be covering but are not.
  • Seasonal trends or outlier issues.
36m1VJfweiYxuksw2zoBC561CZgqjfOpv3iNX4UQkI0xnbjmwJNQdMpZLT6K UQBjGA2JtGQnFBeQ3XATSdQNqCjEFv 3 1VsASfonR9HlW4rT8kRaox1AMYdhZGsvv Vzzd XN6=s1600

Cannibalistic Content

Cannibalistic content is problematic because you can lose control of the designed experience for users and search engines can get confused and rotate through the URIs they show to users for specific search terms. Combining highly similar pages is a great strategy for users and for search engines.

If you click on a single search query in Search Console, Google will show you all the pages on your site that have competed for that query over the time frame given

YszMQpD0k4mXUASZR F7ttKau6zYo0xj2FVJoBXX10QmNztwdT0qE9PkdeJGtt6xyrBOgm58hSW9latqKe MQlgwtdM41 KKEmEd0Z5TvYPjnB5hujzJbtrx9Y94GFIRrEQTitY7=s1600

This can sometimes be confusing because in many cases Google may display site links in search results causing multiple URIs to display in search results for queries.

If we focus more on non-brand queries (search queries that don’t contain a discrete brand or product name), it is often more fruitful to find pages where you really have highly cannibalistic content. If you are handy with Python, this can be pulled from the Search Console API and you can create spreadsheets that give counts of the number of URIs receiving clicks for the same query.

Screaming Frog now has a content duplicates report that will allow you to crawl your website and quickly review content that is either exact or near-duplicate.

Jmqa2hUnbZaRjeuV9a38to6lIp O8ueTxWMwKTxAe5NoeHBNtT4iMq1iUtQm7cvWMyqDqoYadHu4L4Sm9AKPR8nSmgckNVCCOKg5HkoYlVs GQ24rLeA3kSogrVPOSzJHZRW7xxa=s1600

Finally, giving SEM teams a path location to place paid landing pages is a good strategy to ensure that disparate teams are inadvertently creating cannibalistic content.

VJ6CzC0DH Qaq S4WA NQFaOCszMcZpZtUKDvZVa42atdxsOd8utT ZTtkGYjwNuiZlNm8um4WkUQCkRakyXqMXvIqeXikcwXvYxALh8qr QUlzjT2VUf0b4ryNmalZ 1MwFqKay=s1600

Experience

 YtqMwLSlipEyuSUvJuui86C0947aqlcKCTvNT05My9BrNdw 85twOWW66edVjk3X4kKpGkXG5Au  CaDGQyodV4n8F63zM L9ZTQulk H1LYCUbVR08sgwrjoDEFcaaKrJOx5Gg=s1600

The experience that users have on websites can affect visibility as well as revenue. In many cases, if a change is good for revenue, it will also be good for visibility as search engines incorporate experience more into their understanding of metrics that quantify user satisfaction.

Page Speed

Two of the best ways to get page speed to be a priority for a company is to align with revenue loss or position as a way a competitor is beating them.

Google Analytics has limited page speed metrics and for smaller sites, can give strongly biased averages with small timing samples, but by increasing timing samples, and working to align metrics, like document interactive with meaningful revenue decline, this can give you the ammo you need to get speed work prioritized.

QXSgAJTd2uoP7OUQajc1JMuE Lb4bKebp0nv2ENxyKEEwyQlyuzfjzO0XIihSexsL3ezWUQSr4qy HscP79xvVRwQQfzoDRXGiu43Ho8Mq88NwMWOEgdQBruHULnjMsPnBEOqXzT=s1600

One of my favorite reports to share with developers is the Measure report from web.dev. Not only does it provide an overview with prioritized issues and guides to resolution, but it also links to a lighthouse report so developers can drill down for more details of individual issues.

IiBWUua DYzxG5fJKf9Zzfe2cZbk4HHyaVG21 6Ums6d SdUkBUX0l  AgemWopBv9cPJ02bOtPf2y5FbepEV Cbxf5R3f6q8JtSrwTS2MH LV1Lu8YgtWZ2 Cjph6TC2B4MVIsY=s1600

Web.dev also provides a link to a handy CrUX Data Studio dashboard that will make it easy to see improvements and celebrate with a larger set of internal stakeholders.

Microsoft Clarity

Clarity is a wonderful free tool from Microsoft that connects from Bing Webmaster Tools and provides a rich tapestry of experience metrics as well as individual session recordings. There is no better way, in my opinion, to understand user experience than reviewing session recordings. You can see when people are reading, see when they are having to close 15 popups, see when the hamburger menu keeps closing unexpectedly, and see if there are other things getting in the way of something you want them to do.

I0gjGM5TPHXBN6XW8k6FUqAEICnHOD3Wz4zrTySTTtG Pq YcxKP5dDS GJ7HQepLqYViabqQp2WNt6lObjTrEPWF5ngVqyxF FsV2BpA02wGZctgOLAnAMsc51zT450XVCWOJ7h=s1600

Understanding Intent

In your analytics tool, utilizing the second page and exit page in landing page reports can give you really good information on what users want and their path to get it. Does the landing page include links to the information they were looking for? Did users end up navigating to another page for an answer that should have been answered on the landing page?

2g8e9vWyOJalic3NoMf PIoJj0NyCGa7CybYfkH2InxJ8th5Qj9fECJVglFeHxjAksm3I3Z01XQYTbrPSHJAGC9tBSbhy2Ay2OUPcH8urNuM69GNuUAf640u Va9LM0 Y4CMJ Rt=s1600

Hidden Issues

Getting in the habit of opening the Developer Tools Console in Chrome when visiting pages is a good way to spot hidden errors that may be impacting users or your metrics.

Errors here can lead to:

  • Incomplete tracking information
  • Missing content
  • Insecure pages
  • Poor page performance.
H89LiR9r15QMNs CCtAvNL6oOG0U1wHml Iz9hRfKleCcTJLuUSDG0wZKC41Z91Np0338Y8cs 9yV73PqQ6wi6zSu1PMoqJ3VtlCxYryQ27knr1sNRB SrI3GUYHDA2uK2mDSY7D=s1600

Relevance

Relevance, to me, is how well a page aligns to and covers what the user was looking for. This is not at a keyword level, but rather, does the page provide the answer or solution to the overriding meaning the user had with their search query.

In Google Data Studio, you can quickly pull user searches and landing pages along with other informative metrics like clicks and impressions.

XXwvjQCOkruk9unwMoDhow08NiHlaOH3TqNPizybacpzds9VOzAuUCJpnikcq66Q0qHQgSY4wOPeLzf 4O7IOnviuePNNikVqhbjtxvvpyvXfaH2dsrCamnk7CWyfiXPnl9RcvdC=s1600

Downloading these to a CSV and using a simple pivot table in Excel or Google Sheets allows you to get a high-level view of what the best relevance engine on the planet, Google, thinks your page is about.

E6IUvp3znAlzRd8qBGs4oLbIbKaBPX8BQhhbFBZapecAvxIezTO553yaZfe5Phvw5wrWz3vMK ZAzIi3QRuEktg8CJPfvGY D DW7kjIeeouBARCxdXOm3TxnRzGkMRGv10EvmFU=s1600

Since this page on the Locomotive website is designed to sell Technical SEO services, we can quickly see where the page is relevant for things that maybe it shouldn’t be.

This is an opportunity for us to update the page with more text describing the type of analyses we offer, speak to the benefits of Technical SEO, and talk about our credentials as an agency. The searches in green (below) are aligned with the goal of the page.

The items in red (below) are an opportunity for us to produce more educational content which goes more into the details and mechanics of Technical SEO. 

ZZHt8Jj9IBnSYepdNg6jgtUljoY1M5nLi2ZdQEHpMCbQSuhrUtX2Jifm6eKE7pEumk9rsYQbvyZbE1aUSgDXaf81s4wbW Tgnv40LRwrVxRiwq6EoNWqtpdPGOrMSOhcWyzoYEbX=s1600

In addition, understanding your authority and expertise, as a search engine would see it, is critical to understanding what you can be relevant for. Around 2019, Google started elevating rankings for some absurdly unoptimized sites. Many of these were local government sites which had never seen an SEO and rarely a developer or designer. They got better at understanding authority attributed to websites.

Search engines also can use the entire corpus of a site’s text content, authors, links, etc. to understand the expertise that a site has for a given subject area. Writing new content which is aligned with your web site’s subject matter expertise, or your civic authority, will most always perform better than content that isn’t. This also aligns with the concept of “A rising tide floats all boats”, meaning that, over time, the more you demonstrate your expertise in new content, this has an additive impact on all content in the subject area.

Finally, the last two areas around relevance include knowing what you can be relevant for, and understanding when Google adds relevance for you.

If I worked at an energy company and it was suggested that we name a new product plan “unlimited utilities”, unless substantial monies were applied to awareness of this name, it is very unlikely that users would ever find our landing pages via Google searches due to Google’s understanding of this as a navigational term for a specific company.

C9hsRfY7 KimOPKDACdUjz2gwmro3dnoS809uWjHqTNx0BxqZ39Z76a93QyWSOqqFQFW922lS CDR08PhGnMJEuVCdllx2hYgUps8jBG3Ofu BCDsDe7XklSr8fV1MPRRR1VN32W=s1600

I like to think that Google just includes things that it knows about me into my search text. In the example below, Google knows that I am in Raleigh, so they included a +raleigh into my search.

I am sure it is more complex than this, but in terms of a mental construct, it is useful to consider that Google provides your location, search history, etc. into the processing of your search to provide results more tailored to you.

20mvKCphtEKXs7lPpG4663THnLkCWU8WVgkChWElq7ezwwqIWE5IvssCtVOfs4IPsxZGbBpdNP7jdBs9E6CtL7hZ4AZmsKlE7k Pw  3JYsipvBaHprbySk S8gS8mW9ZpVTh5d9=s1600

Wrapping Up

Effective SEO requires a holistic approach. These are the key elements to coming at it from every angle:

  1. Companies need the team buy-in and resources to succeed with SEO.
  2. SEO teams should focus on clarity of communication and efficient prioritization.
  3. The key areas to consider in SEO strategies are Links, Content (page satisfaction), Experience, and Relevance.
  4. GIGO is a real thing. Taking time to go slow with accurate XML sitemaps, custom metrics, user feedback mechanisms, etc can make your life easier and give you data to inform growth.
  5. Spend some time watching user sessions. You will thank me.
  6. Work hard to ensure your pages solve a problem or provide the right answer.
  7. Look at how your page’s content aligns with user searches provided by Google. 
  8. Write to support and build your site’s subject matter expertise. Credibility is key.

Want to watch the full session and others from SMX Report on-demand? Register here.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About the author

JR Oakes
Contributor
JR Oakes is the VP of Strategy at LOCOMOTIVE. He has been a member of the SEO industry since 2011. Prior to that, he was an architectural glass designer, specializing in leaded and carved glass. He attended the Design School at NC State University. JR has been active in the SEO community in Raleigh, helping to organize the Raleigh SEO Meetup and Beer and SEO Meetup. In addition, he is one of the founders and moderators of /r/TechSEO on Reddit. An avid technophile, he enjoys applying new technologies to existing problems and posts a lot of open-source code on his GitHub profile. He has worked with some of the largest brands in the world and is passionate about sharing his knowledge and work with others. He has written for Search Engine Land, Search Engine Watch, opensource.com, and several other industry tools and publications.

Get the must-read newsletter for search marketers.