Google Webmaster Tools can be a powerful ally. But, if you make a mistake or put this power in the wrong hands, it can mean trouble for your search engine optimization. In this post, I provide a basic SEO Guide to Webmaster Tools to help get you started if you aren’t taking full advantage of WMT yet.
It is very important to point out that some of these things are more detrimental than others. Also, there could be multiple articles written about each of these tools and reports. This SEO guide to Webmaster Tools is a simple overview with a little insight.
Messages: Spam Warnings & Other Notifications
Priority: Medium / High
Many of us know that Google sends an email to Webmaster Tools if there is an issue with your site. If you don’t check Webmaster Tools messages frequently, you could miss an important piece of information.
An example of an important message would be an unnatural link notification. A message such as this could be indicator of a major issue, or it could result in almost nothing at all. It really all depends on how Google plans on dealing with your particular situation. Regardless, if you get any type of notification, it is important to figure out why.
In the settings tab, you can do three things: set geographic target, preferred domain and crawl rate.
According to Google, “If your site has a neutral top-level domain, such as .com or .org, geotargeting helps Google determine how your site appears in search results, and improves our search results for geographic queries. If you don’t want your site associated with any location, select Unlisted.”
Make sure to set this up so that it targets your intended geographic market.
Google states, “If you specify your preferred domain as http://www.example.com and we find a link to http://example.com, we’ll consider both links the same.”
I always recommend setting a preferred domain based on the way you want your website indexed. To do this, you may need to verify ownership of both the www and non-www versions of your domain.
The last option in this area allows you to modify crawl rate. Google says, “Our goal is to crawl as many pages from your site as we can without overwhelming your server’s bandwidth. You can change the crawl rate (the speed of Google’s requests during the crawl) for sites at the root or subdomain level – for example, www.example.com and http://subdomain.example.com. The new custom crawl rate will be valid for 90 days.”
In most cases, Webmasters will let Google optimize the crawl rate for the website. But in some cases, Googlebot may cause some issues which make it necessary to alter the rate.
Every website owner wants good sitelinks. If you don’t know, sitelinks are the links that show up under your domain name in Google search results.
Below are some small sitelinks.
Below are some large sitelinks.
Sitelinks are determined based on how much authority the domain has for a particular query. In many cases, we build sitelinks to assist with online reputation management because it pushes negative information further down the page.
No matter what your sitelinks look like, this section of Webmaster Tools allows you to demote your sitelinks. So, if there is one that you do not want listed for some reason, you can remove it.
A word of caution here. I have seen people demote one sitelink and then lose all of their sitelinks for months. Make sure you really want to demote that sitelink before doing so.
Webmaster Tools has a setting that allows you to specify URL parameters and request that Google crawl certain URLs and not crawl others URLs. This is an incredibly powerful tool. If you make a mistake, it can mean that a large chunk of your site is removed from the index.
I always recommend that people just stay away from this tool in general, in my humble opinion, it is a better option to use rel=canonical, no index/no follow, 301 or robots.txt when faced with most issues this tool alleviates.
But, if you need to use this tool, make sure you set it up correctly. Also, make sure the person enabling the tool knows the URL structure of the website inside and out.
If the site has a clean URL structure and the right person is using this tool, it can of course work well.
Change of Address
If you move, you need to tell someone correctly!
According to Google, “If you’ve moved your site to a new domain, you can use the Change of address tool to tell Google about your new URL. We’ll update our index to reflect your new URL. Changes will stay in effect for 180 days, by which time we’ll have crawled and indexed the pages at your new URL.”
Excessive Crawl Errors
If your website is not working and there are errors occurring on a regularly basis, this needs to be dealt with. In so many cases, I see sites with thousands of errors that are never addressed. Each error that affects usability can mean a potential lost customer.
This report shows you pages crawled per day, kilobytes downloaded per day and time spent downloading a page. Incidentally, there is a new Google Analytics report that also shows time downloading a page. The Analytics report, in my opinion, is much cooler. But overall, I like this set of reports. Think about it like this:
- The more pages crawled the better (shows Google is interested and checking out content)
- The more downloaded the better (unless there is an issue with Google wasting time on a area that should not be touched)
- The more time spent downloading, the worse off you are. As you probably know page speed is an SEO factor. However, in some cases certain media just might take a long time to download.
Note: these rules are not applicable to all websites in all situations. They are just general guidelines.
Priority: Medium / High
In this report you can see the URLs that are blocked by robots.txt. In some cases, you will see areas of your site that are blocked which should not be. So, when you see this, unblock them!
Fetch As Google
The fetch as Google tool allows you to retrieve a page of your website as if you were Google. This can be very helpful if you want to verify whether or not a page is accessible. Sometimes, with very large sites, there are so many things going on, it can be great to have this simple tool to turn to. It can give you a straight answer. Can Google grab the page or not?
This tool also has the option to fetch pages as Googlebot-Mobile. This can be very helpful, particularly because of the elements that need to be put in place for the different forms of mobile optimization.
The index status report shows you how many URLs are indexed out of all of those that Google can find on the website.
Here is an example of one way you can look at this report: say your website has 300 URLs in the sitemap; these are URLs you want to get indexed and probably the only ones you are aware of on the site. But, the index report shows that you have 3,000 URLs that are indexed out of 20,000 potential URLs.
Inconsistencies such as this scream issues with canonical URLs, duplicate content or just a Webmaster who does not know what he or she is doing.
When someone injects code into your forum or comments area, that is an issue. Google will see this and deem it malware. When someone then visits your website there is a chance a message will be delivered saying this site is not safe for users. For this very reason, it is important to check your malware report.
The search queries tab gives you a rough idea of the number of Impressions and Clicks your URLs are getting in the Google Index. It shows your top queries and even breaks them down by Mobile, Image, Video, Web, Location and Traffic.
This report can give you an idea of rankings and traffic. But, it is very unreliable. Overall, it just provides a rough idea of where things are. If the chart has a spike or a trough, then it is a good idea to investigate.
Links To Your Site
Priority: Medium / High
We all know a link from a bad website can hurt you. We also probably all know that if you have too much anchor text for a keyword you will get an algorithmic penalty. In the link report in Webmaster Tools, you can see who has linked to you. You can also see your top anchor text. If you are not ranking for the term that is in your top anchor text, there is a good chance you need to get rid of some of those links.
This is the area where you can export all of the links pointing to your website and then review them. When you need to disavow links prior to a reinclusion request you’ll be spending some time in this area. In particular, when I am in this report, I look for low-quality websites that have tons of links pointing at the site. Usually, a global link from a low-quality site is a red flag for Google. We could go on and on about how to evaluate links here, but the important thing to know is that this tool exists.
Good internal linking really helps Google find pages. Also, when it done correctly, it can increase your rankings. The idea is that each link to page, whether internal or external, is a vote to rank that page higher in Google. So, if this is the case, consider the value internal linking can bring to the table.
The more links you point to a page, the higher the authority of that page in Google’s eyes. So, make sure to link to your most important pages for search often. But, just like everything in life, you don’t want to overdo it. Keep your internal linking within reason and have a ratio that relates well to other pages’ internal linking ratio on the site.
Priority: Medium / High
When it comes to sitemaps, you should be submitting at least one to Webmaster Tools; that would be the basic XML sitemap. Outside of this, you may also submit an image sitemap, news sitemap, video sitemap or mobile sitemap.
Regardless, check this area to make sure your sitemap is submitted and there are no errors. The more sitemaps the better, if you have the content to support them.
Priority: Medium / High
Google has a URL removal tool that allows you to request that a URL be taken out the index. Of course, you can only use this tool if you own the website. But, the issue is that people use this tool when they don’t understand it. In one case, I saw an employee use this tool to remove the home page of a very, very large website. No one could understand the huge drop in traffic until I found the request.
Recently, I saw a developer use this tool to remove over 50 URLs. Each of these URLs had page rank and would have been great to 301 redirect or simply modify and leave be. This tool is very powerful; don’t let just anyone use it.
This report is a real gem. It allows you to see forms of duplicate content on the website from Google’s perspective. Click on the duplicate title tags link and you’ll see a list of pages that share the same title. This is a great way to find canonical URLs and other forms of duplicate titles and descriptions on the site.
The content keywords report tells you the keywords that are most frequently used on the website. Of course, this has SEO implications. Post Panda, it has become important to have a theme associated with your website.
By having keyword groups that have a logical theme, you stand a better chance of ranking well in Google. If you are focusing on a difficult keyword, many SEOs will want to see that in the content keywords report.
The structured data report tells you how many structured data items Google found on the site and how many pages it found structured data on. It also tells you how many types of structured data are on the site. I believe structured data sets companies apart and will be a very big part of SEO as we move into the future. This tab allows you to get a clear view of structured data that is on your website.
The data highlighter tool is a nice option for those who don’t feel comfortable with code. Instead, this tool allows you to apply structured data to your website without making any actual changes to the site. Instead, Google just saves this information and applies it to the site for you. Right now, it only works for event structured data, but they may add to this tool’s abilities in the future.
In this other resources area, we have three main things. Let’s touch on them briefly below.
This tool allows you to check that Google can correctly interpret and display your structured markup.
This free local platform from Google has now become Google +, but it is still listed here as Google Places. Please update Google!
This is the place to upload your product data. With the recent AdWords integration, it is important to connect this account with AdWords to do well in Google Shopping. If you want to learn more about this, there is a nice starter guide here.
This page shows you statistics for your author profile, should you have one correctly hooked up to the website. I actually use this a good amount when setting up rel=”author” for a new website just to make sure it is all correct and good to go.
Did you know you can add a Google search bar to your website? It is true, and this is one place where you can get information on this. I’ve set it up for a few sites, and it works well. It has the advantage of allowing for site search reporting.
This tool is really similar to Fetch as Google. It allows you to fetch the website and then preview it from a perspective of On Demand Desktop Search Instant Preview, Pre-Render Desktop Search Instant Preview and Mobile Search Instant Preview.
In this section, they will also tell you if there are errors fetching resources. We are using the Ignite Visibility domain here as an example. There are a few issues, as you can see, but we are about done with a full redesign, so we will let them slide. You can see 9 errors fetching resources at the bottom.
Priority: As Low as you Can Go
Site Performance is no longer supported by Google. Google provides this information on the subject.
“Site Performance was a Webmaster Labs feature which we’re no longer supporting.”
Try these other resources for understanding and improving site performance:
- Google Analytics Site Speed measures page load time as experienced by your visitors and allows you to measure other user defined timings.
- PageSpeed Insights analyzes the content of your pages and provides suggestions to improve performance.”
I am a big fan of the PageSpeed Insights report.
Disavow Links Tool
The Disavow Links tool can be very helpful or harmful, depending on how you use it. Matt Cuts, head of Web Spam at Google, has gone on record saying it is a power tool. But why? Well, if you block a good link you may lose rankings, and if you block a bad link it may help your SEO.
So, if you are using this tool, you really need to know the difference between a good and bad link before you use it. Most SEO experts will have criteria for dealing with bad link analysis.
Although we touched on many points in this guide, there is so much more these tools can be used for, and there are many ways to benefit from these tools and reports, but getting started is the first step! Search Engine Land has a complete archive of Google Webmaster Tools updates and WMT feature news to utilize as a resource as well.
Opinions expressed in the article are those of the guest author and not necessarily Search Engine Land.