A Webmaster View Of Google’s Latest Search Quality
Since Google posted on how their search quality stuff works, I figured I take the webmaster view and tell you what webmasters and SEOs have been seeing from Google’s search quality as of late.
As many of you know, my other search blog, the Search Engine Roundtable, covers webmaster and SEO related discussions from the forums. That allows me to document any change that Google makes in search quality – from small to large. I wanted to share with you several of the changes and updates you guys, SEOs and webmasters, noticed in the past thirty-days, with Google’s search results.
The timeliest story is that Google is updating right now. Dozens of webmasters have posted at several search forums that they have noticed major changes to their search rankings in Google. Be it they lost rank or gained rank, there appears to be a major algorithm update at Google. Of course, Google is constantly changing the search results around – but typically when I see this level of discussion at the forums, I know it is a sign of a major algorithm update.
Since February, Google has been shifting the results around based on the case sensitivity of your search. Today, I posted an example of a search query that differs in the search results returned by Google, when the case is changed from all lowercase to word case. Google says that case sensitivity makes no difference in the search results. So either Google is testing a new feature and will do away with that search guideline or something else is causing the search results to differ.
I don’t even know where to begin with the various “Minus X” theories being put out there. I believe in them, as I said before, but there are so many, including -6, -30, -60, -950 and so on. Branko of SEO Scientist offered some theories behind the penalties based on data he complied from his client base.
Earlier this month, many webmasters, including myself, noticed Googlebot’s crawl rate drop. It was very widespread at the time, but the discussion around that died as soon as Googlebot became more active. Initially, I thought maybe Googlebot became more efficient and needed to crawl less frequently. But now I think it was just a reporting error in Google Webmaster Tools.
The wildest thing going on right now, is the Google Floating Four behavior. In short, a search result would be in the fourth position and then go so a lower position, then come back to fourth spot and then go back again to a lower position. Tedster came up with some theories and posted them in a WebmasterWorld thread. Personally, I am not sure what to believe here – it does seem strangely tied to Google Universal Search, where Google places a universal search result in that fourth position. But honestly, the results I have seen are not universal search related, so I really don’t know what to think here.
Finally, to end off with on fairly sad note. It appears that most SEOs believe that competitors can hurt their Google rankings through links. I ran a poll that had over 135 responses, where 70% believed a competitor can hurt ones Google rankings with the use of links. Only 11% said they don’t believe that, while the rest had no idea what to believe. In summary, 81-percent of SEOs have a fear that a competitor can hurt your Google’s rankings through the use of inbound links.