2011: The Year Google & Bing Took Away From SEOs & Publishers

Increasingly over the years, search engines — Google in particular — have given more and more support to SEOs and publishers. But 2011 marked the first significant reversal that I can recall, with both linking and keyword data being withheld. Here’s what happened, why it matters and how publishers can push back if Google and Bing don’t change things.

Where We Came From

Some might believe that search engines hate SEOs, hate publishers and have done little over the years to help them. They are mistaken, either choosing to deliberately ignore the gains or, more likely, are simply unaware of how far things have come.

When I first started writing about SEO issues nearly 16 years ago, in 1996, we had little publisher support beyond add URL forms. Today, we have entire toolsets like Google Webmaster Central and Bing Webmaster Tools, along with standalone features and options, which allow and provide:

  • Ability to submit & validate XML sitemaps
  • Ability to view crawling & indexing errors
  • Ability to create “rich” listings & manage sitelinks
  • Ability to migrate a domain
  • Ability to indicate a canonical URL or preferred domain
  • Ability to set crawl rates
  • Ability to manage URL parameters
  • Ability to view detailed linkage information to your site
  • Ability to view keywords used to reach your site
  • Notifications of malware or spam issues with your site

There’s even more beyond what I’ve listed above. The support publishers enjoy today was simply unimaginable to many veteran SEOs who were working in the space a decade ago.

The advancement has been welcomed. It has helped publishers better manage their placement in those important venues of the web, the search engines. It has helped search engines with errors and problems that would hurt their usability and relevancy.

That’s why 2011 was so alarming to me. After years of moving forward, the search engines took a big step back.

The Loss Of Link Data

One of the most important ways that search engines determine the relevancy of a web page is through link analysis. This means examining who links to a page and what the text of the link — the anchor text — says about the page.

However, for years Google has deliberately suppressed the ability for outsiders to see what links tell it about any particular page. Want to know why THAT result shows up for Santorum? Why Google was returning THAT result for “define English person” searches? Sorry.

Google won’t help you understand how links have caused these things. It refuses to show all the links to a particular page, or the words used within those links to describe a page, unless you are the page’s owner.

Why? Google’s rationale has been that providing this information would make it harder for it to fight spam. Potentially, bad actors might figure out some killer linking strategy by using Google’s own link reporting against it.

It’s a poor argument. Despite withholding link data, it’s painfully easy to demonstrate how sites can gain good rankings in Google for competitive terms such as “SEO” itself by simply dropping links into forums, onto client pages or into blog templates.

Given this, it’s hard to understand what Google thinks it’s really protecting by concealing the data. But until 2011, there was an easy alternative. Publishers and others could turn to Google-rival Yahoo to discover how people might be linking to a page.

Goodbye Yahoo Site Explorer

Yahoo launched its “Yahoo Site Explorer” back in September 2005, both as part as a publicity push to win people away from Google and to provide information to publishers. The tool allowed anyone to see what link data Yahoo had about any page in its listings.

Today, Yahoo still supposedly wants to win people away from Google. But because Yahoo’s web search results are now powered by Bing, Yahoo has little reason to provide tools to support publishers. That’s effectively Bing’s problem now.

Yahoo closed Yahoo Site Explorer at the end of last November, saying as it still does on the site now:

Yahoo! Search has merged Site Explorer into Bing Webmaster Tools. Webmasters should now be using the Bing Webmaster Tools to ensure that their websites continue to get high quality organic search traffic from Bing and Yahoo!.

That’s not true. Yahoo Site Explorer was not merged into Bing Webmaster Tools. It was simply closed. Bing Webmaster Tools doesn’t provide the ability to check on the backlinks to any page in the way that Yahoo Site Explorer allowed.

The closure supposedly came after Yahoo “listened to your feedback” about what publishers wanted, as it posted earlier this year. I don’t know what feedback Yahoo was hearing, but what I’ve heard has been people desperately pleading with Yahoo or Bing to maintain the same exact features that Yahoo Site Explorer provided — and pleading for well over a year.

Yahoo-Bing Deal Has Reduced Competition & Features

When the US Department Of Justice granted its approval for Yahoo to partner with Microsoft, that was supposed to ensure that the search space stayed competitive. From what the Department Of Justice said in 2010:

After a thorough review of the evidence, the division has determined that the proposed transaction is not likely to substantially lessen competition in the United States, and therefore is not likely to harm the users of Internet search, paid search advertisers, Internet publishers, or distributors of search and paid search advertising technology.

I’d say dropping Yahoo Site Explorer did harm to both users of internet search and internet publishers. Yahoo Site Explorer was a distinctive tool that only Yahoo offered, allowing both parties named by the DOJ to better understand the inner workings of the search engines they depend on. It also reduced competitive pressure for Google to offer its own tool.

Indeed, things have gotten worse since Yahoo Site Explorer closed. At the end of last December, Bing officially confirmed that it no longer supports the link command in its help forum.

Next To Go, The Link Command?

The link command allows you to enter any page’s web address prefaced by “link:” in order to find links that point at that page. It’s a long-standing command that has worked for many major search engines as far back to late 1995, when AltaVista launched.

Google still supports this command to show some (but not all) of the links it knows about that point at pages. I’d link to Google’s documentation of this, but the company quietly dropped that some time around May 2008. Here’s what it used to say:

Here’s how the command still works at Google. Below, I used it to see what links Google says point to the home page of the official Rick Santorum campaign web site:

The first arrow shows you how the command is being used. The second arrow shows you how Google is reporting there are 111 links pointing to the page. Imagine that. Rick Santorum, currently a major Republican candidate for US president, and Google says only 111 other pages link to his web site’s home page.

The reality is that many more pages probably link over. Google’s counting them but not showing the total number to people who care about what exactly is being considered. I’ll demonstrate this more in a moment, but look at the worse situation on Bing:

One link. That’s all Bing reports that it knows about to those in the general public who may care to discover how many links are pointing to the Rick Santorum web site.

It’s Not Just An SEO Thing

People do care, believe me. I actually started writing this article last Monday and got interrupted when I had to cover how Google might have been involved with a link buying scheme to help its Chrome browser rank better in Google’s own search results.

I doubted that was really the main intent of the marketing campaign that Google authorized (Google did err on the side of caution and punished itself), but the lack of decent link reporting tools from Google itself left me unable to fully assess this as an independent third-party.

As soon as that story was over, renewed attention was focused on why Rick Santorum’s campaign web site wasn’t outranking a long-standing anti-Santorum web site that defines “santorum” as a by-product of anal sex.

Major media outlets were all over that story. My analysis was cited by The EconomistCNN, The Telegraph, The New York Times, MSNBC and Marketplace, to name only some.

But again, I — or anyone who really cared — was unable to see the full links that Google knew about pointing at both sites, much less the crucial anchor text that people were using to describe those sites. Only Google really knew what Google knew.

Third Party Options Good But Not The Solution

If you haven’t heard more complaints over the closure of Yahoo Site Explorer, and the pullback on link data in general, that’s because there are third-party alternatives such as Majestic Site Explorer or the tool I often use, SEOmoz’s Open Site Explorer.

These tools highlight just how little the search engines themselves show you. Consider this backlink report from Open Site Explorer for the Rick Santorum campaign’s home page:

The first arrow shows how 3,581 links are seen pointing at the page. Remember Google, reporting only 111? Or Bing, reporting only 1?

The next two arrows show the “external” links pointing at both the Santorum home page and the anti-Santorum home page. These are links from outsiders, pointing at each page. You can see that the anti-Santorum page has four times as many links pointing at it than the Santorum campaign page, a clue as to why it does so much better for a search on “santorum.”

But it’s not just number of links. Using other reports, I can see that thousands of links leading to both sites have the text “santorum” in the links themselves, which is why they both are in the top results for that word.

Because the anti-site has so many more links that say “santorum” and “spreading santorum,” that probably helps it outrank the campaign site on the single word. But because the official site has a healthy number from sources including places like the BBC saying “rick santorum” in the links, that — along with its domain name of ricksantorum.com — might help it rank better for “rick santorum.”

It’s nice that I can use a third party tool to perform this type of analysis, but I shouldn’t have to. It’s simply crazy — and wrong — that both Google and Bing send searchers and publishers away from their own search engines to understand this.

For one, the third party tools don’t actually know exactly what the search engines themselves are counting as links. They’re making their own estimates based on their own crawls of the web, but that doesn’t exactly match what Google and Bing  know (though it can be pretty good).

Not Listing Links Is Like Not Listing Ingredients

For another, the search engines should simply be telling people directly what they count. Links are a core part of the “ingredients” used to create the search engine’s results. If someone wants to know if those search results are healthy eating, then the ingredients should be shared.

Yes, Google and Bing will both report link data about a publisher’s own registered site. But it’s time for both of them to let anyone look up link data about any site.

The Blekko search engine does this, allowing anyone logged in to see the backlinks to a listed page. Heck, Blekko will even give you a badge you can place on your page to show off your links, just as Yahoo used to. But for Google, it’s “normal” for its link command to not show all the links to a page. Seriously, that’s what Google’s help page says.

Google, in particular, has made much of wanting people to report spam found in its search results. If it really wants that type of help, then it needs to ensure SEOs have better tools to diagnose the spam. That means providing link data for any URL, along with anchor text reporting.

Google has also made much about the need for companies to be open, in particular pushing for the idea that social connection should be visible. Google has wanted that, because until Google+ was launched, Google had a tough time seeing the type of social connections that Facebook knew about.

Links are effectively the social connections that Google measures between pages. If social connections should be shared with the world, then Google should be sharing link connections too, rather than coming off as hypocritical.

Finally, it doesn’t matter if only a tiny number of Google or Bing users want to do this type of link analysis. That’s often the pushback when this issue comes up, that so few do these type of requests.

Relatively few people might read the ingredients labels on the food they eat. But for the few that do, or for anyone who suddenly decides they want to know more, that label should be provided. So, too, should Google and Bing provide link data about any site.

Goodbye Keyword Referrer Data

Encrypted Search AnalyticsWhile I’m concerned about the pullback on link data, I’m more concerned about how last October, Google stopped reporting to publishers the keywords people used to find their web sites, for times when those people were logged into Google.

Link data has long been suppressed by Google. Holding back on keyword data is a new encroachment.

Google has said this was done to protect user privacy. I have no doubt many in the company honestly believe this. But it if was really meant to protect privacy, then Google shouldn’t have deliberately left open a giant hole that continues to provide this data to its paid advertisers.

Worse, if Google were really serious about protecting the privacy of search terms, then it would disable the passing of referrers in its Chrome browser. That hasn’t happened.

Unlike the long examination of link data above, I’ll be far more brief about the situation with Google withholding link data. That’s because I’ve already written over 3,000 words looking at the situation in depth last October, and that still holds up. So please see my previous article, Google Puts A Price On Privacy, to understand more.

Google’s Weak Defense

Since my October story, the best defense that Google’s been able to concoct for withholding keyword data from non-advertisers is a convoluted, far-fetched argument that makes its case worse, not better.

Google says that potentially, advertisers might buy ads for so many different keywords that even if referrer data was also blocked for them, the advertisers could still learn what terms were searched for by looking through their AdWords campaign records.

For example, let’s say someone did a search on Google for “Travenor Johannisoon income tax evasion settlement.” I’ve made this up. As I write this, there are no web pages matching a Google search for “Travenor Johannisoon” at all. But…

  • If this were a real person, and
  • someone did that search, and
  • if a page appeared in Google’s results, and
  • someone clicked on that page…

then the search terms would be passed along to the web site hosting the page.

Potentially, this could reveal to a publisher looking at their web analytics that there might be a settlement for income tax evasion for involving a “Travenor Johannisoon.” If the publisher starting poking around, perhaps they might uncover this type of information.

Of course, it could be that there is no such settlement at all. Maybe it’s just a rumor. Anyone can search for anything which doesn’t make it into a fact.

More likely, the search terms are so buried in all the web analytics data that the site normally receives that this particular search isn’t noticed at all, much less investigated.

Extra Safe Isn’t Extra Safe

Still, to be extra safe, Google has stopped passing along keyword data when people are signed-in. Stopped, except to its advertisers. Like I said, Google argues that potentially advertisers might still see this information even if they were also blocked.

For instance, say someone runs an ad matching any searches with “income tax evasion” in them. If someone clicked on the ad after doing a search for “Travenor Johannisoon income tax evasion settlement,” those terms would be passed along though the AdWords system to the advertiser, even though the referrer might pass nothing to the advertiser’s web analytics system.

So, why bother blocking?

Yes, this could happen. But if the point is to make things more private, then blocking referrers for both advertisers and non-advertisers would still make things harder. Indeed, Google still has other “holes” where “Travenor Johannisoon” might find his privacy exposed just as happens potentially with AdWords.

For example, if someone did enough searches on the topic of Travenor and tax evasion, that might cause it to appear one of Google Instant’s suggested searches.

So why bother blocking?

Also, while Google blocks search terms from logged-in users in referrer data, those same searches are not blocked from the keyword data it reports to publishers through Google Webmaster Central. That means the Travenor search terms could show up there.

So why bother blocking?

Nothing has changed my view that, despite Google’s good intentions, its policy of blocking referrers only for non-advertisers is incredibly hypocritical. Google purports this is done to protect privacy, but it puts its own needs and advertisers desires above privacy.

Blocking referrers is a completely separate issue from encrypting the search results themselves. That’s good and should be continued. But Google is deliberately breaking how such encryption works to pass along referrer data to its advertisers. Instead, Google should block them for everyone or block them for no one. Don’t play favorites with your advertisers.

What Google & Bing Should Do

Made it this far? Then here’s the recap and action items for moving forward.

Bing should restore its link command, if not create a new Bing Site Explorer. Google should make sure that its link command reports links fully and consider its own version of a Google Site Explorer. With both, the ability for anchor text reports about any site is a must.

If there are concerns about scraping or server load, make these tools you can only use when logged in. But Yahoo managed to provide such a tool. Blekko is providing such statistics. Tiny third-party companies are doing it. The major search engines can handle it.

As for the referrer data, Google needs to immediately expand the amount of data that Google Webmaster Central reports. Currently, up to 10,000 terms (Google says up to 1,000, but we believe that’s wrong) for the past 30 days are shown.

In November, the head of Google’s spam team Matt Cutts — who’s also been involved with the encryption process — said at the Pubcon conference that Google is considering expanding the time period to 60 days or the queries to 2,000 (as said, we think — heck, we can see, they already provide more than this). Slightly more people wanted more time than more keywords shown.

I think Google should do more than 60 days. I think it should be providing continuous reporting and holding that data historically on behalf of sites, if it’s going to block referrers. Google is already destroying historical benchmarks that publishers have maintained. Google’s already allowed data to be lost for those publishers, because they didn’t begin to go in each day and download the latest information.

So far, all Google’s done is provide an Python script to make downloading easier. That’s not enough. Google should provide historical data, covering a big chunk of the terms that a site receives. It’s the right thing to do, and it should have been done already.

What Publishers Can Do

An anti-SOPA-like effort as targeted GoDaddy isn’t going to work with the search engines. That’s because the two biggest things that publishers could “transfer” out of Google and Bing are their ads and their web sites. But there’s no place to transfer these to that wouldn’t hurt the publishers with incredible amounts of lost traffic.

This doesn’t mean that publishers are powerless, however.

Bing is desperate to be seen as the “good” search engine against “evil” Google. Publishers should, whenever relevant, remind Bing that it’s pretty evil not to have maintained its own version of Yahoo Site Explorer much less to have closed the link command.

Mention it in blog posts. Mention it in tweets. Bring it up at conferences. Don’t let it die. Ask Bing why it can’t do what little Blekko can.

As for Google, pressure over link data is probably best expressed in terms of relevancy. Why is Google deliberately preventing this type of information from being studied? Is it more afraid that doing so will reveal weaknesses in its relevancy, rather than potential spam issues? Change the debate to relevancy, and that gets Google’s attention — plus the attention of non-publishers.

There’s also the issue of openness. Google shouldn’t be allowed to preach being “open” selectively, staying closed when it suits Google, without some really good arguments for remaining closed. On withholding link data, those “closed” arguments no longer stand up.

As for the referrer data, Google should be challenged in three ways.

First, the FTC will be talking to publishers as part of its anti-trust investgation into Google’s business practices. Publishers, if asked, should note that by withholding referrer data except for Google’s advertisers, it’s potentially harming competing retargeting services that publishers might prefer to use. Anti-trust allegations seem to really get Google’s attention, so make that wheel squeak.

Second, question why Google is deliberately leaving a privacy hole open for the searchers it’s supposedly trying to protect. If Google’s really worried about what search terms reveal, the company needs a systematic way to scrub potentially revealing queries from everything: suggested searches, reporting in Google Webmaster Central, AdWords reporting as well as referrer data.

Finally, withhold your own data. Are you opted-in to the data sharing on Google Analytics that launched back in 2008? Consider opting-out, if so:

To opt-out, when you log in, select an account, then select “Edit Analytics Account” next to the name of the account in the Overview window, then you’ll see options as shown above and as explained on this help page.

Opting out means you can’t use the benchmarking feature (fair enough, and no loss if you don’t use it) and Conversion Optimizer. If you still want Conversion Optimizer, don’t opt-out or alternatively, tell Google that you should have a choice to share data solely for use with that product but not other Google products.

There might be other drawbacks to not sharing that I’m missing. But we haven’t been sharing here at Search Engine Land since the beginning of the year. So far, we’re not having any problems.

Google loves data. Withholding your own is another way for publishers to register their displeasure about having data withheld from them. And it’s the type of thing that Google just might notice.

Related Articles

Related Topics: Channel: SEO | Features: Analysis | Google: Critics | Google: SEO | Google: Webmaster Central | Link Building: General | Microsoft: Bing Webmaster Tools | SEM Industry: General | SEO: General | Top News | Yahoo: Site Explorer

Sponsored


About The Author: is a Founding Editor of Search Engine Land. He’s a widely cited authority on search engines and search marketing issues who has covered the space since 1996. Danny also serves as Chief Content Officer for Third Door Media, which publishes Search Engine Land and produces the SMX: Search Marketing Expo conference series. He has a personal blog called Daggle (and keeps his disclosures page there). He can be found on Facebook, Google + and microblogs on Twitter as @dannysullivan.

Connect with the author via: Email | Twitter | Google+ | LinkedIn



SearchCap:

Get all the top search stories emailed daily!  

Share

Other ways to share:
 

Read before commenting! We welcome constructive comments and allow any that meet our common sense criteria. This means being respectful and polite to others. It means providing helpful information that contributes to a story or discussion. It means leaving links only that substantially add further to a discussion. Comments using foul language, being disrespectful to others or otherwise violating what we believe are common sense standards of discussion will be deleted. Comments may also be removed if they are posted from anonymous accounts. You can read more about our comments policy here.
  • http://andrewnealjenkins.com Andrew Jenkins

    Amen!

  • http://www.hireseomanager.com Steven

    The “Not Provided” blocking actually makes us LESS secure. Google hacking has been popular for years for footprinting websites and gathering sensitive information about potential targets.

    A Google search with something like intitle: index.of passwd.bak povides a wealth of information to hackers. Up until this change it was easy to examine analytics to see if anyone was pounding on your site or found documents on the server that should not be there. To avoid detection pen testers would not click on the results for that reason. Now all they have to do is log in with a bogus Google account and examine search after search that you will never see.

  • http://www.sitesell.com KenEvoy

    What a phenomenal article, Danny. It takes someone of your stature to make Google and Bing take notice of insights such as these. But… will they do anything about it?

    Well, humans make up a company. I’m hopeful that Googlers read this and realize that they are party to this. Realistically, though, I Google’s corporate DNA has probably mutated beyond repair.

    Their more likely response is a phone call to you asking, “What the ____ did you write THAT for?”

    Opportunity knocks. Will Bing (or others) open it?

    I believe that Google is nearing a tipping point. The “good guy” image is eroding, replaced by a growing impression of “Google be evil.”

    Many Webmasters have long felt that Google’s openness with Webmasters was merely good PR and clever manipulation. I had considered them cynics in earlier years. Wrong…

    They were “canaries in the mine.”

    My own “good guy” illusion of Google ended when they unilaterally added a “Back to Google” link on every AdSense ad, with no way for publishers to opt-out and with no payment for that click back to google.com.

    It was theft, pure and simple. It stole free exposure on billions of ad impressions and it did not pay what other advertisers paid on those same ads for clicks.

    At that time, the uproar was instant, viral and LOUD. Google removed that indefensible feature. If memory serves, they did so without apology, saying merely that they would revisit the policy at some time in the future.

    What was most striking, though, is how quickly they pulled it… within hours of the growing uproar. Anyone who knows anything about feature releases knows that a company the size of Google could only act that quickly if they were ready with a contingency plan.

    Otherwise, the time to analyze an issue like this, to meet and decide, and to do another release would normally take days if they had not already anticipated this reaction as a strong possibility.

    You can safely bet that a “kill this if it hits the fan” button was in place.

    That can only mean that they KNEW it was wrong. But it was a “CLICK-GRAB” that they could not resist, despite knowing that this was evil theft that hurt publishers.

    Google’s contingency plan worked wonderfully. The uproar disappeared. It’s a brilliant “shut down the controversy” policy, one they use repeatedly to quell noise when it gets too loud (it’s one that politicians should study ;-) ).

    —–SIDEBAR—–
    The recent “paid links” scandal by Google was also rapidly shut down by Google’s “blame the agency” response (always a good scapegoat) AND, more importantly, by their decision to penalize itself (Chrome) for search results for a couple of months.

    This “self-penalty” was actually well perceived by many. Clever marketers that they are, this was the only option. Turn lemons into lemonade.

    It was more than damage control. They came out looking like heroes of their principles. Be realistic and picture the conversation in their emergency meeting over this issue.

    A hero does not do the right thing because it was the best strategic move. Heroism is the selfless and optional act of doing right at great peril. There was nothing selfless or optional about Google’s self-penalty.
    —–SIDEBAR—–

    Google’s pulling of the “back to Google” AdSense link theft was a watershed moment for me. I was a firm believer in Google and it’s “don’t be evil” motto, of its idealistic view of its mission on the Web.

    But…

    Any company that would plunder publishers so blatantly has a mindset that is all wrong. It’s a strong symptom of disease. Many actions since then have only confirmed the diagnosis.

    Yes, they still SAY all the right things when it suits them. But they turn around and DO the wrong things for their own benefit.

    When called on it, they either stonewall or emit ridiculous explanations. And to prevent being called out, they have a new strategy… take away data. Taken together, it’s the mushroom strategy…

    “Keep them in the dark and feed them BS.”

    Google’s withholding link data is strong self-defense. The lofty, noble motives that they claim are false, as you dissected so sharply, Danny.

    I agree totally with your comment that withholding link data is about their fear that it will “reveal weaknesses in its relevancy, rather than potential spam issues.” Your analysis of search results for “Santorum” show how to exploit the same weakness in Google’s algorithm that “Googlebombers” do…

    Create enough links for a search term that is not very competitive (ex., “santorum”) that lead searchers to a misleading and/or damaging page that, in fact, has little or nothing to do with the actual SEARCH INTENT of that term. (Google defines link-bombing more narrowly, to its own advantage, but it’s the same weakness.)

    By doing so, the “linking” factors within Google’s algorithm dominate the small amount of data generated by ALL the other algorithmic factors (since the search term has little competition) combined, manipulating the damaging page into top spot of the SERP.

    Is that a Googlebomb? It depends if you accept Google’s definition of it. I don’t. “Santorum” is a perfect example of DEFEATING SEARCH INTENT (no one is searching for a fictitious word) through a well-coordinated, emotionally charged link scheme. The same technique is used to denigrate trade names commercially.

    Essentially, your own name, be it personal or product, is at risk. Why Google does not shut down this weakness that allows others to damage one’s most important asset (your name) is beyond me.

    Instead of fixing the weakness, they choose to hide it, deny it or (if the noise and embarrassment is loud enough) publicly announce it has been fixed… until the next time.

    What about withholding keyword data?

    Their “privacy” reason must be a lie, since other actions are not consistent with the claim. It would be more accurate to simply say…

    “We have decided to sell privacy to advertisers, but not give it for free to anyone else anymore. It’s our engine and we can do as we like.”

    We would not like it, but we’d at least admire the honesty. Instead, we don’t get the data AND we don’t like the deception.

    They’ve learned a lot since the AdSense fiasco. They are phasing out the data gradually. Only about 10% of searchers were “logged in” currently, so the loss does not seem great. But with their determined push on Google+, tying it across all their tools, that percentage will increase.

    In a few years, there will be virtually no keyword data from Google, not enough to be statistically useful.

    —–SIDEBAR—–
    Hopefully, this will give pause to those using Google Plus. If Google ever controlled both social AND search, given the mentality of this company, it would be long-term disastrous.
    —–SIDEBAR—–

    Danny, your demolition of their justification needs no further comment. It exposes Google’s hypocrisy for what it is.

    They want ONE-WAY openness only.

    They will continue to manipulate through mistruth, hiding in the darkness of no-data that they create.

    What else did Google take away from us this year, Danny?

    There are many more worrisome Google trends, perhaps the most important being the loading of more and more “Google product” at the top of the SERPS, pushing the organic results below the fold in some cases.

    Publishers are being marginalized.

    This is the biggest takeaway threat of all.

    Publishers cannot afford to be hidden by Google’s quest for ever-greater quarterly gains (a pressure that Brin and Page swore they would never succumb to, when they went public).

    Sure, we can (and should) choose better monetization options than AdSense (there are many — AdSense should be seen as “starter-earning” until you find far better ways to monetize traffic). BUT…

    We cannot afford to be hidden so Google can make yet more billions. THAT is no longer a partnership between publisher and search engine.

    It’s called “being used” without recourse. Which brings me to the final part of your article…

    What is our recourse? What should we do?

    When Google stole exposure and clicks from publishers, they DIRECTLY AND OBVIOUSLY took money out of publishers’ pockets for their own benefit.

    The issues was obvious, inflammatory and simple. The grassroots uproar was spontaneous and it was not going to go away.

    So Google backed down.

    But these issues? They are complicated. The debates are tortuous.

    Arguing with Google is pointless. And movements like withholding information from Google Analytics will not be adopted by many. (If it is, Google will take counter-measures under the guise of some principle or another.)

    In any event, Google gathers so many googol-bytes of data from so many sources that there’s no way to hold back enough to get their attention.

    Expect no groundswell of action here by publishers…

    Yes, money IS being taken from publishers’ pocket, so it should matter just as much as the AdSense caper. But it’s far too subtle and complicated for spontaneous outrage.

    I’d like to expand on your suggestions, Danny, and offer four possible courses of action that MIGHT make a difference…

    1) Anti-trust — Google is vulnerable here, as you point out. Strong arguments can be made about the many ways that Google takes advantage of its monopoly on search to disadvantage others.

    WHO, though, will push this? Who will inform the rather uninformed regulators so that they cannot be snowed by Google’s team of “weathermen?”

    2) Bing — If Bing had a strategic bone in its body, it would recognize its opportunity. THIS is the time for Bing to step up with a true partner mentality and deliver information, tools and support that would win over Webmasters and publishers.

    We live in a digital world. The right product from the right company with the right mentality can sweep Google aside almost as quickly as it swept other engines aside, as rapidly as Facebook did to MySpace. OK, maybe not that quick. But it is certainly possible.

    Google swept into its position with a combination of high-quality Webmaster relations and a superior product. We were the early adopters who made Google “cool.” We helped Google cross the chasm and spread to the “user on the street.”

    Bing should realize who needs to be courted and won over if it hopes to supplant Google.

    3) Media? Not likely. The issues are too complicated for their readers. They prefer “santorum” jokes (expect spreadingsantorum.com to fade in results now that the media is getting too loud — if history holds, expect a manual fix that may or may not be admitted).

    Rule out media unless someone in a high position of authority in the world of search engines leads a strong and public movement that gets the media’s attention through clear, simple encapsulations of the “Google is increasingly evil” trend.

    Publishers and webmasters need a rallying point, someone with instant and strong street-AND-media cred. (Um, Danny? ;-) )

    Flash-mob terminations of Google Plus accounts, for example, would generate media attention (and therefore, Google’s).

    But this needs to be organized. Who will take that lead?

    4) At times like this, “black swans” emerge. What is the next great technology that can revolutionize search? (It’s not Blekko — the quality is not there, at least not yet.)

    There’s nothing for us to do here except jump on “the next Google” with the same intensity that we jumped on the “don’t be evil” Google that we used to love.

    Capitalism is self-correcting. As we’ve seen on Wall Street, those corrections need a little push, be it by legislators or grass-root movements. When Google’s abuses become too great, those opportunities appear.

    The end-user is the ultimate arbiter. Will s/he get sick of seeing Google “becoming” the Web? Will Google lose credibility when a search reveals only Google products? Will nausea be induced by Google Plus being shoved down our throats at ever step of one’s interaction with Google?

    I sure hope so. But I don’t suggest we hope that “something happens.”

    Great opportunities exist for the individual online. Publishers, large and small, have tremendous potential if they deliver original, high-value content that delivers what the end-user wants.

    However, Google’s mindset, their tilting of the playing field, is worrisome. Your voice is loud, Danny. I hope they hear it because “what was withheld in 2011″ is only a harbinger of what’s to come.

    Google is no longer a company with WIN-WIN in mind.

    Who will lead the measures required that helps Google see the light of fairness or that helps a competitor emerge that does realize that no one can own the Net?

    Warm regards and with thanks for an especially stellar article,
    Ken Evoy
    Founder, SiteSell.com

  • http://FrugalZeitgeist.com F.Z.

    Setting up my withholding of data from Google Analytics right this minute… Thanks for the in-depth article.

  • http://www.lauraalisanne.com lauraalisanne

    This post (heck it’s a full-blown tome) is so comprehensive and helpful, that I’m asking my entire marketing team to read it ASAP.

  • http://www.seoconsultant.ie Ivan

    Really nice post!!!

    Our Dear ‘ Do no Evil – Google’, no matter how we loved it got really bad. It is especially clear with new guys like Blekko that show more about inbound links that Google ever did.

    It will result in people steadily moving from the Google fanboly club into the group of people who will be aware of and say aloud: ‘Google is very evil’. It did happen to Microsoft, and their share price NEVER vent up after majority of people agreed that ‘$MSFT is Evil’.

    Pity because I did like Google all this years. But when you document what we have suffered – I am not really sure I can say I like them anymore? The moment when the majority of the people think negative is a point of no return. If you do not reinvent someone of a Steve jobs character. And perception of Google isn’t really getting any better overall.

  • Chas

    Great article, Danny;
    As much as Google and Facebook would like to own the internet, they don’t. Microsoft Advertising is a disaster, and I certainly will never use them, again. There are alternatives to
    find backlinks, as you mentioned. There is Alexa, although they are slow on updates and there is Compete(I am not familiar with how good they are; I just know it’s an alternative).
    There are also alternatives to Adwords, such as jumpfly and 7search.
    I will be closing my gmail account this year and plan on patronizing Goliath as little as possible.
    I use Gigablast as my default search engine & only go to the behemoths when I am not happy with the results. I will also check out Blecko~ thanks for the tip.

  • rypher21

    This makes me wonder, im thinking of ways to cope up with this changes..

  • http://linkwhatishr.com Mamun-ar-Rushid

    Interesting article due to we were not aware before & sincerely thinking you for a helpful article and looking forward for your further assistance for beginners.

  • http://www.tielict.com/blog/ Tom Mghendi

    Withholding your own data by opting out of data sharing in Google analytics may work, but probably for a short time only. The platform is Google’s, if they want to use the data, they still can, regardless of whether you have opted out or not. The loss would be yours – not being able to use conversion optimizer and the benchmarking feature.

    Since blekko and other smaller players provide some of this data, we may as well let Google go on with this potentially self-destructive path. With time, blekko’s (or any other tool’s) quality will improve enough to be a viable alternative. And this would be good for the internet (or web) ecosystem.

  • http://www.facebook.com/SEOManoj Manoj Pallai

    Some times even just few days ago I just felt that types issue. So here which one is right? then how I can we measure the exact figure of links?

  • https://plus.google.com/u/0/108529133658461221225/ Neil Grainger

    Lots of interesting points here. I’ve been thinking about the keyword referrer data problem and I think Google’s reasons might be a cover, but not for the nefarious reasons people think. I think they’ve hit a technological stumbling block. As the likes of Google+ becomes more popular, more and more people will be logged into a Google account. This is a secure connection by default and as result no referrer data will be sent. Google hasn’t really implemented anything, they’ve just hit a problem they haven’t got a fix for and have decided to call it a privacy feature as a bit of marketing before questions start being asked.

  • http://ianmacfarlane.com/ Ian Macfarlane

    Hi Danny

    A great article, a definite (and troublesome) trend.

    A small correction re Bing – they removed support for the link: (and linkdomain:) operators a long time ago. What that search query you’re showing with one result is doing is a simple keyword match, not a link search.

  • http://www.nathanielbailey.co.uk Nathaniel Bailey

    wow that was one uber long and informative article Danny :)

    Not sure if YSE going is going to help google though, I cant see them creating any tools like that as it would go against the changes they have made to show us (as public searchers) even less information about sites, unless signed into WMT.

    So YSE, means one thing and one thing only in my books, and thats top sites such as OSE and Majestic should now be getting more traffic due to YSE going. Plus I think the info offered at OSE is and always was much better then that of YSE, so no loss to me and Im sure a lot of others feel the same in that respect.

  • http://www.seo-theory.com/ Michael Martinez

    Of course, Yahoo! Site Explorer never could show anyone which links Google had indexed or was allowing to pass value.

    SEOmoz’ Linkscape doesn’t show anyone which links Google has indexed (or which links Bing has indexed) or allows to pass value.

    Majestic SEO doesn’t show anyone which links Google and Bing have indexed or allowed to pass value.

    Gazing at backlink reports doesn’t explain search rankings to people. Complaining about the loss of a feel-good tool doesn’t do anything to advance the search engine optimization industry out of the dark ages.

  • http://www.fangdigital.com Jeff Ferguson

    Thanks, Danny… at long last, a respectful, well thought-out, researched, and argued article about these issues that actually provides enough data and insight to actually change my mind on the issues. I was on the side of Google on many of these because, frankly, Google doesn’t have to provide any explanation at all for the changes it makes to its product, especially to the community that is designed to take advantage of its system rather than just use it.

    Consumers should always be the real change catalyst for any business and while we are both, SEOs can often ride on the side that the engines are there for us and us alone. However, as you say early in your article, Google has done plenty for us over the years to make our jobs easier, which again would provide a solid argument for why we should just shut up when they make changes that might make our jobs a little hard for awhile.

    However, your logical arguments on why their “official statements” don’t hold water are enough for me to join the fight. Rather than you and your publication playing the role as the victim or presenting this case as some sort of conspiracy theory, you present facts, and I respect you that much more for your efforts.

  • Kevin Hill

    Google is entrenching, and becoming more evil. At HFT, we saw just about 35% of our organic traffic keywords being blocked. And for what reason – none that I can reasonably understand.

    What I can understand is that google now has a very expensive version of GA that you can pay for. They are also seeing that their competition has diminished from 2 to 1. And time, and time again, Yahoo and Bing have demonstrated that they just can’t grab market share.

    Google loves data. And now the next monetization is getting ready to happen. I have no doubt that Google will relent, and give us amazing tools to look at data.

    At a price….

    and that my friends, is how google explodes and becomes 10 times more profitable than it already is.

  • Klais

    Exceptional post, Danny.

    As far as Google search privacy goes, keep in mind the “loophole” goes beyond just PPC: all mobile and tablet organic searches are currently un-encrypted, and pass referring keyword data as before.

    As explained in my November 2011 SEL column (below), I think Google’s made a strategic decision to purposely avoid inhibiting mobile web development, and to continue providing the mobile keyword data marketers need to build relevant mobile content.

    By year-end, I think that will change: Mobile/tablet organic search privacy will become strategically important, and referring keyword data stripped accordingly. But right now it means PPC and Mobile are both strategically important to Google — and should be to EVERY marketer as well.

    http://searchengineland.com/give-thanks-google-hasnt-secured-mobile-search-data-yet-101819

    Brian @ Pure Oxygen Mobile

  • http://www.freshlols.com F.L.

    Thank you Danny Sullivan for saying what I’m sure everyone has been thinking in the past 12 months. MUST READ article for every new/old SEOs.

    Goodbye SEO and hello Social Media HaXoRing.

  • http://www.brickmarketing.com Nick Stamoulis

    2011 was certainly a year of great changes within the SEO industry. Our practices had to be revisited and reworked in so many ways. The lack of keyword data was really a huge blow. Google said that it would affect a small percentage of searches, but that’s not what I’m seeing. Depending on the industry, it really could be a significant number. We rely on that data to make not only SEO decisions but business decisions in general.

  • http://www.seochemist.com Oli

    There have been several small blows, but fortunately most ethical SEO seems to have been untouched.

    I am however looking forward to seeing what 2012 will bring!

  • http://www.messagecrafters.net TeriPatrick

    Question: Are there any serious efforts underway to re-imagine the Internet?

    If not, it’s about time to start that discussion. One reason we face challenges now is that the decisions about how to organize access to information is made by technology experts – not people with expertise in business, publishing, community, education, etc. It’s time to think outside algorithm.

    Here are few things that need to change:

    1: Get rid of incentives to clog the web with garbage content. The SEO game, as played by Google’s rules, has had many negative unintended consequences. One of the biggest is the discouragement of true content experts. Why volunteer your time to share your research and insights when you have to expend untold hours marketing your content to rise about the garbage that has been created for no other purpose than to improve search rankings.

    This is a serious issue with far-reaching implications. The internet should improve access to high-quality information. Instead it has begun to bury it under an avalanche of garbage content.

    2. Create financial incentives to contribute high-value resources – including well-organized access to business sites. I don’t know exactly what this should look like, but here is one idea: A local community forum that is maintained by a paid staff and funded by local advertisers. It is interactive, encouraging community engagement – reporting on local issues, community groups, local sports, bands, etc. People in the community who contribute high-quality content receive a financial reward for their efforts. That increases the quality and makes the site a draw for community members. The site should offer tools to help organize meetings, garage sales, fundraisers – to improve engagement. Local businesses would benefit because they would finally have a forum for reaching their local target audience. The ad spend would be more effective for them. The community benefits because they have a way to stay informed and connected locally.

    Here is my point: Google capitalized on a world of free information by organizing access to it. That access has great value – but the approach has begun to destroy the quality of the information. Think newspapers. Great that we can read them for free. Not so great that journalism is no longer a paid profession. Reporters are disappearing along with the newspapers – with huge long-term implications.

    Some how the model has to change to provide a financial incentive for people without trust-funds to make a career out of creating or organizing access to high-quality information. Google is all-in with the algorithm approach. This is one instance in which a computer really can’t replace human effort.

  • http://www.dealhorizon.com John C Sharp

    Great article. Anyone that has recently integrated with any of the APIs from either Yahoo or Google is either nodding or applauding while reading this: the truth is, the data being provided, even by these paid APIs, is almost always incomplete or inaccurate.

    It is possible that Google will indeed come out with a new service that enables paid access to data. But for those of us that ALREADY pay thousands of dollars a month to consume data from subscription APIs, the trust level is dropping fast. Will all the data be shown? Doubtful. It isn’t shown now, and the only way of determining that is to do the kind of analysis this author has done.

    There are alternatives – Majestic SEO, Heardable.com, and others – hopefully, some of these companies will benefit from the lack of data, along with their consumers.

  • http://rockfi.sh steveplunkett

    Thanks for the wrap-up Danny, head down in FOY campaigns, great summary, now i can get back to work.

    =)

  • http://siteexplorer.co S.E.

    Wow, I don’t like where this is heading. Great write-up and incredibly verbose.

    +1 for Blekko – I hope somehow they manage to make a much bigger name for themselves.

Get Our News, Everywhere!

Daily Email:

Follow Search Engine Land on Twitter @sengineland Like Search Engine Land on Facebook Follow Search Engine Land on Google+ Get the Search Engine Land Feed Connect with Search Engine Land on LinkedIn Check out our Tumblr! See us on Pinterest

 
 

Click to watch SMX conference video

Join us at one of our SMX or MarTech events:

United States

Europe

Australia & China

Learn more about: SMX | MarTech


Free Daily Search News Recap!

SearchCap is a once-per-day newsletter update - sign up below and get the news delivered to you!

 


 

Search Engine Land Periodic Table of SEO Success Factors

Get Your Copy
Read The Full SEO Guide