• http://www.livinginthailand.net/ Neale

    Looks like I might need to start doing a little pay per click just to see what terms people are using to find my sites..

  • http://twitter.com/BIGELLOW Bob Bigellow

    I’m having a difficult time, reading this article, determining which side of the issue you’re on. In many cases, you talk about Google “blocking” referrer data. In other cases, you suggest they’re “leaking” referrer data.

    Both terms are misleading in their own ways.

    When it comes to “blocking” referrer data, this is done by the browser, not by Google. By design, browsers do not pass referrer information when it’s from a secure connection. So, when Google switched to secure search (whether it was the specialized domain or the main site), it was the browsers that were now blocking the referrer data.

    The problem is, people who pay for ads are paying for two things. First, they’re paying for the ad itself. Second, they’re paying for the ability to track the success of that ad. Since the ads are based on a bidding system, it is critically important that advertisers have access to the kind of analytics that would be needed to determine how valuable a search term is, so that they aren’t over-bidding.

    So, to be a good business partner to advertisers, Google needed to capture this data on behalf of the advertisers paying for the inclusion. So, they effectively work around the browser blocking issue. They are able to do this because: 1) They know the owner of the ad 2) The ad is hosted on their own site, so they can employ any number of techniques to capture data on behalf of the advertiser.

    It’s probably safe to say that people click on organic results more than they click on ads. So, whichever tracking methods are used (whether it is an interstitial page or capturing data based on a click event) is going to be a certain amount of computing overhead. For them to employ this same technique on organic results would require a huge huge increase of processing needed by Google’s servers without any compensation. I’m sure they are taking this into account while looking for alternative solutions.

    The other issue is the fact that there’s little point in capturing this data for domains that have not been claimed by a webmaster. So, this leaves them with an engineering incentive to only add tracking to organic results for those who have claimed the domain name for purposes of the Webmaster Tools. It’s very possible they have considered that this additional analytics (since it would require more resources) might be part of a premium service. For instance, you might pay a small fee to get an “Enterprise” version of Google Webmaster Tools or access via Google Analytics. Considering there is some overlap between Google Analytics and Google Webmaster Tools, perhaps they are looking to streamline these services a bit. Imagine being able to log into Google Analytics even if you haven’t added any code to your site, but just claimed the domain name. Imagine you have access to the types of analytics found in Google Webmaster Tools. Then, by adding some code to your site, and now you have more data. By using AdWords, even more data. By paying a premium, even more data still.

    It’s a tricky situation. Switching to all SSL all the time and allowing the browsers to block the referrer data and making no attempt to capture this themselves on behalf of advertisers, they’d lose their advertisers to search engines that don’t have SSL search and can provide these analytics to their paying customers. They’d call it “blocking referrer data”. If they circumvent it for paid advertisements AND organic results, they’d have privacy advocates claiming they are “leaking referrer data” and making SSL search less useful. So, they found a middle ground. Let it be blocked for organic results, let it be captured for their paying customers who rely on the data.

    Now, it turns out, this problem is being solved in the open with a new standard for browsers to support. It looks like they may use this standard to allow the referrer data to flow, even in SSL situations. So, they’re just doing this at the pace of standards implementation, which can take months or years.

    On the flip-side, you might ask… why secure search only for logged in users? It’s likely for two reasons. First, someone who is logged in now gets personalized results. Information from Google+ is embedded in results. Maybe one day, a search on Google.com will also show emails matching the terms from Gmail, documents matching the term in Google Docs, files matching the term in Google Drive, etc… With this combination of both public and private data in results, it makes sense to secure search for logged in users. Users who are not logged in are only seeing public data, so it’s less of a concern about securing that search. If they are particularly paranoid that people might know what they are searching for, they’d just log in… or use the special domain to encrypt their search.

    The second reason might simply be resources. Encrypting results takes processing power. Google handles a TON of traffic. So, it only makes sense for them to roll this out in stages. First, with logged in users. Then, convince a bunch of people to always be logged in via Google+. Once that flattens, they can introduce it into another browser. It may very well be that Google’s data shows that more people are searching on Google while using Chrome than while using Firefox. So, it would make sense to slowly roll out this feature further by allowing it to be implemented in Firefox as the next small step. Then, after they see how the boat holds up to that amount of increased SSL traffic, they can implement it into Chrome. Maybe the final step would be to just have it active for everyone, whether logged in or not. This time might be a far way out if they need to beef up resources to handle the load and traffic, or if they need to first solve the referrer problem before they upset the webmasters and advertisers of the world with simply not enough data to extrapolate from.

    So, just to be clear, which side of the fence are you on? Do you see Google as “blocking referrer data” while siding with advertisers and webmasters, or do you see Google as “leaking referrer data” while siding with privacy advocates? Or, are you just sitting on the fence and playing both sides against each other?

  • http://searchengineland.com/ Danny Sullivan

    Bob, Google SSL Search’s blocking of referrer data is not done by the browser. It’s done by Google. Deliberately.

    With your browser, going from SSL to non-SSL should pass no referrers; going from SSL to SSL will pass them. 

    With Google SSL search, Google deliberately prevents referrers from passing in either case with organic listings but allows them to pass for ad clicks. 

    This was explained in the article above, but it’s even further explained in these background articles:

    http://searchengineland.com/google-to-begin-encrypting-searches-outbound-clicks-by-default-97435http://searchengineland.com/google-puts-a-price-on-privacy-98029
    Before October 2011, Google didn’t suggest that referrer data was somehow private information that needed to be blocked. It only claimed that after starting SSL search by default. If that’s the case, then it doesn’t matter what the advertisers may or may not want. If the data is private, you block it.

    The argument about what Google has to do for advertisers to do analytics is flawed. They can track the success of an ad through the Google AdWords system without having it tied to an IP address in the way that providing a referrer allows. IE: advertisers could be limited to exactly the same “safe” release of data that publishers now get with Google Webmaster Central.

    Of course, doing this prevents Google from allowing advertiser to make use of retargeting, which is perhaps one reason Google has allowed this hole to remain.

    The argument about capturing data for domain that have not been “claimed” by a webmaster is also flawed. The data is there. Google has it all. It’s not like it suddenly starts recording it only after a domain is verified in Google Webmaster Central. It’s all part of Google’s regular clickthrough logging that is done.

    The issue seems mainly to be that Google doesn’t want to spare the machine power to make this data accessible to site owners for longer than 30 days. It could do this; it doesn’t consider it a priority.

    It’s possible this is all part of a secret uber plan to get site owners to pay. I don’t think that’s the case. I think that some within Google wanted to drop all referrers, because that would have been most secure. I think that the advertising side of Google screamed bloody murder that this couldn’t happen. And the advertising side won.

    That’s my suspicion trying to read the tea leaves. I don’t have any inside sources saying this. I do know that it would be incredibly difficult for Google to launch a “premium” analytics tool that releases the same data that we’ve been told for months is now too sensitive to share through referrer strings.

    I know exactly the argument why this might apply to Google Search Plus Your World. Again, it was in one of my background articles above. It doesn’t add up. You can read that for yourself, why it doesn’t, here:

    http://searchengineland.com/googles-search-plus-your-world-to-launch-beyond-us-113840

    The resources argument also doesn’t add up, in terms of encryption power. Google has given the go ahead, as I explained, for Firefox to make encrypted search the default for millions of searches each day for logged out users. If they have a resource problem over this, that would have been flagged exactly as it was in 2010, as I also explained, when this was first proposed.

    As for the side I’m on, it’s privacy. Google has declared that search terms themselves are potentially private and therefore can’t be shared with publishers. But it has left all these loopholes. It’s OK to share with advertisers. It’s OK to share through AdWords. It’s OK to share through Google Webmaster Central. Leaving loopholes isn’t good, when it comes to privacy.

    So to go back to what I wrote before on this subject in early January:

    http://searchengineland.com/googles-results-get-more-personal-with-search-plus-your-world-107285

    “Blocking referrers is a completely separate issue from encrypting the search results themselves. That’s good and should be continued. But Google is deliberately breaking how such encryption works to pass along referrer data to its advertisers. Instead, Google should block them for everyone or block them for no one. Don’t play favorites with your advertisers.”

    And after Search Plus Your World launched:

    “Today’s change does nothing to change my view that Google needs to revisit the referrer blocking and either make it a block for everyone, including advertisers, or find a better way to filter search terms that get made visibile in various ways.”

    The ideal solution in my book would be this. Referrers are blocked for everyone. Publishers won’t like that, but I’ve written before that referrers are likely to eventually get blocked in browsers anyway. It’s like fighting against the tide over that. But while referrers are blocked, Google expands the data it reports to publishers through Google Webmaster Central, the “safe” method I’ve mentioned before — and examines ways to ensure that the possibly “private” searches are somehow filtered out.

  • daveintheuk

    I still fail to see the point of this (from a user perspective, I can see why it is attractive to Google to hoard the keyword data – which they *can* tie to a person).

    To use your example, assuming a user searches for “erectile dysfunction”, you would hope that the user will be directed to a page about erectile dysfunction. The vast majority of sites don’t use SSL; so lets assume the site the user ends up on doesn’t – anybody could see where the user ends up, they just loose the keyword information… they still have the users IP and the fact they are browsing a page about erectile dysfunction.

    The webmaster of the page however looses some valuable data through this – perhaps people are searching for “erectile dysfunction in men under 25″ or “erectile dysfunction amongst users of drug X” – that is valuable data for the webmaster (who again, I am pretty sure doesn’t care *who* is searching for those terms – just what terms people are searching for).

    The reality is the vast majority of webmasters do not want to tie keywords to users; they just want to user the information on aggregate; Google do has the means, motive and opportunity to use this data.

    This is just smoke and mirrors for Google to manipulate the search results while it continues to gather and use swathes of data about the users it pretends to care about.

  • Francisco Debs

    Google is trying to hide the keyword from other display and advertising networks. Web site owners are just caught in the middle.

  • http://twitter.com/lizstraws Elizabeth Strawford

    I’ll refrain from wading into the main argument here, but I’d like to say I have always found Google Webmaster Tools search query reports to be a valuable tool and I’m sure we can see larger selections of data in the new SEO reports in Analytics. Whether this is going to change or not I don’t know.

  • http://www.askforeman.com/ Stephen Foreman

    I have been hit by this as much as anyone else and it’s becomming harder and harder without third party tools and logging keyword campaigns through site design to see where my target audience is coming from. Having said that, although we now can’t see the source of the traffic, at least the traffic is still there and it’s not a Google Panda type scenario instead…

  • https://plus.google.com/115081267187799351846?rel=me Michael Hart

    “The reality is the vast majority of webmasters do not want to tie keywords to users; they just want to user the information on aggregate”

    This is exactly what Webmaster Tools is for. https://www.google.com/webmasters/tools

    The only real flaw I see with Webmaster Tools is the data “expires”, though it can be exported and/or integrated with Google Analytics, whichever you prefer.

  • daveintheuk

    WMT is okay, but:

    1. Why should Google have a monopoly and control over this data?
    2. It is limited, both in the scope and in that it expires.
    3. You can’t track individual sessions (I don’t care *who* they are; but it is handy to see a users journey with the KW data).

    What annoys me most is that Google plays the “protecting users privacy” card with this, when they are actively using the data themselves, combined with personally identifiable information – and giving it away to advertisers. It’s their hypocrisy that angers me most. Yet another case of Google tilting the playing field to their advantage.

  • https://plus.google.com/115081267187799351846?rel=me Michael Hart

    2 is resolved by exporting your data. 2 and 3 are resolved by Google Analytics (Analytics has lots of features you don’t seem to know about).

    Honestly, I’m glad Google strips this data… Scumbag web developers will eagerly abuse referrer data, not only in regard to privacy but also to create spammy websites.

    Google has found a solution that still gives developers the data (even organizes it for them in ways FAR superior to alternatives), but ensures that user privacy is guaranteed and spam online is just a little bit harder to create.

  • daveintheuk

    Why should I have to export incomplete data from WMT and hack together a solution when I could do it in any number of existing packages before this ludicrous change. Do you think a world where the only place KW data is available is WMT is preferable to the choice that webmasters currently have?!

    You’ve missed the point where you are talking about GA functionality; I’m well aware of the functionality – but now I cannot get complete KW data in there alongside that functionality as the data is not being provided.

    If you think for one moment that Google’s intentions here were anything but selfish, I commend you for your optimism. Even if this does stop a few rogue spammers, the price is too high for other webmasters.

  • https://plus.google.com/115081267187799351846?rel=me Michael Hart

    “…
    but now I cannot get complete KW data in there alongside that functionality as the data is not being provided.”

    YES YOU CAN. You clearly did not read what I previously wrote. Nonetheless, you CAN link your Google Analytics profiles to Google Webmaster Tools sites. The data is easy to access using custom reports or the default reports Google provides, just like before.

  • les_madras

    I believe Google is doing this to keep search terms data from Facebook.  The “like” button is everywhere, and Facebook presently records referrer data but does not yet use it for ad targeting.  Their big growth drivers post-IPO was to do behavioral and search-intent targeting of ads based on the user’s navigation history.  With this change, Google is taking away search-intent, which is the most valuable of all targeting information.

    In the absence of search-intent information, Facebook ads will not be effective in the top 3 ad sectors of the economy: Health, Auto and Finance.  Nobody talks about any of these on Facebook.

  • donthe

    You’re obviously not a webmaster or you wouldn’t be making such ridiculous statements. using GWT data within GA in not helpful. 

    You need to wait two usually three days for the data to arrive. Two days is ridiculous when you are trying to analyze a drop or increase in traffic. 
    In addition, the landing page and the query are not interlinked. It’s just rows of keywords and then rows of URL’s. No manipulation of data is possible. I have never found it helpful for anything.

  • https://plus.google.com/115081267187799351846?rel=me Michael Hart

    While the delay is annoying, it’s only simply that… annoying.

    I’m fairly certain you can use custom reports and use the data just as you ever would; I’m not sure what you’re trying to do (you’re extremely unclear), but I’ve been able to reproduce all of my old reports simply by changing a few options on my old custom reports.

  • http://www.tennis-artikelen.nl/tennistas.html Tennistas

    I think this is bad news and I fear its only getting worse. I think within two or three years we’re all paying for all analytics…

  • http://www.twitter.com/daveculbertson Dave Culbertson

     I think this is spot-on. It’s really about keeping keyword data out of Facebook’s hands, and the hands of third party targeting networks.

  • Dudibob

    Google does seem to be blocking some Adwords data too.  They show you the statistics for the keyword I guess idea that you are bidding on, let’s use the old example of ‘red widget’ on a broad match.  So you get the statistics on ‘red widget’ however what are those keywords that Google is showing your ads under this umbrella? Well if you look at the keyword overview, a lot of the data has been blocked.

    If you use Adwords try it yourself, in the keywords, see search terms, all keywords and you’ll see some data however the data doesn’t add up to the rest.  Scroll to the bottom and read the mouse over help text from the ‘Other Search Term’ totals and you get a number of bullet points the one of interest is:

    The user has blocked their referrer URL from being passed on to the destination website. 

    So this effectively makes broad keywords dead as we can’t see what crap they’re bringing through to apply negatives!

  • mikitiki

    it’s not true, you can use encrypted search in another language

    for example https://encrypted.google.com/webhp?hl=de for Gemany

    just add your country extension in the end or use eff https everywhere extension for Firefox

    your country usually appears in the right bottom corner and when you
    click it the extension takes you to the encrypted search adding the
    country extension as I showed above

    thanks

  • mikitiki

    it’s not true, you can use encrypted search in another language

    for example https://encrypted.google.com/webhp?hl=de for Gemany

    just add your country extension in the end or use eff https everywhere extension for Firefox

    your country usually appears in the right bottom corner and when you
    click it the extension takes you to the encrypted search adding the
    country extension as I showed above

    thanks

  • http://www.askforeman.com/ Stephen Foreman

    This is really a genuine problem with almost a monopoly in the Search engine market. Google have developed such a strong product becuase of the tools they offer and in the long run, as you have suggested they could start to charge for their services. Then unless there is a valid alternative with free tools, people could be forced to pay for an SEO advantage. Then a new startup will offer free tools and the cycle will begin again.

    Like the Matrix, but for the SEO sector!

  • سعيد الجهني

    country extension as I showed above
    http://www.m9ut.com/vb/

  • http://profile.yahoo.com/5WRLAUI2PMD65EIECPSSQGGSPU Christian

    I’d like some clarification:

    Would these “not provided searches” only show up in the event that the user has “Secure Search” activated within Firefox and they are logged into their Google Account or is this change independent of the user being logged into their Google account?

    I am also confused. So keywords are now being considered protected data even though I would have no idea who they are until after they convert and even if I have some idea who they are normally in most analytics scenario’s you only see them as “groupings” of conversions from KW and not individual user sessions with names, addresses etc (for various reasons)

    I think this type of move to block KW’s coming in from search will have a negative impact on analytics since I won’t be able to follow user sessions from the KW level through the conversion funnel which is going to make tracking ROI much murkier.

    It will also make sites harder to optimize through SEO to drive conversions since I’ll have a good idea rank, but not of much else.  If I don’t know what KW’s are converting the best, it is more difficult to know where to invest additional SEO spend.

    Long run I could see this having a negative impact on SEO investment, as businesses chase more trackable business via PPC or other means. It will be tougher to justify SEO programs.

    We don’t/can’t use Google Analytics and Web Master tools is not a conversion tracking suite.

  • Vijay (Prithvi) Singh Chauhan

    I am not getting 1.5K odd visitors everyday, but my daily pageviews reach to 600 and out of that 90% is search traffic. Even i face this problem and its really bad because it checks us from tracking  out growth to a particular keyword. Out of 520 pageviews yesterday, 38 of them were filed under not provided. I just wonder in what terms this can benefit google..

  • http://twitter.com/AnalyticsNinja Yehoshua Coren

    While I “sort of” hear the privacy issue as it relates to keyword data from referrers, for the most part if falls on deaf ears.  Ultimately, I think that the obfuscation of referral data is a bad thing for the web.  Publishers want to make sure that their sites are as relevant as possible for searchers.  Time and again I have seen publishers discover that certain pages were ranking for certain keywords when indeed they had more relevant pages on their site to present to searchers.  What do publishers do in that case?  SEO.  They can make onsite changes so that the most relevant content is available to searchers.  Without keyword data from search referrers, this process is not possible.  The argument that Google gives for why Advertisers need search referral data, namely “to measure the effectiveness of their campaigns and improve the ads and offers they present you” is absolutely true in the world of organic search as well.  I believe that the privacy concern of referral data is far outweighed by the negative impact that these changes have on webmasters / publishers.  The average user definitely cares about privacy.  But they also care about their experience on internet not frustrating the hell out of them as relevancy decreases.

  • http://www.fulltraffic.net/ Fede Einhorn

     People advertising in Google are still able to monitor which keywords work better, as google stores in their servers all the data about the “clicker” and the ad clicked. Even if the browser hides the referrer. It’s not like you click on a result that is hyper linked to: http://www.domain.com, you click on a http://www.google.com/blabla, which means that first google saves EVERYTHING about you and the click, then you are redirected to http://www.domain.com.

  • luckydad2

    Danny-

    Thanks for this post.

    In my humble opinion, however, you mislead the reader in your second and third sentences:

    ” It will also further reduce the ability for publishers to know how
    people find their sites in Google — except for Google advertisers. A
    loophole in Google Secure Search continues to provide them with this
    data.”

    That’s not entirely correct. If  Vinnie Visitor searches for “Amazon” on Google encrypted, he gets many organic search results – links to Amazon.com, Wikipedia, Amazon’s twitter, account, etc and some paid search ad links to Amazon.com.

    If Vinny Visitor clicks on the Paid Search Ad link for Amazon.com, Google sends on the referrer information including the search term to Advertiser Amazon.com, even thought Vinny visitor searched on Google encrypted.

    BUT if Vinny Visitor clicks on the Organic search result link for Amazon.com, Google does not send on the referrer information or the search term to (Advertiser) Amazon.com, _because_ Vinny Visitor searched on Google encrypted.

    See what I mean? Google is still sending a visitor to an “Advertiser” but _via_ an organic search result. Your first two sentences make it sound like Google sends referrer and search terms (data) to Advertisers _regardless_ of how Google is sending them to the Advertiser. That is not true and, again IMHO, misleading.

    You correct this in your first reply comment by saying:

    “With Google SSL search, Google deliberately prevents referrers from
    passing in either case with organic listings but allows them to pass for
    ad clicks.”

    But even that could be improved – I’d rather you say:

    “With Google SSL search, Google deliberately prevents referrers _including search terms_ from
    passing in either case with organic listings but allows them to pass for
    ad clicks.”

    Google does not always provide advertisers with referrer and search term data – it only always provides that data for searchers who click on Advertisers paid search ad links.

    Google does not send referrer data to advertisers when visitors are logged in to Google, or are using Google encrypted  AND  the visitor clicks on an _organic_ search result link.

    IF a visitor to Google is not logged in OR is not using Google encrypted, Google does send on referrer information to the website, regardless of the whether the visitor clicks on a paid search link or an organic search link.

     

  • Harry Clark

    Search engines such as Google and Bing are constantly informing all us SEO’s and publishers that we need to provide content that is relevant to the user, but how can we do that when they are extracting such vital information away from us? Frustrating to say the least.

    However, I do understand that they want to protect users online security, so maybe I’m just moaning because it makes my job that little bit harder.

  • rare poetry

    amzing nice so cool