Firefox To Use Google Secure Search By Default; Expect More “Not Provided” Keywords To Follow

The popular Firefox browser is on track to use a secure method of searching Google by default, a change that will help prevent potential “eavesdropping” of what people are searching for. It will also further reduce the ability for publishers to know how people find their sites in Google — except for Google advertisers. A loophole in Google Secure Search continues to provide them with this data.

“We are currently testing the change to use SSL for built-in Google searches in our Firefox nightly channel. If no issues are uncovered, it will move through our Aurora and Beta release channels before eventually shipping to all our Firefox users. This will include migrating the changes to our non-English version of Firefox, as well,” said Johnathan Nightingale, Director of Firefox Engineering, when I emailed Firefox about the posted change.

How The Change Happened

Privacy advocate Christopher Soghoian noted the change on his blog today. Back in February 2011, he pushed for secure search to be the default in Firefox. At that time, Google Chrome engineer Adam Langley said that using a secure version of Google known as Google Encrypted Search wouldn’t work:

We would welcome Firefox giving their users the option to use encrypted search. However, at this time we don’t feel that our encrypted search offers the features and speed that our users expect and so we wouldn’t want it to be the default. We are working towards making encrypted search as fast and complete as unencrypted search, but we’re not there yet

Since then, Google made a different method of secure searching the default for Google for signed-in users at, Google SSL Search. That renewed discussions about making secure search at Google the default for Firefox users. Both Langley and another Google employee, Mike Graboski, made comments that suggested Google had no issues with Firefox making the switch.

From Langley:

We’re happy to be offering SSL search for our signed-in users on, and we’ve received a lot of positive feedback. We want to make it available on other Google domains as well, but we’re still working on that.

From Graboski:

Google’s search team is ok with Firefox using for search suggestions, so please use this endpoint.  Thanks!

Google confirmed for me that Graboksi’s statement is correctly interpreted as a go-ahead for the Firefox team to make the switch, if it wanted to.

The change was formally made yesterday. As the Firefox statement notes, unless there are issues that crop up, all Firefox users who search using Firefox’s built-in features, such as its search box, will have their searches done using a secure connection.

The only exceptions to this will be for Firefox users who have changed their default search engine from Google to something else or for those using the Russian version of Firefox, which uses Yandex as its default search engine.

Impact On Consumers

The shift means more security for millions of Firefox users. It will make it harder for outsiders to potentially eavesdrop on what someone is searching for.

Just as secure connections protect someone’s credit card numbers when by things online, secure connections also mean that what someone is searching for can be seen only by Google and the person who is searching, with two important exceptions: Google’s advertisers and those who use Google Webmaster Central.

Those exceptions can’t be dismissed, even though the privacy risks with either of them is relatively small. When Google turned Google SSL Search on by default last year for logged-in users, it pitched this as protecting privacy. Nevertheless, it went out of its way to leave a loophole open for advertisers. It also seems to be ignoring the hole with Google Webmaster Central.

Make no mistake, searching was made massively more secure by Google’s move, and Firefox’s change will further make it secure for yet more people. But if the goal is to fully protect privacy, Google would upgrade the entirely different Google Encrypted Search service, and Firefox would use that.

The Privacy Loopholes

Let’s revisit secure searching at Google, to understand how with both flavors offered, search data — including potentially very private searches — can escape despite encryption.

Google has two secure searching products, Google Encrypted Search and Google SSL Search. With either, no one can eavesdrop on the searching you do with Google. That’s a big, welcomed change. But when you click on a listing or ad at Google, what you searched for will be contained in what’s called “referrer data” that your browser passes along to the destination site.

For example, do a search for “erectile dysfunction,” click on a listing, and that search term is in the referrer data that normally gets sent to the site you visit by Google. The same thing would happen if you used Yahoo or Bing, by the way. It part of how browsing software itself works.

In most cases, the site you visit isn’t going to know who you really are. They get a fairly anonymous strings of number called an IP address. But with some work, or perhaps by combining the IP address with cookie data or other information, they might be able to figure out more about who you really are.

Another way that search terms are revealed are through two Google programs for publishers: Google AdWords and Google Webmaster Central. With Google AdWords, you purchase ads, and you can see the search terms that people use when clicking on those ads. With Google Webmaster Central, you’re shown the search terms people used to reach your site over the past 30 days.

Neither of these programs link IP addresses with search terms, so there’s really no good way for publishers to match searches back to a particular person. These are helpful and relatively “safe” ways Google helps publishers without harming user privacy.

Think of it all as having a continuing “search conversation” with Google. Secure search prevents anyone from hearing the full conversation. But in some instances, when you speak loudly about a particular person, referrer data allows them to hear a tiny fragment of that talk. Even then, they still probably don’t know it was you who said it.

In short, letting search terms “escape” or “leak” via referrer data is still fairly private for the vast majority of searches that happen out there, I’d say. Despite this, Google decided this data was so sensitive that it blocked non-advertisers from getting it back in October. That magnifies the problem of why it hasn’t blocked its advertisers, as well.

SSL Vs Encrypted

Both versions of Google’s secure search leak referrers. Google Encrypted Search does this for technical reasons. Google SSL Search does it because Google deliberately wants referrers to be passed along to its advertisers.

Google Encrypted Search was launched by Google in May 2010. Originally, you could enable it by going to Note the additional S in the https prefix. That indicated the secure version of Google search was being used. However, the service caused problems for some schools that wanted to use other Google products. It was moved to a new location:

When you use Google Encrypted Search, referrers are blocked entirely with one key exception: if you go from Google Encrypted Search to another secure site. It’s a technicality in how browsers work. When you have a secure connection to one site, no referrer data is passed along to the next unless that site also opens a secure connection for you.

This is a tiny security issue. That’s because it’s rare that you’d go from Google Encrypted Search to another secure site, since most sites don’t run secure servers that turn up in search results.

Google SSL Search largely came about in October 2011. That was when Google announced that by default, it would enable a secure searching connection for anyone who was logged into Before then, I’m pretty sure you could go to and establish a secure connection if you want, but it’s hard to pin this down. But really, October 2011 was the key date. Suddenly, millions of people searching on Google found they had a secure connection on by default.

How about referrer data? Google made a point to block this for anyone who clicked on its “editorial” or non-paid listings, saying this was designed to protect privacy. However, it continued to provide referrer information to its advertisers. Click on an ad after searching for “erectile dysfunction,” and an advertiser would receive both what you searched for and your IP address linked to that search.

Why Google didn’t block ALL referrers was perplexing. If search terms themselves were potentially private, as Google started arguing, then letting any of them out was bad. At best, Google concocted an odd, far-fetched defense that advertisers could run so many ads that potentially, they still might see search data.

I’ve found this unconvincing, as I explain more in 2011: The Year Google & Bing Took Away From SEOs & Publishers. That story also explains why, if search terms are so sensitive, Google should be filtering them in some way from Google Webmaster Central, as well. Also see my other article, Google’s Results Get More Personal With “Search Plus Your World”, for more about this.

The bottom line is that both versions of Google secure search allow referrers to escape, but Google Encrypted Search does this far less than Google SSL Search. If Firefox was really serious about using privacy, it would use that. But it can’t, not easily, and some of the reason for that comes back to Google.

Secure Searching Beyond The US

Firefox isn’t just used by those in the US. There are version of it for those in countries all over the world. It’s better if these country-specific versions point to the correct country-specific versions of Google (except, as mentioned, the Russian version which uses Yandex).

Google Encrypted Search is really a US/English-language service. There’s no ability to change the interface language from English to German that I can see. To even try this, you have to log in. When logged in, even if you set your language to German, Google Encrypted Search keeps speaking English back to you as the overall interface language.

In contrast, using Google SSL Search means that Firefox can point to where SSL search is formally supported already,, Google UK, Google France and Google Germany. The latter three got support in January. In March, Google announced that it would be coming to more counties over the course of several weeks. I already can see already works now for places like Google Australia, Google Poland and even Google Iceland, even though Google hasn’t formally announced this.

For Firefox, using Google SSL Search makes more sense. Country-specific versions of Firefox can use the right Google SSL Search for the right country, something that Google Encrypted Search wouldn’t allow.

Why Not Kill All Referrers?

Another way that Firefox could make things more secure would be to kill all referrer data within the browser itself. It could do this, and then there would be no leakage of terms from Google nor from other sites, when people surf the web.

I asked Firefox about this, but it didn’t provide any answer on that question, only the quote I have above.

I asked Microsoft the same for Internet Explorer, but I haven’t heard back yet.

Google told me that it doesn’t have anything to announce about this, in relation to its Chrome browser.

Fallout For Publishers

The move will be further bad news for publishers, who have come to depend on search term data passed along by referrers. It’s not uncommon to hear sites report that 20% or more of their search queries are now reported as “not provided” due to Google’s blocking.

Yesterday, I even published an example of how on my personal blog, 35% of my search terms are now withheld. Here’s the illustration, showing traffic for March 19:

The Firefox change to Google SSL Search means that this “not provided” percentage will only climb higher for all publishers. It wouldn’t be so bad if Google provided this data on a long-term basis through Google Webmaster Central. As I explained, this is a safe way for Google to tell publishers how people are reaching their sites through search while also protecting user privacy.

Unfortunately, Google only lets you gather this data back for 30 days. If publishers haven’t been tapping into it regularly, they can’t maintain trends that they’ve had before.

I continue to wish that Google would expand this data. The lack of attention here gives the impression that Google really doesn’t care that much about supporting publishers in this regard. That includes even Google advertisers, who also have “free” listing data that’s been lost.

Why Doesn’t Chrome Offer Google Secure Search?

Another twist to this story is that Firefox’s move means that it’s going to be offering a more secure way to search Google than Google’s own Chrome browser does.

By default, Chrome won’t initiate a secure connection with Google Search. If you’re logged in, however, it will maintain the default secure connection with Google.

Will this change? “We don’t have anything to announce about Chrome at this time,” Google told me.

On Secure Search, It’s Google: 2, Bing & Yahoo: 0

While I have issues with Google for allowing some search terms to leak through referrer data, Google deserves serious kudos for offering secure search overall. Its two big rivals, Bing and Yahoo don’t. As Soghoian put it, when I said it seemed kind of crazy that Google has two ways of secure searching with some referrer leakage:

Better for Google to have two secure search sites, than Microsoft and Yahoo, which have zero.

How about it, Microsoft? The company told me:

Bing does not offer SSL.  To protect themselves from being unknowingly redirected we recommend people install OpenDNS.

Of course, if you really want to be secure, you could always try Duck Duck Go. You can force a secure search there by going to (surprisingly, this isn’t the default). As for referrers, it doesn’t pass any on.

Related Articles

Related Topics: Channel: Consumer | Features: Analysis | Firefox | Google: Chrome | Google: Secure Search | Google: Security | Google: Web Search | Top News


About The Author: is a Founding Editor of Search Engine Land. He’s a widely cited authority on search engines and search marketing issues who has covered the space since 1996. Danny also serves as Chief Content Officer for Third Door Media, which publishes Search Engine Land and produces the SMX: Search Marketing Expo conference series. He has a personal blog called Daggle (and keeps his disclosures page there). He can be found on Facebook, Google + and microblogs on Twitter as @dannysullivan.

Connect with the author via: Email | Twitter | Google+ | LinkedIn


Get all the top search stories emailed daily!  


Other ways to share:

Read before commenting! We welcome constructive comments and allow any that meet our common sense criteria. This means being respectful and polite to others. It means providing helpful information that contributes to a story or discussion. It means leaving links only that substantially add further to a discussion. Comments using foul language, being disrespectful to others or otherwise violating what we believe are common sense standards of discussion will be deleted. Comments may also be removed if they are posted from anonymous accounts. You can read more about our comments policy here.
  • Neale

    Looks like I might need to start doing a little pay per click just to see what terms people are using to find my sites..

  • Bob Bigellow

    I’m having a difficult time, reading this article, determining which side of the issue you’re on. In many cases, you talk about Google “blocking” referrer data. In other cases, you suggest they’re “leaking” referrer data.

    Both terms are misleading in their own ways.

    When it comes to “blocking” referrer data, this is done by the browser, not by Google. By design, browsers do not pass referrer information when it’s from a secure connection. So, when Google switched to secure search (whether it was the specialized domain or the main site), it was the browsers that were now blocking the referrer data.

    The problem is, people who pay for ads are paying for two things. First, they’re paying for the ad itself. Second, they’re paying for the ability to track the success of that ad. Since the ads are based on a bidding system, it is critically important that advertisers have access to the kind of analytics that would be needed to determine how valuable a search term is, so that they aren’t over-bidding.

    So, to be a good business partner to advertisers, Google needed to capture this data on behalf of the advertisers paying for the inclusion. So, they effectively work around the browser blocking issue. They are able to do this because: 1) They know the owner of the ad 2) The ad is hosted on their own site, so they can employ any number of techniques to capture data on behalf of the advertiser.

    It’s probably safe to say that people click on organic results more than they click on ads. So, whichever tracking methods are used (whether it is an interstitial page or capturing data based on a click event) is going to be a certain amount of computing overhead. For them to employ this same technique on organic results would require a huge huge increase of processing needed by Google’s servers without any compensation. I’m sure they are taking this into account while looking for alternative solutions.

    The other issue is the fact that there’s little point in capturing this data for domains that have not been claimed by a webmaster. So, this leaves them with an engineering incentive to only add tracking to organic results for those who have claimed the domain name for purposes of the Webmaster Tools. It’s very possible they have considered that this additional analytics (since it would require more resources) might be part of a premium service. For instance, you might pay a small fee to get an “Enterprise” version of Google Webmaster Tools or access via Google Analytics. Considering there is some overlap between Google Analytics and Google Webmaster Tools, perhaps they are looking to streamline these services a bit. Imagine being able to log into Google Analytics even if you haven’t added any code to your site, but just claimed the domain name. Imagine you have access to the types of analytics found in Google Webmaster Tools. Then, by adding some code to your site, and now you have more data. By using AdWords, even more data. By paying a premium, even more data still.

    It’s a tricky situation. Switching to all SSL all the time and allowing the browsers to block the referrer data and making no attempt to capture this themselves on behalf of advertisers, they’d lose their advertisers to search engines that don’t have SSL search and can provide these analytics to their paying customers. They’d call it “blocking referrer data”. If they circumvent it for paid advertisements AND organic results, they’d have privacy advocates claiming they are “leaking referrer data” and making SSL search less useful. So, they found a middle ground. Let it be blocked for organic results, let it be captured for their paying customers who rely on the data.

    Now, it turns out, this problem is being solved in the open with a new standard for browsers to support. It looks like they may use this standard to allow the referrer data to flow, even in SSL situations. So, they’re just doing this at the pace of standards implementation, which can take months or years.

    On the flip-side, you might ask… why secure search only for logged in users? It’s likely for two reasons. First, someone who is logged in now gets personalized results. Information from Google+ is embedded in results. Maybe one day, a search on will also show emails matching the terms from Gmail, documents matching the term in Google Docs, files matching the term in Google Drive, etc… With this combination of both public and private data in results, it makes sense to secure search for logged in users. Users who are not logged in are only seeing public data, so it’s less of a concern about securing that search. If they are particularly paranoid that people might know what they are searching for, they’d just log in… or use the special domain to encrypt their search.

    The second reason might simply be resources. Encrypting results takes processing power. Google handles a TON of traffic. So, it only makes sense for them to roll this out in stages. First, with logged in users. Then, convince a bunch of people to always be logged in via Google+. Once that flattens, they can introduce it into another browser. It may very well be that Google’s data shows that more people are searching on Google while using Chrome than while using Firefox. So, it would make sense to slowly roll out this feature further by allowing it to be implemented in Firefox as the next small step. Then, after they see how the boat holds up to that amount of increased SSL traffic, they can implement it into Chrome. Maybe the final step would be to just have it active for everyone, whether logged in or not. This time might be a far way out if they need to beef up resources to handle the load and traffic, or if they need to first solve the referrer problem before they upset the webmasters and advertisers of the world with simply not enough data to extrapolate from.

    So, just to be clear, which side of the fence are you on? Do you see Google as “blocking referrer data” while siding with advertisers and webmasters, or do you see Google as “leaking referrer data” while siding with privacy advocates? Or, are you just sitting on the fence and playing both sides against each other?

  • Danny Sullivan

    Bob, Google SSL Search’s blocking of referrer data is not done by the browser. It’s done by Google. Deliberately.

    With your browser, going from SSL to non-SSL should pass no referrers; going from SSL to SSL will pass them. 

    With Google SSL search, Google deliberately prevents referrers from passing in either case with organic listings but allows them to pass for ad clicks. 

    This was explained in the article above, but it’s even further explained in these background articles:
    Before October 2011, Google didn’t suggest that referrer data was somehow private information that needed to be blocked. It only claimed that after starting SSL search by default. If that’s the case, then it doesn’t matter what the advertisers may or may not want. If the data is private, you block it.

    The argument about what Google has to do for advertisers to do analytics is flawed. They can track the success of an ad through the Google AdWords system without having it tied to an IP address in the way that providing a referrer allows. IE: advertisers could be limited to exactly the same “safe” release of data that publishers now get with Google Webmaster Central.

    Of course, doing this prevents Google from allowing advertiser to make use of retargeting, which is perhaps one reason Google has allowed this hole to remain.

    The argument about capturing data for domain that have not been “claimed” by a webmaster is also flawed. The data is there. Google has it all. It’s not like it suddenly starts recording it only after a domain is verified in Google Webmaster Central. It’s all part of Google’s regular clickthrough logging that is done.

    The issue seems mainly to be that Google doesn’t want to spare the machine power to make this data accessible to site owners for longer than 30 days. It could do this; it doesn’t consider it a priority.

    It’s possible this is all part of a secret uber plan to get site owners to pay. I don’t think that’s the case. I think that some within Google wanted to drop all referrers, because that would have been most secure. I think that the advertising side of Google screamed bloody murder that this couldn’t happen. And the advertising side won.

    That’s my suspicion trying to read the tea leaves. I don’t have any inside sources saying this. I do know that it would be incredibly difficult for Google to launch a “premium” analytics tool that releases the same data that we’ve been told for months is now too sensitive to share through referrer strings.

    I know exactly the argument why this might apply to Google Search Plus Your World. Again, it was in one of my background articles above. It doesn’t add up. You can read that for yourself, why it doesn’t, here:

    The resources argument also doesn’t add up, in terms of encryption power. Google has given the go ahead, as I explained, for Firefox to make encrypted search the default for millions of searches each day for logged out users. If they have a resource problem over this, that would have been flagged exactly as it was in 2010, as I also explained, when this was first proposed.

    As for the side I’m on, it’s privacy. Google has declared that search terms themselves are potentially private and therefore can’t be shared with publishers. But it has left all these loopholes. It’s OK to share with advertisers. It’s OK to share through AdWords. It’s OK to share through Google Webmaster Central. Leaving loopholes isn’t good, when it comes to privacy.

    So to go back to what I wrote before on this subject in early January:

    “Blocking referrers is a completely separate issue from encrypting the search results themselves. That’s good and should be continued. But Google is deliberately breaking how such encryption works to pass along referrer data to its advertisers. Instead, Google should block them for everyone or block them for no one. Don’t play favorites with your advertisers.”

    And after Search Plus Your World launched:

    “Today’s change does nothing to change my view that Google needs to revisit the referrer blocking and either make it a block for everyone, including advertisers, or find a better way to filter search terms that get made visibile in various ways.”

    The ideal solution in my book would be this. Referrers are blocked for everyone. Publishers won’t like that, but I’ve written before that referrers are likely to eventually get blocked in browsers anyway. It’s like fighting against the tide over that. But while referrers are blocked, Google expands the data it reports to publishers through Google Webmaster Central, the “safe” method I’ve mentioned before — and examines ways to ensure that the possibly “private” searches are somehow filtered out.

  • daveintheuk

    I still fail to see the point of this (from a user perspective, I can see why it is attractive to Google to hoard the keyword data – which they *can* tie to a person).

    To use your example, assuming a user searches for “erectile dysfunction”, you would hope that the user will be directed to a page about erectile dysfunction. The vast majority of sites don’t use SSL; so lets assume the site the user ends up on doesn’t – anybody could see where the user ends up, they just loose the keyword information… they still have the users IP and the fact they are browsing a page about erectile dysfunction.

    The webmaster of the page however looses some valuable data through this – perhaps people are searching for “erectile dysfunction in men under 25″ or “erectile dysfunction amongst users of drug X” – that is valuable data for the webmaster (who again, I am pretty sure doesn’t care *who* is searching for those terms – just what terms people are searching for).

    The reality is the vast majority of webmasters do not want to tie keywords to users; they just want to user the information on aggregate; Google do has the means, motive and opportunity to use this data.

    This is just smoke and mirrors for Google to manipulate the search results while it continues to gather and use swathes of data about the users it pretends to care about.

  • Francisco Debs

    Google is trying to hide the keyword from other display and advertising networks. Web site owners are just caught in the middle.

  • Elizabeth Strawford

    I’ll refrain from wading into the main argument here, but I’d like to say I have always found Google Webmaster Tools search query reports to be a valuable tool and I’m sure we can see larger selections of data in the new SEO reports in Analytics. Whether this is going to change or not I don’t know.

  • Stephen Foreman

    I have been hit by this as much as anyone else and it’s becomming harder and harder without third party tools and logging keyword campaigns through site design to see where my target audience is coming from. Having said that, although we now can’t see the source of the traffic, at least the traffic is still there and it’s not a Google Panda type scenario instead…

  • Michael Hart

    “The reality is the vast majority of webmasters do not want to tie keywords to users; they just want to user the information on aggregate”

    This is exactly what Webmaster Tools is for.

    The only real flaw I see with Webmaster Tools is the data “expires”, though it can be exported and/or integrated with Google Analytics, whichever you prefer.

  • daveintheuk

    WMT is okay, but:

    1. Why should Google have a monopoly and control over this data?
    2. It is limited, both in the scope and in that it expires.
    3. You can’t track individual sessions (I don’t care *who* they are; but it is handy to see a users journey with the KW data).

    What annoys me most is that Google plays the “protecting users privacy” card with this, when they are actively using the data themselves, combined with personally identifiable information – and giving it away to advertisers. It’s their hypocrisy that angers me most. Yet another case of Google tilting the playing field to their advantage.

  • Michael Hart

    2 is resolved by exporting your data. 2 and 3 are resolved by Google Analytics (Analytics has lots of features you don’t seem to know about).

    Honestly, I’m glad Google strips this data… Scumbag web developers will eagerly abuse referrer data, not only in regard to privacy but also to create spammy websites.

    Google has found a solution that still gives developers the data (even organizes it for them in ways FAR superior to alternatives), but ensures that user privacy is guaranteed and spam online is just a little bit harder to create.

  • daveintheuk

    Why should I have to export incomplete data from WMT and hack together a solution when I could do it in any number of existing packages before this ludicrous change. Do you think a world where the only place KW data is available is WMT is preferable to the choice that webmasters currently have?!

    You’ve missed the point where you are talking about GA functionality; I’m well aware of the functionality – but now I cannot get complete KW data in there alongside that functionality as the data is not being provided.

    If you think for one moment that Google’s intentions here were anything but selfish, I commend you for your optimism. Even if this does stop a few rogue spammers, the price is too high for other webmasters.

  • Michael Hart

    but now I cannot get complete KW data in there alongside that functionality as the data is not being provided.”

    YES YOU CAN. You clearly did not read what I previously wrote. Nonetheless, you CAN link your Google Analytics profiles to Google Webmaster Tools sites. The data is easy to access using custom reports or the default reports Google provides, just like before.

  • les_madras

    I believe Google is doing this to keep search terms data from Facebook.  The “like” button is everywhere, and Facebook presently records referrer data but does not yet use it for ad targeting.  Their big growth drivers post-IPO was to do behavioral and search-intent targeting of ads based on the user’s navigation history.  With this change, Google is taking away search-intent, which is the most valuable of all targeting information.

    In the absence of search-intent information, Facebook ads will not be effective in the top 3 ad sectors of the economy: Health, Auto and Finance.  Nobody talks about any of these on Facebook.

  • donthe

    You’re obviously not a webmaster or you wouldn’t be making such ridiculous statements. using GWT data within GA in not helpful. 

    You need to wait two usually three days for the data to arrive. Two days is ridiculous when you are trying to analyze a drop or increase in traffic. 
    In addition, the landing page and the query are not interlinked. It’s just rows of keywords and then rows of URL’s. No manipulation of data is possible. I have never found it helpful for anything.

  • Michael Hart

    While the delay is annoying, it’s only simply that… annoying.

    I’m fairly certain you can use custom reports and use the data just as you ever would; I’m not sure what you’re trying to do (you’re extremely unclear), but I’ve been able to reproduce all of my old reports simply by changing a few options on my old custom reports.

  • Tennistas

    I think this is bad news and I fear its only getting worse. I think within two or three years we’re all paying for all analytics…

  • Dave Culbertson

     I think this is spot-on. It’s really about keeping keyword data out of Facebook’s hands, and the hands of third party targeting networks.

  • Dudibob

    Google does seem to be blocking some Adwords data too.  They show you the statistics for the keyword I guess idea that you are bidding on, let’s use the old example of ‘red widget’ on a broad match.  So you get the statistics on ‘red widget’ however what are those keywords that Google is showing your ads under this umbrella? Well if you look at the keyword overview, a lot of the data has been blocked.

    If you use Adwords try it yourself, in the keywords, see search terms, all keywords and you’ll see some data however the data doesn’t add up to the rest.  Scroll to the bottom and read the mouse over help text from the ‘Other Search Term’ totals and you get a number of bullet points the one of interest is:

    The user has blocked their referrer URL from being passed on to the destination website. 

    So this effectively makes broad keywords dead as we can’t see what crap they’re bringing through to apply negatives!

  • mikitiki

    it’s not true, you can use encrypted search in another language

    for example for Gemany

    just add your country extension in the end or use eff https everywhere extension for Firefox

    your country usually appears in the right bottom corner and when you
    click it the extension takes you to the encrypted search adding the
    country extension as I showed above


  • mikitiki

    it’s not true, you can use encrypted search in another language

    for example for Gemany

    just add your country extension in the end or use eff https everywhere extension for Firefox

    your country usually appears in the right bottom corner and when you
    click it the extension takes you to the encrypted search adding the
    country extension as I showed above


  • Stephen Foreman

    This is really a genuine problem with almost a monopoly in the Search engine market. Google have developed such a strong product becuase of the tools they offer and in the long run, as you have suggested they could start to charge for their services. Then unless there is a valid alternative with free tools, people could be forced to pay for an SEO advantage. Then a new startup will offer free tools and the cycle will begin again.

    Like the Matrix, but for the SEO sector!

  • سعيد الجهني

    country extension as I showed above

  • Christian

    I’d like some clarification:

    Would these “not provided searches” only show up in the event that the user has “Secure Search” activated within Firefox and they are logged into their Google Account or is this change independent of the user being logged into their Google account?

    I am also confused. So keywords are now being considered protected data even though I would have no idea who they are until after they convert and even if I have some idea who they are normally in most analytics scenario’s you only see them as “groupings” of conversions from KW and not individual user sessions with names, addresses etc (for various reasons)

    I think this type of move to block KW’s coming in from search will have a negative impact on analytics since I won’t be able to follow user sessions from the KW level through the conversion funnel which is going to make tracking ROI much murkier.

    It will also make sites harder to optimize through SEO to drive conversions since I’ll have a good idea rank, but not of much else.  If I don’t know what KW’s are converting the best, it is more difficult to know where to invest additional SEO spend.

    Long run I could see this having a negative impact on SEO investment, as businesses chase more trackable business via PPC or other means. It will be tougher to justify SEO programs.

    We don’t/can’t use Google Analytics and Web Master tools is not a conversion tracking suite.

  • Vijay (Prithvi) Singh Chauhan

    I am not getting 1.5K odd visitors everyday, but my daily pageviews reach to 600 and out of that 90% is search traffic. Even i face this problem and its really bad because it checks us from tracking  out growth to a particular keyword. Out of 520 pageviews yesterday, 38 of them were filed under not provided. I just wonder in what terms this can benefit google..

  • Yehoshua Coren

    While I “sort of” hear the privacy issue as it relates to keyword data from referrers, for the most part if falls on deaf ears.  Ultimately, I think that the obfuscation of referral data is a bad thing for the web.  Publishers want to make sure that their sites are as relevant as possible for searchers.  Time and again I have seen publishers discover that certain pages were ranking for certain keywords when indeed they had more relevant pages on their site to present to searchers.  What do publishers do in that case?  SEO.  They can make onsite changes so that the most relevant content is available to searchers.  Without keyword data from search referrers, this process is not possible.  The argument that Google gives for why Advertisers need search referral data, namely “to measure the effectiveness of their campaigns and improve the ads and offers they present you” is absolutely true in the world of organic search as well.  I believe that the privacy concern of referral data is far outweighed by the negative impact that these changes have on webmasters / publishers.  The average user definitely cares about privacy.  But they also care about their experience on internet not frustrating the hell out of them as relevancy decreases.

  • Fede Einhorn

     People advertising in Google are still able to monitor which keywords work better, as google stores in their servers all the data about the “clicker” and the ad clicked. Even if the browser hides the referrer. It’s not like you click on a result that is hyper linked to:, you click on a, which means that first google saves EVERYTHING about you and the click, then you are redirected to

  • luckydad2


    Thanks for this post.

    In my humble opinion, however, you mislead the reader in your second and third sentences:

    ” It will also further reduce the ability for publishers to know how
    people find their sites in Google — except for Google advertisers. A
    loophole in Google Secure Search continues to provide them with this

    That’s not entirely correct. If  Vinnie Visitor searches for “Amazon” on Google encrypted, he gets many organic search results – links to, Wikipedia, Amazon’s twitter, account, etc and some paid search ad links to

    If Vinny Visitor clicks on the Paid Search Ad link for, Google sends on the referrer information including the search term to Advertiser, even thought Vinny visitor searched on Google encrypted.

    BUT if Vinny Visitor clicks on the Organic search result link for, Google does not send on the referrer information or the search term to (Advertiser), _because_ Vinny Visitor searched on Google encrypted.

    See what I mean? Google is still sending a visitor to an “Advertiser” but _via_ an organic search result. Your first two sentences make it sound like Google sends referrer and search terms (data) to Advertisers _regardless_ of how Google is sending them to the Advertiser. That is not true and, again IMHO, misleading.

    You correct this in your first reply comment by saying:

    “With Google SSL search, Google deliberately prevents referrers from
    passing in either case with organic listings but allows them to pass for
    ad clicks.”

    But even that could be improved – I’d rather you say:

    “With Google SSL search, Google deliberately prevents referrers _including search terms_ from
    passing in either case with organic listings but allows them to pass for
    ad clicks.”

    Google does not always provide advertisers with referrer and search term data – it only always provides that data for searchers who click on Advertisers paid search ad links.

    Google does not send referrer data to advertisers when visitors are logged in to Google, or are using Google encrypted  AND  the visitor clicks on an _organic_ search result link.

    IF a visitor to Google is not logged in OR is not using Google encrypted, Google does send on referrer information to the website, regardless of the whether the visitor clicks on a paid search link or an organic search link.


  • Harry Clark

    Search engines such as Google and Bing are constantly informing all us SEO’s and publishers that we need to provide content that is relevant to the user, but how can we do that when they are extracting such vital information away from us? Frustrating to say the least.

    However, I do understand that they want to protect users online security, so maybe I’m just moaning because it makes my job that little bit harder.

  • rare poetry

    amzing nice so cool

Get Our News, Everywhere!

Daily Email:

Follow Search Engine Land on Twitter @sengineland Like Search Engine Land on Facebook Follow Search Engine Land on Google+ Get the Search Engine Land Feed Connect with Search Engine Land on LinkedIn Check out our Tumblr! See us on Pinterest


Click to watch SMX conference video

Join us at one of our SMX or MarTech events:

United States


Australia & China

Learn more about: SMX | MarTech

Free Daily Search News Recap!

SearchCap is a once-per-day newsletter update - sign up below and get the news delivered to you!



Search Engine Land Periodic Table of SEO Success Factors

Get Your Copy
Read The Full SEO Guide