• http://www.everfluxx.com/ Everfluxx

    Is it just me, or this sucks?

  • http://www.diarizing.com Stefano Gorgoni

    @Everfluxx clearly it’s not just you ;) every step back in transparency from search engines sucks

  • Antti Nylund

    I dropped my jaw reading their announcement.

    Less confusing?

    Simplified interpretation?

    Based on whose feedback? I can’t imagine anyone who would think that 246 000 is confusing vs 250 000…

  • pierrefar

    Very nice title you got there :)

    Numbers in Webmaster Tools are representations of what we call buckets (the +/-10% ranges you refer to in http://www.seroundtable.com/google-webmaster-tools-accuracy-12768.html ). These representations work a lot like bucket names and all we’re doing here is changing the names slightly to remove some confusion. There isn’t really a gain nor loss of in transparency nor accuracy because although the names changed slightly, they still represent the same buckets. The goal now and previously is to make webmasters aware of significant changes to long-term trends and not bog them down in minute fluctuations.

    Had we changed the sizes of buckets by making them bigger, then yes, you can make an argument about loss of accuracy, but we didn’t change the bucket sizes.

    Hope this clarifies things a bit more.

    Thanks,
    Pierre

  • http://www.everfluxx.com/ Everfluxx

    @pierrefar: OK, I think I finally understand what you did now. No, seriously. It took me a while to figure this out, but if I understand you correctly: the new numbers are in the same ranges as before; you just changed the ranges names to show one or two significant digits instead of two or three.

    For example, “24,900,000″ already represented a number in the “25M +/- 2.5M” range (i.e., from 22,500,000 to 27,500,000); let’s assume the “real” number is 24,999,999. GWT will now be showing that number as “25,000,000″. Is that correct?

    If so, it is indeed true that there is no “real” loss of accuracy, since the previous representation was already an approximation, and the degree of precision (+/-10%) remains unchanged. Nonetheless, I think it is very counter-intuitive and hard to grasp for the average Joe webmaster!

    Just my $0.02 (+/- $0.0025)

  • http://www.andreamoro.co.uk/ Andrea Moro

    What idiocy is this? I always complain about the accuracy of the gwt. So what’s the point to round the number up?
    Are you listening to your users? I don’t think so, why don’t show us where
    Are all those requests to make such a change?

  • http://wefollow.com/echwa Damien Anderson

    I believe this change helps Google more than Webmasters. Who would submit feedback to say ‘Er, Google.. can you dumb down those numbers my math is bad?’ — please!!

    Every change made to restrict, simplify or obscure data is beneficial in driving more value to Google paid programs. I have always found the Global and Local search ‘estimates’ between Keyword tool and WMT funny.

    From a friend who worked at Overture they used to purposely play with the numbers to suite their needs.

  • http://seo-cubed.com SEO SEO SEO

    It appears that even prior to this change the data was sampled. Now that there is rounding on that sample the data is much less usable. It was pretty useless before on any website with more than 10,000 visits per month. This data is now much less usable for such large sites with heavy traffic, but still seems okay to use for smaller websites.
    My take is that instead of trying to measure your CTR, just focus on traffic by keyword, and perhaps also focus more on bounce rate by keyword as well. Generally with SEO there is a lot of data out there, and when there is so much to do it seems that its best to just focus on a few key metrics and get good at improving those vs. trying to look at other metrics like this CTR data out of GWT.