• https://www.jonidbendo.com Jonid Bendo

    Really interesting that the world wide web now resolves around Google. I really love Google, but hate to see worthy sites getting this kind of treatment and without a clear reason. What i am not aware of and could not seem to find in the article is if he did actually try and contact Google support for this case?

  • https://plus.google.com/u/0/+JakubŠikola Jakub Šikola

    Blaming Google for their changes is same thing as blaming car manufacturer for cancel manufacturing of your favourite van.

  • http://www.andreapernici.com/ Andrea Pernici

    I think it’s related to bulletin boards.
    I bet that asking more than 1000 bulletin boards you will get a similar pattern of lost traffic all over the world. Maybe I’m wrong, but maybe not.

  • http://www.silvar.net/ Miguel Silva Rodrigues

    A great read as usual from Danny.
    There are a few more things to consider too:
    1. The Internet keeps growing and growing, and with it grows competition. Hence rankings are not supposed to remain stable. The only way to grow “forever” is to become a solid brand offering unique propositions and managing to stay ahead of the competition.
    2. Google tries to balance the need for recent information with trust in established websites. While Metafilter is an established site, it isn’t necessarily an authority in the many subjects brought up in their Ask section. We know Google is still tweaking with Authority so this is an important thing to consider.
    I’m sure there are other things escaping us. Who knows, even Googlers might not be able to explain exactly what’s up. After all, Google Search is a huge concoction of algorithms that could qualify as a “complex system” – http://en.wikipedia.org/wiki/Complex_system#Complex_adaptive_systems

  • tambourette

    Here’s an idea: try to build a site as if Google didn’t exist, without relying on organic traffic from it. That way, penalties come and go but you still stay in business.

    When a site has to close down because it’s losing traffic from Google, it has a poor business model.

    Having said that, I empathise with MetaFilter and if Google pointed out what’s wrong on a site, it would help a lot.

  • atentat

    What the hell is google support?

  • https://www.jonidbendo.com Jonid Bendo

    help forum => https://support.google.com/websearch/?hl=en#topic=3378866

    where you are given the option to contact them based on your problem… dunno if they actually take them in consideration though :). Also big businesses almost always have a Google representative, i am not aware if metafilter had one though, so this is why i asked :)

  • http://www.rankontoponline.com/ Troy Curfman

    it also could be because no one has done a onsite analysis and seeing if they are sending link juice to outside sources and also their own description uses the words (anyone can contribute a link or comment to) suggesting sending out to too many sites on one page. no filtering of any sort. I’ve never been there before and just looking at the on page content just looks like too many outgoing links and too easy to use as a platform to send out links.. use less out going links. keep them on inner pages and just pass less link juice. restructure…Just what I seen in 2 minutes and looking at the source code I think link farm even though it’s a creditable site.

  • http://www.seo-theory.com/ Michael Martinez

    Okay, clearly there is not enough data to reverse engineer the problem (at least, not that has been shared publicly). And while many SEO providers have certainly brought shame upon the industry with their smarmy link building tactics, Matt Haughey needs to come down off his high horse and be less judgmental about what search engine optimization is. Otherwise, he doesn’t deserve this kind of attention from you or anyone else who has a legitimate interest in deciphering Google’s behavior toward seemingly good Websites. There are better battles to pick out there.

    What he is trying to do *IS* part of search engine optimization. You can’t practice SEO while you look down your nose at it.

  • Marcus Aurelius

    I feel slightly bad on a personal level. Had I been the one who recommended they start this business I might feel worse. If the concierges stop sending guests to your restaurant, that’s just part of the deal. If a local radio show host who routinely plugs your service for free moves to another city, that’s just part of the deal. Free business generation sources don’t always last forever. The risk is quantifiable and taken on by choice by those who build the business.

  • Bryan J Busch

    If MetaFilter didn’t benefit from AdSense, they’d have no employees, and then there’d be spam on it.

  • Guest

    Interesting analogy, but not entirely accurate. Car manufacturers do provide extensive support and information to suppliers- how to effectively proceed with new implementations in useful time.

  • http://christophermeinck.com Christopher Meinck

    As a forum owner who has seen a continued decline since 2012, I’ve been chasing my tail trying to fix issues. Did we have too many ads? By allowing indexation of member profiles, was that deemed thin content? Recent site audits appear to be technical (broken links), duplicate content being the issue, but I can’t be sure. It seems Google continued to index URLs from our previous software, while indexing the new URLs as well (despite proper 301s). None of this showed up in GWT. Those clean up efforts continue, but I was hopeful we’d see a bump for the massive changes made over the past year. We saw yet another 5% decline as a result of Panda 4.0.

    I don’t see Google’s reasoning for not disclosing when Panda refreshes? Additionally, if they were interested in helping webmasters, the two suggestions you outlined would go a long way. It seems they are more interested in confusing folks then helping legitimate site owners.

    It seems to me, that if Google were to provide this information to webmasters, we’d be able to improve our websites. “We’ve detected a problem, here’s a sample URL.”

    I don’t see this happening and my hopes for a reversal wanes with each passing update.

  • Steve H.

    Interesting analogy, but not entirely accurate. Car manufacturers do
    provide extensive support and information to suppliers- how to
    effectively proceed with new implementations in useful time.

  • http://www.hushes.com/ Cheryl

    Not knowing is mindnumbing especially for a small/med size site that does not have the kind of brand recognition as MetaFilter. I spent 50+ days trying to figure out why my site took a nosedive and the advice I received was (considering my business) downright insulting. That’s another story ;-) So, yes, I completely understand how frustrating the process is as you are trying to ‘fix’ something in the dark. I agree with most of Mr. Sullivan’s suggestions. As to someone else’s comment that one should build as if Google didn’t exist, been there done that. I hear you. However, when you consider “negative seo” tactics, you cannot do that and successfully defend your brand e.g., trademarks, copyrights etc.

  • AK

    I also got hit on that exact date, Nov 17th and I’ve never been able to figure out why!! Only 6 months before, I had changed my theme and my traffic had doubled. On the 17th, it was cut in half again! It’s completely unfair if they tell MetaFilter what happened and I never find out!! I’m just so frustrated that the people who PURPOSELY spam Google get to see exactly the errors via Manual Actions and here I am trying to create a useful site with no spam and no SEO whatsoever and I get hit and don’t even get told what caused the problem. ARRGGGGG!

  • Ric

    Wow so Google is the ‘militarized police’ of the Internet world! Shooting your dog and killing your advertising all the while ‘no knock’ locking out your content with no explanation!

    Man, maybe its really time to stop using all their products. Becoming a Facist search engine is not a small disservice!

  • RyanMJones

    I hate to say this, but the look and feel, design, and elements of the site may have something to do with it. If you printed out known spam sites, and metafilter’s ask section, taped them to a wall, and stepped back 50 feet – they’d all look very similar. that’s probably what’s happening (in a very simplistic explanation) at the algorithm level.

    In other words, it has many features in common with whatever training set of bad sites they used to train the panda algorithm, and few in common with the known good sites in the training set.

    With sites of this type, it’s tough – the amount of spammy sites in the corpus vastly outweighs the amount of quality sites yet they mostly all share the same characteristics. In this case, extra steps are needed to differentiate themselves from the other similar style sites that aren’t as high quality.

    A different commenting engine, a different layout, authorship for profiles, these are all ideas that could help.

  • Mike Grant

    This article has a lot of words, without actually saying anything. You say it could be spam, but you didn’t take the initiative to investigate. You say it would be backlink profile, but you didn’t deep dive. It could be algorithmic, but you claim to not know that, either.

    So, what message are you trying to deliver here? That innocent people get penalized, too? Well, that’s no surprise to anyone.

  • http://alanbleiweiss.com/ alanbleiweiss

    One critical notion that everyone seems to discount or even ignore, is the concept that most sites have several problems, not all of which are tied to a single specific “named” update or penalty. If a site has weakness due to a handful of factors, to the point where it’s on the precipice, any otherwise “minor” update to Google’s algorithms can trigger a plummeting effect.

    And since we don’t have access to Google’s algorithms, the only proper approach is to dive deep in and address issues across the board. Wild guesses, and spending a few hours looking at a handful of things is rarely going to identify the cause when the cause is actually many different distinct issues.

  • http://www.lostsaloon.com/ Ferdinand

    I totally agree with Danny in that response and/or notification to webmasters when you are being penalized with an explanation of why or what happened will go a long way here. Most often than not, it is possible for Google to know exactly which filter degraded the site if it did happen.

    There are also several other things that can happen when algorithmic updates are done. A few of them could be:

    1. New updates can include new SEO properties/factors/signals that are taken into account. Old SEO signals can be dropped or can carry less weight. Google might not want to disclose all of these specific factors and signals as it could be considered intellectual property. There are not going to give you any info that will let you create a contending search engine.

    2. It is virtually impossible to test and run the “new and improved” filters on millions of domains/webpages and diagnose how each of them behaves. It is even more difficult to find the outliers in such cases. A filter that effectively works for 99% of the sites might and will throw up quite a few false positives mostly because of varied nature of content in websites. Every website has unique “set of content” in some or fashion. Most times it is impossible to just create exceptions for each and every false positive.

    3. Web is not static, if anything. Hundreds and thousands of new content and websites are created every single minute. It is quite possible that your content now ranks lower because there is newer and better content that was unearthed by the new algorithm. It could be the natural regression of the content. When your website contains thousands of pages, and many of those pages now rank lower even by a single position it can easily add up.

    Again, we have to also consider some quirks on “when” Google should notify you. Even when following the suggestions that Danny made (which I agree with mostly), there are few issues with it….

    1. The position on Google search results is hardly a constant. It varies wildly based on query, location, personalization etc etc. Google cannot notify you everytime you drop in rankings because as new post is made or new content is indexed that is better than yours. When you drop from 1st to 2nd that could be a drop in as much as 30% traffic for that page.

    2. The queries also die, most times over time but some times suddenly. I am sure people do not search for the same things as they did 10 years ago. If your site is optimized for “Palm pilot”, then that drop you see may not have anything to do with Google. I understand that such a drop would be gradual, but still it could happen with in the scope of an algorithmic update.

    3. The notification process works best if and when it is a hard and deliberate penalty. Most times, it usually a combination of several different factors and filters which cannot be deciphered with out a time consuming audit process for that particular page/domain.

    As somebody mentioned earlier in the comment, it was your decision to build a business based on the good graces of Google. Also, you profited from it at some point when Google backed you or was not “good enough” to catch you. Just because you profited in the past does not mean it will last forever. That is like telling the cop who pulled you over for speeding….”But I speed all the time, and was not caught before.”

    My own sites saw drops of about 30-40% traffic with new Panda 4.0 update in last several days, and I could not figure out what I might have done wrong. I am pretty sure it is nothing to do with what I have done or not done but more of a part of “algorithmic evolution” in content recognition. This might seem and sound as defending Google, but I certainly am not trying to do it intentionally.

  • http://www.lostsaloon.com/ Ferdinand

    I totally agree with Danny in that response and/or notification to webmasters when you are being penalized with an explanation of why or what happened will go a long way here. Most often than not, it is possible for Google to know exactly which filter degraded the site if it did happen.

    There are also several other things that can happen when algorithmic updates are done. A few of them could be:

    1. New updates can include new SEO properties/factors/signals that are taken into account. Old SEO signals can be dropped or can carry less weight. Google might not want to disclose all of these specific factors and signals as it could be considered intellectual property. There are not going to give you any info that will let you create a contending search engine.

    2. It is virtually impossible to test and run the “new and improved” filters on millions of domains/webpages and diagnose how each of them behaves. It is even more difficult to find the outliers in such cases. A filter that effectively works for 99% of the sites might and will throw up quite a few false positives mostly because of varied nature of content in websites. Every website has unique “set of content” in some or fashion. Most times it is impossible to just create exceptions for each and every false positive.

    3. Web is not static, if anything. Hundreds and thousands of new content and websites are created every single minute. It is quite possible that your content now ranks lower because there is newer and better content that was unearthed by the new algorithm. It could be the natural regression of the content. When your website contains thousands of pages, and many of those pages now rank lower even by a single position it can easily add up.

    Again, we have to also consider some quirks on “when” Google should notify you. Even when following the suggestions that Danny made (which I agree with mostly), there are few issues with it….

    1. The position on Google search results is hardly a constant. It varies wildly based on query, location, personalization etc etc. Google cannot notify you everytime you drop in rankings because as new post is made or new content is indexed that is better than yours. When you drop from 1st to 2nd that could be a drop in as much as 30% traffic for that page.

    2. The queries also die, most times over time but some times suddenly. I am sure people do not search for the same things as they did 10 years ago. If your site is optimized for “Palm pilot”, then that drop you see may not have anything to do with Google. I understand that such a drop would be gradual, but still it could happen with in the scope of an algorithmic update.

    3. The notification process works best if and when it is a hard and deliberate penalty. Most times, it usually a combination of several different factors and filters which cannot be deciphered with out a time consuming audit process for that particular page/domain.

    As somebody mentioned earlier in the comment, it was your decision to build a business based on the good graces of Google. Also, you profited from it at some point when Google backed you or was not “good enough” to catch you. Just because you profited in the past does not mean it will last forever. That is like telling the cop who pulled you over for speeding….”But I speed all the time, and was not caught before.”

    My own sites saw drops of about 30-40% traffic with new Panda 4.0 update in last several days, and I could not figure out what I might have done wrong. I am pretty sure it is nothing to do with what I have done or not done but more of a part of “algorithmic evolution” in content recognition. This might seem and sound as defending Google, but I certainly am not trying to do it intentionally.

  • RyanMJones

    Just noticed that a LOT of comments have been deleted from this article. Just curious as to why.

  • Durant Imboden

    Has MetaFilter been penalized by Google (as suggested by this story’s headline), or is it just the victim of an algorithm change that favors a different type of content? And how has MetaFilter been affected by Panda 4.0?

  • Ralph Slate

    That is not “Google Support” – there is no Google Support for web sites because Google says that the sites are not its customers.

    The forum is a vile, nasty place. If you post there, be prepared to be sneered at, ridiculed, accused of doing bad things, and then be prepared to wade through all kinds of bad advice. This is from people who have been anointed by Google as “Top Contributors”.

    For example, for any problem, you will almost always have someone tell you that you are penalized because you have above-the-fold advertising. Not “too much” – some of the moderators are pushing the idea that any advertising is too much.

    When I posted my site there – 15 years old, very popular, and I have never engaged in any dodgy activity – someone dragged up a single link posted on some Czech sex site and dismissed me as a spammer.

    Even for those who can get past that, they will hone in on minor problems with your site and then say that this is your problem. For example, maybe you aren’t redirecting all your traffic to http://www.sitename.com, so that sitename.com appears as a different page. Even though by all accounts, Google can figure this out, the moderators shame you for something like this.

    I can understand that Google doesn’t want to devote resources to answering questions from webmasters, 90% of which probably are trying to spam and 99.9% of which don’t deserve to be at the #1 spot. However, they should clearly identify it when you get an “algorithmic action”, and they should either provide you with the tripwire that you’re tripping on, or at the very least allow you to pay then $50 to have a person review your site and tell you where you’re being pushed down.

    I spent almost five months in such a hell – with a 40% drop in traffic on April 24, 2012. Yes, that was the Penguin day. Every single moderator on their site said that my problem was a Panda problem, not a Penguin problem – despite the clear drop on a notable date. I did get John Mueller – a Google employee – to comment, and his advice was “I don’t see anything wrong, my advice is to make your site the best of its kind”. Duh. There was a very clear penalty as my long-tail pages were ranking #1 on the 2nd page, behind spammy garbage. Then, on October 13, 2012, the pattern reversed itself. Yes, I had made a bunch of changes, but to this day I don’t know if I came back due to a change I made, or due to Google fixing their algorithm.

  • http://theseonut.com/ Adam

    True, but how is that Google’s problem? Metafilter decided to base their business model on Google rankings/traffic. Google did not decide that for them.

  • http://www.monicawright.com Monica Wright

    They aren’t deleted, the Disqus spam filter is overreacting, we are releasing many comments now.

  • http://searchengineland.com/ Danny Sullivan

    Google support for publishers is through Google Webmaster Tools, via the link in the article above. You don’t have to go into the forums; you can directly request things like reconsideration of your site from an actual Google employee.

  • Ralph Slate

    Reconsideration requests are for manual actions. There is no support for a site that gets whacked by an algorithmic action other than the forums.

  • Matt Jones

    You say “site operators should be notified”, but I suspect Google just hears “blackhat SEO operators should get feedback on how to be moar evil”. Bit of an arms race going on there…

  • RyanMJones

    This is the issue. You need to tell people if they’ve been penalized, but you can’t tell them exactly what triggered the penalty or else you’re just giving them a recipe for how to beat it next time. Finding a balance is not easy.

  • Rob Waldeck

    but the drop off was on one day, leading people to believe that one thing done that day caused the issue.

  • Matt McGee

    I don’t speak for Danny, but I’d suggest that the point of this article isn’t to solve MetaFilter’s problem, it’s to explain (to MetaFilter and the many tech sites/readers that don’t know) how and why penalties happen and the struggle — especially for non-SEOs — to understand how to fix things when Google sometimes doesn’t say what you did wrong.

  • TrevorAGreen

    Google makes over 90% of its revenue for adds, pays nothing to the people who create the content that populates its results and then penalized people for selling ads to monetize their content and then proceeds to penalize thin content because they want more quality free content in their index that users will give them credit for finding and keep coming back and clicking on those ads. Google is just built to suck as much value out of the content of others without paying them for that content. They are a parasite masquerading as symbiote. There is no reason why the majority of their billions shouldn’t be property of the people who’s content makes up their index. They need to start paying the costs that they are skirting. The free ride piggyback ride on the collective intellectual property of the world needs to end.

  • Paige

    Agreed. People want to complain about Google’s overreaching power (and look down on SEOs as Haughey did) while cashing in on organic search. You can’t have it both ways.

  • http://searchengineland.com/ Danny Sullivan

    Mike, the article has a lot of words that actually answer the things you believe are missing. You should reread it.

    I’ve said I don’t know exactly the situation because I don’t. No one knows other than Google, which is explained in the story.

    I did say you can make assumptions about what the problems might be and analyze those. Which I did, in fairly decent depth in the sections about ads and links.

    The message I’m trying to deliver was right there in the opening paragraph: when someone is penalized by Google, it can be hard for them to figure out exactly what’s wrong to fix things. And that theme is continued, expanded and fully explained and documented in this article — with a conclusion that it sure would be nice if Google gave better feedback.

  • http://searchengineland.com/ Danny Sullivan

    Google has periodically asked for people to report potential false positives with both Panda and Penguin. Reconsideration can also be used as a mechanism to alert Google to the fact that you believe the filter is off. Even though it’s meant for manual actions, it is one of several methods of contacting them.

  • Jonah Stein

    Assuming they got caught up in a Panda data refresh, the issue is either brand queries or deep links.

    MetaFilter has about 40k searches/month for brand related keywords (according to keyword planner) and about 8k of those are for “ask metafilter”. While this would certainly seem to be enough to indicate it was a Brand in Google’s mind, it is likely/possible that they slipped below the desired ratio of brand/navigational search to keyword driven SEO traffic.

    Alternatively, too few of their newer URLs “earned” enough deep links and suddenly they dipped below that quality threshold. The way the math in the Panda patent seems to work, roughly, is something like this:
    Bt (brand traffic = successful navigational search queries) / OT (total organic SEO traffic) times DL (deep links) divided by Tip (total indexed pages).

    If the issue is unrelated to Panda, Google probably just wants them to rel=nofollow all external links because a small percentage appear to be spammy.