• netmeg

    I just want to point out that “Michigan Wedding Fireworks” isn’t me.

  • http://twitter.com/AdreanaAtTAP Adreana Langston

    There is a part of this story that I do not understand. Is Google also penalizing the site mauioffseasonDOTcomFORWARDSLASHhotels for practicing the proliferation of spam?

  • Ashley Berman Hale

    Danny – don’t be a dipshit. False reporting and scare tactics help no one.

    John did point out several issues of bad spam to Mozilla. Further, the penalty referenced UGC. No one is being treated ‘unfairly’ here.

    Regarding the BBC case you are dead WRONG that it was a single link. It was egregiously spammy link tactics to a single article. Not the same.

    The idea here is that you need to properly manage websites and clean out crap content – not just nofollow links and think that your job is done.

  • http://www.michaelmerritt.org/ Michael Merritt

    Danny, there is at least one Mozilla blog that has do-follow comment links: http://blog.mozilla.org/dmandelin/2010/01/14/tracemonkeyfx36-hacks/

    That could be where it’s coming from.

  • http://searchengineland.com/ Danny Sullivan

    I see nofollow links there.

  • klippers

    Good article. It amazes me just how vague Google is with this stuff and Google isn’t transparent by any means. You can tell just how confused Chris More is in his comments, as anyone would be. .

  • http://twitter.com/marcusbowlerhat Marcus Miller

    You know, I agree with this to some extent. It’s a 2 minute job to set up some simple comment rules and filtering in WordPress for a one man band, so if you are a big property, then you have a responsibility to not auto publish junk. A page full of junk is a page full of junk and I don’t really want that returned in my search results.

  • http://searchengineland.com/ Danny Sullivan

    Ashley, you seem to be missing the point. Google has said that it is being more transparent about why publishers might be penalized. And yet, here’s Mozilla getting a notice that it has an action against it for “user generated spam” but without further guidance as to exactly where or what.

    That’s something that should have come along with the original notice, arguably. It’s certainly something Google has said Mozilla should get, if it goes back and does a reconsideration request, as I explain. Why it makes sense to make people do an unnecessary second step is unclear. It’s a time waste all around, for both the publisher and Google.

    John didn’t point out examples of bad spam. He pointed out things that look like spam but which may not be. In particular, as the comment links have nofollow in them (per Google’s own guidelines, and per John’s advice in response), these shouldn’t be spam.

    If Google is now defining spam as any comments at all, regardless of nofollow or not, then we have a real problem. That’s not something Google’s said before. It’s not does much of any “crap content” gets you a manual penalty actions that I’ve seen. That might be your gut opinion, but it’s not what Google has been saying.

    When it comes to the crap content you think just needs to be weeded out, Google has the Panda Update to fight that type of stuff. But Panda is an an automated system, not a manual action. It’s a completely different process. If Mozilla were hit by Panda, they’d not have gotten a notice.

    In short, John seems focused on the idea that this is indeed about Mozilla having UGC that allows bad links to pass credit. That’s also the focus of Google’s guidelines about UGC, to use nofollow or other means to prevent that stuff from passing link credit.

    Given that, I’m not sure what the false reporting is here, nor the scare tactics. Mozilla got a penalty. That’s a fact. How big a penalty is unclear, given “granular” can mean anything. Exactly why it got a penalty is unclear, as well.

    As for the BBC, you can read our articles that lead to the help discussion for yourself. Someone from the BBC got a notice about a penalty. It involved one single page that had unnatural links pointing at it. I see in my story, I’d written:

    “Eventually, Mueller answered that the warning came from having one single link that was deemed spammy and that “granular” action was taken.”

    I’ll fix that — I’d meant to write one single page on the site that generated that action. And that’s still an issue. Sending a notice to a publisher with thousands, if not millions of pages, and then expecting them to magically know which single page in that bunch is the problem isn’t useful for the publisher or Google. Google should tell the publisher exactly what the issue is and where, and the publisher can then more effectively solve the problem without this time waste of forum posts and reconsideration requests.

  • http://twitter.com/marcusbowlerhat Marcus Miller

    But then again, it would be nice if they could give targeted examples, not so much for mozilla, but for the little people who don’t always realise they are getting things wrong.

  • http://searchengineland.com/ Danny Sullivan

    You’d think so. But then again, if you look here:
    http://googlewebmastercentral.blogspot.com/2009/08/optimize-your-crawling-indexing.html

    That’s Google’s own official blog for webmasters, designed to give publishers advice on how to stay in good graces with Google. Note the comment from “Electronics gadgets online,” which thanks Google for some of the good advice it is sharing.

    You’d think some simple comment rules would catch that, right? Or the post from “help computer services” or maybe taking care of “Trekking in Nepal” over here:

    http://googlewebmastercentral.blogspot.com/2013/03/cheat-sheet-for-friends-and-family.html

    Google has typically warned people about UGC to make use of nofollow. Mozilla seems to have done a lot of that. It certainly may have places where the nofollow isn’t being used, but if that’s the case, as I’ve said earlier, it sure would be easier if Google just told it where.

    And in general, I’d certainly agree that went you come across a blog post with tons of spammy comments, it just looks bad. But that really hasn’t been Google’s focus in sending out manual penalty notices.

  • http://www.facebook.com/coachcijaye Cijaye DePradine

    Danny thanks for this article. But I do have to agree with Ashley. A nofollow reference isn’t the answer to the real problem here.

    It’s proper site management, moderation and overall common sense. If you have a wordpress site for example you should be using the best possible counter-attack plugins to ward OFF the poor content AND you should be setting alerts for all new comments and getting in there regularly to get rid of anything that gets passed the filters.

    I would also add that you should do the same for links. If you can set alerts for new links or at least watch for them by monitoring your inbound links regularly you can do your due diligence in removing bad links AS they happen instead of letting them build up over months or years without attention.

    Just some brief food for thought on these particular matters. That’s all.

  • http://twitter.com/jamslater James Slater
  • Tom

    If you ever wondered what it would be like to take the 900 lb gorilla model from the world of Microsoft and apply it to the web as a whole, here you go.

    Another good idea: limit usernames to 6-7 characters if they’re allowed to be links, or allow any length of they aren’t linkable to an arbitrary location.

  • Tom

    We don’t have enough information but to know if they have (or not) but the google ToS seem to indicate that they reserve the right to. Although it may not be as directly punitive as all this: My understanding is the algorithm has shifted to analyze and take into account the quality of back links, not just the quantity and page rank of the source. So junk like this, that appears mostly in the comment sections of blogs, would be worth next to nothing, or maybe even weight you down, but quality, unpaid for links in the meat of a site’s content would be worth much, much more.

  • http://searchengineland.com/ Danny Sullivan

    Nofollow is the primary thing that Google has focused on, in relation to UGC. It comes up plenty in its page on UGC. It also seems to be the primary way it seeks to escape penalizing itself. Otherwise, [site:blogger.com payday loans] is a good place to start to see what a nightmare Google itself would face. Or, you know, visit any popular YouTube page. The comments far outweigh the actual “content” on that page, and there’s plenty of spam in there.

    I agree that good publishers should see to curb comment spam beyond just using nofollow. But the bigger issue remains that if Google is trying to help publishers deal with spam, to the degree of sending out alarming penalty notices, it needs to be more precise about what and where that spam is.

  • http://searchengineland.com/ Danny Sullivan

    To clarify, I see some links there are nofollow, some that aren’t. The issue was never that I was saying that all of Mozilla was perfectly dealing with spammy links. It was that the list of results it was given included some pages where it was doing things right, which is confusing, but more confusing is just not being told precisely what the problem is.

  • Ashley Berman Hale

    They are being more transparent. Most of their penalty notifications only pointed to the guidelines. I think pointing out the issue is sufficient enough.

    The ORIGINAL message pointed to UGC spam and it gave ideas of where to look.

    John pointed out spam comments, but also spam content. And the links have not, and are not today, all nofollowed. Furthermore – the issue goes beyond just links. It’s just crappy content. You can’t get a page of total spam and nofollow the links and think it is ok. There’s no usability angle there!

    “If Google is now defining spam as any comments at all, regardless of nofollow or not, then we have a real problem.” Where the hell are you getting this stuff Danny? Seriously! How are you even logically jumping to this? Come on, you’re a smart guy…

  • Ashley Berman Hale

    Word!

  • http://twitter.com/marcusbowlerhat Marcus Miller

    Sure, it’s seemingly a case of do what we say, not what we do.

    Certainly, some examples would not hurt. You can’t lock a man up without telling him what he has done wrong and if Google wants ‘us’ to clean up the web, then why not help us in that task?

  • Ashley Berman Hale

    Marcus – I think you have to keep scalability in mind + putting the onus back on the site owners.

    I hate it when Google’s penalty messages only list the vague “against guidelines” line. However, when they point out something as specific as UGC and then even give examples of where to look I don’t think there’s a lot of room to complain. I believe fully that the webmasters should know their site better than anyone and be able to dig around and manage their site without a full blueprint for Google. Especially, since when it comes to Search Google’s customers are the users – not the site owners. So in this case, all data is a positive move for webmasters.

  • http://twitter.com/marcusbowlerhat Marcus Miller

    I am not sure scalability works as a reason here. If it is a manual penalty, they have the data and examples, so why not share? Now, if it is malicious, then okay, lets not give a ‘get out of jail card’ but some of these cases could be cleared up more easily with more transparency.

    There is no right and wrong here and obviously, each case is unique and Google has to cope with scaleability of this process but if they want to webmasters to do better, then they have a duty to educate and so far in the whole evolution of the penalty, they have not done a very good job of that (despite continued promises towards more disclosure).

    As you say, if the type of problem, location and examples are given then that’s pretty much a fair cop in my mind.

  • keaner

    lots smarter then you it would seem

  • keaner

    “setting alerts for all new comments” – You can really tell who doesn’t manage big sites and who does.

  • http://searchengineland.com/ Danny Sullivan

    Ashley, Google is on-the-record per my quote of Jake Hubert saying they’ll tell you precisely what’s wrong with your site. That’s a huge and welcomed change in a world where Google has stepped up notifications. I applaud that move. It’s in line with what they’ve said about sharing more specific details for publishers who come away confused by these notices.

    The original message, if it pointed to UGC spam at all, wasn’t shown that way in the webmaster forum. If it contained helpful specifics for Mozilla, then it’s odd that Chris More from Mozilla felt he had to go ask for advice from Google both in the forum and in a tweet he made to Matt Cutts.

    He was clearly confused. Google doesn’t want him or any publisher to be confused, which is why it is being more transparent and why it has said it will provide a “clear answer” if a reconsideration request is filed.

    But as I’ve explained, it would save everyone a lot of time if it just provided that clear answer as part of the penalty notification. I get that you think things are clear and transparent enough. I can only point you to that two major publishers now within a short period of time, Mozilla and the BBC, clearly haven’t felt they’ve understood what the problem is.

    I also get that in your mind, Google is somehow penalizing for bad usability or crappy content beyond content that contains links. All I can tell you is that you seem to be confused to its use of algorithmic penalties to fight poor content (IE, Panda) or pure spam (IE, Penguin, among others) and its use of manual penalties to go after things like bad links.

    And for the “where am I getting this stuff from question,” I’m getting it from you. I’m telling you that sites with lots of UGC content are now going to face manual spam penalties because of “non-link spam,” that’s a marked shift from what Google has done before. It just hasn’t taken many of these type of actions for that type of behavior that I recall.

    So if you believe that’s the case, as you seem to, then UGC sites ranging from Mozilla to YouTube have some issues. Because pick a popular YouTube video page, and you’ll see cruddy content all over it — and yet many of those pages rank just fine.

    I get that you’ve been activity in the help forums for a long time, so perhaps you’re feeling you need to be defensive of Google. I can appreciate that, because goodness knows Google takes plenty of flak for things it can’t control and often doesn’t get enough credit for good things that it can do.

    But this issue of publishers being confused about what Google doesn’t like about their sites when getting notifications has been going on for ages. It’s been miraculous that Google has finally gotten over its paranoia about telling people what is wrong. That’s somewhat justified given the crap they face out there. But it’s still a better move to just say what the problem is.

    I mean, in the end, if you sit back — Mozilla is a huge site. A trustworthy site. You’d think Google would want to explain precisely to the company what the issues are from the start, rather than forcing the publisher to turn to Twitter and the help forums for more help. That’s not efficient.

  • http://searchengineland.com/ Danny Sullivan

    A manual penalty only goes out when an actual human being at Google has reviewed and found a specific problem, which is logged in a database at Google in order to assign a penalty and to review whether a penalty should be extended, when it expires.

    So in terms scalability issue, there isn’t one. The information is all there. The person sending the penalty knows what it is. The only thing that’s not happening is that when the notice goes out, it’s not explaining much or all of the problem.

    If a publisher goes back to Google after getting one of these notices, according to Google, THEN Google will provide a clear answer of what the problem is.

    That’s the real scalability issue. It forces a second round and possible false starts.

  • Ashley Berman Hale

    Danny – the original message, that Chris posted said:

    “Google has detected user-generated spam on your site. Typically, this kind of spam is found on forum pages, guestbook pages, or in user profiles.”

    So yes, very clearly pointed to UGC spam.

  • Ashley Berman Hale

    Probably, bro.

  • http://searchengineland.com/ Danny Sullivan

    Perhaps we’re miscommunicating. To me, that message is saying, “Hey, you have a UGC spam problem” but not “Hey, look at this page — see that garbage? That’s causing you to get a penalty.” It’s saying generally there’s a problem but not specifically where it’s at.

    As a result, More — with a huge number of pages he’s responsable for, has to start his own detective work to figure out which page or pages might be the issue.

    Something specific generated that notice. Google knows what it is. If they tell him, he can solve the specific problem plus apply, hopefully, a more general remedy as well.

  • Ashley Berman Hale

    I like data too – and generally, more information is good. But I also think some of that weight needs to fall on the webmasters. Google’s given us more information over time – but when you get more is it really appropriate to respond by stamping your foot and just wanting more and more?

    I’m used to that behavior from my daughters, but just hope to see something different from grown ups.

  • http://searchengineland.com/ Danny Sullivan

    If you had 100,000 daughters, and one of them did something wrong, and the school called you and said “you’re daughter is misbehaving,” yeah, I think you’d expect a little guidance about which one.

    If you’re a big publisher like Mozilla or the BBC, having to effectively guess at what Google thinks is wrong with your site is a waste of time, especially when Google knows exactly what the issue is, clearly doesn’t have some trust issue about telling you and, in fact, will tell virtually any site what’s wrong if they ask after getting a notification, as long as they file a reconsideration request.

    The information is already there, already available, but Google is inefficiently providing it. It could do a better job, and hopefully it will.

    That doesn’t mean it has done a bad job providing advice and resources. It’s grown its support immensely since the time I first started writing about Google back when it launched in 1998, back in the days when it famously talked about not being worried about spam at all, not believing there was spam.

    Things have changed, and for the better. That doesn’t mean there isn’t room for improvement, and you don’t see that improvement unless you cover some of the reasons why those are needed.

  • http://www.michaelmerritt.org/ Michael Merritt

    Maybe he’s just giving some general advice on where to look? He looked up a specific query and pointed out the existence of pharmaceutical terms on Mozilla blogs. That’s certainly more precise help than a lot of people get, which I understand because John probably has other things to do than spend time in the help forums all day. Also, I thought he was pretty precise on the spammy comments point.

  • http://www.michaelmerritt.org/ Michael Merritt

    This I agree on. I’m unsure how they’d go about doing that, though John’s response would suggest they already have the algorithm to determine spammy comment sections.

  • Doc Sheldon

    good article, Danny. I can see your point on the lack of professed transparency on Google’s part. But like Ashley, I don’t feel sorry for Mozilla, after having allowed that kind of crap content to stand, nofollowed or not. Google clearly has a problem dealing with webmasters in an open and efficient manner, but many webmasters also seem to have a problem keeping their sites clean for their users. To me, scalability doesn’t enter into it. I may moderate the UGC on my own site myself, but others, with 100s of thousands of pages of UGC can’t do that efficiently. So hire more people or tighten up your operation to stop it before it becomes an issue.

  • Alan

    I am with Ashley on this one.

    Firstly, Just because Mozilla is a big site doesn’t mean it needs preferential treatment from Google. So does that mean Google is going to be telling everyone exactly what they are doing wrong? It is a bit elitist of you to think that way Danny. You are coming across as if “the little people” don’t matter. Maybe you are falling for the big brand doctrine Google seems to be espousing these days.

    Secondly, Mozilla’s blog area has been used for comment spam and for building linkwheel type structures for years now. At least the last 5 years worth, that I can remember.

    Mozilla is showing that it is a lazy organization, that doesn’t want to put in the effort of cleaning up it’s back yard. They could hire even a mediocre SEO to track down the offending pages for them if they don’t have the expertise in-house. Quite frankly if I was Google I wouldn’t let Google bot come with in cat 5 cable of the joint.

  • http://searchengineland.com/ Danny Sullivan

    Alan, if Mozilla is a lazy organization, you should spend some time over at Blogger and YouTube. Heck, go to the Google webmaster blog. You’ll find plenty of spammy comments that are allowed to get by on Google’s own turf.

    As for tracking down the offending page, when Google decides that one single page out of 21 million on your domain is spam — which ultimately turned out to be the case here — it’s not a question of your SEO skills in locating it. It’s a question that pretty much only Google knows where it is, so they ought to tell you.

    As for elitism, I’ve written nothing saying that Google should only provide details for “big” publishers. I’ve been consistent in saying that if it’s going to send notices, it should provide an explanation of what’s wrong to any publisher it notifies.

    I mentioned the issue of “big” publishers like Mozilla only because in their cases, it’s potentially even harder to track down where a single page of spam that Google’s concerned about may exist, if Google doesn’t help. That’s not saying they should get preferential treatment — again, I think all publishers should get an explanation. I’m just trying to point out how hard it can be to spot spam when you’re dealing with thousands of pages, rather than tens or hundreds.

    That’s why Google just telling publishers what’s wrong, rather than this guessing game, would be good. And telling all publishers with a penalty that.

  • http://searchengineland.com/ Danny Sullivan

    I guess I’m loss at this point at some commenting here thinking that I’m somehow not saying to clean up spam beyond nofollow.

    That was in the headline of my story, that this was about UGC spam. It was in the second line of the opening paragraph, that people should pay attention to what’s happening by users on their site. It was in my conclusion, to be sure you’re screening your UGC stuff — with a focus on making sure you’re using nofollow — but I didn’t say that exclusively was the universal solution.

    So to be perfectly clear, I’m not saying nofollow is the universal panacea, just apply and you’re cured. You do have to pay attention to what’s happening on the site overall, because non-link spam might be happening.

    The bigger point for me remains that if Google knows exactly what page is an issue, or what area is the problem, telling a site owner that save time all around in getting the problem solved.

  • http://twitter.com/BarefootSEO_ piers Ede

    How coincidental that Mozilla is a major rival of Google in the browser war. They really are the end….

  • Alan

    Ok Mozilla is a lazy organization when it comes to their blog area. They have always been such but yeah blogger et al are worse. I didn’t say they were the penultimate in laziness.

    and I don’t know why I am arguing with you on this one I like it when you actually have a go at Google for something.

    However with mozilla it isn’t just one page that has been spammed to death there are other examples and although I haven’t been there in ages I do remember them being part of many of those crappy linkwheel diagrams that you used to see all over the web.

  • http://searchengineland.com/ Danny Sullivan

    Alan, the notice was the result of one page of spam. There’s been an update in the thread about this. One single page of spam generated that notice, just like one single page with unnatural links pointing at it generated a notice for the BBC. In both cases, these large publishers with millions of pages effectively had no clue where to begin.

    Mozilla may have broader comment spam issues going on that it may want to address, even though those apparently aren’t causing penalties. I’m not excusing that.

    And I don’t know why we’re arguing, either :) I mainly wanted to stress that I’m not saying big organization should get special treatement. I just think generally, if Google knows what’s wrong, then tell people.

  • http://twitter.com/marcusbowlerhat Marcus Miller

    I think it is hard to disagree with that. They have the data as they are penalising on it, it is the best possible example for the site owner at what they (or their users) are doing wrong – why not share?

  • abigail_rocket_blast

    Thought experiment: imagine that you are responsible for the BBC website, http://www.bbc.co.uk/. Google has told you that a single page on it has a dodgy link profile, but you don’t know which one. Ashley – how would you, personally, set about identifying which one?

  • Ashley Berman Hale

    Abigail – I’d look for outliers, changes in the regular patterns. I’d look for spikes in links found and I’d look for anomalies in traffic. It’s not always guaranteed – but you need to be familiar with the data and poking around.

  • abigail_rocket_blast

    Thanks. I’m sure you’re right and you would track it down eventually, but I’m inclined to agree with Danny in that it would save a great deal of everyone’s time and effort if you were just told which page to fix in the first place!

  • Doc Sheldon

    I can’t disagree with that, Danny. I guess I was just looking at the situation in a literal sense of whether the penalty (as referenced in the title “Google Hits Mozilla With Spam Penalty Over User Generated Content”) was justified or not. It sounds like we agree that it was. We certainly agree that Google’s obscure notifications are counter-productive.

  • Ashley Berman Hale

    Of course it would save time – but, should we expect it? I’m more of a “teach a man to fish” sort of leaner.

    Google is providing a free service to webmasters, with free tools, documentation, etc. They don’t have to offer any of that. They also don’t have to give you any notification of any manual action – but they do. Sure, I’d love more data but I also think it’s mighty righteous to demand it. Do you get that sort of information/assistance from other free services?

  • abigail_rocket_blast

    Probably not, but then I can’t think of another free service whose whims can have such a potentially catastrophic effect on a business.

    Good point though – it would probably be more cost effective for most businesses with large sites to pay for this information than waste expensive man-hours on searching fruitlessly for the answer. Maybe they could offer it as a ‘premium’ service for those without the time or inclination for that sort of on-the-job learning?

  • http://searchengineland.com/ Danny Sullivan

    Google dismissed the idea of paid support many, many years ago. That has all been discussed and long forgotten by so many.

    It’s very good that it does provide so much free support, and I’ve applauded that growth over the years.

    But it’s not all altruistic. For one there’s a huge PR value in providing this support. I’ve seen any number of articles over the years from prominent people and companies that have complained that Google has screwed up indexing them — sometimes with great validity — so having support tools helps counter bad PR.

    It’s also Google’s job. They pay in virtually all cases absolutely nothing for the content they index. They walk in to publisher sites, help themselves to the content on those sites and use those listings as the basis of an incredible profitable ad business.

    The unwritten contract in all this is that publishers themselves get huge amounts of traffic from Google, if they allow the Google crawlers in. But those crawlers aren’t perfect, do make mistake and Google needs help.

    That’s what the whole webmaster support system is about. That’s why Google also provides it for free. And it is not unreasonable at all to say that if Google knows which single page out of 22 million is the spam page, that it would just point at that and save everyone time.

    If it had done that in this case, Mozilla could have fixed the page within minutes, plus been alerted to the broader spam problem.

    Since it didn’t do that, we got this…

    1) Mozilla has to go to the webmaster forums and Twitter asking for help.

    2) Volunteers on the webmaster forums, including yourself, have to make guesses as to what’s wrong.

    3) None of those guess were right about the page in question, which means that if Mozilla had relied only on that, they still wouldn’t have solved the spam issue but rather wasted time. Might have prevented a future spam warning, but it wouldn’t have resolved the actual one that was issued.

    4) Because this is a major company getting hit over a common issue, I and others spend time writing about it.

    5) Two different people from Google have to go into the forum and answer the question, to finally get the right answer out.

    That’s a huge time suck, including for Google, because the “more data” you seem to think unreasonable to be provided — the actual spam page out of over 20 million good pages on the site — wasn’t given.

    That’s the point here — that if Google is going to send notices out, because Google itself has decided that it sees a value in doing so, then Google needs to make sure those notices are actionable. Not a needle-in-the-haystack chase.

  • http://www.seojus.com/ justin

    I would have to agree with Danny here. Google is a gizallion dollar company. They have the manpower and resources to do anything they want. If you have violated their guidelines you should get a full report of what is wrong. It would be like getting arrested but the cop doesn’t tell you what your being arrested for. The whole penalty issue is completely messed up and not productive at all. Google isn’t just a nice little volunteer that is out there helping people, they are a business making millions of dollars off of its searchers, publishers, and advertisers, so yes, you do deserve to get a detailed report of what you did wrong. By not identifying the crime your just wasting time and money. Apply the penalty, send the notification with what is wrong, give a certain amount of time to fix it, if they do then lift the penalty, if they don’t then leave it applied. If there main goal is to create a better user experience, then they need to inform the webmasters who are creating this user experience of what violations are being committed.

  • http://www.drakondigital.com/ Drakon Digital

    First, Child Protective Services needs to get involved. Naming your child “free online shopping” is definitely a form of abuse. I’m picturing Johnny Cash singing “A boy named Free Online Shopping”.

    Second, I understand Google’s reservations with informing potential spammers what they are doing wrong and how they are finding them. However, it shouldn’t be that hard for Google to spot spammers and the tactics used to manipulate search results. I find spammy links every single day. Often, I am appalled that Google has not noticed and acted on the spammy links. If I can find dozens (sometimes hundreds) of them a day, their staff of PhD’s and MBA’s shouldn’t have as much trouble as they appear to.

    Third, Google penalizing sites for not conforming to their guidelines and then not informing them what was done wrong is like the police arresting you and not telling you what crime you committed. How are we supposed to correct the problem and take steps to limit the chances of its reoccurrence if we don’t know what to do. When I was spanked bare-bottom with the soup spoon as a kid my mom always made sure I knew what I did wrong. And guess what… I learned to correct that behavior very quickly.

    p.s. Matt Cutts, pay me half of what one of your fancy PhD’s earns in a year and I’ll send you a spreadsheet as long as Santa’s wish list chock-full of spam techniques (that are currently earning top rankings for many large companies) that are so easy to spot even a lowly SEO could do it.

  • http://www.facebook.com/coachcijaye Cijaye DePradine

    Well said. :)