• Gridlock

    “an e-commerce company and website launched in 2013 that sells nutritional supplements”

    Shut it down!

    Well, that was easy. Got any harder ones?

  • Ehrlich Alba

    is site one selling the jellyfish that’s supposedly “the miracle weight loss” thing? LOL

  • http://www.ericward.com/lmp Eric Ward

    The challenge is so many penalized sites exist in the middle, between these two extreme examples. Many sites are able to earn a few good links, but will it be enough?

  • http://www.seopros.org Terry Van Horne

    Actually you never really have to take the site down…. change all the page names and break the canonicalization of the domain/home page…. change the Google default site in GWT …bingo bango bongo…. have accomplished the same as changing domain. All manual and algorithmic programming dampeners and penalties are placed at the page level so changing the name of the page (best to 410 old page) removes the penalty **UNTIL** the site is indexed (Panda) or an algorithmic algo update occurs. The penalty is removed the cause is not gone!

  • http://www.ericward.com/lmp Eric Ward

    Interesting. The technical equivalent of “whack a mole”? If the original cause of the penalty remains, would this be sustainable?

  • http://www.seopros.org Terry Van Horne

    No, basically it removes the penalty or dampening (in the case of Panda) **Until** the page is **re-indexed** in the case of Panda or a **refresh** of Penguin occurs. In the case of a manual it basically starts the site from scratch w/o any links. That said…. nothing to stop you from contacting the sites that meet your criteria and re-establishuing the link, however, if it is manual I’d wait until the reconsideration passes before contacting the sites you want to retain.

  • https://plus.google.com/+JohnBritsios/about John Britsios

    Terry, according to the Googler John Mu, does not seem that what you said above is fully accurate: http://www.youtube.com/watch?v=ArBkHv4r4Yc&feature=share&t=21m1s

  • http://www.seopros.org Terry Van Horne

    Actually he is saying if you leave the site with no changes,,,, as I have said above we are changing all the page names and I have actually done this a few times and in the case of the manual the penalty was removed. I suggest you read my answer a little more closely before saying what I said is not accurate…. and John Mueller has been known to give inaccurate info in those HOA’s for instance I can show you one where he says iframe page links are not followed…. I have since seen a few instances where that is not the case…

  • http://mattmikulla.com/ Matt Mikulla

    I’m trying to follow detail by detail here Terry.

    Are you recommending changing the preferred domain in GWT site settings after switching the canonicalization of the domain?

  • http://www.annapurnadigital.com/ Kasy Allen

    Very interesting. It’s one of those things that I’ve never really thought of, but now that I know about it, I feel like I’ve had an “Ah-ha” moment! I suppose the same could be said for moving from http to https (if the business wanted/needed https).

  • Kevin Lee

    Interestingly, the above scenario could occur for site number two as a result of negative SEO. Its unfortunate that we live in a world where negative SEO can work and the vast majority of small businesses don’t have anyone watching webmaster tools. I’d love to see a stat of what percentage of domains even have a webmaster tools account active. My guess is that its very low.

  • https://plus.google.com/+JohnBritsios/about John Britsios

    If I did not misunderstood John Mueller, he is not only talking about manual actions. There is also a Penguin and an eveflux links evaluation algorithm, which we must take them all into account IMO. I am not following John Mueller for a long time, but in the few HOAs I watched I did not spot any inaccuracy in his advises so far, but I got very valuable input so far. If he said anything wrong in some occasions does not mean that he must be always wrong. And one thing I must add here, is that when Matt Cutts or John Mueller says something, they are responsible for what they say, not for what we just can or want to understand.

  • http://www.seopros.org Terry Van Horne

    Exactly that way you can break the links to the home page

  • http://www.seopros.org Terry Van Horne

    Taking into account the info John pointed to from John Mueller’s Hangout changing from HTTP to HTTPs is not enough of a change to get Google to drop the links to the site. The main thing is the change of file/page names is actually the key (410 is better than 404 for quick removal from index.

    The removing of the domain canonical and edit to the preferred site setting in GWT has the same affect of removing the links to the home page… those links are now pointed at what Google **often** treats as a different site. I also know for a fact this does work for algorithmic dampening as well. We have used this technique successfully with some Panda infected sites (changing page names and fixing page). As to Penguin… well the jury is out…whether you ever fully recover what you lost… or if the dampening subsides enough that new linkage starts to move the site

  • https://plus.google.com/+JohnBritsios/about John Britsios

    Terry may I ask what do you mean with “fixing page”?

  • http://ydraw.com/ Ydraw

    My question is this. Don’t you think that eventually Google is going to make some adjustments to those sites that have been hit with backlink building techniques? I feel like they are just going to scare everyone for a year and locate all the spammy sites with their disavow tool. Once that is over they will revert back some. Google’s updates have really hurt their search quality. I see junk sites that have not been touched in years outranking good quality website.

  • Juliette Paradise

    Hi Eric, good article and I definitely agree with your assessments in
    the face of manual penalties. What about sites that saw significant
    drops in performance after Penguin, but don’t seem to have been hit with
    an actual penalty? Would you still recommend axing the site and
    starting anew? Thanks.

  • Chris Koszo

    @linkmoses:disqus, do you have any first-hand experiences where a penalized site recovered by the new links that were created? I’d be curious how many good links it took to outweigh the bad. I don’t think a handful are enough, even if it’s from WSJ or something. Google needs to see a pattern of good behavior, maybe even based on time (you spent 1 year spamming hard, you must get 1 year’s worth of amazing PR and links to get back to level ground?)

    What do you guys think? Anyway, lots of good ideas in this post and comments!

  • http://ignitevisibility.com John E Lincoln

    Great article Eric. We have been working on so many penalties lately. I have probably gotten 10 or 15 manual actions revoked at this point. Some are so easy, it only takes an email. Others can take up to 6 months and tens of hours of link clean up. I would agree that Google generally basis the severity of the penalty on how bad the link portfolio looks. But in some cases it almost seems like it just depends on if you get lucky enough to get someone on the manual action team who is easygoing, opposed to a member of the manual action team who is out for blood. In one case for sure I can tell you they were just way to hard on a website and in another far too easy on one. I’m talking a difference of weeks opposed to 6 months to get back in, following the exact same process for reincluson for both sites and both had similar link profiles.
    What is really tough at this point is that John Mueller went on record saying that the penalty can follow you now, if they detect the same content on another domain. Even if you don’t put redirects or rel canonical in place. http://ignitevisibility.com/google-penalty-follows-changing-domains-regardless-redirects/ So it really seems like we have to get rid of a penalty or build a new site with ALL new content.

  • https://plus.google.com/+JohnBritsios/about John Britsios

    Chris I had such experiences already. But I need to clarify here that its is not about quantity of links, but it is about quality links. If your cross the threshold of tolerance i.e of the Penguin, you can get out of the threshold overriding the offensive links with gaining new quality links. Need to clarify though, that when I say quality links I do not mean high PageRank sites/pages. Another way may be to try to remove offensive links physically. But If you do both, you are on the best way for a very effective recovery, and not just recovery. Makes sense?

  • http://www.businessfluid.com Carl Bischoff

    Hey Eric thx for the post such a can of worms and so many businesses have been slammed by all this. If Google in the manual penalties would just supply some examples of areas to work on it would make our lives so much easier. Telling a business owner to kill a site is shattering for them. Deciding to start again after a few failed inclusion requests is certainly valid, as the cost to keep chasing a needle in a haystack in both seo time and sales lost can be crippling. I think it is plain wrong for Google to destroy a legitimate business for some crappy links regardless who built them, sure responsibility is required & they have made that point pretty clearly now. One positive which does not favour Google is this has forced businesses to get more creative with their marketing and not be so Google centric.

  • Jon_Wade

    So, Google’s disavow does not work? I thought that was the solution to having to kill a domain off.

  • http://www.CheesyCorporateLingo.com/ Patrick Reinhart

    We have been receiving a lot of inquiries the last few months about this exact thing from prospective clients. It’s a hard call to make. One individual came in and had a long standing website and business that received a manual penalty from using a link building firm, but it still ranked highly for a few key terms, which was the hard part. At the end of the day we still decided to ditch the domain while leaving a landing page telling people who knew or bookmarked the site that they had moved to a new domain. The site has been outperforming the old one, so it was definitely the right call.

    Tough decision either way though. If it’s a long standing site you have to take into consideration all of the other marketing material that is out there with that website on it, which is when a landing page identifying the new domain is a good idea.

  • http://www.discoverafricagroup.com/ Andre Van Kets

    Hey Eric — excellent analogy using the two extremes. But as you point out. Most websites find themselves somewhere in the middle.

    One of our sites faced a similar scenario to your (Site two) in December 2013. We’d built up literally hundreds of natural links from the early days of our online travel business (est 2002), yet we had also built up hundreds of bad links from the poor choice of SEO agencies (2008-2012).

    With regards the question you raise about: is it worth salvaging a site that’s in somewhere in the middle?

    From my personal (and painful) experience: yes.

    In fact, we managed to submit a successful RR for a domain with ~1800 back-linking domains (half of which we deemed unnatural when we did the painstaking clean-up job) in 10 days straight.

    And we managed that, without employee the services of a “Google penalty removal agency”.

    It took a metric tonne of hard work, a structured methodical plan, a committed team of employees, friends, wives and girlfriends (who gave up their weekend and evenings to meet our target) — and a good dose of humility.

    If there’s an avenue for sharing our story, I’d love to do so. There were so many incredible lessons learnt from the process.

    - Andre

  • LucasWagland

    Great article. This should be shown to all clients that think SEO is too expensive. It gives a good example of the actual hard work that quality practioners undertake. I know I am preaching to the converted here, but what is the cost of good SEO? A hell of a lot cheaper than recovering from bad SEO.

  • http://www.ericward.com/lmp Eric Ward

    I’d want to see a database of their backlinks first.

  • http://www.ericward.com/lmp Eric Ward

    I do have first hand experience in building credible links while the client engaged in takedowns and disavows. It took 4 months, but they ended up in the top three. Site was not ecom.

  • http://www.ericward.com/lmp Eric Ward

    Andre – id love to talk with you about this

  • http://www.brickmarketing.com/ Nick Stamoulis

    I have spoken with several sites and sometimes killing the domain and starting from scratch is actually going to be less painful and cost less than spending months, if not years, trying to salvage a ruined link profile. It’s a big business decision and one that should not be taken lightly. If you have taken the time to build up a real brand sometimes suffering through the SEO mess is worth it because your website has more going for it than just organic positioning.

  • sharithurow

    Hi Eric-

    Excellent article! I am not a fan of the disavowal tool, but I will admit that just having it available makes clients feel better.

    Sometimes, we just have to balance the amount of time and effort (and EXPENSE) it will take to start over vs. the time and effort (and EXPENSE) it will take to fix.

    The truth often stares people in the face and they just don’t want to accept it: sometimes it is just better for people to start over.

    I recently had that conversation with a client whose developers (many technical people honestly, sincerely believe they have information architecture skills but do not) did a terrible job with the site architecture.

    I recommended that they fix the architecture. If they fixed the architecture, the downstream effect would be that they wouldn’t have all of this extra work to do all of the time. The more they put it off? The more work they would have to do after they fixed the architecture.

    I feel your article is analogous to an architecture situation.

    BTW, I emphasized “expense” because that seems to be the word that businesses tend to respond to the most.

    Nice work!

  • sharithurow

    Eric was trying to illustrate a point. Instead of being a unfairly critical, I find it better to try to get the big picture. I understand what
    Eric was trying to communicate.

    As columnists, we are allotted a specific amount of space and words to communicate some rather complex subjects. I get the feeling (having been in the in-between situations that Eric described) that showing all of the details could have caused multiple problems: (a) not enough space to present enough details to make the point, and (b) providing too many details that might make his clients’ uncomfortable.

    I think Eric made his point(s) very well.

  • Webado

    Ah but that would have to be done at the very least without benefit of any 301 redirections at all between old urls and new ones – this includes canonical urls. Moreover the old urls would need to disappear altogether – respond with 410 or 404. This includes the original homepage so you need some creative server response manipulation.

    Furthermore page content itself needs to be revised, not merely left identical to what it was. So it’s not as simple as that. A lot of work is still required. Might or might not work in the long run. Basically still equivalent to building a new site from the ground up on a new domain.

  • http://www.psychics.co.uk/ Craig Hamilton-Parker

    Should you kill a site and redirect to a new domain if it is hit by Panda?

  • Steve

    Hi Eric, Thanks for the article. Could you tell by killing site what actually you referred? Assuming killing site is getting disconnect from server. But what if site One get new domain name for his business with same products (contents) and redirect old domain name to new domain and again if he hire good SEO firm? Is it going to work out for site one company? How google is going to treat with new site?

  • sandeep_jha

    Site one is need to be closed because they never got trust from authority website and you need to remove there bad profile links.

    Site two need to be removed few link which had made from Second SEO company because they have trust in there previous link profile and you can send also reconsideration after unnatural link removal

  • https://plus.google.com/+JohnBritsios/about John Britsios

    That is not a good idea at all. You better fix the site issues and sit back and wait for the next Panda update.

  • http://www.psychics.co.uk/ Craig Hamilton-Parker

    Been waiting three years though as I got hit by the first panda algorithm. Have changed every page on the site and moved from static pages to WordPress plus removed and no-indexed hundreds of good pages that have been heavily scrapped or copied. (Now trying to mimic Wikipedia for page layouts to try and get reading age level up.)

    I used to have a massively busy site but now just a trickle. Frustrating as every page is my own original content and I have not bought links or anything suspect. http://psychics.co.uk So hard to know the right advice as SEO’s have so much conflicting advice.

  • https://plus.google.com/+JohnBritsios/about John Britsios

    Just curious Craig. Are you sure it was Panda? And did you file DMCAs against those content infringements? And why do you think trying to mimic Wikipedia for page layouts can help?

  • Steflea Petru

    10 Things to Expect from Your SEO Copywriter.From the perspective of a business owner, webmaster, or marketing manager, the change exhibited by the Internet is profoundly exciting, yet profoundly disturbing. The information (and misinformation and disinformation) it offers, the business benefits it promises, and the rules it is governed by change at such a rapid rate that it’s almost impossible to keep up.

  • http://www.psychics.co.uk/ Craig Hamilton-Parker

    Yes sure it was Panda – was one of the first sites hit and a reliable SEO expert confirmed it as Panda too when he looked at my web-master tools. First thing I did was send out DMCA infringements – thousands of them as some pages – such as my astrology pages and crystal pages – were copied everywhere.

    I am using Wikipedia as my page layout model to try and gain a bit more of an authoritative look to the pages. I link to authority sites and also to sites with a sceptical agenda to give balance to my own ideas. References at the bottom of a page also qualify my web sources and books cited.

    And it is also a way to refresh the pages. But even with all these hours of work it does not seem to budge in the search ranks. Frustrating task though as I was previously spending all my time writing good content for the site and not trying to endlessly keep Google happy. In the past, the algorithm seemed to know my site was the source of the original content – now it has no idea. Google is broken.

    You just cannot win with Google – I believe the real reason they hit sites is because – sites like mine anyway – get too much free traffic. Instead we should all be spending everything on advertising with Google. They are not a Search Engine plus advertising; they are an Advertising Platform plus search.