• http://www.sefati.net Alireza Sefati

    I work for an ecommerce company that has its own ecommerce website but also has a ton of co-brands and partnes and there is a lot of dup. content all over the net. Also a lot of products lack content. Then they wonder why they are not ranking better.

    As an SEO manager, my push has been to fix the content problem but Panda is helping me pusher harder so we’ll see how it all goes.

    btw this article is worth linking to..on my blog I am going to summarize it and link to it.

  • Rob Snell

    Thanks @AS — Getting LOTS of good feedback from retailers. We ALL are guilty of not writing UNIQUE CONTENT for all our products!

    Lots more smaller retailers with datafeeds were affected, but didn;t have enough traffic to make the SISTRIX list. I’m swapping notes with several BIG fellas, but the consensus is TAKE OUT THE TRASH and WRITE YOUR OWN CONTENT! ;)

  • jhopkins

    “On one retailers’ top 100 category pages, only 4% of the text was unique text.” This is very interesting!

    Is there a certain tool or process you use in order to evaluate the content on these category pages. I am running into a very similar problem on my own site.

  • Rob Snell

    Yep. Copy and paste in a text editor that can do a word count on ALL words and SELECTED words. Try Wordwrangler or WORD or even a Google doc.

    Then view the Google cache of one of your pages and click the TEXT ONLY VERSION.

    Here’s a link to the TEXT ONLY cache of last month’s column:


    here is a google doc I just made to show how I do it


  • http://www.blindsonline.com B.O.L.

    Like a lot of e-retailers we spend a good deal of time working on traffic strategies that may not be the best ideas, based on your article. We’ll rethink our tactics for 2011 and I’m sure our customer experience on our site will improve along with our rankings. Thanks Rob!

  • http://www.gamerstube.com Joe Youngblood

    Hey Rob just a word of caution, none of this is proven yet. although i tend to agree with most of your post. some people will take this and say ‘this is our solution’ when it could be something else. I would hope when/if you see a recovery with these tactics that you would share the findings. Panda 2 started April 6th (ish) so some retailers may have been hit there.

    I do agree that the ‘entire-domain’ penalty angle is a bit overstated. i’ve seen several times over where when someone supplies a description (manufacturer, author, realtor, service provider etc..) that google seems to rank one strong domain with that shared description and then pushes the pages with copies to page 3-6. I was hoping to make a blog post with pretty charts about this but thanks to NDA issues i’m having trouble getting everything together.

    In an ironic twist when I see the issue described above I notice that page one is suddenly filled with much thinner pages like yellow page listings, etc… that just dont have the description and sometimes these are on weaker domains/pages than the ‘shared’ content that google has pushed down. I’ve also noticed that pages with ‘shared content’ that also have deeper content such as customer reviews, etc.. seem to still rank page 1 while the thinner versions are penalized.