• http://www.fathomseo.com Mike Murray

    Thanks for the cloaking update and detailed perspective. Engines may frown on cloaking, but I’m not convinced they’re going to catch it or deal with it as much as people may think. In a way, it’s like duplicate content. I see see plenty of that going on even with national brands. And the sites still rank well. With cloaking, I do wonder if it makes sense to offer different versions to honor regulatory requirements. But I imagine site design could address that.

  • http://www.cumbrowski.com Carsten Cumbrowski

    Serving different content to spiders than to users. What do you mean by “content”? The actual written text or the whole HTML source code?

    If I serve to the user a fully designed page, a badly designed one from SEO point of view, with a lot of junk code, too much crap before the actual text, wrong HTML tags and to the spider the same text and everything but in short, clean and proper html, then is it not cloaking?

    Is leaving parts of the navigation out for the spider which is shown to the user (for example CSS drop down nav) considered different content? I mean, it’s duplicate stuff anyway which would most likely be filtered out by the search engine.

    What is your take on that kind of “cloaking”? Ethical? Gets you banned?

  • http://www.thinkseer.com/blog wilreynolds

    While hiding copy behind flash isn’t traditionally seen as cloaking, in the spirit of the definition it is basically showing the search engines one thing and the user another. I wrote about SAAB doing this a while back…what is your take?


  • http://www.linkedin.com/in/leevikokko Leevi Kokko

    Hiding copy behind flash should be totally OK for sites such as mtv.com, because both HTML and flash use the same content source.

    This snippet from http://www.simplebits.com/work/mtv/

    “The shiny new version of the site required the latest-and-greatest Flash plugin, and MTV.com found it important to leave no one behind. Readers who haven’t upgraded to the latest version of Flash would still receive the site’s content, identical thanks to transforming the same XML that drives the Flash version into nicely formatted XHTML/CSS templates. In addition, search engines would better index the site’s content and Flash-less browsers and devices would benefit as well.”

    So essentially, while search spider indeed sees a different version of the site compared to an average user, it is only in terms of user experience, not the actual content.

    I think this particular case is a beautiful example of how proper use of standards and industry best practices can boost your online business.

  • http://www.stephanspencer.com Stephan Spencer

    When I say “content” I am referring to the copy, title tags, alt text, etc. — basically anywhere a spammer could insert keyword-rich gibberish.

    In regards to your hypothetical scenario of moving HTML around and swapping in and out different HTML tags, I consider that cloaking.

    In your second hypothetical scenario I would also consider leaving out parts of navigation only for the spider to be cloaking as well.

    There is a distinction between operating in non-ethical terroritory and being at risk of a ban. Just because you are being ethical, doesn’t mean you won’t inadvertently trip an automated algorithm. I think moving HTML around on the page or replacing content within various HTML containers is dangerous even though it could be ethical from the standpoint that you are making visible to the spiders content that might be trapped within JavaScript or Flash.

    You bring up a really good point that Flash-based content or navigation really does need to have an alternative version that is accessible to the spiders. Whether that would be considered cloaking by the search engines, I think depends on the implementation. Are graceful degradation and progressive enhancement cloaking with regard to the use of Flash with SWFObject and DIV tags? Some people are quick to label them as such regardless of use or intent. While they can be used for cloaking purposes, if used conservatively and with their intended purpose in mind, then I don’t consider them to be cloaking. But at the end of the day it is not really what I think that matters. It is what Google and the other engines think.

    I know Google doesn’t like Flash. They don’t see it as a very accessible technology. It goes against their philosophy of making the world’s information universally accessible because it is not friendly to the visually impaired, people on antiquated browsers and handhelds and so forth. So I think that progressive enhancement of Flash would be seen as a good thing.

    Flash isn’t going away any time soon so there needs to be a workaround and I see progressive enhancement as the best workaround that we have available. It is my understanding that engines use human review to determine intent for the use of SWFObject and perhaps other forms of progressive enhancement. If the search engines thought that progressive enhancement was inherently evil, they wouldn’t bother with human review; they’d simply nuke it every time.

  • http://scottj.info/ Scott Johnson

    It’s good to air the facts on cloaking. There is a proper place for such technologies on the web. To forbid them outright is ridiculous. I enjoy reading articles like this one–articles that challenge the status quo with cold hard facts.