• http://www.blueridgeonline.com bheafner

    Hi Matt, just read your article on Google finally contacting webmasters when they see red flags while crawling websites, bravo this is greatly needed and has to be a tremendous job and huge expenses to better the WWW for us all. I as read what Google considers Cloaking (pasted below) is misleading totally contradicting to what the latest copy of Microsoft Expressions built in SEO Tools suggest in fact it shows as an error on your webpage while building. How long will it take and how many companies are going go under until we come to a standard SEO practice like 802.11g rule it affects everyone doing business on the Internet and yet the only people who pay are the mislead webmasters like myself. Please look into this I would love to know how both Microsoft and Google would answer my question it would also make a great article.

    Sneaky JavaScript redirects
    When Googlebot indexes a page containing JavaScript, it will index that page but it cannot follow or index any links hidden in the JavaScript itself. Use of JavaScript is an entirely legitimate web practice. However, use of JavaScript with the intent to deceive search engines is not. For instance, placing different text in JavaScript than in a noscript tag violates our Webmaster Guidelines because it displays different content for users (who see the JavaScript-based text) than for search engines (which see the noscript-based text). Along those lines, it violates the Webmaster Guidelines to embed a link in JavaScript that redirects the user to a different page with the intent to show the user a different page than the search engine sees. When a redirect link is embedded in JavaScript, the search engine indexes the original page rather than following the link, whereas users are taken to the redirect target. Like cloaking, this practice is deceptive because it displays different content to users and to Googlebot, and can take a visitor somewhere other than where they intended to go.