Why Many Web 2.0 Developers Are Search Morons

Web 2.0 sites have caught on because they offer slick user interfaces, with cool AJAX effects and lots of user-friendly controls and really useful mashups of content. But many Web 2.0 developers are apparently clueless when it comes to search engine optimization—often to the point where they’re actively harming their presence in search engines. For […]

Chat with SearchBot

Web 2.0 sites have caught on because they offer slick user interfaces, with cool AJAX effects and lots of user-friendly controls and really useful mashups of content. But many Web 2.0 developers are apparently clueless when it comes to search engine optimization—often to the point where they’re actively harming their presence in search engines.

For example: I really love Schmap—a little company that makes robust local guides, mashing up interactive maps with articles, events, pictures and local listings for restaurants, hotels, museums, clubs, etc., and enabling some sharing utilities though their Schmapplets. They’re cool not least because of their strong integration with Flickr. But, the disappointing thing for me is that they have not built themselves to be optimized for search. For a small, up-and-coming local guide site that’s courting as much coolness as they can, this is a serious oversight.


Schmap appears to have tried to form their hyperlinks in an optimized manner—the links avoid query strings and they pass good keyword text. Example:

https://www.schmap.com/miami/

But, when following the links into their local guides, you quickly find basic flubs which can keep search engines from spidering their rich content. The link to that Miami guide, just as with the others, redirects to a URL like this one:

https://www.schmap.com/miami/home/

That redirect is showing up as a 302 “temporary” redirect. Much better for search engines, it would need to be a 301 “Permanent” redirect, or even more preferable, it shouldn’t redirect at all. But, that redirect isn’t the only problem on the links into their top guides. On the homepage, if you click on the “browse by country” links, little dynamic menus pop up with more links, and none of those links appear to be accessible to spiders.

Once you reach the main page for each of the guides after going through the redirects, all the links and text in the main body of their pages are apparently dynamically written onto the page via a javascript—yet another barrier to the spiders finding those links and spidering their internal pages. Indeed, we can see that Google hasn’t indexed this page.

Now, at first glance, an SEO expert might question whether it’s going to be beneficial at all for Schmap to make their content indexable or not, since a lot of their content is syndicated from Wcities, and therefore duplicates text that’s also displayed on many other websites. But, I’d guess that Google might not frown too much on Schmap’s site for this, because they’re combining other useful things for users on the same page, such as maps and photos, so these pages might provide sufficient enhancement to pass Google’s quality evaluations.

Why should Schmap and other mashup sites be concerned about their search engine friendliness? Frankly, small companies can’t afford to be leaving money on the table with stupid oversights like this. I can nearly guarantee that if Schmap were to properly optimize their content, they’d be getting substantially more traffic than they’re currently seeing, and a substantial amount of that traffic would improve clicks to the ads they display on their pages—enhancing their bottom line. More traffic would nearly guarantee that Schmap would build up a larger user base, too—increasing the sort of stats that can sometimes result in fantastically huge dollars if Schmap were inclined to go public or get acquired by another company some time down the road. Don’t all mashup developers dream of making fantastic revenue or getting acquired for huge dollars? Happens much more easily if you have great traffic.

Schmap has a little over 5,000 pages indexed in Google, but I estimate that this is a small fraction of what they really could have, considering the amount of content they offer.

Schmap is not alone in having optimization issues, though. The trend for doing whiz-bang AJAX applications results in a number of developers neglecting thorough, high-quality design in favor of the special effects available through dynamic, browser-based apps. People searching for info in search engines could be finding their sites a whole lot more often if this weren’t the case. Just a quick survey of a few top map mashup sites shows that they are similarly non-optimal:

www.housingmaps.com (9 pages indexed in Google)
www.mywikimap.com (1,380 pages indexed in Google)
stormadvisory.org (1,240 pages indexed in Google)

One site that’s a mashup poster child for search engine optimization is Wikimapia:

wikimapia.org (617,000 pages indexed in Google!)

Their PageRank is 7!

How does Wikimapia do it? Pretty elegant approach. Although at first glance you’d think their homepage is 100% AJAX, if you look into the page’s source code, you’ll find that they have links in DIV tags down into the countries, and those links and pages provide an alternate navigation link hierarchy for search engines to follow. They’ve thus exposed all of their unique content out on simple HTML pages for the search engines to be able to find.

Pages like this:

wikimapia.org/country/USA/Texas/Dallas/

provide link paths down into the content that their users have added:

wikimapia.org/87338/

Wikimapia demonstrates what every map mashup should be doing: provide an alternate link hierarchy, outside of your javascript/java/AJAX/Flash applications, which allow search engines to crawl through to a page for every unique piece of content on your site. You don’t have to dumb down your whiz-bang AJAX applications to accomplish this.

Do this to help users find content on your site, and do it to help your bottom line.

There are many articles on how to optimize AJAX and other dynamic websites. Check them out, and when you’ve fixed your mashup site you’ll increase how much money you could make from your hobby programming.

I hope Schmap will do this! They’re a fine service, and I’m delighted each time one of their editors contacts me, requesting that I allow them to include my pix in their local guides. If Schmap contacts you about including some of your Flickr pix in their guides, I advise that you let them do it. They link the pix back to your Flickr page for the picture, and they give you a credit attribution. This can help build traffic to your Flickr photo pages, and when Schmap eventually fixes their SEO blunders, you’ll likely get even more value out of the exposure.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About the author

Chris Silver Smith
Contributor
Chris Smith is President of Argent Media, and serves on advisory boards for Universal Business Listing and FindLaw. Follow him @si1very on Twitter and see more of his writing on reputation management on MarTech.

Get the must-read newsletter for search marketers.