Key Takeaways From SMX Advanced For In-House SEMs

Perhaps for the first time ever I’m guilty of being slack. Either that or I’ve been seriously buried (a condition I’m sure most in-house SEMs are quite familiar with). Although there was some great live blogging of SMX Advanced, nobody to my knowledge summarized key takeaways for in-house search marketers. So here are what I […]

Chat with SearchBot

Perhaps for the first time ever I’m guilty of being slack. Either that or I’ve been seriously buried (a condition I’m sure most in-house SEMs are quite familiar with). Although there was some great live blogging of SMX Advanced, nobody to my knowledge summarized key takeaways for in-house search marketers. So here are what I consider the most compelling topics discussed at SMX Advanced last month:

  • Google is starting to manipulate forms on some web sites, in order to access content. Google may also index pages that are displayed when incomplete forms are executed. It may be worth reviewing the forms on your site(s) through the eyes of a search bot.
  • Google accepts cookies if necessary to access content. If you have an accessible non-cookie page and an alternative cookie version of the same page, then Google probably is not indexing the cookie version. In those cases, it probably makes sense to focus search engine optimization efforts on the non-cookie version.
  • Page layout is important as a ranking signal. Ideally, a page’s layout should not break when scripts are turned off. The Firefox noscript add-on is a great tool for testing this. Yahoo! suggested no hiding cascading style sheets (CSS) from bots, as this could be seen as an attempt to hide information about a page’s layout.
  • Yahoo! has improved its crawling of XML sitemaps files. If your Yahoo! sitemaps failed in the past, you may want to resubmit them.
  • After acquiring a new domain for SEO purposes, to prevent resetting the domain’s PageRank don’t update the domain’s host, DNS setting, and content all at once. Instead, do it gradually.
  • User agent detection is acceptable if you’re using it to present different content based on different technologies. For example, using user agent detection to serve mobile versus web content is acceptable. User agent detection for other purposes is not recommended.
  • Google recommended using one set of data to power multiple views. Presenting new data in alternate content could seem spammy to a search engine. For example, when using swfobject the text should match the flash content.
  • The AdWords landing page quality score may be at least partially based on organic search ranking signals. If that’s the case, then advertisers could skirt high minimum bid requirements by switching to new destination URL domains—that is, until the minimum bids are increased for those domains.
  • For nth testing of web pages, it is best to set the test with a cookie for each path instead of including test parameters within URLs. Otherwise, your test pages may be indexed.

If you were at SMX Advanced, I hope I got a chance to meet you and I hope that my perspective here is helpful—even if it is a bit delayed thanks to my responsibilities at my “day job!” See you at the next conference!

Andrea Harris is search engine marketing manager at Carfax Vehicle History. The In House column appears on Wednesdays at Search Engine Land.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About the author

Andrea Harris
Contributor

Get the must-read newsletter for search marketers.