Microsoft’s Search Engine Optimization Advice for Bing
Microsoft recently published a PDF about Search Engine Optimization called Bing: New Features Relevant to Webmasters. This is the second SEO-related offering in as many weeks. During SMX Advanced, Microsoft launched the IIS Search Engine Optimization Toolkit. I’m working on a review of that as well, but it requires Vista or Windows 7 and IIS […]
Microsoft recently published a PDF about Search Engine Optimization called Bing: New Features Relevant to Webmasters. This is the second SEO-related offering in as many weeks. During SMX Advanced, Microsoft launched the IIS Search Engine Optimization Toolkit. I’m working on a review of that as well, but it requires Vista or Windows 7 and IIS 7, so installation takes a bit of effort. I talked with Microsoft about the information in the PDF and got some clarity and additional information. The PDF primarily describes the user interface changes launched with Bing, and how those might impact site owners, but also touches on search engine optimization.
Google has provided SEO advice for a while, both in their help center and in their Search Engine Optimization Starter Guide, so check those out if you’re interested in hearing the official search engine stance on SEO. Google’s document is much more instructive, providing their top guidelines for creating websites that can be easily crawled and indexed by search engines.
What can we learn from Microsoft’s advice for webmasters?
The document starts by describing the primary difference in Bing vs. Live Search that impacts site owners: the first search results page no longer shows results 6 through 10 for the searcher’s query. Instead, the page includes “categorized” results after the list of 5 sites that match the initial query. So, for instance, if you search for [vw beetle], results 6 through 8 are for [volkswagon bettle for sale] and results 9 through 12 are for [used volkswagon beetle].
Expanded long tail opportunities
The document clarifies that more than 300 factors are used in determining this categorization and that what categories appear is entity-dependent. In other words, if you search for [buffy the vampire slayer], you may see a category for “wallpaper” because lots of people search for Buffy wallpaper, but you might see categories for “episodes” and “characters” because Microsoft has identified “Buffy the Vampire Slayer” as a TV show.
Microsoft says that this new way of organizing results provides new long tail opportunities for site owners. They suggest determining what entities you most want your site to rank for and then optimizing for the long tail queries within those entities. For instance, if your site is about cars, you’ll want to optimize for the car brands (VW, Volvo, etc.) but also car-related categories such as dealers, used, and for sale. That will give you additional opportunities to rank on the first page of results.
Bing has also introduced a concept called “best match”. When Bing is reasonably confident that the first result is what the searcher is looking for, it will be noted as the “best result” and could contain additional details, such as a box for searching within the site, phone numbers, and links to pages within the site. In some instance, no results other than the best match appear on the page, which Microsoft says their team chooses when they have high confidence in the result and the query volume is high. Of course, this isn’t great news for the site owner at position 2.
If you don’t want your site to appear as the best match, don’t want the internal site search surfaced, want to correct the phone number that appears, or want to remove links, you can request that by emailing email@example.com.
Document preview provides additional content from the site in a hover and, in Microsoft’s words, “helps searchers find the content they want faster, without leaving the SERP until they are ready”.
They say this helps increase qualified traffic, but some webmasters might think that it helps searchers find the content they want without leaving the SERP at all. For them, Microsoft provides a way to opt-out of the feature. Just add the following to the <head> section of each page:
<meta name="msnbot", content="nopreview">
If you don’t want this feature used on any page of the site, it might be easier to return the directive in the server HTTP header as follows:
Not everything from the document preview comes from the site. Some information could be from external sources as well. For instance, if the site is in Flash and Microsoft has trouble extracting data, they might turn to a third-party source. And they may use local information such as an address or phone number from an external source. You can request that third-party data not be used for your listing by emailing firstname.lastname@example.org.
Microsoft says that Flash-based sites are responsible for 21% of all empty descriptions in their index. They say they are doing “limited data extraction” and are now able to generate descriptions for one-third of those. They may also use anchor text from incoming links. Of course, Google has been using incoming anchor text as source data for missing titles for a while, and has continued to evolve its ability to extract Flash data. No word on whether Microsoft is using Adobe’s crawler API or another technology for this extraction.
Use of microformats
While the document encourages the use of structured data, such as microformats, Microsoft tells me that they aren’t currently using this data for crawling, indexing, or ranking. This stance mirrors Google’s and Yahoo’s, both of whom are encouraging structured data use, but are not yet using the data for web search.
Instant answers, the “OneBox”-style results that provide data to answer the query, aren’t new to Bing, but may be gaining prevalence. For now, Microsoft tells me that you can request evaluation of your data for inclusion using their support forum.
Local results are more heavily featured in Bing. You can add your site to their Local Listing Center to ensure you have the opportunity to be featured in these results.
Microsoft provides some interesting data about searcher behavior, which they tell me is based on a number of internal studies derived from analysis of their search logs and toolbar logs. For instance, they found that searchers refine their query, bounce back to the search result, or abandon the search 50% of the time. The breakdown of that 50% is below.
They also found that searchers repeated 24% of their queries during a session (which led to the additional of search history in the left column of the search results page).
All in all, this document doesn’t provide a lot of new information about SEO. But I applaud Microsoft for understanding things like removing results 6 through 10 and replacing them with categorized results and adding a hover with additional content from the site could impact search traffic and for providing information about these features. It’s difficult to say just how much these changes will effect traffic patterns. The biggest change, of course, would come if Bing brings about Microsoft’s goal of increasing search share.
Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.