In 2012, I started a series, How The Major Search And Social Engines Are Using The Semantic Web, which took us to a point in time around September 2012. Since then, there have been further interesting developments.

In this article, I am going to focus on recent developments that are search engine and/or Google specific, then take a further look back in search engine history with the assumption (for you history and strategy lovers,) that a successful strategy used once, may well be used again in similar circumstances.

Image via presentermedia.com under license

Google & The Semantic Web

In the interim, since September of 2011, Google has taken increasingly more steps in becoming more semantic-web like in nature and in the migration of its SERPs, resembling those of an answer engine.

For example, Google has added explanations to the Knowledge Graph. Subsequent to that, on November 9th, Google adopted GoodRelations as part of schema.org. Aaron Bradley wrote an excellent article about that, and you can read more details here if you missed it.

The Knowledge Graph was rolled out globally, on December 4, 2011, and various interesting changes were made to flight search and associated activities. All very interesting; however, not exactly earth shattering, but well worth noting.

Google Knowledge Graph

Google Data Highlighter

On December 12, 2012, Google rolled out a new tool, called the Google Data Highlighter for event data. Upon a cursory read, it seems to be a tagging tool, where a human trains the Data Highlighter using a few pages on their website, until Google can pick up enough of a pattern to do the remainder of the site itself.

Better yet, you can see all of these results in the structured data dashboard. It appears as if event data is marked up and is compatible with schema.org. However, there is a caveat here that some folks may not notice.

No actual markup is placed on the page, meaning that none of the semantic markup using this Data Highlighter tool is consumable by Bing, Yahoo or any other crawler on the Web; only Google can use it!

Google is essentially hi-jacking semantic markup so only Google can take advantage of it. Google has the global touch and the ability to execute well-thought-out and brilliantly strategic plans.

Image via presentermedia.com under license

You may think, how odd, why would Google go to all that effort to create standards for schema.org for all three search engines, (actually four at this point: Bing, Yahoo, Yandex and Google), and then create a tool useful only for Google? It appears such a charitable gesture on their part.

A Little Google History On Sitemaps & Schema.Org

Perhaps some history can help us understand this better. Schema.org was first announced on June 2, 2011 as the first time the three major search engines came together to produce a standard since sitemaps.org. Depicted below are the home pages of both schema.org and sitemaps.org. Note the striking similarity.

Sitemap.org & Schema.org Homepages

Even the terms and conditions appear similar, as can be seen in the figure below. Ok, so now you may be thinking, so what, they used the same template. It is much more interesting than that. As you know, Google is brilliant at applying successful strategies, and if it succeeds once, it is very likely to pursue that same tactic again. (All you historians out there know how history tends to repeat itself). Perhaps a look at where Google is going with the structured markup can be determined by what they did with sitemaps.org.

Robots.Txt File

On a similar note, www.example.com/robots.txt is a standard location for a sitemap to reside; however, Google provides the option to submit the sitemap directly. (Most search engines do, actually). Given that the search engines allow direct sitemap submission (and may even prefer it), it’s interesting in that there is no longer an actual need for a robots.txt file. Many sites do not have them.

In some sense, however, this is strongly reminiscent of the action being taken by Google with regard to the Data Highlighter for events. Net effect: it makes this information available to Google, but not necessarily to others.

Although up and down in certain situations, there seems to be a lot of controversy around this fact, you can see a recent example noted in Matt McGee’s article. (For the rest, I will leave it to historians to make the projections.)

Historically, it also seems, for both search engines and websites/webmasters alike, best practice is to adhere to standards, as that is what best practices ultimately boil down to and why standards are so important! Ultimately, they should be enforceable (one would assume).

How Small Merchants Can Remain Competitive In Google Shopping

Small Merchants can remain competitive in Google Shopping by adding rich snippets and more. Below are some mechanisms worth looking into, as it appears Google is facilitating the small merchant in Google Shopping to retain inventories that include more eclectic and interesting product offerings!

If you understand the big picture, it is easier to predict not only what to do, but how and why with regard to getting good SERP visibility. For example, I am embedding a portion of Google’s webmaster advice on rich snippets and products. You can find the link here. I checked the last update, which was December 3, 2012.

If we are to believe what we see in Google Webmaster Tools above, it means you can get into Google Shopping free using Structured Markup! If anyone experiments with this, I would love to encourage responses in comments.

Considering how fast things are changing with schema.org microdata, there is a need for automation to remain both competitive and compliant.

On Google’s part, this would offer an ideal mechanism for small business owners with unique, boutique-style products or eclectic products to enter Google Shopping. This would enrich and augment the current inventory on Google Shopping. On the part of the small merchant, it would now be necessary to strongly consider this option.

On a similar note, I also found this link to be of interest on Google, posted the same day! It is a great guide on best practices. Note the sitemap and robots.txt stipulations as well as those on rich snippets. It also was last updated December 3, 2012.

It may be, that much of this falls on the heels of Microsoft’s Scroogled campaign, where in addition to much else, they comment on PageRank being replaced by Pay to Rank as an algorithm. It may have elements of truth, but PageRank is now reduced to only 1 of a couple of hundred signals Google uses to rank pages.

Essential & Professional Practices For 2013

Based on the information above, it seems to me that retailers can profit by following my suggestions below.

  • Stick to industry standards and standards recommended by the search engines.
  • Always provide unique and interesting content on your pages.
  • Keep your information fresh and relevant.
  • Provide clean sitemaps and make use of the <lastmod> feature.
  • Mark up your pages for rich snippets, if you are not sure how to do that, engage experts to help you or look for automated software that can provide the same!

Lastly, in terms of 2013 predictions, count on the Data Highlighter being expanded to more than just events!

 

Opinions expressed in the article are those of the guest author and not necessarily Search Engine Land.

Related Topics: Channel: SEO | Schema.org | Search & Retail

Sponsored


About The Author: of SemanticFuse is a semantic strategist and software engineer. providing semantic SEO and other related consulting services. Starr is a technology expert and software designer specifically in the semantic search and semantic Web arenas.

Connect with the author via: Email | Twitter | Google+ | LinkedIn



SearchCap:

Get all the top search stories emailed daily!  

Share

Other ways to share:

Read before commenting! We welcome constructive comments and allow any that meet our common sense criteria. This means being respectful and polite to others. It means providing helpful information that contributes to a story or discussion. It means leaving links only that substantially add further to a discussion. Comments using foul language, being disrespectful to others or otherwise violating what we believe are common sense standards of discussion will be deleted. Comments may also be removed if they are posted from anonymous accounts. You can read more about our comments policy here.
  • daveintheuk

    Of course they are. Google’s stated objective is to “find and organise the world’s information”… but it is starting to feel more and more like their objective is to “take and control access to the world’s information’.

    Webmasters should stop feeding the hand that bites.

  • Ajedi32

    Who are they biting? The webmasters? On the contrary, the whole reason that webmasters are willing to provide this information to Google is because Google is helping them by helping other people find their website.

  • daveintheuk

    All webmasters with “not provided”. Any webmaster/business owner in travel, local, shopping, finance by either forcing to “pay to play” or shaving off any profitable traffic for their own products.

  • http://twitter.com/Greekgeek Greekgeek

    All webmasters for whom Google refuses to accept standard, semantic markup for authorship unless they have a Google plus account. Google’s ever-changing rules for authorship now ask us to use email validation and/or the nonstandard ?rel=author syntax in order to get our authorship to appear in search results. In effect, Google is holding our online identity hostage, forcing us to use Google Plus for our online profile. If we don’t, or if we don’t use Google’s ?rel=author code instead of regular, standard semantic markup, it may even put someone ELSE as the author of our articles and work in search results (as it has on many articles of myself and many others, as noted in a searchengineland article last month.) The effect of this is to make sure our authorship is only visible through Google’s tools and is not visible on other search engines.

    The scary part is that Google actually did a bait and switch on this. For 2011 and most of 2012, standard semantic markup to indicate authorship worked fine, and our author names and photos appeared in Google search results. So we got used to it. But starting last fall, Google started removing our author names and photos and replacing them with popular Google Plus writers who use the same publishing platforms that we do, so that, for instance, Seth Godin is now listed by Google as the author of the majority of my work. I get it: Google wants me to upgrade my account to Google Plus. But I refuse to be blackmailed. Hey, maybe all those fans of Seth Godin will be tempted to read my stuff?

  • Ajedi32

    …I literally didn’t understand a word of what you just said.

  • Ajedi32

    What’s wrong with using `?rel=author`? I realize that it’s non-standard, and I agree that Google should be using standardized markup (http://schema.org/), but how is that “holding our online identity hostage”?

  • ChrisFree

    “it means you can get into Google Shopping free using Structured Markup”
    While a wonderful thought, I believe that is a misinterpretation of the Webmaster Tools statement.

    “Submit your product listings for free” means submit your product listings for free … to Google search for possible inclusion on search results pages (not on Google-Shopping).

    Structured (product) data remains as potential enhancements to organic search results.

    Google Shopping (Product Listing Ads) remains as separate paid ad space.

  • ScottyMack

    There are tons of things we may not like as far as what Google is doing (and in everyday life), but there are battles worth fighting and ones
    that you have lost before you have even entered the fray. I guess it all
    boils down to whether you want to adapt and prosper or rebel and fail
    with your principles in tact.

  • http://www.esocialmedia.com Jerry Nordstrom

    Well sure kind of – You find, organize and then put your ads in between the users and access to the data. What I find most interesting is how many people so willingly give away their personal information with little regard to its value.

  • http://alexwebmaster.com/ Alex Garrido

    I see your frustration, yet as with the FTC antitrust case… you are not forced to be on Google. You have the freedom to leave Google if you are not happy with the service they are providing you. Of course, no one will argue that Google is manipulating its service to benefit themselves, but again, Google is a private business that offers a near stationary product/service for free. If someone gives me a plate of soup, I do not complain about it being cold. I can choose to eat it or throw it away.
    Just a friendly thought :D

 

Get Our News, Everywhere!

Daily Email:

Follow Search Engine Land on Twitter @sengineland Like Search Engine Land on Facebook Follow Search Engine Land on Google+ Get the Search Engine Land Feed Connect with Search Engine Land on LinkedIn Check out our Tumblr! See us on Pinterest

 
 

Click to watch SMX conference video

Join us at one of our SMX or MarTech events:

United States

Europe

Australia & China

Learn more about: SMX | MarTech


Free Daily Search News Recap!

SearchCap is a once-per-day newsletter update - sign up below and get the news delivered to you!

 


 

Search Engine Land Periodic Table of SEO Success Factors

Get Your Copy
Read The Full SEO Guide