Schmidt: Listing Google’s 200 Ranking Factors Would Reveal Business Secrets

Google’s been under fire in various quarters, including the New York Times, that it should open up about how its ranking algorithm works. I think offering the exact formula would be difficult plus potentially give away trade secrets. But how about just listing some of the basic “ingredients?” Nope, said Google CEO Eric Schmidt, this week.

You can watch Schmidt’s answer below. It came during a lunchtime meeting with reporters at the Google Zeitgeist conference on Tuesday. The exchange is as follows:

Sullivan:

You say that you have like these 200 factors, why not at least just list them?

Schmidt:

Because we change them. What would happen is, you’ve asked me this question for the eight years I’ve worked with you, so it’s the same question. Why don’t we publish these things. And the fundamental answer is we’re always changing. We’re always changing, and if we started saying here’s how the black box works, then all of a sudden huge incumbencies would come out about this change and that change, and we just don’t want that pressure.

Sullivan:

I’m not saying this is how the factors are actually measured up or weighted.

Schmidt:

But even the list.

Sullivan:

But 50 of those factors have never changed.

Schmidt:

Let’s just be honest and say you and I disagree.

Sullivan:

OK…

Schmidt:

It’s a business secret of Google.

Brian Womack of Bloomberg:

But that’s not very open.

Schmidt:

Again, openness, I’ll take your definition of open, let’s start with how does your firm operate with openness. The company ? has some secrets ?

Those question marks at the end are because I couldn’t quite hear the last part.

Ingredients Alone Aren’t A Recipe

Now let me back up with some further perspective. I agree, Google’s exact ranking formula is a business secret. But I wasn’t asking for the secret recipe, exactly how all the things are mixed to create Google’s special sauce. I was asking what harm there was in listing the 200 various ingredients that are in the sauce.

As mentioned, a New York Times editorial earlier this year suggested that Google’s secret sauce ranking algorithm — the actual recipe — should be opened up to government scrutiny.

I found that so laughable that I wrote a parody of it, The New York Times Algorithm & Why It Needs Government Regulation. The ranking algorithm does change constantly, exactly as Schmidt said. Does a government monitor sit within Google to constantly watch it?

Plus, it IS a business secret. If Coke doesn’t have to hand over its formula, if KFC doesn’t have to reveal its special blend, if the New York Times doesn’t have to document exactly how a story comes to be written in all its various details, why would Google have to give up its crown jewels?

But Coke does tell you the basic ingredients that are in its drink. I think if Google were to list more of its 200 ranking signals, it would become clearer to critics that the algorithm isn’t simply a list of companies that Google thinks should rank higher or lower.

What’s On The List?

Heck, I can make some of the list off the top of my head, from things Google’s actually said over the years

  • Presence of search term in HTML title tag
  • Presence of search term in HTML body copy
  • Use of bold around search term
  • Use of header tags around search term
  • Presence of search term in anchor text leading to page
  • PageRank of a page
  • PageRank / authority of an entire domain
  • Speed of web site

That not enough? SEOmoz runs an regular survey to compile factors. WebmasterWorld forum members compiled a list last year. Here’s another compilation of potential signals. Google offers advice through its own site. It had no problem loudly telling the world earlier this year that site speed was a new factor being considered.

Providing a list of all the ingredients doesn’t seem that crazy, especially if it helps to better illustrate the complexity of Google’s ranking system and the difficulty to manipulate that somehow to favor something in particular.

By the way, search start-up Blekko not only lists some of its ranking factors but the scores a particular page has earned according to each of them. Blekko: New Search Engine Lets You “Spin” The Web explains more about this.

Perhaps if Blekko grew as popular as Google, doing this might leave it open to revealing too much about its ranking system to competitors and site owners who might try to dominate listings. But it’s another sense of how factors could at least be listed.

But Facebook Should Open Up?

The ‘business secret” response is especially in contrast to another big item that came out of that meeting, that Google wants Facebook to open up its social data. Schmidt said on this topic (and you can watch the video further below):

We want our core products to get better because of social information. The best thing that could happen would be if Facebook would open up its network and we just used that information to improve our ads and our search …. Failing that, there are other ways in which we could to get that information, is what we’re working on.

In some ways, Facebook’s social connection data can be seen as its own special sauce. It no more wants to enable a competitor than Schmidt does. So why the double-standard?

For one, Google’s got no problem having a double-standard on openness. If it’s behind in an area, it plays the open card. If it’s ahead, not so much. My past article, Google: As Open As It Wants To Be (i.e., When It’s Convenient, gets into this more.

Then again, people should be able to take their data — social or otherwise — wherever they want. Google’s own Data Liberation Front group pushes to make this happen more within Google, and the company deserves praise for that effort. Similarly, if people want to take their Facebook contact data elsewhere, or even their “Likes,” they should be able to do that.

Schmidt: Read Facebook’s Terms Of Service

That’s the crux of what Schmidt really got at about Facebook this week. After his statement on Facebook above, I’d asked him why he couldn’t already use much of this data that Facebook has opened up through things like the Open Graph API. His answer (which comes at the end of the second video clip below):

Read the terms of service … trust me, read the terms of service.

Presumably, he’s saying Facebook’s terms are somehow preventing Google from interacting with it more.

Going through the terms myself, I see this key part:

You will not directly or indirectly transfer any data you receive from us to (or use such data in connection with) any ad network, ad exchange, data broker, or other advertising related toolset, even if a user consents to that transfer or use.

Certainly, that seems to prevent Google from tapping into the data to improve its own ads, which is one of the two things Schmidt mentioned wanting social data for. As for search, which he also mentioned, I’m missing the problem. I wish he’d just have explained it.

In the end, Facebook’s social connections are no more its special sauce than Google’s ranking factors are. Both are ingredients, not an exact recipe for success. Just knowing all my friends or what I like doesn’t mean giving away a good algorithm that helps decide what to show me in my social news feed. Just knowing the 200 ranking factors doesn’t mean you can now crawl the web and best Google.

So publishing a list of ranking factors? I see much more PR advantage for Google in doing that, especially when it talks so much about being open, than saying no with the  “business secret” excuse.

How about some video?

Here’s Schmidt talking about Google’s ranking factors:

Here’s Schmidt talking about obtaining Facebook’s social data and what Google might do with it:

And for the really interested, TechCrunch editor Mike Arrington has posted a video of the entire lunch with reporters along with his overall highlights of the talk.

New York Times reporter Claire Cain Miller also has a write-up from the lunch.

Related Topics: Channel: SEO | Facebook | Google: Business Issues | Google: SEO | Google: Social Search | Top News

Sponsored


About The Author: is a Founding Editor of Search Engine Land. He’s a widely cited authority on search engines and search marketing issues who has covered the space since 1996. Danny also serves as Chief Content Officer for Third Door Media, which publishes Search Engine Land and produces the SMX: Search Marketing Expo conference series. He has a personal blog called Daggle (and keeps his disclosures page there). He can be found on Facebook, Google + and microblogs on Twitter as @dannysullivan.

Connect with the author via: Email | Twitter | Google+ | LinkedIn



SearchCap:

Get all the top search stories emailed daily!  

Share

Other ways to share:
 

Read before commenting! We welcome constructive comments and allow any that meet our common sense criteria. This means being respectful and polite to others. It means providing helpful information that contributes to a story or discussion. It means leaving links only that substantially add further to a discussion. Comments using foul language, being disrespectful to others or otherwise violating what we believe are common sense standards of discussion will be deleted. Comments may also be removed if they are posted from anonymous accounts. You can read more about our comments policy here.
  • http://www.jonathanwagner.ca vectoron

    Eric’s answer is horrible. The real answer should be because they don’t want system gaming. If Google revealed the “list” you would get the same problem we had in the past with meta keyword stuffing. Users would not benefit from Google releasing the list, SEO companies would benefit because they would know how to game the system to artificially raise rank. The whole premise behind Google is to logically index and rank content without the need of any kind of SEO. Just because you can tailor your content to get you a higher rank doesn’t mean your content is more relevant, it just means you have gamed the system. In many ways SEO is actually degrading the quality of results, and if Google released the list, it would just be worse.

  • http://www.npromote.com Davor Bomestar

    @vectron – and in many ways SEO is actually helping the quality of results. When I look for some local results I can’t find some basic information because all the sites localy are very poorly optimized. With better and broader SEO awerness many SERPs pages would be a lot relevant.

  • David_lou

    Surly there is a difference between user data portability and the details of an algorithm.

Get Our News, Everywhere!

Daily Email:

Follow Search Engine Land on Twitter @sengineland Like Search Engine Land on Facebook Follow Search Engine Land on Google+ Get the Search Engine Land Feed Connect with Search Engine Land on LinkedIn Check out our Tumblr! See us on Pinterest

 
 

Click to watch SMX conference video

Join us at one of our SMX or MarTech events:

United States

Europe

Australia & China

Learn more about: SMX | MarTech


Free Daily Search News Recap!

SearchCap is a once-per-day newsletter update - sign up below and get the news delivered to you!

 


 

Search Engine Land Periodic Table of SEO Success Factors

Get Your Copy
Read The Full SEO Guide