The New York Times Algorithm & Why It Needs Government Regulation

The New York Times is the number one newspaper web site. Analysts reckon it ranks first in reach among US opinion leaders. When the New York Times editorial staff tweaks its supersecret algorithm behind what to cover and exactly how to cover a story — as it does hundreds of times a day — it […]

Chat with SearchBot

The New York Times is the number one newspaper web site. Analysts reckon it ranks first in reach among US opinion leaders. When the New York Times editorial staff tweaks its supersecret algorithm behind what to cover and exactly how to cover a story — as it does hundreds of times a day — it can break a business that is pushed down in coverage or not covered at all.

When the New York Times was a pure newspaper, it was easy to appear agnostic about its editorial coverage, with no reason to play favorites with one business or another. But as the New York Times has branched out, making investments in external companies, it has acquired pecuniary [that means financial, by the way] incentives to favor those over rivals.

The New York Times argues that its behavior is kept in check by competitors like The Wall Street Journal or the Tribune Company. But the New York Times has become the default newspaper for many internet and print readers, with home delivery in over 340 markets and is, in fact, the nation’s largest seven day newspaper. Competitors are a click away, but a case is building for some sort of oversight of the gatekeeper of news.

In the past few years, the New York Times has come under accusations that it is too liberal or that the things it writes can’t be believed. It once employed a reporter by the name of Jayson Blair who resigned in May 2003, admitting that he just made things up. There have been other controversies involving the newspaper, enough to make a Wikipedia page of them. Mysteriously, that was deleted just a month ago.

These accusations and concerns may have merit. Or maybe they don’t. We can’t be bothered to look into the possible motivations of those making them, to investigate how true they might really be. Some of those accusing the New York Times of being unfair may not deserve coverage or may be seeing a liberal bias because of their own conservative ones.

Still, the potential impact of the New York Times algorithm on the internet economy, not to mention the US economy, the US government and the world as a whole is such that it is worth exploring ways to ensure that the editorial policy guiding the New York Times is solely intended to improve the quality of journalism and not to help other businesses that the New York Times owns or the bottom line of its for-profit owners.

Some early suggestions for how to accomplish this include having the New York Times explain with some specified level of detail the editorial policy that guides what it decides to covers, what it doesn’t decide to cover, why it chooses to write a particular headline with a particular angle, to show all versions of a newspaper story that is written from start to finish, to reveal what’s been edited out. Another would be to give some government commission the power to look at all these aspects, perhaps the power to reside within the newsroom and ensure fairness.

The New York Times provides an incredibly valuable service, and the government must be careful not to stifle its ability to innovate. Forcing it to publish the algorithm or method it uses to evaluate would allow any business or government agencies to game the rules in order to obtain positive coverage — destroying its value as a newspaper. Requiring each algorithm tweak to be approved by regulators could drastically slow down its improvement and regular reporting operations.

Forbidding the New York Times to branch out into new areas — such as when it invests in things like a job search engine (here) or a company behind a free blogging tool (here) — might reduce the diversity of its financial health, even if it does help ensure the company won’t favor these products over those produced by rival companies on the internet that it also covers.

With these caveats in mind, if the New York Times is to continue to be the main map to the news and information highway, it concerns us all that it leads us fairly to where we want to go.

And Now, Without The Satire…

For those who didn’t see it, the piece above is a satirical rewriting of yesterday’s New York Times editorial, The Google Algorithm, which suggested that Google needs to be regulated, since its an important “gateway” that faces recent accusations of bias.

I’ve been covering the search space closely for nearly 15 years, from before Google itself even existed, so I have seen these types of claims far longer and examined them in far more depth than what went into that New York Times editorial.

My guess is that the editorial staff (the staff that writes the newspaper’s editorials, which are opinion pieces, which is confusing when the newspaper also has an editorial staff that writes “editorial” stories elsewhere that are supposed to be unbiased) spent about an hour or so discussing recent Google news, then someone was probably assigned to write the editorial and invested all of about three hours on it.

That’s not much time or care for a major and well-respected newspaper (in many quarters) to decide the government should evaluate “fairness” when it comes to making editorial judgments in search results, be they from Google or any other search engine.

Editorial Independence

Search engines are very similar to newspapers. They have unpaid “organic” listings, where usually (though not always), a computer algorithm decides which pages should rank tops. The exact method isn’t important. What’s important is that those unpaid listings are the search engines’ editorial content, content it has solely decided should appear based on its editorial judgment.

Search engine also have paid listings, advertisements, which aren’t supposed to influence what happens on the editorial side of the house. We even have FTC guidelines ensuring proper labeling of ads and intended to protect against “advertorials” in search results.

It’s a church-and-state divide with good search engines, just as it is with good newspapers.

What the New York Times has suggested is that the government should oversee the editorial judgment of a search engine. Suffice to say, the editorial staff of the New York Times would scream bloody murder if anyone suggested government oversight of its own editorial process. First it would yell that it has no bias, so oversight is unnecessary. Next it would yell even more loudly that the First Amendment of the US Constitution protects it from such US government interference.

First Amendment Covers Search Engines

Guess what. The First Amendment protections of freedom of speech and freedom of the press apply to more than newspapers. In fact, they apply to search engines. The courts have said so, most clearly back in May 2003, in the SearchKing case:

PageRanks are opinions–opinions of the significance of particular Web sites as they correspond to a search query … accordingly, the court concludes Google’s PageRanks are entitled to full constitutional protection.

The Gatekeeper Argument

Ah, but that was 2003. Google wasn’t so powerful or deemed as a “gatekeeper” then. Now it controls everything!

Wrong. Even back then, Google was being paraded as being too big, too powerful (see 14 “Is Google Evil?” Tipping Points Since 2001). Heck, consider this newspaper article from December 2002:

Google has become enough of a Web gatekeeper that its leads now prop up plenty of commercial sites….

Google uses many variables in its automated ranking process, but a key one is the number of other pages linking to a particular page, because links can serve as a kind of endorsement. Google regularly shuffles its rankings to reflect changes in its own methods or in the Web’s link structure and content.

That makes its free listings a risky platform on which to build a business. Google also routinely slashes the rankings of sites that appear to be using tricks, like hiding keywords in invisible text that can be read only by search engines….

The free ride may not last, however. Ms. Johnson of Forrester says larger companies have been discovering the power of search engines and site optimization. As was the case on eBay when big retailers moved in, search listings are becoming less democratic. “It’s going to be more and more difficult for small sellers to get noticed,” she said. “The free listings lunch may be ending soon.”

That’s from the New York Times, by the way — reporting how almost ten years ago, Google was a “gatekeeper” where the “free lunch” for small businesses might be ending.

What’s happened since then? Small businesses, along with big businesses, have continued to do just fine. Google has handled hundreds of BILLIONS of searches over that time.

If Google was seriously abusing its “gatekeepter” status, you’d expect to hear billions of complaints about anti-competitive behavior in those billions of searches. Instead, the loudest voice we hear is from a tiny shopping search engine virtually no one has heard of, Foundem, screaming that we need “search neutrality” because Google’s tried to keep it out for competitive reasons.

That’s right. Google’s smart enough to build this supposed search monopoly, but then when it decide to whomp the competition, it’s Foundem that it finds a threat. Not Amazon. Not eBay. Not Yahoo. Not Bing. Foundem.

Please.

By the way, many people forget that Yahoo was once considered the web’s gatekeeper. There were even calls in the late 1990s that the government should step in since Yahoo’s editors could make or break a business depending on if it listed them or how it did so.

Somehow, businesses survived Yahoo. Somehow, businesses have survived Google’s supposed gatekeeping. A key part of that is down to the fact that business online do receive traffic outside of search engines, in particular with Facebook and Twitter sending some sites more traffic that Google.

Google’s Plenty Transparent

Still, shouldn’t Google share more about how it creates its algorithm? Compared to the New York Times, Google’s a model of transparency. Consider:

  • Google will list EVERY site that applies for “coverage” unlike the New York Times, which regularly ignores potential stories
  • If Google blocks a site for violating its guidelines, it alerts many of them. The New York Times alerts no one
  • Google provides an entire Google Webmaster Central area with tools and tips to encourage people to show up better in Google; the New York Times offers nothing even remotely similar
  • Google constantly speaks at search marketing and other events to answer questions about how they list sites and how to improve coverage; I’m pretty sure the New York Times devotes far less effort in this area
  • Google is constantly giving interviews about its algorithm, such as this one in February, along with providing regular videos about its process (here’s one from April) or blogging about important changes, such as when site speed was introduced as a factor earlier this year.

Heck, back in June 2007, Google allowed New York Times reporter Saul Hansell into one of its search quality meetings, where some of the core foundations of the algorithms are discussed. Very few outsiders have ever attended one of these meetings. Hell, I’ve never attended one (Google’s joked with me that I’d understand to too much). Letting Hansell into this meeting would be the same as if the New York Times let Google CEO Eric Schmidt sit in one of their daily news budget meetings or better, a discussion between an editor and a report about the approach to a particular story.

Google’s Subject To Other Regulations

The question isn’t whether we need a new search “truth commission” any more than we need a newspaper “truth commission.” The question really is whether Google’s acting anti-competitively. We have plenty of anti-trust laws that already can be applied (and have been, as in the case of the proposed Google-Yahoo deal in 2008).

Exhibit A for some of the anti-competitive complaints is the resurgence in accusations that Google “favors itself” with specialized search products.

Google’s offered more than web search for a very long time. Image searches, for example, stretches back to 2001. It is a search company. It is supposed to offer search products. It makes no sense to expect those search products to be merely listing web pages. If people are doing shopping searches on Google, it should evolve its product to have a specialized shopping tool. That’s what its users want.

Sure, that might hurt other shopping sites out there. Or, it might not, if they offer a better shopping search than Google. But it’s a ridiculous argument that Google should somehow send every shopping query out to another shopping search engine.

Imagine if you did a web search for something, say “iphone,” and every link you got lead to Bing, Yahoo and other search engines, which in turn showed their results for iPhone. That’s crazy. You came to Google for answers, to be lead directly to sites with those answers, not to be sent to another search engine and forced to search again.

Where Google’s potentially anti-competitive is if no longer sends people AWAY from its site (or maybe not — no one complains that Yahoo recirculates much more back into its own properties). Its shopping search product sends people away to individual merchants, for free. To understand this more, I’d recommend watching my short talk below, The Search Platform: Friend Or Vampire?

That’s not to say Google’s without faults. It is. In particular, things like “Places” pages that recirculate back into Google or the fact that many of the external sites it lists carry its AdSense ads are causes for concern if not outright complaint.

But by and large, Google’s been a net positive actor, from where I measure things. It deserves better than a knee-jerk reactionary editorial from what’s supposed to be one of the leading newspapers of the world.

Postscript: Google is now also out today with its own take on the idea of “search neutrality” in an official blog post: Our op-ed: Regulating what is “best” in search?


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About the author

Danny Sullivan
Contributor
Danny Sullivan was a journalist and analyst who covered the digital and search marketing space from 1996 through 2017. He was also a cofounder of Third Door Media, which publishes Search Engine Land and MarTech, and produces the SMX: Search Marketing Expo and MarTech events. He retired from journalism and Third Door Media in June 2017. You can learn more about him on his personal site & blog He can also be found on Facebook and Twitter.

Get the must-read newsletter for search marketers.