The Myth Of Great Search Engine Results

Too much time is often spent about the new features the various major search engines roll out or the latest deals they cut. Here at Search Engine Land, we can be as guilty of that as anyone. To correct it, I’ll be spending more and more time highlighting poor quality search results that I encounter, in hopes of nudging the industry to improve things.

Canaries In The Search Mine

I’ve spoken and written for years that when it comes to search engines, I think there are two “canaries in the coal mine” that catch a whiff of something bad emanating from the search engines.

The first are librarians and research professionals. They’re acutely aware of when search counts don’t make sense, if something important in a field they know isn’t being listed and other issues.

The far larger group are site owners or search marketers. The common joke is that when they spot search engine spam, that stands for Someone Positioned Above Me. Thus, it’s easy to dismiss what they see as just being colored by self-interest.

Sure, there’s some of that. But these people are also often subject experts. As surely as Cypher in the Matrix could look at computer code and say, “All I see now is blonde, brunette, redhead,” a subject expert like a site owner or search marketer can look at results and know when they don’t smell right, when something’s wrong.

About two weeks ago, our Reviewing Some Bad Google Search Results With Sergey Brin article highlighted a few bad results I could see in my subject area of expertise, that of search engines. Today, I’ll bring in another example, that of “search term research.”

Benchmarking Against Expert Knowledge

Search term research is one of the core aspects to search marketing, and I’ve covered various tools out there for years. Here at Search Engine Land, we maintain a Search Term Research page devoted to the topic. It’s a good page. There are probably better ones, and maintaining these types of pages is always difficult. Still, it’s kind of a benchmark for me. If I don’t see it ranking, what’s the quality of stuff that IS ranking above it?

As it turns out, our page isn’t on Google at all. Not at all. And it’s, um, our fault. The All In One SEO Pack plug-in we use with WordPress here set all of our category pages to be excluded from Google. It wasn’t that way originally. Back in the summer, the latest version of the plug-in changed things to overwrite how you’d previously configured it. I should have known better, too, because I even retweeted a warning about this. Everything’s fixed today, and we’ll see how things go.

Still, that page as well as my own knowledge of the area gives me a good benchmark I can use against other pages that do appear in the search results. So how’s it look?

Google: How About Some China Wholesale?

Over at Google, a search for search term research leads off with the Google AdWords Keyword Tool, which is an excellent first choice. It’s a dependable tool, offered for free, with great data. Some more tools follow that, then two older articles (from 2007 & 2006, respectively) on conference presentations about the topic. Those are kind of iffy to be in the top results given their age, but certainly they’re relevant. Then I get another tool, a fresher article that’s not super-substantial, a compilation list of articles and a nice conference presentation.

Overall, it’s not bad. Not fantastic, but not bad. Where things really fall down is when you go to the second page of results.

OK, few people go past the first page of results. I know this. But still, that second page of results? It contains what Google is presenting as among the very best out of 126 million pages on the web for this topic. The very best. And we get on the second page?

search term research - Google Search

For those who can’t see the image above for some reason, the rundown:

  • Link to a keyword research tool, which makes sense
  • Really weird local results about local companies in New York that somehow seem related to search term research
  • A really bad directory listing of resources
  • An OK page listing some tools
  • Agenda for a session on the topic for a conference in 2006
  • A press release from someone speaking on the topic in 2006
  • Another keyword research tool
  • The most amazing bad result, some “China Wholesale Supplier” with search term research products. More on this in a bit…
  • A review of one particular tool from 2006
  • An article I wrote on the topic in 2007

I think the amazingly poor quality of these results are self-evident. Let’s look at that China Wholesale page. Again, out of 126 million possible matches, this is what Google thinks is the 18th best out of all of them for the topic of search term research:

Search Term Research - China Wholesale Supplier

I can’t even figure out what this is! One arrow points to how the page is in the “Search Term Research” category of the hosting web site. The other two point at what’s listed in this category, oil paintings of some soccer stars.

How on earth has Google, with its supposedly awesome attention to search quality, allowed this to show so high in the results?

But Bing’s Worse!

At least Google can fall back on the “others are worse” excuse. Let’s go to Bing:

search term research - Bing

Ugh. The rundown:

  • Terms of service for those looking into broadband research?
  • A tool, OK
  • Wikipedia article on research in general. Not search term research — just research
  • An undated article with bad advice that the best way to do research isn’t to do it at all. Just write! See what terms generate visits after you write. That’s terrible advice, because if you haven’t written using important terms, you’ll probably never see the traffic for them in the first place to know you should use them.
  • That tool listed in number two? This is an article about it from the company that owns the tool
  • Coverage of a search term research session at a conference from 2007
  • A compilation of articles on the topic
  • A page for marketing terms. Not search term research, just marketing terms
  • Another page with coverage of a search term research conference session in 2007
  • A press release about a biospace research project.

Did I say ugh? I’ll say it again. Ugh. It’s self evident how many of these pages are clearly NOT the best on the topic out of the millions of pages that Bing could pick.

For some reason, I see completely different results than this when I use Safari, rather than Firefox:

search term research - Bing-2

These are generally better, but I still get weird outliers like one for a Utah History center or a place to buy essays.

Yahoo: 50% Irrelevant

How about Yahoo? Ugh again:

search term research - Yahoo! Search Results

Pages for biotech research, autism research, research at Oregon Health & Science University plus two for Lexis/Nexis show up. That’s 50% of the top results completely off target for what I searched for. Not just 50% so-so results. They’re just totally not right. At all.

Ask’s OK, If You Can Stand The Ads

Ironically, given I’ve written it off as a serious search engine last year, Ask seems to have fairly decent results for the topic. There’s a bad press release of someone speaking on the topic, but everything else is good or at least related to the subject.

Of course, you have a giant ad unit containing five paid listings that’s shoved between the first result and the rest. Then another five at the bottom. Then one more paid listing after that, with no disclaimer as required by the Federal Trade Commission.

Things Feel Worse, But Hard To Quantify

Sadly, we still have no commonly accepted measurements of relevancy across search engines, and it’s an area that gets harder and harder to assess, as more material is blended in alongside web page results. That’s something I’d still like the search engine to collaborate on, some independent regular assessment of their quality.

To me, it feels like they’re getting worse, not better. But I can’t document that. What I can do is demonstrate without much difficulty, for areas where I have subject expertise, how bad they can be. They get by because along with the bad, there’s enough good. But they should be better than this.

Related Topics: Channel: Strategy | Features: Analysis | Stats: Relevancy | Top News


About The Author: is a Founding Editor of Search Engine Land. He’s a widely cited authority on search engines and search marketing issues who has covered the space since 1996. Danny also serves as Chief Content Officer for Third Door Media, which publishes Search Engine Land and produces the SMX: Search Marketing Expo conference series. He has a personal blog called Daggle (and keeps his disclosures page there). He can be found on Facebook, Google + and microblogs on Twitter as @dannysullivan.

Connect with the author via: Email | Twitter | Google+ | LinkedIn


Get all the top search stories emailed daily!  


Other ways to share:

Read before commenting! We welcome constructive comments and allow any that meet our common sense criteria. This means being respectful and polite to others. It means providing helpful information that contributes to a story or discussion. It means leaving links only that substantially add further to a discussion. Comments using foul language, being disrespectful to others or otherwise violating what we believe are common sense standards of discussion will be deleted. Comments may also be removed if they are posted from anonymous accounts. You can read more about our comments policy here.
  • contactdjy

    Interesting article – thanks. I also feel that search results are getting worse – for what it’s worth. Totally subjective but a consistent gut feel. I mainly use Google search.

  • Matt Cutts

    Danny, your Google search results look a little strange. Have you done anything weird to your search settings or location? Maybe used someone else’s cookie to join a Google UI experiment or something? I ask because I’m getting different results when I do the search. I asked my wife to do the query [search term research] as a sanity check and she sees the same results that I do. Here’s a screenshot of what I see:

    Those results are actually quite good, in my opinion. I might switch out 1-2 results for a different one, but overall I think the results I see are quite good.

  • Mikkel deMib Svendsen

    > Sadly, we still have no commonly accepted measurements of relevancy across search engines,

    Its interesting – I remeber we have had this chat for years now and still we have no widely accepted method for measuring search quality.

    Years ago I remember you came up with some pretty good ideas for how to conduct quality comparison (can’t find the link, though …) and I also know all the engines do something internally.

    So why haven’t the engines accepted a public standard? Do you know if any of the major engines are currently for it or against it?

  • Vanessa Fox

    Matt, I’m seeing the same results Danny is. I’ve seen a bunch of other weird stuff lately (blatant spam ranking really well). Can forward a few examples if you’d like.

  • Matt Cutts

    Vanessa–definitely feel free to forward them on. I’m in Google’s TGIF right now, but I’ll check into why Danny/you are seeing different rankings than I see on my computer, my wife’s computer, and my phone. :|

  • markjackson

    I see what Matt sees.

  • dazzlindonna

    I see what Matt sees, but as my first page of results – and they seem to match what Danny describes as his first page of results. Which makes me wonder if maybe Matt was comparing his first page to Danny’s screenshot of the second page?

  • Marjory

    I’m confused. Aren’t the results supposed to be personalized based on search behavior? Why should anyone see the same results?

  • Jonahstein

    I see what Matt see… except he has one more listing at the bottom and I get the related search terms box.

  • Matt Cutts

    Okay, a fellow Googler pointed out that I misread the post. Danny’s Google screenshot is page 2 of Google’s results and the rest of the screenshots are from the first page of other search engines. dazzlindonna, you’re right–I was comparing my first page to Danny’s screenshot of the second page.

  • Danny Sullivan

    Matt, that’s it exactly. The first page of results ARE fairly good, as I noted. It’s the second page that’s largely a mess. I’ve bolded now that I’m talking about the second page (though I did go on at length to say this). The results there are just funky. And while I know this is the second page, and few people go to them (as I said in the story), still, these are results 11-20 that Google is saying are the best out of 125 million results — and many of them just don’t measure up for that.

  • alanmitchell

    What I can’t understand is why the general standard of paid search ads is also so poor and irrelevant to users’ search queries, despite thousands of advertisers having an almost open brief to write engaging ads to compete for the user’s click. Have a look at some examples I found when searching for hotels in Sydney:

    It’s a myth that paid search is ‘saturated’ and ‘so two years ago’ – there are still countless opportunities everywhere for advertisers wanting to focus on relevancy.

  • ronald

    Let’s do a test. Since Google is claiming to organize the worlds information let’s use that.
    what is information [Res: OK]
    information is what [Res: OK]
    information what is [Res: 7: ACA - What is Chiropractic?]

    From the results we can see that they value/order the most common used phrases. But their mistake is they then take all data equal, which can easily exploited by SEO.
    Bing doesn’t make that mistake.

  • Andrew Goodman

    Danny, don’t you think you’ve chosen an example with a high degree of difficulty?

    (1) it’s three common words, that have many synonyms — search, term, and research. that’s bound to confound many engines. it’s also low volume so there wouldn’t be a wealth of data on it. and you don’t enclose it in quotes. isn’t the more common usage “keyword research”? I see a pretty solid Google SERP on that term.

    (2) it’s a sort of incestuous one — a search by someone looking to discuss search marketing keywords, with search marketers. so – that field is bound to *attract* more spam from the 100,000 SEO’s who use these products and want to experiment with placement techniques.

    I’m sure you’ll move onto more practical consumer examples like

    [chiropractor toronto] (should be very good, common business type, large city)


    [chiropractor schenectady] (should have a bit more trouble, longer tail, smaller city)

    I’ll be looking forward to more insights.

    Certainly for the sites I work with, I see a lot of SPAM in the results — Sites Positioned Above Me!! If only Google knew what some of these companies were doing, I’m sure they wouldn’t rank so well. The question is, after weeding out all that stuff, what’s left. Without human curation I think it is pretty hard to for search engines to make judgment calls about intent as it intersects with quality/relevancy, across so many different types of queries, using only “signals” of same (esp. on long tail stuff).

    A final point here would be — I’d love to compile a list of places users should be searching to get better results *if not Google*. On the above, for example, my instinct would tell me that Yelp would be a much better resource.

    That turns out not to be the case for schenectady chiropractors (no reviews), but somewhat so for Toronto (some reviews, larger market).

    As for reviews and community helping the consumer find the quality vendors, on sites like Yelp…. Google has an inconsistent approach in featuring these, and in some sense competes with them. So they do well for awhile in Google SERP’s, and then get buried. An exception is TripAdvisor. But that may be for good reason (more volume, longer track record to establish intent signals).

    But on a query for “St Lucia hotels,” where TripAdvisor comes in #1 as they often do… you’d still be worried if junky sites showed up, say, in positions 11-20. On this query, I didn’t find that to be the case. Maybe Google has a better time of it reading signals for queries like this both in terms of their popularity, and in terms of consistent consumer intent.

  • Helen

    Thanks so much Danny for bringing this up! I’ve been seeing a LOT of irrelevant search results lately. More and more of them are coming from facebook, some directories, various portals, but not the actual sites I am looking for….

  • Stupidscript

    Marjory, October 23rd, 2009 at 10:23 pm ET makes a good point. Where is the personalization based on search history that we’ve been hearing so much about? Or does it only occur with a sustained search exercise … based on the current session, and not including previous sessions?

  • Mark Cramer

    The problem you’re facing, which is true for virtually all queries, is that “search term research” can mean different things to different people at different points in time. The result set, if generated correctly, should represent those documents with the greatest likelihood of satisfying the user’s information need, but there will always be ambiguity. The “Utah History Research Center” is perhaps not a great example, but if (strangely) 5% of searchers find that one useful, then perhaps it makes sense to put it on page 2.

    There’s also the interesting trade-off between the total relevance and the “risk” of the result set. is an interesting paper that you might enjoy which essentially says that the quality of the result set can be improved by introducing results that do not correlate strongly with others on the page. In other words, diversity in the result set can improve the search experience by reducing overall risk.

    Therefore, while I’m not going to defend all of the results above (some are pretty bad), it makes sense that some of the results might be outside of what you might consider “relevant” because someone else just might think otherwise.

    The fundamental issue is one of ambiguity combined with an overabundance of content. With 126 million matches it’s virtually impossible to make every result at the top relevant to the particular user, at that point time, while completely eliminate all that are irrelevant.

  • liveambitions

    Here’s my 2 cents:

    1. Those websites on page 2 are probably strong, and thus resulting in top rankings. There may be other web pages with better relevancy, but the strength of these weaker websites may not be great enough to overpower the stronger websites.

    2. “Search term research” may have 125 million results, but it doesn’t necessarily mean that this phrase is popular. Unpopular phrases don’t get much play in article titles and content. So, Google has to do its best to come up with whatever is available for indexing.

    3. Let’s not forget that Google’s algorithm is still driven by computers. This is a technology that is still being worked on. We’re kind of expecting a computer to do a human’s job. Only a human can look at a listing and say that it’s truly relevant or not. I think Google’s done a great job thus far.

    For example, if you run a search for “apple”, how in the world is a computer supposed to know what kind of apple you’re talking about? Unless the computer can read the searcher’s mind, it’s only going to be able to spit out what it’s programmed to do. Search results are sometimes only going to be as good as the search queries.


  • Danny Sullivan

    Thanks, everyone, for all the comments. Some responses.

    Andrew, yep, those are three common words. But search engines have long used proximity as a ranking signal. Even if you don’t deliberately seek a phrase using quotation marks, they’ve done this. Google moved that way long ago. One reason was because people DON’T use quotation marks or common commands. So in my book, that doesn’t excuse things. In fact, do this:

    Now I’m quoting those terms. I still see the same crummy results, including the China Wholesaler site in #20. It has nothing — nada — to do with those words other than having those three words in that exact order on the page. If that’s all it takes to get into position 20 on Google these days, just use the exact phrase once, our jobs as search marketers just got a lot easier!

    True, “keyword research” is a far more popular query on this subject than “search term research.” I still don’t think that excuses such bad results.

    I disagree on the incestuous issue. Google’s listing things like slides and agendas. No one went out with an agenda to deliberately rank for these words. And on Yahoo, half the top results weren’t even related to search term research. I mean the Autism page ranks because it’s about Autism Research and has the words “enter a search term” also on the page. That’s the type of false positive I’d expect from AltaVista circa 1999, not Yahoo in 2009.

    On [chiropractor toronto], sure, that’s better. Though not sure how much I trust one of the pages in Google’s top results that has a big “Toronto’s Chiropractors Directory rated on the TOP on YAHOO and GOOGLE!” heading at the top of the page. Or another with all the “useful links” at the bottom leading from a page about Toronto chiropractor services to “plano dentist” or “LA oral surgeon.” That screams out link exchange. Guess that stuff is working again in Google.

    I guess my main point is that for many searches, I can fine outliers that make you just go “huh?” And that’s the myth I’m talking about, the idea that some people have that when you get a top ten list, that these really are the top ten pages on the web for what you searched for. In reality, they’re the top ten best guesses, and sometimes those are terribly wrong. And in general, I feel like things are getting a little more wrong these days than right. I just don’t have stats to back that up.

    Stupidscript, personalized search is making a huge difference in search results, I’ve been finding. Logged in, I can see pages that were buried back in the 4th page of search results leapfrogging to page 1. That’s great, one of the pluses to personalized search. But the core non-personalized results should still be pretty strong, too. And if you’re not seeing them, you’ve got to be logged in and have web history enabled. See this for more:

    Mark, agreed, it’s a huge challenge for a search engine to know what exactly someone seeks for any query that’s entered. But I don’t think “search term research” is that ambigious. At best, the key issue is do you want tools, reviews of tools or articles about the practice. And we do see that variety. But the quality can be iffy. And I really doubt 5% of those searching on this topic find “Utah History Research Center” to be relevant. That just doesn’t make sense to me

    I also agree that diversity can be useful. But that might be part of the problem. The diversity dial might be cranked up too much.

    liveambitions, Google’s search technology is over a decade old now. That’s like 100 search dog years. The oddities on page 2, the weirdness on Bing and Yahoo, these are things you’d see back a few years ago that were corrected. They feel like a step backwards from where we’ve been, not an issue that they just haven’t reached the right mark yet.

Get Our News, Everywhere!

Daily Email:

Follow Search Engine Land on Twitter @sengineland Like Search Engine Land on Facebook Follow Search Engine Land on Google+ Get the Search Engine Land Feed Connect with Search Engine Land on LinkedIn Check out our Tumblr! See us on Pinterest


Click to watch SMX conference video

Join us at one of our SMX or MarTech events:

United States


Australia & China

Learn more about: SMX | MarTech

Free Daily Search News Recap!

SearchCap is a once-per-day newsletter update - sign up below and get the news delivered to you!



Search Engine Land Periodic Table of SEO Success Factors

Get Your Copy
Read The Full SEO Guide