Direct answer fail: Google gives only one side of proposed gun ammo & magazine law

Want to know about California’s Proposition 63, a measure to control gun ammunition sales and large magazines, which is on the state ballot this month? Google has the answer. It’s a “deceptive ballot initiative that will criminalize millions of law-abiding Californians.” So much for balanced search results. Google presents that answer at the very top of its […]

Chat with SearchBot

google-questions1-ss-1920

Want to know about California’s Proposition 63, a measure to control gun ammunition sales and large magazines, which is on the state ballot this month? Google has the answer. It’s a “deceptive ballot initiative that will criminalize millions of law-abiding Californians.” So much for balanced search results.

Google presents that answer at the very top of its results, when searching for “Prop 63” or “Proposition 63,” as shown below:

proposition_63_-_google_search-2

To answer “What’s happening here,” as Medium CEO Ev Williams asked when spotting this four days ago: for all its smarts, Google is still pretty dumb.

Over the years, Google has increased the frequency of showing direct answers in its search results — something it calls “Featured Snippets.” The idea is that mobile users especially want fast facts, not to have to click through to a website.

That’s even a potential advantage in its forthcoming Google Home assistant. It should allow Google Home to answer questions well beyond rival Amazon Echo, because Google will rely on the entire web rather than a more limited set of curated resources, especially Wikipedia.

For example, here’s Google Home answering a question it found from the web, making use of featured snippets, that I’d previous tested with Amazon Echo, which couldn’t do it:

(NOTE: the video above might not show due to temporary problems Twitter is having).

To get these answers, Google effectively guesses (even with all that machine learning) at which site it thinks might have a definitive answer. But the downside is that when Google goes wide beyond curated sources, it makes mistakes. God only loves Christians. Dinosaurs are an indoctrination tool. A not-safe-for-work answer for eating sushi. These are real things that Google featured snippets have gotten wrong in the past.

Heck, Google still will tell you that Barack Obama is “King of the United States” based on our own article about how a featured snippet originally screwed up this answer.

king of the united states

These types of mistakes are embarrassing in web search results. They’re going to be even worse with Google Home, where Google will start reading aloud some of these crazy answers without at least the back-up of other search results. Potentially, that could even hinder the product.

Indeed, as Google Home increases the attention that featured snippets get, it’s not unlikely that we might see companies actively working to spam them (more than they do now), or even rival groups trying to obtain their “side” as the preferred answer that Google gives.

We’ve asked Google for comment and will update if one comes.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About the author

Danny Sullivan
Contributor
Danny Sullivan was a journalist and analyst who covered the digital and search marketing space from 1996 through 2017. He was also a cofounder of Third Door Media, which publishes Search Engine Land and MarTech, and produces the SMX: Search Marketing Expo and MarTech events. He retired from journalism and Third Door Media in June 2017. You can learn more about him on his personal site & blog He can also be found on Facebook and Twitter.

Get the must-read newsletter for search marketers.