Why Siri Can’t Find Abortion Clinics & How It’s Not An Apple Conspiracy

“I’m standing in front of a Planned Parenthood,” the CNN reporter says, “And Siri can’t find it when I search for abortion clinic.” No, it can’t. It’s not because Apple is pro-life. It’s because Planned Parenthood doesn’t call itself an abortion clinic. Welcome To Search Scandals, Apple It’s been interesting to watch the Siri Abortiongate […]

Chat with SearchBot

“I’m standing in front of a Planned Parenthood,” the CNN reporter says, “And Siri can’t find it when I search for abortion clinic.” No, it can’t. It’s not because Apple is pro-life. It’s because Planned Parenthood doesn’t call itself an abortion clinic.

Welcome To Search Scandals, Apple

It’s been interesting to watch the Siri Abortiongate scandal blow up in Apple’s face over the past few days. Apple is learning for the first time what it’s like to run a search engine. People hold you accountable for everything, even if you the information isn’t even from your own database.

Google is a battle-scarred veteran in these matters. Why does an anti-Jewish site show up in response for a search on “Jew?” Why did President George W Bush’s official biography rank for miserable failure? Why do you get THAT result for a search on Santorum?

Sometimes, Google’s opens up to explain some of these oddities, which tend to have reasonable explainations. Not always. The company stayed closed-mouthed about why exactly a search for “climategate” was suggested and suddenly disappeared, taking ages to finally explain why. That harmed it.

Inside Siri

The same silence is harming Apple now. Sure, the company has issued a statement to various outlets saying there’s nothing intentional happening, and it’s merely a bug that needs to be fixed. Here’s one of the statements from Apple CEO Tim Cook, given to NARAL Pro-Choice America:

Our customers use Siri to find out all types of information and while it can find a lot, it doesn’t always find what you want. These are not intentional omissions meant to offend anyone, it simply means that as we bring Siri from beta to a final product, we find places where we can do better and we will in the coming weeks.

Here’s another, given to the New York Times:

“Our customers want to use Siri to find out all types of information, and while it can find a lot, it doesn’t always find what you want,” said Natalie Kerris, a spokeswoman for Apple, in a phone interview late Wednesday. “These are not intentional omissions meant to offend anyone. It simply means that as we bring Siri from beta to a final product, we find places where we can do better, and we will in the coming weeks.”

But opening up on how exactly Siri works would help. It would help a lot. Without that, the speculation continues, as you can see in this article from The Raw Story:

Kerris did not, apparently, explain why Siri, although still in beta, has no difficulty locating escort services, plastic surgeons that will perform breast augmentation procedures or hospitals to direct users if they have erections lasting longer than five hours (a condition known as priapism).

I’ve got no inside knowledge of how Siri works. Heck, we weren’t allowed to attend the launch event of Siri, despite Search Engine Land being the leading news site that focuses on search. But because I’ve covered search so long, I can take a pretty good shot at explaining what’s wrong, why Siri will suggest where you can get Viagra or bury a body but not where you can find an abortion.

It Can Find Viagra, But Not…

Let’s start with the ACLU’s post, which says:

If Siri can tell us about Viagra, it should not provide bad or no information about contraceptives or abortion care.

Personally, I can’t get Siri to search for Viagra. It insists on seaching for “biography” no matter how I speak Viagra. But here’s an example of what the ACLU is upset about, taken from the Siri Failures, Illustrated blog post from Amadi:

Viagra Siro

But ask for contraceptives, and no luck. Here’s another example from Amadi:

No Plan B

Actually, this isn’t the case. If I ask it for condoms, I get an answer:

Condoms On Siri

Asking for a brand name, like Trojans, however, doesn’t help me.

Siri Doesn’t Understand Many Things

What’s going on here? First, Siri doesn’t have answers to anything itself. It’s what we call a “meta search engine,” which is a service that sends your query off to other search engines.

Siri’s a smart meta search engine, in that it tries to search for things even though you might not have said the exact words needed to perform your search. For example, it’s been taught to understand that teeth are related to dentists, so that if you say “my tooth hurts,” it knows to look for dentists.

Unfortunately, the same thing also makes it an incredibly dumb search engine. If it doesn’t find a connection, it has a tendency to not search at all.

When I searched for condoms, Siri understood those are something sold in drug stores. That’s why it came back with that listing of drug stores. It know that condoms = drug stores.

It doesn’t know that Plan B is the brand name of an emergency contraception drug. Similarly, while it does know that Tylenol is a drug, and so gives me matches for drug stores, it doesn’t know that acetaminophen is the chemical name of Tylenol. As a result, I get nothing:

Acetaminophen

Conspiracy Or Generally Confused?

Is Siri also against headaches? I don’t think so, but it easy to pursue one line of questioning in various ways, such as everything about abortions, and come away with a skewed view that Siri is pro-life rather than just buggy in general.

Indeed, it can be horrifying. That search for the morning after pill? Siri comes up with an almost mocking sounding “Is that so” response. It’s worse, bone-chillingly worse, when “I was raped” is searched for:

Raped

Now search for “I was laughing,” and you get the same type of responses:

Siri Laughing

What’s happening here? Does Siri really understand that raped means rape and is mocking someone? Or are we seeing a series of responses when it really doesn’t know what you want about anything and instead shifts into a conversational mode that some engineers thought might be funny?

I’m pretty sure it’s the latter. And people did think this was funny. When Siri came out as part of the iPhone 4s, and they would ask it all types of jovial things. It’s not funny when you’re talking about rape, but Siri really doesn’t know you’re talking about that.

Past Tense, Different Word

But wait, what about this:

Rape Resources

If you’re really looking hard for some smoking gun, then maybe this is one. As Amadi post said, this is proof that Siri does know what rape is.

Sure, it’s proof that it knows that the exact word “rape” is linked to sexual abuse treatment centers. It’s not proof that Siri understands “raped” — which is a different word from rape in spelling — has the same meaning, only in the past tense.

Humans easily know this stuff. For search engines, it’s hard. It’s perhaps harder for Siri, ironically, because is tries to make life easier for people by not requiring them to be direct.

In Google Voice Actions for Android, if you wanted rape resources, you would literally says “search rape resources,” and you’d get web search results. You can do exactly the same with Siri, if you want to be literal. But because it tries to be helpful, it can also be limiting.

No Abortion Clinics, No Tool Stores….

Meanwhile, as for not finding those sexual abuse centers, I’m guessing that’s because there were simply none nearby that expressly defined themselves that way. That leads me to the now infamous inability for Siri to find abortion clinics:

Abortion Clinic

Guess what? It also cannot find hardware stores, when I try to find them by asking for a tool store, even though there are plenty of hardware stores near me:

Tool Store1

In both cases, Siri understands this is a local search that I want to do, which means that it should do my search over at Yelp, the partner it uses for local listings.

But Yelp Has Them!

If I search at Yelp for abortion, I get plenty of matches — one of them a local Planned Parenthood clinic:

Planned Parenthood

Why is Siri deliberately suppressing this information? Notice all the bold mentions of “abortion” in those listings. Those are from comments people have left. They’re not the names of the businesses.

Siri’s not finding abortion clinics because Planned Parenthood and other places that perform abortions don’t call themselves that, not in their names, nor have they been associated with a category for that. That’s the best guess I have in this.

Planned Parenthood is in the “Medical Center” category, and while Siri may have linked businesses in that type of category to a variety of medical procedures, for whatever reason, abortion isn’t one of them.

Similarly, for whatever reason, Siri hasn’t linked “tool” to the “Hardware Stores” category. The reason, as is the case with abortion, is almost certainly not because of a conspiracy against tools.

But You Do Get Abortion Clinic Listings

It’s almost good that Siri isn’t able to tap into the Yelp comments to help extend its search, because some might be annoyed to get a match for a car service or a Japanese grill. Others might see a church listing coming up first and assume a further attempt to push a pro-life agenda.

Indeed, one of the things that kicked all this attention on Siri and abortion searches off was an article at The Raw Story where a search in Washington DC yielded “abortion clinics” that really were pro-life centers. From the story:

Siri Clinics

Woah. What’s going on there? I don’t know. It’s especially weird in that to even find these companies in Yelp, you have to hunt and hunt for them. In fact, nothing I could do brought up the first listing, but I did find the second.

Looking at that, I note that it’s not assigned to any particular category. Nor is the word “abortion” mentioned on the page. It makes me wonder if Yelp, lacking good first-hand information about this business, has instead pull information in off its web site — which includes terms like abortion – to help classify it.

In some other cases, Siri — depending on Yelp information — does seem to get it right. From a comment on the story at The Raw Story:

I highly doubt it was intentional, probably more to do with places not listing the word “abortion” in their titles. i just tried it and she pointed me right to the nearest clinic in boston, for whatever that’s worth.

And another:

I was unable to reproduce the problem here in rural Texas, not far From Austin.  The first listing that Siri came up with was to the Killeen Women’s Health Center, the web link for which took me to the site for the Austin Women’s Health Center, a legitimate clinic offering a full range of reproductive choices and services.

In some cases, Yelp is clearly passing along information to Siri that it has things it believes to be abortion clinics. But that information is pretty limited, it seems.

Confusing Human & Computer Results

I’ll end with one more thing. How can Siri dumb enough not to list an abortion clinic near that CNN reporter yet clever enough to suggest that if you want to bury a body, try dumps and swamps:

Hide A Body

That’s again down to programmers thinking this would be funny. They’ve hard linked this type of query to react that way, and it was funny. That’s probably the case with the searches for escort services that do work.

But now, when a serious issue like abortion searches come up, it causes confusion between the things that Siri can figure out automatically (with a lot of weakness) and the things it seems incredibly clever about (with some human help).

No doubt Apple will fix things so that searches for abortion clinics will bring back relevant resources. No doubt there will be plenty of other things that remain buggy — and even when it comes out of beta, you can expect that. That’s the nature of search. Just ask Google.

Now have a chuckle. Stephen Colbert did a wonderful send-up on the whole Siri/abortion issue last night:

The Colbert Report Mon – Thurs 11:30pm / 10:30c
Conservative Siri
www.colbertnation.com
Colbert Report Full Episodes Political Humor & Satire Blog Video Archive

For related news, see coverage from around the web at Techmeme.


Contributing authors are invited to create content for Search Engine Land and are chosen for their expertise and contribution to the search community. Our contributors work under the oversight of the editorial staff and contributions are checked for quality and relevance to our readers. The opinions they express are their own.


About the author

Danny Sullivan
Contributor
Danny Sullivan was a journalist and analyst who covered the digital and search marketing space from 1996 through 2017. He was also a cofounder of Third Door Media, which publishes Search Engine Land and MarTech, and produces the SMX: Search Marketing Expo and MarTech events. He retired from journalism and Third Door Media in June 2017. You can learn more about him on his personal site & blog He can also be found on Facebook and Twitter.

Get the newsletter search marketers rely on.