Why Siri Can’t Find Abortion Clinics & How It’s Not An Apple Conspiracy

“I’m standing in front of a Planned Parenthood,” the CNN reporter says, “And Siri can’t find it when I search for abortion clinic.” No, it can’t. It’s not because Apple is pro-life. It’s because Planned Parenthood doesn’t call itself an abortion clinic.

Welcome To Search Scandals, Apple

It’s been interesting to watch the Siri Abortiongate scandal blow up in Apple’s face over the past few days. Apple is learning for the first time what it’s like to run a search engine. People hold you accountable for everything, even if you the information isn’t even from your own database.

Google is a battle-scarred veteran in these matters. Why does an anti-Jewish site show up in response for a search on “Jew?” Why did President George W Bush’s official biography rank for miserable failure? Why do you get THAT result for a search on Santorum?

Sometimes, Google’s opens up to explain some of these oddities, which tend to have reasonable explainations. Not always. The company stayed closed-mouthed about why exactly a search for “climategate” was suggested and suddenly disappeared, taking ages to finally explain why. That harmed it.

Inside Siri

The same silence is harming Apple now. Sure, the company has issued a statement to various outlets saying there’s nothing intentional happening, and it’s merely a bug that needs to be fixed. Here’s one of the statements from Apple CEO Tim Cook, given to NARAL Pro-Choice America:

Our customers use Siri to find out all types of information and while it can find a lot, it doesn’t always find what you want. These are not intentional omissions meant to offend anyone, it simply means that as we bring Siri from beta to a final product, we find places where we can do better and we will in the coming weeks.

Here’s another, given to the New York Times:

“Our customers want to use Siri to find out all types of information, and while it can find a lot, it doesn’t always find what you want,” said Natalie Kerris, a spokeswoman for Apple, in a phone interview late Wednesday. “These are not intentional omissions meant to offend anyone. It simply means that as we bring Siri from beta to a final product, we find places where we can do better, and we will in the coming weeks.”

But opening up on how exactly Siri works would help. It would help a lot. Without that, the speculation continues, as you can see in this article from The Raw Story:

Kerris did not, apparently, explain why Siri, although still in beta, has no difficulty locating escort services, plastic surgeons that will perform breast augmentation procedures or hospitals to direct users if they have erections lasting longer than five hours (a condition known as priapism).

I’ve got no inside knowledge of how Siri works. Heck, we weren’t allowed to attend the launch event of Siri, despite Search Engine Land being the leading news site that focuses on search. But because I’ve covered search so long, I can take a pretty good shot at explaining what’s wrong, why Siri will suggest where you can get Viagra or bury a body but not where you can find an abortion.

It Can Find Viagra, But Not…

Let’s start with the ACLU’s post, which says:

If Siri can tell us about Viagra, it should not provide bad or no information about contraceptives or abortion care.

Personally, I can’t get Siri to search for Viagra. It insists on seaching for “biography” no matter how I speak Viagra. But here’s an example of what the ACLU is upset about, taken from the Siri Failures, Illustrated blog post from Amadi:

But ask for contraceptives, and no luck. Here’s another example from Amadi:

Actually, this isn’t the case. If I ask it for condoms, I get an answer:

Asking for a brand name, like Trojans, however, doesn’t help me.

Siri Doesn’t Understand Many Things

What’s going on here? First, Siri doesn’t have answers to anything itself. It’s what we call a “meta search engine,” which is a service that sends your query off to other search engines.

Siri’s a smart meta search engine, in that it tries to search for things even though you might not have said the exact words needed to perform your search. For example, it’s been taught to understand that teeth are related to dentists, so that if you say “my tooth hurts,” it knows to look for dentists.

Unfortunately, the same thing also makes it an incredibly dumb search engine. If it doesn’t find a connection, it has a tendency to not search at all.

When I searched for condoms, Siri understood those are something sold in drug stores. That’s why it came back with that listing of drug stores. It know that condoms = drug stores.

It doesn’t know that Plan B is the brand name of an emergency contraception drug. Similarly, while it does know that Tylenol is a drug, and so gives me matches for drug stores, it doesn’t know that acetaminophen is the chemical name of Tylenol. As a result, I get nothing:

Conspiracy Or Generally Confused?

Is Siri also against headaches? I don’t think so, but it easy to pursue one line of questioning in various ways, such as everything about abortions, and come away with a skewed view that Siri is pro-life rather than just buggy in general.

Indeed, it can be horrifying. That search for the morning after pill? Siri comes up with an almost mocking sounding “Is that so” response. It’s worse, bone-chillingly worse, when “I was raped” is searched for:

Now search for “I was laughing,” and you get the same type of responses:

What’s happening here? Does Siri really understand that raped means rape and is mocking someone? Or are we seeing a series of responses when it really doesn’t know what you want about anything and instead shifts into a conversational mode that some engineers thought might be funny?

I’m pretty sure it’s the latter. And people did think this was funny. When Siri came out as part of the iPhone 4s, and they would ask it all types of jovial things. It’s not funny when you’re talking about rape, but Siri really doesn’t know you’re talking about that.

Past Tense, Different Word

But wait, what about this:

If you’re really looking hard for some smoking gun, then maybe this is one. As Amadi post said, this is proof that Siri does know what rape is.

Sure, it’s proof that it knows that the exact word “rape” is linked to sexual abuse treatment centers. It’s not proof that Siri understands “raped” — which is a different word from rape in spelling — has the same meaning, only in the past tense.

Humans easily know this stuff. For search engines, it’s hard. It’s perhaps harder for Siri, ironically, because is tries to make life easier for people by not requiring them to be direct.

In Google Voice Actions for Android, if you wanted rape resources, you would literally says “search rape resources,” and you’d get web search results. You can do exactly the same with Siri, if you want to be literal. But because it tries to be helpful, it can also be limiting.

No Abortion Clinics, No Tool Stores….

Meanwhile, as for not finding those sexual abuse centers, I’m guessing that’s because there were simply none nearby that expressly defined themselves that way. That leads me to the now infamous inability for Siri to find abortion clinics:

Guess what? It also cannot find hardware stores, when I try to find them by asking for a tool store, even though there are plenty of hardware stores near me:

In both cases, Siri understands this is a local search that I want to do, which means that it should do my search over at Yelp, the partner it uses for local listings.

But Yelp Has Them!

If I search at Yelp for abortion, I get plenty of matches — one of them a local Planned Parenthood clinic:

Why is Siri deliberately suppressing this information? Notice all the bold mentions of “abortion” in those listings. Those are from comments people have left. They’re not the names of the businesses.

Siri’s not finding abortion clinics because Planned Parenthood and other places that perform abortions don’t call themselves that, not in their names, nor have they been associated with a category for that. That’s the best guess I have in this.

Planned Parenthood is in the “Medical Center” category, and while Siri may have linked businesses in that type of category to a variety of medical procedures, for whatever reason, abortion isn’t one of them.

Similarly, for whatever reason, Siri hasn’t linked “tool” to the “Hardware Stores” category. The reason, as is the case with abortion, is almost certainly not because of a conspiracy against tools.

But You Do Get Abortion Clinic Listings

It’s almost good that Siri isn’t able to tap into the Yelp comments to help extend its search, because some might be annoyed to get a match for a car service or a Japanese grill. Others might see a church listing coming up first and assume a further attempt to push a pro-life agenda.

Indeed, one of the things that kicked all this attention on Siri and abortion searches off was an article at The Raw Story where a search in Washington DC yielded “abortion clinics” that really were pro-life centers. From the story:

Woah. What’s going on there? I don’t know. It’s especially weird in that to even find these companies in Yelp, you have to hunt and hunt for them. In fact, nothing I could do brought up the first listing, but I did find the second.

Looking at that, I note that it’s not assigned to any particular category. Nor is the word “abortion” mentioned on the page. It makes me wonder if Yelp, lacking good first-hand information about this business, has instead pull information in off its web site — which includes terms like abortion – to help classify it.

In some other cases, Siri — depending on Yelp information — does seem to get it right. From a comment on the story at The Raw Story:

I highly doubt it was intentional, probably more to do with places not listing the word “abortion” in their titles. i just tried it and she pointed me right to the nearest clinic in boston, for whatever that’s worth.

And another:

I was unable to reproduce the problem here in rural Texas, not far From Austin.  The first listing that Siri came up with was to the Killeen Women’s Health Center, the web link for which took me to the site for the Austin Women’s Health Center, a legitimate clinic offering a full range of reproductive choices and services.

In some cases, Yelp is clearly passing along information to Siri that it has things it believes to be abortion clinics. But that information is pretty limited, it seems.

Confusing Human & Computer Results

I’ll end with one more thing. How can Siri dumb enough not to list an abortion clinic near that CNN reporter yet clever enough to suggest that if you want to bury a body, try dumps and swamps:

That’s again down to programmers thinking this would be funny. They’ve hard linked this type of query to react that way, and it was funny. That’s probably the case with the searches for escort services that do work.

But now, when a serious issue like abortion searches come up, it causes confusion between the things that Siri can figure out automatically (with a lot of weakness) and the things it seems incredibly clever about (with some human help).

No doubt Apple will fix things so that searches for abortion clinics will bring back relevant resources. No doubt there will be plenty of other things that remain buggy — and even when it comes out of beta, you can expect that. That’s the nature of search. Just ask Google.

Now have a chuckle. Stephen Colbert did a wonderful send-up on the whole Siri/abortion issue last night:

The Colbert Report Mon – Thurs 11:30pm / 10:30c
Conservative Siri
www.colbertnation.com
Colbert Report Full Episodes Political Humor & Satire Blog Video Archive

For related news, see coverage from around the web at Techmeme.

Related Topics: Apple: Siri | Channel: Mobile | Features: Analysis | Search & Society: General | Top News

Sponsored


About The Author: is a Founding Editor of Search Engine Land. He’s a widely cited authority on search engines and search marketing issues who has covered the space since 1996. Danny also serves as Chief Content Officer for Third Door Media, which publishes Search Engine Land and produces the SMX: Search Marketing Expo conference series. He has a personal blog called Daggle (and keeps his disclosures page there). He can be found on Facebook, Google + and microblogs on Twitter as @dannysullivan.

Connect with the author via: Email | Twitter | Google+ | LinkedIn



SearchCap:

Get all the top search stories emailed daily!  

Share

Other ways to share:
 

Read before commenting! We welcome constructive comments and allow any that meet our common sense criteria. This means being respectful and polite to others. It means providing helpful information that contributes to a story or discussion. It means leaving links only that substantially add further to a discussion. Comments using foul language, being disrespectful to others or otherwise violating what we believe are common sense standards of discussion will be deleted. Comments may also be removed if they are posted from anonymous accounts. You can read more about our comments policy here.
  • JoshUng

    I do doubt that Apple decided to block Siri from providing locations of abortions clinics, but I don’t think totally lets them off the hook.

    The pitch about Siri was that it “knows what you’re talking about.” It knows if you say you locked yourself out, to provide a list of locksmiths. That’s part of the commericials they do. It should know that Planned Parenthood is a place somebody looking for an abortion clinic would be interested in.

    It doesn’t mean Apple or Siri is pro-life, it just means Siri isn’t quite as intelligent as they pretend it is.

  • http://www.vignature.com Scott Roberts

    Agree 100% with JoshUng. Siri doesn’t claim to be a simple search engine. It claims to be “a simpler, more intelligent, and more personal paradigm for interacting with the internet — your virtual personal assistant”. It also claims “Siri is a “do engine” – the next evolution beyond a set of ten blue links.”

    Looks like they over promised and under delivered.

  • http://www.skypeenglishclasses.com Paul Peters

    It’s Santorum, not Santorium. Otherwise, great article.

  • http://searchengineland.com/ Danny Sullivan

    Josh, totally agreed. It’s a failure, no doubt. I’m waiting for the Saturday Night Light fake commercial that takes all those things Siri is supposed to do so well in the real ads and slight changes stuff to all the things it fails at.

    My point wasn’t to excuse Apple for Siri’s failures, rather, mainly to highlight that they weren’t being caused by some overt attempt to be pro-choice or something.

    Paul, oops! Fixed!

  • http://www.linkedin.com/in/chriscornwell Chris Cornwell

    Ridiculous.

  • http://www.halobrien.com/ Hal O’Brien

    “It doesn’t mean Apple or Siri is pro-life, it just means Siri isn’t quite as intelligent as they pretend it is.”

    Almost like Apple’s other exaggerated claims. (“It just works.”) I guess, “Only Macs snow crash,” didn’t test so well in focus groups. (Go to YouTube, search for “mac snow crash,” and you’ll see what my mac mini does every 4-6 weeks.)

  • themacolyte

    I don’t think any of Apple’s marketing claims regarding Siri have been invalidated by it’s performance. It’s ignorant and naive to assume that marketing statements (or any statement) are absolutes. If you believe it “knows what you’re talking about” must be taken as an absolute statement then I think you are in the wrong, not Apple. There isn’t a person alive that lives up to that statement either, yet ignorant consumers think a programmed search assistant should if a generic claim is made to that effect.

    Give me a break. Apple didn’t over promise or under deliver on anything.

    Also, if something doesn’t do what you want it to, that doesn’t mean it is buggy or that it’s “failure” is a “bug”. Features that rely on data are only as good as the data. If something isn’t covered then it’s missing data, but that isn’t a bug.

    For Apple to live up to the expectations of those that don’t understand how any of this works, they will have to catalog all of human knowledge, and Siri will be in beta forever.

  • http://deepthiw.com Deepthi Welaratna

    For me, it’s understandable that Siri will have some gaps in knowledge and search utility. What is not really okay here is that the gap is abortion clinics. Best case scenario, nobody thought about what to do for that search — which indicates a lazy attitude towards women’s issues to the point of being offensive. Good access to health information on the web has always been an issue for women, one that there has been plenty of discussion about on the Internet for decades. It’s not a new issue. Apple’s had plenty of time to figure out a better approach for Siri than generic ignorance. And the lack of customization for “I was raped?” Completely beyond the pale.

  • Neil Faulconer

    I can find both abortion clinics and hardware stores with Siri. This is just BS.

  • http://friendfeed.com/timjones17 tim jones

    Apple’s failed magic isn’t magical at all.

  • http://www.steveroberts.co Steve Roberts

    Is this a joke, What do you expect it to do. A bunch of bollox this article is. Plus it’s a beta version, its still in progress. But to compare it to a search engine alone. It uses other search engines to find information.

  • jalbrook

    @ Deepthi Welaratna,

    The best case scenario is not “a lazy attitude towards women’s issues to the point of being offensive”, its “creation of an algorithm that incorporates as much usable data and helps as many users without compromising quality of results”. I presume you have no idea how the search engine was programmed, and it’s likely that you don’t have any idea how other search engines are programmed beyond maybe the vaguest of assumptions. If you do, then perhaps you can suggest to Apple how to fix their problem, but otherwise you can’t say with certainty that it just “didn’t occur” to programmers (of which there was probably at least one female in regular contact with some of the staff who it apparently didn’t occur to either– extending this line of reasoning suggests this woman is now an anti-abortionist or misogynist herself with absolutely no other supporting evidence). Actually, since results for Planned Parenthood are returned when specifically queried, the opposite is most likely true– they were trying to maintain quality of the results while still taking into account demands that are likely to be unusual. As for “I was raped”, I would point out this is not a crime that occurs exclusively to women (or is perpetrated by men exclusively) in the first place, though some authors (not yourself) seem to imply that it is. Beyond this, though, the problem is Siri does not understand a number of individual words, but the general syntax and make up of a sentence like “I was + keyword” leads it to use a conversational tone, apparently. Unless it does understand the keyword. This is not “beyond the pale” in its offensiveness any more than the suggestion that programmers who toil away at an algorithm are sexist because said algorithm is not perfect. It’s just the best the program can do in its beta

    It’s also interesting to note that people in different locations and at different times receive (sometimes drastically) different results. If the programming methodology left Siri wholly without the ability to comprehend abortion or related phrases this would not be the case. Instead, it’s likely that it at least can learn what these terms are marginally related to– perhaps from trial and error, manually reworking the algorithm, or simply from user generated input elsewhere. If Siri in New York finds abortion clinics that are not abortion clinics, maybe someone makes a note somehow (I have never used the program, so I don’t know the options for self-regulating results) or the fact that they don’t use the results returned and make another search leads Siri’s algorithm to presume these results are not as useful. The next time someone makes a similar result, the relevance is further downgraded if they don’t use the result, or upgraded if they do seem to use it. No idea if this is how the algorithm actually responds, but no one here has knowledge of its inner workings. If people are content with outright insulting the programmers of the device with such little evidence then I suppose that’s the world we live in.

  • DJ Rosen

    They’d probably make more money and have more appreciation to deliberately make it worse for what it is now. A deliberately sexist, pro-nazi and wildly nasty search engine that exists for the fun of it. Occasionaly lampooning Google and all the other search engines. Occasionally sending people to oddball sites at random intervals. Highly edited pictures of some ex-presidents and moral speakers. A maping feature that sends everyone to the same communist country. Bus schedules that include the moons of Jupiter. Mock virus alerts.

  • http://deepthiw.com Deepthi Welaratna

    @ jalbrook,

    I appreciate your thoughtful response to my comment, but I think despite trying to reference my comments as much as possible, you don’t actually address the heart of my argument, which is that because “good access to health information on the web has always been an issue for women” this is something that should have been addressed in any new search engine. Again, in my opinion, “Apple’s had plenty of time to figure out a better approach for Siri than generic ignorance.” Or nonengagement, or any other way you’d like to describe the jokey, conversational tone Siri responds with for the statement “I was raped” — especially when contrasted with the specific customization for “I am horny,” which brings up a list of nearby escort services. To me, there is a clear sense of prioritization lacking in this customization that I find offensive. I don’t think I’m insulting the programmers here — customization doesn’t necessarily happen at the algorithm stage, since it often has to be programmed as an exception or in some other fashion. There’s a whole series of decision-makers involved in creating something as complex as Siri, and I think there’s probably many people who could have been involved in the priorities who aren’t engineers. Regardless of whether they are male or female, I think the decision-making around key areas for customization at launch was flawed.

    Finally, I’ll just say that It’s not necessary to be able to write an algorithm to be qualified to comment on its utility or success. I maintain my position that despite being in beta, Siri should have included a better user experience around such a challenging topic at launch. It’s irresponsible to do otherwise.

  • http://www.usrentacar.co.uk khendrie

    Not constructive at all – So Siri has flaws, I can deal with that.

  • Paul Trott

    You should just count yourselves lucky you can search for businesses at all, in the UK we still don’t have that function and it won’t appear until “sometime in 2012″, whenever that means.

  • http://www.vitaminstore.nl Loek Bosman

    Funny to see how you’re battery is dying while you wrote this article. I bet it didn’t took you all day to write this, but you went from 99% to empty & charging anyways! I think that’s a far more interesting issue then discussing weather Apple is pro-life or not.. Who isn’t anyways.

  • jalbrook

    @ Deepthi Welaratna

    You didn’t specifically say programmers, I apologize for making that assumption. But you did appear to insult the creative team nonetheless, and, of course, the company as a whole. As for addressing the heart of your argument, I don’t disagree that the availability of health information for women is important, and I won’t refute that Siri does not address some health concerns as capably as some user would like. However, just because several phrases you don’t find important enough to be customized are customized, it does not mean that Apple has failed in such a way as to be considered irresponsible. After all, many users are probably quite amused by the customized options. If Siri was purported to be or have a sexual abuse support function then it would irresponsible, but it’s hardly fair to insult the creators for not taking into account every possible phrase and giving customized responses for them all.

    I agree you do not need to know how to create an algorithm to be able to comment on its utility. But your “best case” scenario of the previous comment wasn’t commenting on the utility of the algorithm at all, but the actual practice of its creation– you implied that no where in the creative process did it occur to create customization for a query of “I was raped”, when that could very well be untrue, and the main priority (of fulfilling what MOST users would want) took precedence over this and other customizations. You are apparently of the opinion that the creative process for Siri could have been better. Without knowledge of the creative process, specifically the core of the product– it’s algorithms– you can’t be certain that the creative team didn’t take into account your advice and for some other reason were still not able to bring to market a product that you would have found satisfactory. Providing examples of phrases you don’t consider as important that have customized responses does not strengthen your argument because those customized responses could have been lower priority but easier to put in for some reason, or higher priority because they are more frequently searched, or have been used for some other reason– without knowing the process as well as the creation you can’t say.

    But personally, my biggest problem with your argument isn’t even the fact that it’s based mostly in conjecture. It’s that it seems to be a perfect setup for moving the goal posts, even if you yourself don’t move them. I mean, if Siri had been created with a customized response to “I was raped”, or if it is later upgraded to include such a response, what would your opinion be if it didn’t further have a customized option for “My dog was run over,” or “My lung collapsed”, or “My arm was just sliced off” and on and on ad infinitum? These are all real and serious concerns, so according to your argument not having customized for them is “beyond the pale” and “irresponsible?” Perhaps I’m misunderstanding, but it seems like this is what you implied by your initial post and reinforced by the one just prior.

  • http://deepthiw.com Deepthi Welaratna

    @ jalbrook

    Let me contextualize my essential point by saying that I think there are a set of topics that should be discussed at the very beginning of a search program to determine how they will be handled. These topics concern access to information as related to legal rights — which includes women’s health.

    That companies think through these topics isn’t legally mandated, but I’d think even without the more respectable reasons, a company with foresight would address these topics to avoid criticism. And criticism is what I’ve raised in this situation.

  • http://deepthiw.com Deepthi Welaratna

    PS: I think it’s is debates like these that define the goal posts in the longer-term. And to address one other point you made, If Apple made their decision-making process transparent for why these omissions happened, perhaps your argument would have more weight with me. But if it’s behind a veil, I can only comment on what the result is, not the mechanics that led to that result.

  • jalbrook

    @ Deepthi Welaratna

    I suppose I am mostly affronted by your statements because I’m in software development (though not search engine specifically) and while it’s one thing to criticize the utility of a program, your comments regarding the creators lean far closer to libelous antagonism, considering their methodology is veiled. After all, the utility is not absolutely abysmal; useful results are returned even in the niche of “women’s health” (as seen here and confirmed by various commenters in the “blogosphere”).

    Now of course you have every right to your opinion and the right to word it however harshly you see fit, but considering the little we know of their efforts, I think it’s more likely that such harsh treatment will breed dislike of (to generalize) feminists by developers and if these issues were not prioritized as they should be this time, the next amazing product certainly won’t prioritize them.

    I hope you can understand my point of view, and thank you for such an interesting conversation.

  • Nikhil Khandekar

    Siri is in beta, treat it as such. Wait for the full and final version…though there’s hardly any such thing with Apple. Future versions will perhaps give you full medical instructions on how to perform an abortion at home, using kitchen implements.

    Best.

  • Matt McGee

    ADMIN note: One comment has been removed and another has been added because they included disrespectful attacks on other commenters. We don’t tolerate that. Please see our community guidelines if you have any questions:

    http://searchengineland.com/community-guidelines

  • Tim 2

    Planned Parenthood and other places that provide abortion services often refer to the service as “pregnancy prevention.” Siri brings up those locations if you ask her/it to. http://i41.tinypic.com/21b55c3.jpg

Get Our News, Everywhere!

Daily Email:

Follow Search Engine Land on Twitter @sengineland Like Search Engine Land on Facebook Follow Search Engine Land on Google+ Get the Search Engine Land Feed Connect with Search Engine Land on LinkedIn Check out our Tumblr! See us on Pinterest

 
 

Click to watch SMX conference video

Join us at one of our SMX or MarTech events:

United States

Europe

Australia & China

Learn more about: SMX | MarTech


Free Daily Search News Recap!

SearchCap is a once-per-day newsletter update - sign up below and get the news delivered to you!

 


 

Search Engine Land Periodic Table of SEO Success Factors

Get Your Copy
Read The Full SEO Guide