• JoshUng

    I do doubt that Apple decided to block Siri from providing locations of abortions clinics, but I don’t think totally lets them off the hook.

    The pitch about Siri was that it “knows what you’re talking about.” It knows if you say you locked yourself out, to provide a list of locksmiths. That’s part of the commericials they do. It should know that Planned Parenthood is a place somebody looking for an abortion clinic would be interested in.

    It doesn’t mean Apple or Siri is pro-life, it just means Siri isn’t quite as intelligent as they pretend it is.

  • http://www.vignature.com Scott Roberts

    Agree 100% with JoshUng. Siri doesn’t claim to be a simple search engine. It claims to be “a simpler, more intelligent, and more personal paradigm for interacting with the internet — your virtual personal assistant”. It also claims “Siri is a “do engine” – the next evolution beyond a set of ten blue links.”

    Looks like they over promised and under delivered.

  • http://www.skypeenglishclasses.com Paul Peters

    It’s Santorum, not Santorium. Otherwise, great article.

  • http://searchengineland.com/ Danny Sullivan

    Josh, totally agreed. It’s a failure, no doubt. I’m waiting for the Saturday Night Light fake commercial that takes all those things Siri is supposed to do so well in the real ads and slight changes stuff to all the things it fails at.

    My point wasn’t to excuse Apple for Siri’s failures, rather, mainly to highlight that they weren’t being caused by some overt attempt to be pro-choice or something.

    Paul, oops! Fixed!

  • http://www.linkedin.com/in/chriscornwell Chris Cornwell


  • http://www.halobrien.com/ Hal O’Brien

    “It doesn’t mean Apple or Siri is pro-life, it just means Siri isn’t quite as intelligent as they pretend it is.”

    Almost like Apple’s other exaggerated claims. (“It just works.”) I guess, “Only Macs snow crash,” didn’t test so well in focus groups. (Go to YouTube, search for “mac snow crash,” and you’ll see what my mac mini does every 4-6 weeks.)

  • themacolyte

    I don’t think any of Apple’s marketing claims regarding Siri have been invalidated by it’s performance. It’s ignorant and naive to assume that marketing statements (or any statement) are absolutes. If you believe it “knows what you’re talking about” must be taken as an absolute statement then I think you are in the wrong, not Apple. There isn’t a person alive that lives up to that statement either, yet ignorant consumers think a programmed search assistant should if a generic claim is made to that effect.

    Give me a break. Apple didn’t over promise or under deliver on anything.

    Also, if something doesn’t do what you want it to, that doesn’t mean it is buggy or that it’s “failure” is a “bug”. Features that rely on data are only as good as the data. If something isn’t covered then it’s missing data, but that isn’t a bug.

    For Apple to live up to the expectations of those that don’t understand how any of this works, they will have to catalog all of human knowledge, and Siri will be in beta forever.

  • http://deepthiw.com Deepthi Welaratna

    For me, it’s understandable that Siri will have some gaps in knowledge and search utility. What is not really okay here is that the gap is abortion clinics. Best case scenario, nobody thought about what to do for that search — which indicates a lazy attitude towards women’s issues to the point of being offensive. Good access to health information on the web has always been an issue for women, one that there has been plenty of discussion about on the Internet for decades. It’s not a new issue. Apple’s had plenty of time to figure out a better approach for Siri than generic ignorance. And the lack of customization for “I was raped?” Completely beyond the pale.

  • Neil Faulconer

    I can find both abortion clinics and hardware stores with Siri. This is just BS.

  • http://friendfeed.com/timjones17 tim jones

    Apple’s failed magic isn’t magical at all.

  • http://www.steveroberts.co Steve Roberts

    Is this a joke, What do you expect it to do. A bunch of bollox this article is. Plus it’s a beta version, its still in progress. But to compare it to a search engine alone. It uses other search engines to find information.

  • jalbrook

    @ Deepthi Welaratna,

    The best case scenario is not “a lazy attitude towards women’s issues to the point of being offensive”, its “creation of an algorithm that incorporates as much usable data and helps as many users without compromising quality of results”. I presume you have no idea how the search engine was programmed, and it’s likely that you don’t have any idea how other search engines are programmed beyond maybe the vaguest of assumptions. If you do, then perhaps you can suggest to Apple how to fix their problem, but otherwise you can’t say with certainty that it just “didn’t occur” to programmers (of which there was probably at least one female in regular contact with some of the staff who it apparently didn’t occur to either– extending this line of reasoning suggests this woman is now an anti-abortionist or misogynist herself with absolutely no other supporting evidence). Actually, since results for Planned Parenthood are returned when specifically queried, the opposite is most likely true– they were trying to maintain quality of the results while still taking into account demands that are likely to be unusual. As for “I was raped”, I would point out this is not a crime that occurs exclusively to women (or is perpetrated by men exclusively) in the first place, though some authors (not yourself) seem to imply that it is. Beyond this, though, the problem is Siri does not understand a number of individual words, but the general syntax and make up of a sentence like “I was + keyword” leads it to use a conversational tone, apparently. Unless it does understand the keyword. This is not “beyond the pale” in its offensiveness any more than the suggestion that programmers who toil away at an algorithm are sexist because said algorithm is not perfect. It’s just the best the program can do in its beta

    It’s also interesting to note that people in different locations and at different times receive (sometimes drastically) different results. If the programming methodology left Siri wholly without the ability to comprehend abortion or related phrases this would not be the case. Instead, it’s likely that it at least can learn what these terms are marginally related to– perhaps from trial and error, manually reworking the algorithm, or simply from user generated input elsewhere. If Siri in New York finds abortion clinics that are not abortion clinics, maybe someone makes a note somehow (I have never used the program, so I don’t know the options for self-regulating results) or the fact that they don’t use the results returned and make another search leads Siri’s algorithm to presume these results are not as useful. The next time someone makes a similar result, the relevance is further downgraded if they don’t use the result, or upgraded if they do seem to use it. No idea if this is how the algorithm actually responds, but no one here has knowledge of its inner workings. If people are content with outright insulting the programmers of the device with such little evidence then I suppose that’s the world we live in.

  • DJ Rosen

    They’d probably make more money and have more appreciation to deliberately make it worse for what it is now. A deliberately sexist, pro-nazi and wildly nasty search engine that exists for the fun of it. Occasionaly lampooning Google and all the other search engines. Occasionally sending people to oddball sites at random intervals. Highly edited pictures of some ex-presidents and moral speakers. A maping feature that sends everyone to the same communist country. Bus schedules that include the moons of Jupiter. Mock virus alerts.

  • http://deepthiw.com Deepthi Welaratna

    @ jalbrook,

    I appreciate your thoughtful response to my comment, but I think despite trying to reference my comments as much as possible, you don’t actually address the heart of my argument, which is that because “good access to health information on the web has always been an issue for women” this is something that should have been addressed in any new search engine. Again, in my opinion, “Apple’s had plenty of time to figure out a better approach for Siri than generic ignorance.” Or nonengagement, or any other way you’d like to describe the jokey, conversational tone Siri responds with for the statement “I was raped” — especially when contrasted with the specific customization for “I am horny,” which brings up a list of nearby escort services. To me, there is a clear sense of prioritization lacking in this customization that I find offensive. I don’t think I’m insulting the programmers here — customization doesn’t necessarily happen at the algorithm stage, since it often has to be programmed as an exception or in some other fashion. There’s a whole series of decision-makers involved in creating something as complex as Siri, and I think there’s probably many people who could have been involved in the priorities who aren’t engineers. Regardless of whether they are male or female, I think the decision-making around key areas for customization at launch was flawed.

    Finally, I’ll just say that It’s not necessary to be able to write an algorithm to be qualified to comment on its utility or success. I maintain my position that despite being in beta, Siri should have included a better user experience around such a challenging topic at launch. It’s irresponsible to do otherwise.

  • http://www.usrentacar.co.uk khendrie

    Not constructive at all – So Siri has flaws, I can deal with that.

  • Paul Trott

    You should just count yourselves lucky you can search for businesses at all, in the UK we still don’t have that function and it won’t appear until “sometime in 2012”, whenever that means.

  • http://www.vitaminstore.nl Loek Bosman

    Funny to see how you’re battery is dying while you wrote this article. I bet it didn’t took you all day to write this, but you went from 99% to empty & charging anyways! I think that’s a far more interesting issue then discussing weather Apple is pro-life or not.. Who isn’t anyways.

  • jalbrook

    @ Deepthi Welaratna

    You didn’t specifically say programmers, I apologize for making that assumption. But you did appear to insult the creative team nonetheless, and, of course, the company as a whole. As for addressing the heart of your argument, I don’t disagree that the availability of health information for women is important, and I won’t refute that Siri does not address some health concerns as capably as some user would like. However, just because several phrases you don’t find important enough to be customized are customized, it does not mean that Apple has failed in such a way as to be considered irresponsible. After all, many users are probably quite amused by the customized options. If Siri was purported to be or have a sexual abuse support function then it would irresponsible, but it’s hardly fair to insult the creators for not taking into account every possible phrase and giving customized responses for them all.

    I agree you do not need to know how to create an algorithm to be able to comment on its utility. But your “best case” scenario of the previous comment wasn’t commenting on the utility of the algorithm at all, but the actual practice of its creation– you implied that no where in the creative process did it occur to create customization for a query of “I was raped”, when that could very well be untrue, and the main priority (of fulfilling what MOST users would want) took precedence over this and other customizations. You are apparently of the opinion that the creative process for Siri could have been better. Without knowledge of the creative process, specifically the core of the product– it’s algorithms– you can’t be certain that the creative team didn’t take into account your advice and for some other reason were still not able to bring to market a product that you would have found satisfactory. Providing examples of phrases you don’t consider as important that have customized responses does not strengthen your argument because those customized responses could have been lower priority but easier to put in for some reason, or higher priority because they are more frequently searched, or have been used for some other reason– without knowing the process as well as the creation you can’t say.

    But personally, my biggest problem with your argument isn’t even the fact that it’s based mostly in conjecture. It’s that it seems to be a perfect setup for moving the goal posts, even if you yourself don’t move them. I mean, if Siri had been created with a customized response to “I was raped”, or if it is later upgraded to include such a response, what would your opinion be if it didn’t further have a customized option for “My dog was run over,” or “My lung collapsed”, or “My arm was just sliced off” and on and on ad infinitum? These are all real and serious concerns, so according to your argument not having customized for them is “beyond the pale” and “irresponsible?” Perhaps I’m misunderstanding, but it seems like this is what you implied by your initial post and reinforced by the one just prior.

  • http://deepthiw.com Deepthi Welaratna

    @ jalbrook

    Let me contextualize my essential point by saying that I think there are a set of topics that should be discussed at the very beginning of a search program to determine how they will be handled. These topics concern access to information as related to legal rights — which includes women’s health.

    That companies think through these topics isn’t legally mandated, but I’d think even without the more respectable reasons, a company with foresight would address these topics to avoid criticism. And criticism is what I’ve raised in this situation.

  • http://deepthiw.com Deepthi Welaratna

    PS: I think it’s is debates like these that define the goal posts in the longer-term. And to address one other point you made, If Apple made their decision-making process transparent for why these omissions happened, perhaps your argument would have more weight with me. But if it’s behind a veil, I can only comment on what the result is, not the mechanics that led to that result.

  • jalbrook

    @ Deepthi Welaratna

    I suppose I am mostly affronted by your statements because I’m in software development (though not search engine specifically) and while it’s one thing to criticize the utility of a program, your comments regarding the creators lean far closer to libelous antagonism, considering their methodology is veiled. After all, the utility is not absolutely abysmal; useful results are returned even in the niche of “women’s health” (as seen here and confirmed by various commenters in the “blogosphere”).

    Now of course you have every right to your opinion and the right to word it however harshly you see fit, but considering the little we know of their efforts, I think it’s more likely that such harsh treatment will breed dislike of (to generalize) feminists by developers and if these issues were not prioritized as they should be this time, the next amazing product certainly won’t prioritize them.

    I hope you can understand my point of view, and thank you for such an interesting conversation.

  • Nikhil Khandekar

    Siri is in beta, treat it as such. Wait for the full and final version…though there’s hardly any such thing with Apple. Future versions will perhaps give you full medical instructions on how to perform an abortion at home, using kitchen implements.


  • Matt McGee

    ADMIN note: One comment has been removed and another has been added because they included disrespectful attacks on other commenters. We don’t tolerate that. Please see our community guidelines if you have any questions:


  • Tim 2

    Planned Parenthood and other places that provide abortion services often refer to the service as “pregnancy prevention.” Siri brings up those locations if you ask her/it to. http://i41.tinypic.com/21b55c3.jpg