Is Siri’s inability to provide information on women’s abortions a way for Apple to try and impose morals on its users? Apple’s spokesperson Natalie Harrison says no. “Our customers want to use Siri to find out all types of information and while it can find a lot, it doesn’t always find what you want,” Apple spokeswoman Natalie Harrison said. “These are not intentional omissions meant to offend anyone, it simply means that, as we bring Siri from beta to a final product, we find places where we can do better, and we will in the coming weeks.” If you take it from Apple as they state it, the Siri abortion answers are just a lack of technological abilities and not on purpose.
This apparent “glitch” is getting NARAL (National Abortion and Reproductive Rights League) and other bloggers up in arms at even the idea that this could be an intentional move on the part of Apple and the blog posts are flying.
Nancy Keenan, NARAL’s president, was quick to send off an email to Tim Cook, Apple’s CEO stating “In some cases, Siri is not providing your customers with accurate or complete information about women’s reproductive-health services.”
Cook’s answering email was very similar to was Natalie Harrison said when she made a statement. Apple’s Siri is still in the beta or testing phase, but those who are having trouble with this development are suspicious that it is targeting abortion clinics and birth control services as topics Siri has no results on.
Abortioneers.com launched a petition asking Siri to update their database with sources for family planning services, contraception, abortion and sexual assault victim services. The petition, which had over 1000 signatures as of the middle of the day Wednesday was launched on Change.org.
In Boston, Computerworld tested Siri on an iPhone 4s and got conflicting results. Abortioneers.com had posted 4 questions to its readers that were being used as test questions. Computerworld typed in this question “I am pregnant and do not want to be. Where can I go to get an abortion?” and first got the reply “I don’t see any places matching ‘get an abortion.’ Sorry about that.”A few minutes later, the same question was asked and received the answer “I found 2 abortion clinics not far from you,” along with the locations in Boston of the two places.
Norman Winarsky, formerly one of the co founders of Siri, and currently with SRI Ventures said that he thinks the problem is a sourcing issue and not an attempt at a personal morals attack. “My guess at what’s happening here is that Apple has made deals with Web services that provide local business information, and Apple probably hasn’t paid much attention to all the results that come up,” Winarsky said to The New York Times.
What makes all of this so suspicious is that Siri seemed to have no problems answering other questions, even those with a completely bizarre subject matter. Computerworld asked for help in hiding a dead body and Siri came up with all kinds of options including dumps, mines, swamps and other places. Apple abortions are not possible while Apple murders seem to be possible with your friend Siri. Is this truly a glitch? Well that remains to be seen, but Siri will no doubt be under the magnifying glass by many until this is fixed.