CAPPUCCINO COMPANY Apple's personal assistant Siri has entered the spotlight over its apparent bias in what it will search for and direct users to.
The voice-activated IOS 5 application gained some recent notoriety after people discovered that it could not find anything when they asked it to direct them to an abortion clinic, despite the facts that abortion is legal in the US and there are such medical offices in most US cities.
The bias doesn't stop there, however, as Siri has no problem finding pregnancy crisis centres, which often are run by churches and right-wing groups that are very clearly against abortion.
What is interesting is that Siri appears to know the words "abortion clinic" and will say it cannot find any when you ask where you can go to get an abortion without ever mentioning the word "clinic" at all, suggesting the problem is not just in how the question is phrased, but rather in how Siri is programmed to answer any variation of that question.
If users search for "Planned Parenthood", however, Siri will find some locations. This is a less controversial way of asking about abortion services, with the Planned Parenthood Federation of America lobbying for pro-choice as well as providing a range of other services to promote sexual and reproductive health. However, the fact that the word "abortion" elicits no location assistance from Siri suggests that Apple might be censoring it in some way.
If you thought that Siri might just not be intelligent enough to answer most questions, it's worth checking out some of the things that it will find. According to a report by The Young Turks Network it will help find marijuana providers and hookers.
Siri will also help you find a place to hide a dead body, offering such idyllic spots as dumps, swamps, mines, reservoirs and metal foundries. And if you want to jump off a bridge, Siri will tell you the location of the nearest bridge.
Yet it won't tell you where to find the nearest abortion clinic.
We have contacted Apple about this issue and are awaiting a response.
Apple issued a statement saying that the omission was not intentional.
"Our customers want to use Siri to find out all types of information, and while it can find a lot, it doesn't always find what you want," said Natalie Kerris, a spokesperson for Apple, according to the BBC. "These are not intentional omissions meant to offend anyone, it simply means that as we bring Siri from beta to a final product, we find places where we can do better and we will in the coming weeks." µ
Hold the front page
Bluesky's the limit
Might need to come up with a better name though
There's an app for *that*