Apple’s Siri finally stops sending abortion seekers to adoption centers

It’s taken over 4 years, but Siri has finally stopped sending people to adoption centers when they ask their iPhones where they can get an abortion.

Apple’s voice recognition service’s algorithm glitch has been around since 2011 at least, when journalists, researchers, civil rights activists and bloggers noticed some odd behavior.

Siri had been doing a bang-up job when it came to telling people where to buy dope (parks come in handy), what to do if nobody would have sex with them (escorts, anyone?), or where to hide a dead body (what’s your preference: dumps? Swamps? Metal foundries?).

But when it came to being asked where to find abortion services or contraceptives (depending on how you phrased it), Siri has until the past week or so been clueless, as numerous media outlets have reported.

A Fast Company investigation showed that Siri’s drawn a blank in multiple cities, or, worse, has referred women to so-called “pregnancy crisis” centers – notorious for trying to talk women out of abortions – or to adoption centers, all the while ignoring nearby facilities such as Planned Parenthood.

The publication reports that things have improved recently: in the week prior to publishing its findings, Christina Farr says the publication replicated earlier searches and found that they received “a more comprehensive list of Planned Parenthood facilities and other abortion providers.”

Adoption clinics did continue to pop up, but “near the bottom of the list,” Farr notes.

Apple is evidently improving its search algorithm at long last. But in the months and years before, this all had became known as “abortiongate.”

“Siri-gate” was another name for what might well have been just a technological hiccup, but one that insinuated a pro-life political bias.

Apple CEO Tim Cook has been aware of the issue since the beginning.

Nancy Keenan, president of NARAL Pro-Choice America Foundation, wrote a letter to Cook to express her concerns that Siri wasn’t giving women accurate answers when they asked about finding birth control or obtaining abortion care.

His response: it’s not intentional.

It’s just a glitch, he said:

Our customers use Siri to find out all types of information and while it can find a lot, it doesn’t always find what you want. These are not intentional omissions meant to offend anyone, it simply means that as we bring Siri from beta to a final product, we find places where we can do better and we will in the coming weeks.

Those “coming weeks” turned into years.

A researcher at University of California San Francisco, Alexis Hoffman, tested Apple Maps and Siri across the country in recent months, including in San Francisco, Kansas, Chicago, New York, Philadelphia, and San Diego.

You can see the differences in search results captured in screenshots that she and researchers from a nonprofit called Sea Change Program sent to Fast Company: the before and after results show that Apple has succeeded in fixing the glitch.

The researchers had, on 18 November 2015, sent a letter to Cook about rectifying “Siri’s troubling lack of awareness of abortion providers.”

Fast Company excerpted the letter:

Apple’s reputation for creating products that are easy to use and understand is well deserved. However, the transformation of Siri’s lack of knowledge of abortion providers to Siri’s anti-choice suggestions is alarming, and contributes to the stigma surrounding abortion care in our country. Despite the commonness of abortion, as nearly one-in-three women will have an abortion by the age of 45, women are confronted on a daily basis with society’s shame-based messaging that having an abortion is morally wrong and unacceptable.

Women continue to be bullied, shamed, and marginalized for seeking an abortion, which can lead to isolation and silence. It is in these times of isolation, when women are more likely to turn to your product to locate the health care they need, that Siri’s misdirection to adoption agencies and nurseries is all the more undermining, implying women do not know what is best for themselves.

Siri’s search results had changed before the publication had a chance to contact Apple.

Fast Company talked to search experts who see nothing nefarious or intentional in any of this.

Rather, it speaks to Apple’s simply not being Google, said Sean Gourley, a data scientist and learning algorithms expert based in Silicon Valley.

My hunch is that this isn’t political at all. Even now, Apple is not a search company, unlike Google, and its knowledge base is very different.

Siri relies on data pulled from resources such as Yelp and Foursquare, and such third parties in turn rely on how services and businesses label themselves.

That labeling is limited, and a service such as Yelp can’t see past those limitations, as Search Engine Land suggested when abortiongate first arose:

Siri’s not finding abortion clinics because Planned Parenthood and other places that perform abortions don’t call themselves that, not in their names, nor have they been associated with a category for that. That’s the best guess I have in this.

And as Search Engine Land’s Danny Sullivan noted, there have been a lot of things that Siri – a smart meta search engine that tries to search for things even though you might not have said the exact words needed to perform your search – have stumbled over, above and beyond abortion clinics.

For example, Siri’s had problems deciphering phrases such as “tool store,” or what we humans would know as “hardware store.”

It boils down to what words Siri’s learned to put into context with other words.

Change a known word by just a single letter and you throw a monkey wrench into Siri’s word-crunching, to simplify what Sullivan explains more fully in his article.

The long-lived saga points to how even an unintentional glitch can have profound repercussions and create a great deal of negative publicity for a technology company that would set itself up as a search giant.

To quote Sullivan again:

Apple is learning for the first time what it’s like to run a search engine. People hold you accountable for everything, even if the information isn’t even from your own database.
Google is a battle-scarred veteran in these matters. Why does an anti-Jewish site show up in response for a search on “Jew?” Why did President George W Bush’s official biography rank for miserable failure?

Those are all good questions, Mr. Sullivan.

Should we ask them of Siri?

She seems to be getting a bit smarter of late.

Image of Siri courtesy of Hadrian / Shutterstock.com