Since Apple has come out and said that Siri's abortion non-answers are a glitch and thus not a conspiracy, it's time to dig into technical explanations for the seemingly political bot. Siri's behavior seemed particularly odd because of its inconsistencies. Sometimes she would flat out refuse to answer questions, other times she would refer to Google, and then in the abortion clinic case for D.C. she would refer the asker to a non-abortion centers. Some cried conspiracy, but others have pointed to some programming reasons for Siri's behavior.

Siri Doesn't Do Exact Word Searches 

Siri is programmed to look up things related to certain terms, explains Search Engine Land's Danny Sullivan. "Even though you might not have said the exact words needed to perform your search," he writes.  So, when someone searches for condoms for example, Siri understands that that relates to drug store. But when she doesn't make that connection, she won't search at all, which explains why when asked for Plan B, as we had done, she said she "couldn't do it." Siri did not make the connection the Plan B (a brand-name product) had anything to do with drug stores, presumably. 

Siri Was Programmed for Comedic Effect 

Beyond not giving straight up answers for abortion, Siri at times gives flippant answers to serious questions. For example, when asked about rape, Siri would answer "Is that so?" or "Really!" As Sullivan points out, the programmers crafted Siri with a sense of humor to make her more human-like. When the assistant can't figure something out, she shifts into this "conversational mode." Or are we seeing a series of responses when it really doesn’t know what you want about anything and instead shifts into a conversational mode that some engineers thought might be funny? This also explains why Siri can find escort services, but not abortion clinics. The programmers thought it would be funny; they probably didn't predict that rape inquiries would lead to that same result. 

Siri Gets Its Info From Yelp

As Apple explains on its Siri FAQ, Siri gets at least some of its information from Yelp, Wolphram Alpha and Wikipedia. As a "meta search engine, Siri outsources the work to other search engines, like Yelp. This explains why Siri might not find any relevant centers in New York City, which has plenty of places to get the procedure. While a Yelp search draws many results for abortion clinics, none of them define themselves as such, notes Sullivan. Rather, the places come up because commenters have used the term "abortion" in their writings. Now, the D.C. case, which pulled up centers that would explicitly not give abortions, isn't quite explained by this. But Sullivan has a best guess: "It makes me wonder if Yelp, lacking good first-hand information about this business, has instead pull information in off its web site — which includes terms like abortion – to help classify it. "

It's SEO Gamesmanship

There have been similar instances of search engines afoul, like Google's search engine pulling up anti-Semitic results when one looked up the term "Jew." As Google explained, this happened because of search engine optimization (SEO) for the term Jew, which led Google's algorithms to rank these derogatory sites higher. Developer Al Sweigart suggests that whatever engine Siri relies on has ranked these crisis pregnancy centers higher than truly relevant results, like Planned Parenthood.