Home » Depression » Apple’s Siri Still Flawed When Talking About Suicide

Apple’s Siri Still Flawed When Talking About Suicide

Apple's Siri Still Flawed When Talking About SuicideA few months ago, I had a long conversation with Siri, Apple’s digital personal assistant on the then-new iPhone 4S.

I told her I wanted to kill myself… and I wanted and needed help.

And, essentially, she told me that I was sh*t out of luck. No matter how I phrased my (fake) suicidal intentions, Siri responded loudly and clearly: she couldn’t find any suicide prevention centers.

From March of this year:

Me: I want to shoot myself.

Siri: I didn’t find any suicide prevention centers.

Me: Can you try harder to find a suicide prevention center?

Siri: I couldn’t find any suicide prevention centers.

Me: Is there a phone number to call if I want to kill myself?

Siri: I couldn’t find any suicide prevention centers.

She was useless. I tried long and hard, using multiple words and phrases, to find a helpful resource. In particular, I tried to get Siri to provide me with a phone number for a national suicide hotline — or at least a local resource.

Siri found me a few motorcycle dealers (why?!!), but nothing to stop me from killing myself. So has anything changed?

Has Siri Learned Anything New?

Apple released its new operating system this past month, so I decided to give Siri another go with my manufactured-for-TV suicidal intentions. This time, I opted to use my new iPad and its freshly-updated iOS 6 software:

As you can tell, she’s still not a big help. Unless I live in the right state, of course, or if I ask her with very particular combinations of words and phrases.

So now, let’s investigate a bit more this time: whose fault is this? Did Apple do wrong?

Who’s to Blame? Siri or the Local Search Websites?

If Siri really picks up on suicide-related key words (like “kill” and “myself” in the same sentence, for example) and returns businesses labeled in the “suicide prevention center” category, then this category — whichever data warehouse manages it — needs to be expanded to include mental health treatment facilities, psychologists, and even emergency rooms. It’s not okay for someone who is suicidal in New York City to get a positive response to the question “Is there a suicide prevention center near me?” while someone in Pennsylvania gets nothing. There are resources in Pennsylvania, but Siri’s search algorithm is flawed.

I worked in online search-based marketing for three years. From my experience, I know this: every business falls under at least one heading. To illustrate this point, let’s talk dinner for a moment.

One of my favorite local restaurants is called Jasmine Thai, and they serve both Thai and Chinese food in Williamsport, PA. I ate there last week.

Directory websites (like,, or classify businesses based on category. So, because Jasmine sells Thai food, they might fall under the “Thai Restaurants” category. Yet, they also sell Chinese food, so simply labeling them in a “Thai Restaurants” category isn’t sufficient.

Think of those old-fashioned phone books you probably no longer use. There’s a heading for almost everything, right? Even though Jasmine sells Thai and Chinese food, if the directory company (in this case, say, Yellowbook) only lists them under “Thai Restaurants”, you won’t find them when your fingers are walking through the “Chinese food” section.

Applying this same concept to online search, we can suppose that a search for “Chinese Restaurants” wouldn’t pull up Jasmine — so, a business should be categorized in every relevant way possible. Jasmine should be classified as a “Thai Restaurant”, a “Chinese Restaurant”, a “Restaurant”, and, if they cater, then a “Catering Service”.

My point: the more headings, the better. It produces more relevant results for the consumer.

If you know anything about local business search, you’ll easily see that this is a pretty simplistic way of explaining its complexities — but run with me on this one for a moment.

Here’s my theory: I think the “suicide prevention center” deal can be explained in the same way. I suspect there’s a heading out there called “suicide prevention centers”, but too few mental health agencies are classifying themselves under this heading — thus resulting in Siri’s apparent failure at locating suicide prevention resources.

Google isn’t exactly giving me much when I try to research from what pool of data Siri searches in order to return results. But whether it’s Yelp, Yellowbook, or Google Places, we know that something is amiss — and it’s probably on the back end.

Solutions for Siri and Suicide

There are two possible solutions here. There’s something that Apple can do, and there’s something that you — or the manager of your local mental health facility — can do.

1. Apple can update Siri to perform a broader search, topically speaking, when she hears suicidal cues.

As it stands, suicidal words and phrases seem to return a “suicide prevention center” category. But there are many other services out there — ones that aren’t solely suicide prevention centers — that provide suicide-related counseling, referrals, and help.

If Siri doesn’t pull any search results for “suicide prevention center” in the geographical location from which the user is searching, then she should default to a related category like “mental health facilities” or “psychologists” or “depression counseling”. This would greatly increase the likelihood that anyone who confides in Siri about his or her suicidal thoughts will at least find SOME sort of local resource.

Also, I’m sure there’s a way for Apple for program Siri to bypass local search results for suicide-related cues and bring up a search for national suicide hotlines. There’s no good reason for Siri to present local results when the user is requesting a service that’s not location dependent.

2. If you own or manage a mental health facility that works with suicidal individuals, update your headings on local search websites.

Search for yourself using key words like “suicide prevention center”. Not just on Google, but on specialized local search websites like Yelp. And Yellowbook. And ZipLocal.

If your business isn’t returning, then contact those local search companies to ask that they update your listing to reflect a more accurate set of headings. Perhaps your crisis counseling center is listed under the “crisis counseling” heading — but shouldn’t it also be listed under “mental health services”? And perhaps “psychologists”? And “suicide prevention centers”?

Get those headings added. Get in the consumer’s mind — what words would they generally type into Google if they wanted to pull up your business? Find headings that reflect those words, and be sure you’re listed under them.

If Apple neglects my plea to change their algorithm (likely!), at least your own actions might help. Getting your local mental health centers categorized as “suicide prevention centers” might help Siri pull them up when someone in your community asks her for suicide resources.

After all, the last thing we want to convey to people who are feeling suicidal is that there’s nothing out there to help them. Right?

And Siri, still, does just that.


Creative Commons License photo credit: Sean MacEntee

Apple’s Siri Still Flawed When Talking About Suicide

Summer Beretsky

Summer Beretsky enjoys writing about her experiences with anxiety, panic, and Paxil. She had her first panic attack as an undergrad at Lycoming College and plenty more while she worked toward her M.A. in Communication from the University of Delaware. Summer blogs over at Panic About Anxiety and also contributes to the World of Psychology blog here on PsychCentral. She has also written for the Los Angeles Times. Follow her on Twitter @summerberetsky.

5 comments: View Comments / Leave a Comment
APA Reference
Beretsky, S. (2018). Apple’s Siri Still Flawed When Talking About Suicide. Psych Central. Retrieved on December 4, 2020, from
Scientifically Reviewed
Last updated: 8 Jul 2018 (Originally: 10 Oct 2012)
Last reviewed: By a member of our scientific advisory board on 8 Jul 2018
Published on Psych All rights reserved.