Young woman looking at her smartphone at sunsetShare on Pinterest
Jasmin Merdan/Getty Images
  • The Crisis Text Line ended its data-sharing relationship with Loris.ai following criticism over privacy concerns.
  • The incident sheds light on ethical issues about personal data protection in digital services for mental health.
  • Consumers can protect their personal data by researching digital mental health services and reviewing their terms and conditions before use.

In late January, Crisis Text Line, a nonprofit text message-based service for mental health support, ended its data-sharing relationship with Loris.ai, a for-profit customer services platform.

The decision was prompted by a POLITICO article that raised red flags around consumer privacy and ethical issues.

In a statement on its website, Crisis Text Line responded to the criticism by announcing it would terminate its partnership with Loris.ai, stating that the platform had not accessed Crisis Text Line user data since early 2020.

“Your trust is critical to fulfilling our mission of building a more empathetic world,” the statement read. “Our goal is always to make it easier for anyone to get help, and to feel supported and safe in doing so.”

Crisis Text Line states that user data was previously “handled securely, anonymized, and scrubbed of personally identifiable information.” Still, it ended the Loris.ai partnership to help put users’ privacy concerns at bay.

Critics of Crisis Text Line’s relationship with Loris.ai, a machine learning platform that humanizes customer service experiences, had also expressed concern for help-seeking individuals.

For instance, how can those contemplating self-harm or suicide be expected to carefully review a company’s terms and conditions in a moment of crisis?

As a possible workaround, Crisis Text Line truncated its terms and conditions into an easier-to-read bulleted summary.

Shawn Rodriguez, vice president and general counsel for Crisis Text Line, wrote in an email to Psych Central that it should be clear and easy for anyone in crisis to understand what they’re agreeing to when they reach out for help.

“We recognize our responsibility and continue to prioritize the security, privacy, and rights relating to our users’ personal data,” Rodriguez said. “Texters should be aware of important processes we have in place, so users know their rights and feel secure using our service.”

As such, Crisis Text Line users have control of their data and can access, erase, or delete their personal conversation data at any time.

“There are a number of things that we believe are particularly important for people using text message-based services for mental health,” Rodriguez said. “Texting provides a user in crisis with a level of privacy while still allowing them to connect with caring support.”

According to Crisis Text Line, those seeking support should feel reassured their personal data is protected.

Crisis Text Line offers chat-based around-the-clock care from trained counselors and mental health professionals. Similar text-based mental health services include:

  • Better Help
  • Talkspace
  • Calmerry
  • E-Therapy Cafe
  • Ayana Therapy

While there are many benefits to digital services for mental health, convenience is often a top draw.

Still, this rapidly growing industry has its drawbacks. Not all services are free, and privacy issues remain a concern.

Pros

Accessibility

Accessibility is the operative word for any mental health service — particularly for those living in remote regions or low-income households/areas in the United States.

Teodora Pavkovic, MSc, a psychologist and cyber safety expert at Linewize based in Honolulu, said that the lack of access to mental health care in the United States has been highlighted by the pandemic.

“The sharp increase in our time spent online because of this pandemic has enabled digital mental health services to mushroom, with start-ups in this area raking in $5.1 billion in 2021,” Pavkovic told Psych Central.

“The benefit of easy access is certainly at the forefront of these services — although we mustn’t forget that the digital divide still runs deep in this country.”

That digital divide is evident in the 22.5% of U.S. households that do not have a home internet connection.

Affordability

Cost is a significant appeal, with many — but not all — text-based services running free of charge.

“Even when access isn’t an issue for potential clients, cost oftentimes is, and these digital mental health services address that problem well,” Pavkovic said.

Preferred mode of communication

Pavkovic notes that another benefit of text-based mental health services speaks to the habits of younger generations.

Millennials and Gen Zers are typically more comfortable using their devices for communication, making text-based mental health services more approachable and engaging.

“The ability to conduct a therapy-like session while remaining inside your own bedroom — with the option of not having to be seen or heard by a therapist — is a feature that appeals to younger users of these services, especially those [with] anxiety-related issues,” Pavkovic says.

Cons

Privacy concerns

Crisis Text Line’s prior data-sharing relationship illuminates larger ethical issues surrounding data protection and privacy for those using text-based mental health services.

As Pavkovic points out, Health Insurance Portability and Accountability Act (HIPPA) compliance may not always be enough.

According to Consumer Reports, not all information collected by digital apps and services is protected under the federal health privacy law.

Questionable effectiveness

Pavkovic, who writes about how mental health start-ups can build humane and helpful products, questioned the effectiveness of some text-based mental health services and apps.

And as 2022 research suggests, mental health apps and text-based interventions may be only moderately effective at best.

“Are we able to accurately detect and interpret a client’s thoughts and emotions via text and show adequate empathy?” Pavkovic asked. “Does it help the therapeutic alliance if a therapist is available to their client 24/7 and on an unpredictable schedule?”

It may seem unrealistic to carefully read a mental health platform’s privacy policy in a moment of crisis, but it’s becoming increasingly important to do so.

“Whether you’re beginning to use a social media platform for the first time or a mental health one, I’d recommend everyonetake some time to find out exactly how their personal data will be stored and shared — and not to be satisfied by statements like ‘HIPAA compliant,’” Pavkovic said.

“Dig further, do your research, and see how comfortable you are with the platform’s data policy.”

According to Pavkovic, things to look out for include:

  • how long your data will be stored
  • whether it will be anonymized
  • who outside of the therapist or counselor will have access to your information
  • whether you can request that your data is permanently deleted
  • any other services, organizations, or platforms the service has a relationship with

“Using services that allow you to connect with them via WhatsApp (owned by Meta, formerly Facebook) will mean that WhatsApp will also have access to your data,” Pavkovic said.

In addition, Pavkovic recommends exploring the customer service section of any mental health service you’re considering and considering the following:

  • the FAQ section
  • how complaints or comments are collected and stored
  • protocol for helping users facing severe mental health issues like suicidal ideation

Prior missteps by Crisis Text Line were ameliorated by its swift response to its community — and a new focus on trust and transparency.

Even without its data-sharing relationship with a for-profit entity, Crisis Text Line will continue to offer free mental health support to those who need it.

But whether text-based mental health services can effectively treat mental health conditions remains to be seen. It’s reassuring, though, to know that help is available in those critical moments.

“This type of technology is still very new and is certainly vulnerable to today’s tendency to claim that technology can fix any and all of society’s problems, so we don’t have the answers to these questions just yet, Pavkovic said. “We mustn’t stop asking them, though.”

If you’re considering self-harm or suicide, you’re not alone

You can access free support right away with these resources:

  • The National Suicide Prevention Lifeline. Call the Lifeline at 800-273-8255 for English or 888-628-9454 for Spanish, 24 hours a day, 7 days a week.
  • The Crisis Text Line. Text HOME to the Crisis Text Line at 741741.
  • The Trevor Project. LGBTQIA+ and under 25 years old? Call 866-488-7386, text “START” to 678678, or chat online 24/7.
  • Veterans Crisis Line. Call 800-273-8255, text 838255, or chat online 24/7.
  • Deaf Crisis Line. Call 321-800-3323, text “HAND” to 839863, or visit their website.
  • Befrienders Worldwide. This international crisis helpline network can help you find a local helpline.