The dramatic growth of mental health apps has created a risky industry

W.HEN CAROLINA ESCUDERO was severely depressed, it became difficult to see a therapist. So she joined BetterHelp, a popular therapy app. She paid $ 65 each week but spent most of the time waiting for her assigned advisor to respond. She got two replies within a month. “It was like texting someone you know who has no idea how to deal with mental illness,” she says. BetterHelp says its service doesn’t claim to work 24/7, all of its therapists have advanced degrees and “thousands of hours of hands-on clinical work” and users can easily switch them when scheduling is difficult.

Listen to this story

Enjoy more audio and podcasts on iOS or Android.

It has seldom been more urgent to help people cope with mental health problems. The incidence of depression and anxiety has skyrocketed during the pandemic – by more than 25% globally in 2020, according to the lancet, a medical journal. This, combined with more people using online services, has resulted in a boom in mental health apps. The American Psychological Association estimates there are 10,000 to 20,000 available for download. But there is growing evidence that privacy risks to users are being ignored. Nobody checks whether the apps are working either.

Mental health tech companies have nearly $ 2 billion in 2020 CB Insights, a data company. Their products address issues ranging from general stress to severe bipolar disorder. Telemedicine apps like BetterHelp or Talkspace connect users with licensed therapists. Subscription-based meditation apps such as Headspace are also common. In October, Headspace bought Ginger, a therapy app, for $ 3 billion. As large companies prioritize the mental health of their employees, some apps are working with them to help the entire workforce. One such app, Lyra, supports 2.2 million employees worldwide and is valued at $ 4.6 billion.

Underneath, however, trauma lurks in some corners of the industry. In October 2020, hackers who raided Vastaamo, a popular Finnish startup, started blackmailing some of its users. Vastaamo asked therapists to back up patient notes online, but has reportedly not anonymized or encrypted them. The hackers threatened to spread details of extramarital affairs and, in some cases, thoughts of pedophilia on the dark web and reportedly demanded ransom payments in Bitcoin from around 30,000 patients. Vastaamo has filed for bankruptcy but left many Finns from giving personal information to doctors, says Joni Siikavirta, an attorney who represents the company’s patients.

Other cases may arise. There are no universal standards for storing “emotional data”. Harvard Medical School’s John Torous, who reviewed 650 mental health apps, describes their privacy policy as disastrous. Some share information with advertisers. “When I came to BetterHelp, I saw targeted ads with words that I had used in the app to describe my personal experience,” says one user. BetterHelp states that it only shares device IDs associated with “generic event names” with marketing partners, only for measurement and optimization, and only if users agree. It is said that no private information, such as dialogues with therapists, is passed on.

In terms of effectiveness, the apps’ methods are notoriously difficult to assess. Woebot, for example, is a chatbot that uses artificial intelligence to reproduce the experience of cognitive behavioral therapy. The product is being marketed as clinically validated based in part on a scientific study that concluded that humans can form meaningful bonds with bots. But the study was written by people with financial ties to Woebot. Of the ten peer-reviewed reports so far, Woebot said, eight have partnerships with a lead investigator who has no financial ties. All co-authors with financial connections will be disclosed, it said.

Mental health apps are designed to be used in addition to, not in place of, clinical care. Against this background, the European Commission is reviewing the field. It is preparing to promote a new standard that will apply to all health apps. A letter scale evaluates security, usability and data security. Liz Ashall-Payne, founder of ORCHA, a British startup that has reviewed thousands of apps, including for the National Health Service, says 68% did not meet the company’s quality criteria. Time to go back to the couch? â– 

For more expert analysis of the biggest business, business and market stories, sign up for Money Talks, our weekly newsletter.

This article appeared in the business section of the print edition under the heading “Psyber Boom”

Comments are closed.