CCHR Warns of Mental Health Apps’ Human Rights Violations and Surveillance Alarm

mental health apps
Consumers must be better informed of the risks when using a mental health app. The operative word here is ‘health,’ as in free from illness or injury—complete well-being. Unfortunately, psychiatric treatment not only doesn’t treat anything ‘physical,’ but it can also harm and create ill health. – CCHR International

The mental health watchdog documents a trail of civil and human rights violations for consumers, including college students’ use of mental health apps that help fuel a $35 billion psychotropic drug industry

By CCHR International
The Mental Health Industry Watchdog
November 18, 2022

College students have been the target of marketing urging them to use mental health apps, without realizing that their personal information can be hawked to social media outlets and used for data mining. The apps also create a new age in surveillance. Patients who receive health care services are protected by the Health Insurance Portability and Accountability Act of 1996 (HIPAA), which prohibits disclosure of sensitive patient health information without the patient’s consent or knowledge.[1] That isn’t the case with mental health apps, which can be a conduit to get users to take psychotropic drugs (which are prescribed through some apps) that can have serious adverse effects. Former nurses of one psychiatric prescription drug app said they feared it was fueling a new addiction crisis by making the stimulant Adderall and other amphetamines so easy to get.[2]

There are approximately 10,000 to 20,000 mental health apps available, according to the American Psychological Association.[3] They are the largest proportion of the “disease-specific” app market segment. These can range from ones that offer exercise programs for better mental health to others that provide assessment, diagnosis and treatment, such as powerful and potentially debilitating drugs. The global mental health apps market size was expected to grow from $4.71 billion in 2021 to $5.54 billion in 2022.[4] These, then, can further fuel the nearly $26 billion a year in psychotropic drug sales in the U.S.[5]

Roger Severino, a former director of the Department of Health and Human Services’ Office for Civil Rights, the federal agency charged with enforcing health privacy rules, pointed out that HIPPA protections do not apply to health apps.

Tell a psychologist, ‘I’m depressed,’ and HIPAA restricts how that information can be used. But type those same words into an app that has no connection to a covered entity [doctor, insurance company, etc. which is liable under HIPAA], and HIPAA doesn’t protect you.”[6]

Mental Health Apps Cause Civil Rights Violations

TAO Connect is just one of the dozens of mental health apps permeating college campuses in recent years. As The Washington Post reported, “When data from a mental health app is shared or sold to other parties, a wealth of information can be used for purposes beyond the health needs of students. Insurers can use it to calculate premiums, employers can use it to assess risk, advertisers can use it to tailor ads to consumer preferences or conditions, and all can exploit students’ weaknesses.”[7]

The Post found many apps fail to fulfill promises. One app asked for access to contacts and the phone’s microphone, so it could call a loved one if the user hadn’t left his or her room for an extended period. Researchers left the app running for a month. It took all the information for marketing purposes but didn’t once make a call to any loved one.[8]

The Talkspace app links users to a licensed therapist.[9] It faces a securities fraud lawsuit, accused of misleading investors before it went public in 2021 by misrepresenting its financials and growth, although the company claims there is no merit to the allegations.[10] 

In June 2022, a group of three U.S. senators—Elizabeth Warren (D-MA), Cory Booker (D-NJ), and Ron Wyden (D-OR)—wrote letters to executives of Talkspace and another mental health app (BetterHelp) demanding details on how they are protecting the data privacy of their patients and how patients’ data is shared with third parties, including data brokers, online advertisers, and social media companies. The senators are concerned with how mental health apps can “access and use highly confidential personal and medical information.”[11]

Former employees and therapists at Talkspace told The New York Times that conversations meant to be anonymous between medical professionals and their clients were regularly reviewed by the company so that they could mine them for information. Because the text conversations are considered medical records, users are unable to delete the transcripts.[12]

The Psychiatric Drug Push

mental health apps and psychiatric drug push

Such apps can become virtual pharmacies and psychiatric drugs can be prescribed that are potentially addictive, such as the potent stimulant Adderall (mixed amphetamine salts), or antidepressants that can have suicidal side effects, especially for college-aged students.[13]

The Cerebral mental health/therapy app also matches consumers virtually with a therapist and is targeted toward younger users.[14] 

The San Francisco online-prescribing company is being investigated by the Drug Enforcement Administration and the U.S. Justice Department for possible violations of the Controlled Substances Act for its prescribing of Adderall. The company told news organizations it has not been accused of violating the law but earlier this year, it paused prescribing Adderall and other controlled drugs for attention deficit/hyperactivity disorder (ADHD).[15]

The service is marketed as a one-stop shop for individuals to get diagnoses, counseling, and prescriptions. However, one of Cerebral’s former executives, Matthew Truebe, is suing the company for over-prescribing ADHD drugs.[16] 

A Bloomberg investigation of Cerebral found evidence of harmful “overtreatment.” For example, a Cerebral user alleged she was prescribed three antidepressants, an anticonvulsant, and an antipsychotic over the course of three months. One of these medications was prescribed after just an 18-minute video visit

Another consumer told CBS News that in her first roughly 15-minute appointment with a Cerebral prescriber she was prescribed three drugs and in a second, equally brief appointment, was given two more drugs. She worsened and alerted her prescriber about nightmares she was having about hanging herself but said she was brushed off. The next day, a family member found her hanging from a dog leash in her bathroom. She does not remember anything from the incident and thought she had been “dreaming.”[17]

Cerebral clinicians prescribe stimulants to users with no face-to-face evaluation and allegedly, are expected by the company to provide a prescription at the end of a 30-minute call. ADHD drugs such as Adderall are highly addictive, and it was alleged that Cerebral used this fact to improve customer retention.[18] 

Truebe, Cerebral’s former Vice President of Product and Engineering, alleged that Cerebral planned to increase customer retention by prescribing stimulants to 100% of its ADHD patients. According to the complaint filed on April 27, 2022, CEO Kyle Robertson asked Cerebral employees to track the retention rates of ADHD patients who were being prescribed stimulants by Cerebral versus those who were not. Robertson then asked employees to find ways to prescribe stimulants to more ADHD patients to increase retention.[19] 

The app allegedly will take your credit card information prior to showing you the therapists you are going to be matched with. Some consumers were getting charged by the company and then only getting matched with therapists or counselors that did not match the criteria they needed. When asking for a refund, users had to call and email multiple times only to be offered a 30% refund. Users are also having a hard time canceling their memberships as they allegedly have to fill out a form and email customer service at a certain time of day to be able to get the cancellation process started.[20]

The Washington Journal of Law notes that many apps brand themselves as “wellness apps” instead of health apps to avoid having to seek Food and Drug Administration (FDA) approval. The article points out that apps may have little rigorous clinical testing. A meta-review of 145 studies of mental health apps showed that there was little evidence of efficacy among most of them. One app was found to have actively caused clinical harm.[21] 

Another Avenue for Promoting Chemical Imbalance Fraud

With little, if any oversight of the mental health app industry, what patients are told makes the field even riskier. Healthline published a “quick look at the best depression apps” advising readers of which ones to consider. In the same article, it tells prospective consumers, “Some of the common causes of depression are family history, hormone or chemical imbalance, trauma, and substance use.”[22]

A groundbreaking study published in Molecular Psychiatry in July has thoroughly debunked the chemical imbalance theory. Scientists at the University College London reviewed 17 major studies published over several decades and found no convincing evidence to support the theory. Professor Moncrieff and colleagues said, “The serotonin theory of depression has been one of the most influential and extensively researched biological theories of the origins of depression. Our study shows that this view is not supported by scientific evidence. It also calls into question the basis for the use of antidepressants.”[23]

Jumping Out of the Pan into the Fire

The issue is not whether mental health apps are as “effective” as in-office visits with psychiatrists or psychologists, as either way, both are based on ineffective “science.” Choosing either can be akin to jumping out of the frying pan and into the fire.

Many smartphone apps connect to sensors in wearable technologies. Passive data are obtained automatically through sensors, either on the smartphone or via a wearable device, ranging from simple device use metrics to global positioning system (GPS), and even now voice tone (via microphone) or facial expression (via camera) data.

It goes further, using sensors to enforce psychiatric drug consumption compliance. For example, Abilify MyCite is an antipsychotic pill (aripiprazole) with an embedded sensor that communicates to a wearable patch when ingested, which then transmits data to a smartphone app to ensure the patient has taken his “medication.”[24] In order to use the MyCite system, the patient needs to download and use the MyCite app on a smartphone.[25]

In 2020, psychologist Lisa Cosgrove and others published a paper, “Digital Phenotyping and Digital Psychotropic Drugs: Mental Health Surveillance Tools That Threaten Human Rights,” in the journal Health Human Rights. According to the researchers:

“There are currently no clinical trial data to show that the sensor can either consistently track real-time ingestion or increase medication adherence. In fact, on the company’s website, the following statement is made: ‘There may be a delay in the detection of the Abilify MyCite tablet and sometimes the detection of the tablet might not happen at all.’”[26]

Further, “It is noteworthy that in 2014, aripiprazole was the best-selling drug in the United States, costing, on average, over US$800 for a month’s supply and generating over US$7.5 billion in sales from October 2013 through September 2014. After the patent expired in the United States, sales revenues dropped by almost US$7 billion in 2015, which is when [pharmaceutical manufacturers] Otsuka and Proteus first submitted an application [to the FDA] for market approval of the digital version. The generic oral version of aripiprazole costs approximately US$20 per month, while Abilify MyCite costs almost US$1,700 for a month’s supply.[27]

Cosgrove and co-researchers concluded that:

“The advent of digital psychotropic drugs marks a new age in surveillance and poses risks to privacy and human rights, possibly in ways yet unimagined.”[28]

While manufacturers claim that patients consent to use Abilify before taking the pill and its tracking device,[29] it can easily be abused as enforced treatment, which United Nations human rights bodies say constitutes torture. For patients with court-ordered treatment or other involvement in the courts, “consent” takes on a different meaning if it means obtaining freedom, child custody, or a lighter sentence.[30] They may agree to take such a “medication” if it relieves other restrictions placed on them.

Concerns about institutionalization and other coercive practices were a major focus of child psychiatrist Dainius Pūras during his six-year tenure as the United Nations Special Rapporteur on the right to health. He emphasized the urgent need to abandon outdated practices in mental health care, including medicalization, coercion, and institutionalization.[31]

These digital technologies, therefore, “promote practices that violate the right to freedom, including freedom from coercive or degrading treatment. For example, if patients are incentivized to take the digital version of a psychotropic drug (such as by being offered outpatient treatment as an alternative to compulsory inpatient treatment, or as a condition of parole), the line between incentivizing and coercion becomes blurred.”[32]

Apps Built on No Science, No Cures

What is so disturbing is that mental health apps are based on entirely subjective data about behavior and emotions. It is so subjective and arbitrarily determined that some experts can make scaremongering claims such as that “roughly half of the American population is affected by mental health issues.”[33]

Applying an arbitrary—not backed by science—idea that nearly 50% of users are mentally ill creates a potentially huge captive market for psychiatric treatment.

The consumers being profited from are not informed that the “disorders” they may be diagnosed with, which are listed in the Diagnostic & Statistical Manual of Mental Disorders (DSM), cannot be detected and confirmed by any physical or neurobiological test, brain scan, or X-ray, the way medical diseases can.

Dr. Darshak Sanghavi, a clinical fellow at Harvard Medical School stated:

“There is no blood test or brain scan for major depression. No geneticist can diagnose schizophrenia.”[34]

“All diagnosis is done only from symptoms that are made up by groups of psychiatrists. Neither the DSM nor psychiatry itself have any validity,” Dr. Keith Hoeller, Ph.D. warned.[35]

Nor can a consumer expect any cure. The website of the pharmaceutical front group Mental Health America states, “There’s no cure for mental illness….”[36] Even the Mayo Clinic admits “…psychiatric medications don’t cure mental illness….”[37] WebMD also reports “Drugs cannot cure mental illnesses.”[38]

As the Cosgrove paper points out:

“The lack of biomarkers, or objective measurements, to determine mental disorders has plagued psychiatry and resulted in concerns about the validity of psychiatric disorders.”

Consequently, psychiatrists and neuroscientists are turning their attention to digital phenotyping (tracking a set of observable characteristics or traits of an organism), promoted as an objective way to measure—and supposedly predict—traits, behavior, and mood. Digital phenotyping technology uses sensors that can track an individual’s behavior, location, and speech patterns (e.g., intonation).[39]

The standard and validity of psychiatric diagnoses were already shockingly poor. Phenotyping now represents a whole new level of pseudoscience.

Cosgrove and co-researchers point out that by analyzing human-computer interaction (for example, the use of a smartphone), the measurement focus is not on content (what you type) but how you type. These interactions—the patterns and timings of user activity on touch-screen devices—are aggregated and analyzed using machine learning. The results of these interactions are referred to as digital “biomarkers.” Scrolling, clicking, tapping, and other touch-screen behaviors are analyzed with machine learning to predict cognition and mood.”

“The data being gathered and analyzed by tech giants through nontransparent surveillance can now be used to categorize people as ‘at risk’ of committing crimes, including benefit fraud.[40]

Mental health apps are based on hypotheses about behaviors and “disorders” arrived at by opinion, not science.

It is a lucrative business. The mental health app market is estimated to reach $17.46 billion by 2032.[41] Venture capital firms invested more than $2.4 billion in digital behavioral health apps in 2020—more than twice the amount invested in 2019.[42] 

Therapist pay is also an incentive. The average annual pay for someone providing online mental health therapy in Los Angeles ranges between $65,164 and $80,207.[43] Therapists make up to $10,770 a month nearly three times the monthly income of the average American full-time worker.[44]

The average starting salary for psychiatrists increased by 7% from an average of $279,000 in 2021 to $299,000 in 2022.[45] Psychiatrists have the highest annual mean (average) wage—$250,000—out of all mental health jobs, according to new data from the U.S. Bureau of Labor Statistics. Clinical and counseling psychologists, the second highest earners, make just shy of $100,000 per year.[46]

The Way Forward

Consumers must be better informed of the risks when using a mental health app. The operative word here is “health,” as in free from illness or injury—complete well-being. Unfortunately, psychiatric treatment not only doesn’t treat anything “physical,” but it can also harm and create ill health.

The United Nations Human Rights Council recommends that the focus of mental health systems and services should be widened beyond the biomedical model to include a holistic approach that considers all aspects of a person’s life and that forced mental health institutionalization and treatment should be prohibited and be replaced with rights-based and decision-making mental health services in the community.[47]

For further information, read: The Brave New World of Artificial Intelligence in Mental Health

CCHR’s Mental Health Declaration of Human Rights, written in 1969, also includes many of the rights that college students and mental health treatment consumers should be aware of.


[2] “How mental health apps can accelerate the psychiatric prescribing cascade,” Lown Institute, 18 Mar. 2022,




[6] Thomas Germain, “Mental Health Apps Aren’t All As Private As You May Think,” Consumer Reports, 2 Mar. 2021,

[7] Deanna Paul, “Colleges want freshmen to use mental health apps. But are they risking students’ privacy?” Washington Post, 2 Jan. 2020,

[8] Ibid.


[10] Rebecca Torence, “JPM 2022: Talkspace faces securities fraud class-action suit as consumer revenue declines,”, 13 Jan 2022,

[11] Abraham Jewett, “Talk therapy apps under fire over data privacy concerns,” Top Class Actions, 28 June 2022,

[12] Matthew Roza, “Therapy app Talkspace allegedly data-mined patients’ private conversations with therapists,”, 10 Aug. 2020,

[13] Harris Meyer, “How much should you trust BetterHelp, Talkspace, Cerebral and other mental health start-ups touted by celebrities?” Sharp Brains,

[14] Smitha Gundavajhala, “Cerebral Exposes Cracks in Mental Health Apps,” Washington Journal of Law, 12 May 2022,;

[15] Op. cit., Sharp Brains

[16] Op. cit., Washington Journal of Law

[17] Anna Werner, “Expert alarmed by mental health app Cerebral’s speedy sessions and prescriber qualifications,” CBS News, 7 Sept. 2022,

[18] Op. cit., Washington Journal of Law

[19] Ibid.


[21] Op. cit., Washington Journal of Law


[23]; Joanna Moncrieff, Ruth E. Cooper, Tom Stockmann, Simone Amendola, Michael P. Hengartner and Mark A. Horowitz, “The serotonin theory of depression: a systematic umbrella review of the evidence,” Molecular Psychiatry, 20 July 2022,; “Depression is probably not caused by a chemical imbalance in the brain – new study,” The Conversation, 21 July 2022,



[26] Lisa Cosgrove Ph.D., et al., “Digital Phenotyping and Digital Psychotropic Drugs: Mental Health Surveillance Tools That Threaten Human Rights,” Health Human Rights, Dec. 2020,

[27] Ibid.

[28] Ibid.


[30] Ibid.

[31] Op. cit., Health Human Rights

[32] Ibid.


[34] “Prozac Backlash by Joseph Glenmullen, M.D.,”; Dr. Darshak Sanghavi, “Health Care System Leaves Mentally Ill Children Behind,” The Boston Globe, 27 Apr. 2004,

[35] Keith Holler, “Thomas Szasz Versus the Mental Health Movement,” MIA, 17 Sept. 2022,



[38] “Drugs to Treat Mental Illness,” WebMD, 26 Oct. 2022,

[39] Op. cit., Health Human Rights

[40] Ibid.


[42] “In a Murky Sea of Mental Health Apps, Consumers Left Adrift,” California Healthline, 21 June 2021,




[46] “Psychiatrists Make the Highest Annual Mean Wage in Behavioral Health,” Behavioral Health Business, 20 Oct. 2022,