Abstract
In recent years, the availability of software that is targeted toward the general public and designed to assist in the diagnosis and treatment of mental illness or to promote general mental health has expanded greatly. Regulation of more traditional health care providers and health care-associated devices is well established by statute, regulatory guidelines, and common law precedents. Applications (apps), in contrast, pose a novel regulatory challenge. This review examines the current regulatory guidelines for psychiatric mobile mental health apps, as well as the current state of case law in the psychiatric mobile mental health realm.
Society has a long-standing tradition of regulating those who claim to diagnose and treat mental health conditions, as well as the products, such as drugs and devices, used in diagnosis and treatment. Mobile health software that is targeted toward lay people, however, is a recent development that currently faces much less regulatory oversight. Modern computer hardware and software have been available for a relatively short time, and their capabilities will only continue to expand. Society is now faced with questions about whether, and how, to regulate the novel services that these software programs provide. Any regulation of emerging software must promote safety while allowing for innovation. In this review, we consider what currently constitutes the regulated diagnosis or treatment of mental illness, as applied to popular mobile health software.
In the United States, medical boards, psychology boards, nursing boards, and other similar regulatory bodies are assigned the task of licensing and overseeing clinicians. The goal of these boards is to protect the public by ensuring that the care licensees provide meets minimal standards of quality. To diagnose or treat medical conditions without the approval of such a board is generally considered the practice of medicine without a license. Doing so can result in criminal and civil penalties. In addition to maintaining appropriate licensure, clinicians must be aware of a body of federal law that has been written to regulate the medical profession, such as the Health Insurance Portability and Accountability Act (HIPAA)1 and the laws and regulations associated with the Drug Enforcement Administration (DEA). Clinicians also must be aware of state laws that regulate many facets of health care provision. Finally, clinicians should have some awareness of the judicial rulings that have established common law precedents for the practice of medicine in their area. In short, the practice of psychiatry and other mental health disciplines is highly regulated at both the federal and state levels.
In recent years, expansions in technology have introduced computer software that is intended to assist in the diagnosis or treatment of mental illness or to provide coaching or other services to individuals who are in mental distress. There is reason to believe that some patients will benefit from working with computer programs to augment the effectiveness of more traditional treatment programs. There is also reason to believe that, in some instances, patients could derive benefit from working with computer programs without the involvement of a licensed practitioner.
The development of software that can diagnose and treat mental illness carries significant and obvious appeal for patients and society. A person who is trained to conduct psychotherapy, for instance, can provide attention and individualized treatment to only a relatively small number of patients at any given time. Those patients must make themselves available at a particular time and, usually, at a particular location. Between sessions, the clinician is available in only a limited fashion, for instance by phone or pager. The clinician must also charge a fee that is sufficient to provide an appropriate income for his level of education and training.
A computer program, in contrast, can overcome many of the barriers to traditional treatment. Once the software has been written, a computer application (app) can be installed and run on an unlimited number of devices simultaneously. A computer program can be run at a time and place of the user's choosing. Economies of scale also make it possible for a computer program to provide treatment at a low per-user cost. The potential for developing scalable technologies that can autonomously provide benefit to an unlimited number of patients at low cost is alluring.
Although technology offers new benefits, these must be balanced against new risks. The most obvious risk is that the treatment that is provided will be ineffective, or even harmful. This could occur when users self-select treatment that is inappropriate for their condition, for instance when an individual with undiagnosed hypothyroidism or schizophrenia attempts a course of treatment that is targeted toward major depressive disorder. It could also occur when the treatment that is offered simply does not work, even when applied under ideal conditions.2 Given the sheer amount of available software that targets psychiatric conditions and the fact that no credentialing is needed to develop such software, this possibility is of particular concern. As detailed elsewhere, treatment by computer programs could also introduce privacy concerns,3,–,7 malfunctions, exposure to hijacking by hackers who write malicious code,8 and generate other novel problems to which traditional modes of treatment are not subject.
The Evidence that Software May Provide Treatment
Mobile and connected technologies are already playing an increasing role in mental health care. In late 2016, there were more than 250,000 health apps9 and more than 10,000 mental-health-specific ones.10 Although clinical evidence for most of these apps is lacking,11 patients are interested in and using them today.12 The increasing prevalence of smartphone ownership, accessibility and low cost of apps, convenience of app use, and potential to offer evidence-based care has propelled interest in mental health apps from consumers, technology industries, clinicians, policy makers, and researchers.13
Although many available apps have not been tested for efficacy and safety, the research on mobile health for mental health is now increasing exponentially14 and highlights both the feasibility and potential of these technologies. Research on feasibility has demonstrated high rates of usability, acceptance, and patient support across psychiatric disorders, ranging from eating disorders15 to bipolar16 and psychotic disorders,17 and in populations ranging from children18 to seniors.19 Recent research now supports preliminary efficacy of apps to offer adjunctive therapeutic support in several conditions including depression,20 schizophrenia,21 and substance abuse,22 among others. However, this preliminary evidence of efficacy is limited by factors, such as small sample size and short study duration, that make conclusions based on the results difficult to generalize.
As an example, one study offering support of a smartphone app to aid in recovery from alcoholism included 349 patients who met Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (DSM-IV)23 criteria for alcohol dependence and randomized them to treatment as usual or use of a custom app. Over 12 months, the group with the app showed a reduction in risky drinking days and an increased duration of abstinence, but the study was not blinded, app users received more support from coaches, outcomes were based on self-report and not objective measurement, and it was not possible to determine which of the many features of the app were driving efficacy.22 Later, when this same app was deployed in real-world settings of 14 substance abuse programs without any research support, only 3 of the programs were using the app after two years because of difficulty supporting and sustaining app use in a busy clinical environment.24
In addition, data on harm and unintended consequences of these apps are sparse. One study that sought to reduce rates of alcohol abuse via a smartphone app in a college population found that the app actually increased rates of drinking in young men who were using the app to record and compare how much they could drink.25 However, as research on mental health smartphone apps continues to expand rapidly,14 the actual utility and efficacy of apps will become clearer.
New uses for apps will also continue to push the clinical potential, as well as expand risks and unknowns. For example, there is now increasing investigation into tracking behavior through phones' sensors (e.g., use of global positioning system (GPS) technology to monitor activity levels and call logs to observe social patterns) to monitor and predict relapse in mental health conditions.24 New forms of digital-based therapy delivered via apps are also the subject of ongoing clinical studies and have already provided a growing evidence base.25 The current broad interest in mental health apps combined with the rapidly rising research base suggests that these technologies will continue to play a larger role in the future of the field.
Literature Search
A literature review was conducted with the goal of finding case law related to mobile mental health applications, as well as background information regarding regulatory approaches to such apps.
To search for lawsuits related to mobile health apps, we conducted a search of LexisNexis in consultation with a LexisNexis research advisor. The search terms included “mobile health” or “mhealth” (132 results), “smartphone app” and “health” (6 results), “computer program” /p “negligen*” (69 results), “iPhone” /p “medical” (48 results), and “wrongful death” and “computer program” and “health” (30 results). We also conducted a Google search using the terms “mobile health lawsuits” and “smartphone health app lawsuits.”
Mental health applications are novel, but the concepts of self-help and coaching have long-standing histories. We therefore searched case law related to these topics, because it may have relevance to applications that claim to offer similar services. In a search conducted for lawsuits related to life coaching, we used the terms “life coaching” (31 results) or “life coach” (100 results) in LexisNexis. Lawsuits related to self-help were searched on LexisNexis using “self help” /s book or seminar or retreat (368 results) and narrowed the results to only those that contained the term “self help” at least twice (227 results). Other search terms included “David Burns” and “Feeling Good” (1 result), as well as “Erhard seminars lawsuit” on Google, “Tony Robbins Lawsuit” on Google, “Erhard Seminars” on LexisNexis (10 results), and “Tony Robbins” on LexisNexis (11 results).
Results that appeared to be relevant based on the title, case overview, or search terms in context were reviewed for applicability to the regulation of mobile health software.
The Current Federal Regulatory Approach
The primary federal regulators of software that assists in diagnosis and treatment are the U.S. Food and Drug Administration (FDA) and the U.S. Federal Trade Commission (FTC). The FDA has the authority to regulate medical software on the front end, before it is released to the public. The FTC can take action against software companies that make claims about their products that are unsupported by evidence. Examining the approaches these two regulatory bodies have taken provides insight into what type of software is currently thought to merit regulatory oversight.
Although the FDA had authority to take action against misbranded or adulterated medical devices beginning in 1938, it was not until the passage of the Medical Device Amendments of 1976 to the Federal Food, Drug, and Cosmetic Act (FD&C Act)26 that the administration could review these products for safety and effectiveness prior to their entry into the marketplace.27 Under the law (section 201(h) of the FD&C Act (21 U.S.C. §321(h)), a medical device means:
… an instrument, apparatus, implement, machine, contrivance, implant, in vitro reagent, or other similar or related article … intended for use in the diagnosis of disease or other conditions, or in the cure, mitigation, treatment, or prevention of disease … or intended to affect the structure or any function of the body of man … and which does not achieve its primary intended purposes through chemical action” [Ref. 28].
As early as 1989, the FDA developed a policy governing computer software that meets the regulatory definition of a medical device under the law. On February 9, 2015, the FDA released its most recent guidance (which is not legally binding, but nevertheless is of great significance to industry) which describes the way it intends to regulate mobile health software, referred to as “mobile apps” throughout the document.29 The administration notes that many mobile apps with some connection to health do not meet the statutory definition (such as those meant for patient education), but that it intends “to apply its regulatory oversight to only those mobile apps that are medical devices and whose functionality could pose a risk to a patient's safety if the mobile app were to not function as intended” (Ref. 29, p 13).
The medical device definition itself was amended at the end of 2016, by the 21st Century Cures Act (CURES),30 to exclude certain types of software functions from the device definition entirely. One such exclusion is for software functions “for maintaining or encouraging a healthy lifestyle … unrelated to the diagnosis, cure, mitigation, prevention, or treatment of a disease or condition,” functions that likely would not meet the existing definition of medical device in any case.31 This new law also provides a pathway for determining the regulatory status of products with multiple functions, at least one of which is excluded from regulation under the law while another is included. The FDA now notes, on its mobile app guidance document, that it is “assessing how to revise this guidance to represent our current thinking” in light of the new law (Ref. 29, p 1). CURES specifically provides that it does not limit the FDA's authority either to exercise enforcement discretion with respect to any device subject to regulation or to regulate software as a device if it meets the criteria of the most stringently controlled subset of devices: those that require premarket approval.32
A review of the categories of mobile health apps that the FDA intends to regulate helps clarify where the line has been drawn with regard to regulatory oversight. Apps that connect to and control existing medical devices, such as those that control insulin delivery via an insulin pump, are subject to oversight. Apps that transform a mobile platform into a regulated device using various attachments would qualify as well. One example would be attachment of a glucose strip reader to a mobile platform to function as a glucose reader. Also, apps that perform patient-specific analysis and offer patient-specific diagnosis or treatment would be regulated. Examples given of this category include radiation therapy planning software or computer-aided detection image-processing software.
In contrast, the FDA discusses several categories of apps that would not be subject to regulation. (Ref. 29, p 15). Apps that offer coaching or prompting for patients to manage their health in their daily environment would be exempt. For example, an app that offered prompting to a patient with diabetes about diet and exercise would not be subject to review. Apps that offer tools to track and organize health information without providing recommendations to alter or change previously prescribed treatment would be exempt. In addition, apps that allow patients to access their health information easily, help patients communicate with their providers about potential medical conditions, perform simple calculations used in routine clinical practice, and enable individuals to interact with their electronic health records are considered exempt. Under the CURES legislation, some of these types of apps are now categorically excluded from regulation as devices.33
In its guidelines, the FDA offers several specific examples of apps relevant to psychiatric conditions to which it would extend enforcement discretion based on their low level of risk to the public, even though they might fit the definition of medical devices (Ref. 29, Appendix B). Apps that help psychiatric patients maintain behavioral coping skills by providing a “skill of the day” or that offer patients behavioral techniques or audio messages during episodes of acute anxiety would be exempted. Apps that provide educational information, reminders, and motivational guidance to individuals recovering from addiction would be exempt, as would apps that alert a patient with an addiction when he or she is near a preidentified high-risk location. Apps that prompt the collection of behavioral or symptomatic data that have been predefined by a health care provider and then store that information for later review would be exempt, as would those that suggest diagnoses and advice about when and where to seek a health care provider based on checklists of common symptoms. In addition, apps that track and promote medication adherence would not be regulated. FDA pointedly notes, however, that exempt apps “supplement” professional clinical care, rather than replace or discourage professional treatment (Ref. 29, p 16 and FN p 26). It also refers pointedly to “diagnosed psychiatric conditions [emphasis added]” (Ref. 29 Appendix B, p. 23).
As these examples suggest, a relatively wide range of mobile software with useful applications in psychiatry is exempt from regulatory review by the FDA. Looking at the examples listed above, the risk to the patient that the software could pose should be the principal concern, regardless of whether the app meets the technical definition of a medical device. Apps that are geared toward general wellness without reference to specific diagnosis or treatment are now clearly exempt from FDA oversight.31 In contrast, however, an app that recommends changes to medication dosages or discontinuation of treatment based on active or passive symptom monitoring would be subject to FDA review.
The FTC also plays a role in regulation at a national level by acting against deceptive advertising by medical app makers. The largest such action resulted in a $2 million settlement with Lumosity, which offered a “brain training” app that was advertised as protecting against dementia and reducing cognitive impairments from a range of disorders including traumatic brain injury, post-traumatic stress disorder (PTSD), and attention deficit hyperactivity disorder (ADHD).34 The FTC has also taken action against an app that purported to calculate a user's risk of melanoma by using a smartphone camera,35 and against another one that purported to treat acne by shining light from a smartphone onto the user's face.36 More recently, the FTC took action against a smartphone-enabled breathalyzer because of its unfounded accuracy claims.37 Some of these actions protect consumers against physical harm and financial loss from products that make unsubstantiated claims.
Judicial Oversight
No lawsuits were found that were related specifically to mobile health software that purports to diagnose or treat psychiatric conditions. It appears that even when software offers a service that could be construed as “treatment,” it has not thus far been subject to the same type of judicial scrutiny as are licensed health care providers. Overall, our search shed little light on how courts will approach psychiatric mobile health apps. The litigation so far has been focused on device accuracy, whether there is a duty to prevent driver distraction, and whether the government can use data that were gathered (perhaps inadvertently) by devices in a home in a criminal prosecution or whether collection of information in such a way is barred as unreasonable search or seizure.
Fitbit, a company that manufactures mobile heart rate monitoring equipment, is facing a class action lawsuit related to the alleged inaccuracy of its devices.38 The plaintiff's legal team funded a study that was conducted by researchers at California State Polytechnic University (Pomona, CA) and reportedly demonstrated an average discrepancy of 20 beats per minute between Fitbit equipment and an electrocardiogram reading when evaluating 43 test subjects during exercise. The defense team has argued that the study is biased and that it is “the product of flawed methodology.”39 This case appears to be one of the first major forays into the world of lawsuits related to popular wearable mobile health devices.
Several other legal proceedings related to nonmedical mobile devices have gained attention in the national media. Their outcomes may set an interesting precedent with regard to mobile applications more generally. In one such lawsuit, a driver who was using Apple's FaceTime app while driving struck the rear of a vehicle while traveling 65 miles per hour, killing a five-year-old girl. The driver faces manslaughter charges. The parents of the victim sued Apple, alleging that Apple should have designed the software to be disabled when it detected that a user was driving. They also alledge that Apple “failed to warn its users that its product was likely to be dangerous when used or misused in a reasonably foreseeable manner.”40
In another case, police in Bentonville, Arkansas, issued a warrant for Amazon to turn over any recorded audio data from an Amazon Echo device for use in a murder investigation. The Echo “wakes” to a voice command and thus is always listening to ambient sound. Because it sometimes records audio clips in error, the police hoped that it had recorded evidence related to an alleged murder that took place in a private home. Amazon refused to produce the requested data, which may or may not have been recorded. Investigators were able to use a “smarthome” water meter to show that 140 gallons of water were used between 1 and 3 a.m. on the night in question, which they claim supports the hypothesis that the alleged murderer washed away evidence at around that time.41
Software and Coaches
Several companies that offer online software-based services to address topics such as social anxiety have incorporated human coaches into their offerings. It remains an open question whether doing so might militate in favor of classifying the service as treatment and subjecting it to closer regulatory scrutiny. A first-pass analysis, however, suggests that incorporating coaches may not play a major role in deciding whether a service offers treatment of a mental illness. Of note, some services connect users with a licensed health care provider who makes assessments and provides treatments over a mobile platform. These services are not addressed here, because these platforms clearly establish a clinician–patient relationship. These contexts would thus be expected to fall under the more traditional regulations regarding the practice of medicine and the practice of telemedicine.
Looking outside of the world of technology, coaches have offered their services without formal licensure for some time. Coaching is not well defined, but it is typically seen as a systematic approach that draws from multiple disciplines to promote “ongoing self-directed learning and personal growth of the client.”42 Life coaches make up one large category of individuals who work one on one with clients to help them reach their personal goals.
Because life coaching does not follow a medical model or claim to treat mental illness, it is unclear how often individuals with mental illness present for life coaching. A search on LexisNexis for lawsuits related to a life coach's failing to identify psychiatric illness in a client, failing to refer, or experiencing a client suicide yielded no relevant results. The few cases that had been published were of marginal relevance. In one case, a life coach was sued for “defamation, breach of contract of confidentiality, negligence, and intentional infliction of emotional distress” after filing a report of suspected child abuse.43 The lawsuit was dismissed because of the absence of any evidence that the report was submitted in bad faith. Another lawsuit against a life coach involved allegations of unwanted touching and race-based harassment, among other complaints.44 The suit was allowed to proceed to trial; whether its outcome was a judgment after trial or a settlement is not known.
Expanding from coaching to self-help more generally, we undertook a nonexhaustive online search for lawsuits related to several popular self-help seminars, including Tony Robbins and Erhard Seminars. Also searched for were any cases related to the widely-used Feeling Good self-help book written by psychiatrist David Burns.45 The goal in carrying out this search was to determine whether clients' use (or misuse) of self-help has generated lawsuits related to adverse mental health events. This search revealed no examples of successful lawsuits. Erhard Training Seminars generated lawsuits alleging that the program had induced psychosis46 or that the death of a participant was a wrongful death.47 In one case, the plaintiff's lack of physical injuries and the fact that the suit was brought against a successor corporation led to a negative outcome. In the other case, a jury found Erhard Seminars to have been negligent, but failed to find the necessary causation between that negligence and the death of the plaintiff.47
Conclusions
When evaluating which mobile health software offers regulated treatment or diagnosis, several themes emerge. First, what an app claims to do matters. Many apps that target symptoms without claiming to diagnose, treat, or mitigate disease are exempt from FDA oversight. Thus, for example, an app that teaches users skills to deal with acute episodes of anxiety, or general techniques to elevate mood, or tips to promote effective sleep seems unlikely to face FDA oversight, if it avoids making any connection between the states of mind involved and a disease or condition. If, in contrast, an app claims that it offers a course of treatment for bipolar mania, then it is likely to be held to a higher standard by the FDA. The FTC concerns itself primarily with the claims app developers make, because its portfolio is preventing consumer fraud generally. It may require substantiation of claims, even if no disease or condition is involved.
The FTC action against Lumosity and the class-action lawsuit against Fitbit are real-world examples of the “inaccurate-claims” category, irrespective of any medical disease or condition. For instance, Lumosity does not seem to pose serious risks to the health of its user base. Rather, the FTC action is concerned with consumer fraud, with Lumosity's use of claims not sufficiently supported by evidence. In the Fitbit suit, serious adverse health outcomes did not prompt the action. Rather, the central concern is the allegation that the company has misled its customers.
A second theme is that the potential for harm to a user matters. The FDA has indicated that apps that pose a high potential for harm are subject to review. FTC actions against the melanoma detection app and the inaccurate smartphone-enabled breathalyzer also indicate concern for the potential for physical harm. The literature review did not identify any lawsuits or regulatory actions related to bad outcomes stemming from the use of mental health apps. As the Apple FaceTime suit shows, however, in the case of a serious bad outcome (such as death), plaintiffs' lawyers may find novel claims that enable a lawsuit. Whatever the resolution of such a suit, defending against it would be costly. As mobile health apps continue to grow in number and importance, it will remain important to follow developments in case law and regulatory guidelines. The growing use of mobile apps may be a developing area related to standard of care and thus may involve testimony of forensic psychiatrists when there is a negative outcome.
Footnotes
Disclosures of financial or other potential conflicts of interest: None.
- © 2018 American Academy of Psychiatry and the Law