Skip to main content

Main menu

  • Home
  • Current Issue
  • Ahead of Print
  • Past Issues
  • Info for
    • Authors
    • Print Subscriptions
  • About
    • About the Journal
    • About the Academy
    • Editorial Board
  • Feedback
  • Alerts
  • AAPL

User menu

  • Alerts

Search

  • Advanced search
Journal of the American Academy of Psychiatry and the Law
  • AAPL
  • Alerts
Journal of the American Academy of Psychiatry and the Law

Advanced Search

  • Home
  • Current Issue
  • Ahead of Print
  • Past Issues
  • Info for
    • Authors
    • Print Subscriptions
  • About
    • About the Journal
    • About the Academy
    • Editorial Board
  • Feedback
  • Alerts
OtherSPECIAL ARTICLE

Commentary: Let's Think About Human Factors, Not Human Failings

Douglas Mossman
Journal of the American Academy of Psychiatry and the Law Online March 2009, 37 (1) 25-27;
Douglas Mossman
MD
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Info & Metrics
  • PDF
Loading

Abstract

Doctors typically think about medical errors as potential causes of malpractice litigation, as failures by individuals, and as evidence of personal incompetence that may deserve sanctions. Other professions take a different view: designing of safer systems, rather than criticism and punishment, is the way to prevent unintentional mishaps. In his article, Jeffrey Janofsky shows how psychiatrists can think about making care systems safer for patients. He also provides a splendid example of how forensic psychiatrists should conceptualize legal and medical problems encountered in clinical practice.

When I was a boy, auto fatalities occurred at a rate of 6 deaths per 100 million vehicle miles traveled.1 Corporate executives of the major U.S. automakers publicly insisted that the cause of all these deaths was bad drivers. As one vice president of General Motors told the New York Times in 1965, “If the drivers do everything they should, there wouldn't be any accidents, would there?” (Ref. 2, p 227). But that same year, in his classic, Unsafe at Any Speed,3 Ralph Nader articulated a different view: cars had “designed-in dangers” that could be eliminated, and requiring cars to have many already-developed safety features (e.g., seat belts) would save many lives.

In 1966, Congress authorized the federal government to set standards for motor vehicle safety. The first director of the National Highway Safety Bureau, William Haddon, Jr., was a physician who recognized that standard public health measures could prevent motor vehicle injuries.1 Over the next several years, vehicles were required to have all the safety features we now take for granted: seat belts, energy-absorbing steering wheels, shatter-resistant windshields, cushioned interior surfaces, and air bags. After four decades of car improvements, plus safer road design, better public awareness about auto safety, and sensible driving requirements (e.g., child safety seats), the auto fatality rate has fallen to 1.28 deaths per 100 million vehicle miles,4 less than one-fourth the rate in the 1960s.

A significant number of patients is harmed by medical errors,5 but physicians and policy-makers have only recently recognized that health care-induced deaths and injuries are a public health problem. Like other physicians, forensic psychiatrists often look at health care errors and adverse medical events through the lens of malpractice law, which explicitly focuses on failures by individuals. As the Mississippi Supreme Court put it, “Medical malpractice is legal fault by a physician or surgeon. It arises from the failure of a physician [a single individual] to provide the quality of care required by law” (Ref. 6, p 866).

Physicians may know intellectually that “to err is human,” but we don't feel that way about our professional actions. The professional socialization of physicians instills an ideal of error-free practice,7 from which it follows that good physicians should be virtually infallible. Legalistic and self-critical thinking has led physicians to believe that medical error occurs only because of negligence. Physicians often personalize this even further, concluding (consciously or unconsciously) that medical errors reflect underlying character flaws.8

When medical errors occur, the consequences are sometimes tragic, because clearly innocent victims (that is, patients) pay for those errors with their bodies and lives. The reaction, in both legal and medical settings, is to find those individuals who are to blame and punish them. Although this response is understandable, it is ultimately counterproductive. Modern medical practice is a complex affair, and we now know, from looking at how safety improvements have occurred in other high-risk enterprises, that: …fear, reprisal, and punishment produce not safety, but rather defensiveness, secrecy, and enormous human anguish. Scientific studies…make it clear that, in complex systems, safety depends not on exhortation, but rather on the proper design of equipment, jobs, support systems, and organisations. If we truly want safer care we will have to design safer care systems [Ref. 9, p 136].

Jeffrey Janofsky's presidential address10 contains a vivid, clear description of efforts to design a safer care system. It also serves as a splendid example of the intellectual contributions that forensic psychiatrists can make concerning the legal and medical problems that we encounter in our practice.

Suicide is the most frequently identifiable impetus for psychiatric malpractice litigation11 and the second most frequent “sentinel event” reported to the Joint Commission on Accreditation of Health Care Organizations (JCAHO).12 Texts and articles that address prevention of malpractice lawsuits13,14 usually focus on methods of assessment and individualized interventions—that is, potential actions and decisions by individual caregivers that might avert suicide attempts. Recently, however, other perspectives on suicide have entered forensic psychiatry's intellectual arena. These perspectives recognize a clash between the still-prevalent, blame-the-individual ethos of courts and medical organizations, and the systems-oriented ethos of fields that study error scientifically.15,16

The program that Janofsky describes is designed to reduce handoff errors that arise when patient care data are imperfectly transmitted from one caregiver to the next. Harm to patients frequently results from faulty communication,17 and a few years ago, the JCAHO began requiring hospitals to develop standards for improving handoffs out of a recognition that poor handoffs are the single largest source of medical error.18

A focus on better communication as an anti-suicide strategy makes sense, both from what research on hospital errors tells us in general and from the discovery by Janofsky and his colleagues that in implementing observation practices, “most critical observation failure modes were caused by communication failures” (Ref. 10, p 21). Janofsky also notes that suicide observation practices are plagued by a fundamental communication problem: a striking lack of consistency in the terms used to define and describe the type of observation taking place. To address this, Janofsky and his colleagues have adopted an easy-to-understand, clearly defined set of labels for four potential levels (or intensities) of observation.

Notwithstanding my enthusiasm for Janofsky's contribution, I wish I were more confident that the enterprise he describes will reduce inpatient suicides. The following five comments summarize my reservations, which in many cases relate to concerns and problems that Janofsky explicitly acknowledges.

First, we know that a simple procedure used for years to reduce aviation accidents, a checklist, can also reduce medical mishaps and complications from anesthesia,19 central line placement,20 and surgery.21 But procedures in anesthesia and the mechanics of central line placement are united across all care sites by similarities in equipment and human anatomy. Are psychiatric units and the patients who occupy them similar enough to make generalizations about useful, error-saving processes? How adaptable is the workflow diagram that Janofsky has produced to other psychiatric inpatient settings?

Second, some hospital adverse events (e.g. certain types of infections) are so frequent that one can measure the impact of an error-reducing intervention at a single institution in just a few months. At any given psychiatric inpatient unit, however, suicides are rare. To find out whether implementing better communication and clearer nomenclature for observation levels would really reduce inpatient suicides, one might need to conduct a study that involves monitoring outcomes at a large number of institutions. Is such a study feasible? To have a good chance of detecting a benefit (i.e., to have adequate statistical power), how large might the study have to be, and how long would it have to last?

Third, Janofsky describes the limitations of current publications on inpatient suicides and the inability of most would-be investigators to obtain data that might illuminate why inpatient suicides occur. Is there any prospect for this to change? Might a government-initiated effort provide a framework within which data on suicide (along with many other adverse hospital events) might be available for examination by independent researchers?

Fourth, though many physicians might come to appreciate the insights of human factors analysis, few physicians possess the expertise to apply human factor techniques to their own work places. Human factor researchers have taken an interest in the activities of some medical specialists.22 Might psychiatrists find ways to interest these researchers in the dilemmas of our specialty?

Finally, as Janofsky notes, suicide attempts are intentional behavior, and the inpatient who attempts to harm himself is trying to undermine or sabotage staff members’ efforts. Yet the human factors literature assumes that all personnel involved want the system to work and want to prevent adverse outcomes. This raises the question of whether the techniques used in human factors analysis offer the right approach to inpatient suicide. If “sabotage” is the right metaphor for inpatient suicide, would some other analytic or conceptual framework—one drawn from the criminology literature, perhaps—be better suited for preventing inpatient suicide?

Good scientific articles both provide useful ideas and inspire new questions. By this criterion, Janofsky's contribution is one that The Journal is rightly proud to publish.

  • American Academy of Psychiatry and the Law

References

  1. ↵
    Achievements in public health, 1900–1999 motor-vehicle safety: a 20th century public health achievement. MMWR 48:369–74, 1999
    OpenUrlPubMed
  2. ↵
    Newman GR: Car safety and car security: an historical comparison. Crime Prevent Stud 17:217–48, 2004
    OpenUrl
  3. ↵
    Nader R: Unsafe at Any Speed: The Designed-In Dangers of the American Automobile. New York: Grossman Publishers, 1965
  4. ↵
    Department of Transportation: U.S. Secretary of Transportation Mary E. Peters announces new data showing record low highway fatalities. Press release, December 11, 2008. Viewable at http://www.dot.gov/affairs/dot17508.htm. Accessed December 24, 2008
  5. ↵
    Kopp BJ, Erstad BL, Allen ME, et al: Medication errors and adverse drug events in an intensive care unit: direct observation approach for detection. Crit Care Med 34:415–25, 2006
    OpenUrlCrossRefPubMed
  6. ↵
    Hall v. Hilbun, 466 So.2d 856 (Miss. 1985)
  7. ↵
    Hilfiker D: Facing our mistakes. N Engl J Med 310:118–22, 1984
    OpenUrlCrossRefPubMed
  8. ↵
    Leape LL: Error in medicine. JAMA 272:1851–7, 1994
    OpenUrlCrossRefPubMed
  9. ↵
    Berwick DM, Leape LL: Reducing error in medicine. BMJ 319:136–7, 1999
    OpenUrlFREE Full Text
  10. ↵
    Janofsky JS: Reducing inpatient suicide risk: using human factors analysis to improve observation practices. J Am Acad Psychiatry Law 37:15–24, 2009
    OpenUrlAbstract/FREE Full Text
  11. ↵
    Bender E: Psychiatrists can minimize malpractice-suit anxiety. Psychiatr News 38:11, 2003
    OpenUrl
  12. ↵
    Joint Commission on Accreditation of Health Care Organizations: Sentinel event statistics—September 30, 2008. Available at http://www.jointcommission.org/SentinelEvents/Statistics. Accessed December 25, 2008
  13. ↵
    Simon RI, Sadoff RL: Psychiatric Malpractice: Cases and Comments for Physicians. Washington, DC: American Psychiatric Publishers, 1992
  14. ↵
    Appelbaum PS, Gutheil TG: Clinical Handbook of Psychiatry and the Law (ed 4). Philadelphia: Lippincott Williams & Wilkins, 2007
  15. ↵
    Steiner JL: Managing risk: systems approach versus personal responsibility for hospital incidents. J Am Acad Psychiatry Law 34:96–8, 2006
    OpenUrlAbstract/FREE Full Text
  16. ↵
    Gutheil TG: Commentary: systems, sensitivity, and “sorry.” J Am Acad Psychiatry Law 34:101–2, 2006
    OpenUrlAbstract/FREE Full Text
  17. ↵
    Kitch BT, Cooper JB, Zapol WM, et al: Handoffs causing patient harm: a survey of medical and surgical house staff. Joint Comm J Qual Patient Saf 34:563–70, 2008
    OpenUrl
  18. ↵
    Landro L: Hospitals combat errors at the ‘hand-off’: new procedures aim to reduce miscues as nurses and doctors transfer patients to next shift. Wall Street Journal. June 28, 2006, p D1
  19. ↵
    Charlton JE: Checklists and patient safety. Anaesthesia 45:425–6, 1990
    OpenUrlPubMed
  20. ↵
    Gawande A: The checklist. New Yorker. December 10, 2007. Available at http://www.gawande.com/articles.htm. Accessed December 25, 2008
  21. ↵
    Haynes AB, Weiser TG, Berry WR, et al: A surgical safety checklist to reduce morbidity and mortality in a global population. N Engl J Med 360:491–9, 2009
    OpenUrlCrossRefPubMed
  22. ↵
    Weinger MB, Slagle J: Human factors research in anesthesia: patient safety techniques to elucidate factors affecting clinical task performance and decision making. J Am Med Inform Assoc 9(Suppl 1):S58–63, 2002
    OpenUrlAbstract/FREE Full Text
View Abstract
PreviousNext
Back to top

In this issue

Journal of the American Academy of Psychiatry and the Law Online: 37 (1)
Journal of the American Academy of Psychiatry and the Law Online
Vol. 37, Issue 1
March 2009
  • Table of Contents
  • Index by author
Print
Download PDF
Article Alerts
Sign In to Email Alerts with your Email Address
Email Article

Thank you for your interest in recommending The Journal of the American Academy of Psychiatry and the Law site.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Commentary: Let's Think About Human Factors, Not Human Failings
(Your Name) has forwarded a page to you from Journal of the American Academy of Psychiatry and the Law
(Your Name) thought you would like to see this page from the Journal of the American Academy of Psychiatry and the Law web site.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Citation Tools
Commentary: Let's Think About Human Factors, Not Human Failings
Douglas Mossman
Journal of the American Academy of Psychiatry and the Law Online Mar 2009, 37 (1) 25-27;

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero

Share
Commentary: Let's Think About Human Factors, Not Human Failings
Douglas Mossman
Journal of the American Academy of Psychiatry and the Law Online Mar 2009, 37 (1) 25-27;
del.icio.us logo Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • References
  • Info & Metrics
  • PDF

Related Articles

Cited By...

More in this TOC Section

  • AAPL Practice Guideline for Forensic Psychiatric Evaluation of Defendants Raising the Insanity Defense
  • But He Knew It Was Wrong: Evaluating Adolescent Culpability
  • Commentary: Building a Developmental-Ecological Model of Criminal Culpability During Adolescence
Show more SPECIAL ARTICLE

Similar Articles

Site Navigation

  • Home
  • Current Issue
  • Ahead of Print
  • Archive
  • Information for Authors
  • About the Journal
  • Editorial Board
  • Feedback
  • Alerts

Other Resources

  • Academy Website
  • AAPL Meetings
  • AAPL Annual Review Course

Reviewers

  • Peer Reviewers

Other Publications

  • AAPL Practice Guidelines
  • AAPL Newsletter
  • AAPL Ethics Guidelines
  • AAPL Amicus Briefs
  • Landmark Cases

Customer Service

  • Cookie Policy
  • Reprints and Permissions
  • Order Physical Copy

Copyright © 2025 by The American Academy of Psychiatry and the Law