My assignment was to discuss the history of forensic psychiatry, but I face two serious constraints. First, the time and space available do not lend themselves to anything like a complete review; and, second, previous speakers, whose presentations are published elsewhere in this issue of the Journal, have covered a number of points already. Therefore, my approach will be to highlight some data points and landmarks in roughly chronological order and to try to give a sense of the overall sweep of forensic psychiatric history, focusing on some striking case examples.
One thesis for this history could be that the more things change, the more they remain the same. Consider this statement on the role of the expert witness: If you have clearly ascertained that [the defendant] is in such a state of insanity that he is permanently out of his mind and so entirely incapable of reasoning, and no suspicion is left that he was simulating insanity when he killed his mother, you need not concern yourself with the question how he should be punished…he should be kept on close observations and, if you think it advisable, even kept in restraint… [Ref. 1, p 316].
What are the essential themes here? First, the importance of careful assessment: “clearly ascertained.” Then, the critical forensic question of possible malingering: “simulating insanity.” Next, the importance of avoiding the “ultimate issue”: “how he should be punished.” Finally, the matter of clinical management: possible needs for restraint.
What century is this thoroughly timely and modern analysis from? It arises from approximately 180 A.D. The author was Macer, writing near the time of Marcus Aurelius.1 Under Aurelius, a lunatic might be restrained by relatives, but if he escaped and did harm, the relatives might be executed. This format established a curious form of “kinship malpractice,” with a very harsh penalty, indeed.
Let us visit some other ancient times. It may be impossible to find the earliest expert witness, but we have information on record that Antisius examined the corpse of Julius Caesar and opined that only the thoracic sword thrust was fatal; the other 22 stab wounds were not.
Legend relates that the retired Roman general, Cincinnatus, saw a delegation coming from the city and knew that they were going to ask him to lead an army. To escape involvement in that plan, he decided to feign madness by hitching up his plow and sowing salt as if it were seed. The delegates, suspecting a trick, put the infant grandchild of Cincinnatus in the path of the plow, and when he turned the plow aside, they diagnosed malingering. We now use somewhat less drastic methods.
Turning to other lands and other eras, consider the so‐called “truth pellet” used in Africa. Suspects in a serious crime were assembled and, with appropriate ceremony and pomp, were informed that the identity of the criminal would be decided by the magic pellet of truth, which would poison the guilty. In an emotionally charged atmosphere, small pellets were placed in each suspect’s mouth and, shortly thereafter, removed. The pellet was merely a scrap of leather, but, since the criminal’s mouth would have gone dry out of fear of discovery, the pellet not moistened by saliva indicated the guilty one.
In ancient China one of a group of merchants on a journey apparently robbed the others. The local magistrate indicated that the criminal would be discovered by a magic bell, hidden behind an aperture in a curtain. Touched by the guilty, the bell would ring. After all had put their hands through the aperture, the bell had not rung, but the magistrate demanded to look at their hands. The bell had been coated with soot, and the guilty party, fearing detection, had not touched the bell. In a remarkable reversal of symbolism, the criminal was the one with clean hands.
Current controversy about a person’s competence to be executed contrasts sharply with the ancient Jewish Talmud, which recommends that a criminal being executed be made drunk first, to spare him or her the horror of the penalty.
In ancient India around 880 B.C., the laws gave special consideration to retarded persons and children younger than 15. Islamic law provides that murder by a minor or a mentally ill person is involuntary homicide, subject only to compensation for the loss. In classic‐era Greece, the legal code of Draco (who gives us our adjective for harshness of codes, “draconian”) distinguished murder from involuntary homicide.
In medieval Europe, forensic theory held that individuals who were mentally ill had sold themselves to the devil. So persistent was this view that the last witch execution in Germany occurred relatively recently, in 1775. In late 16th century Florence, popular views of the mentally ill were not that different from, or were not less stigmatizing than, those held today. In public language, the subject of forensic attention was a “madman, crazy, raving, insane, demented, fatuous, short on brains, stupid in the brain, nuts, lacking brains, out of his mind and out of his feelings.” When physicians came along, the examinee was described as “ruled by humors, delirious and affected by mania.”2 Monomania became an immensely popular term in France from the 1820s to the 1850s. One author suggested that this was in part a matter of advancing professional status as a matter of professional self‐interest; of course, we never have that problem today.3
In 1839, in France, the concept of monomania was so widespread that a journal noted: “One no longer says: It is his hobbyhorse or his fancy; one says, like a grave physician, ′It is a monomania’” (Ref. 3, p 386). This last detail demonstrates that, if psychiatry can contribute nothing else to the problem, it can always be counted on to supply the requisite jargon.
In England, between 1760 and 1845, 350 criminal defendants alleged a mental disturbance. The usual basis for the claim was little more than their own statements or statements by neighbors or relatives who had allegedly observed their conditions.4 Fewer than one‐fourth used medical witnesses.
The development of the “science” of phrenology was another important step, but its value would not be recognized because of its clearly fallacious basis. However, we should recall that it paved the way for later developments in a localization theory of the brain, leading ultimately to modern “regional” neurology and neuropsychology.
The philosopher Immanuel Kant, during the 1790s, touched on important themes of accountability, freedom, and proper use of the faculties of knowledge.5 In a remote precursor of the Diagnostic and Statistical Manual, he distinguished four kinds of psychoses relevant to forensic psychiatry: amentia (chaotic thought); dementia (delusions of reference, not separating fantasy from reality); insania (disturbed judgment—flight of ideas); and vesania (disorder of reason like schizophrenia).
Emil Kraepelin6 advanced theories of naturalism and context‐dependence of mental events. Some important implications were:
Forensic psychiatry is a medical (hence quantitative) science.
Delinquency is naturalized as a social illness; punishment is society’s revenge on misbehavior.
Criminal behavior is regarded as mental illness.
In England at the turn of the 19th century, Thomas Percival, in his Medical Ethics,7 had this to say about expert testimony: When it becomes [medical practitioners’] painful office to deliver evidence, on such occasions, justice and humanity require that they should scrutinize the whole truth and nothing extenuate nor set down aught in malice (emphasis in the original) [Ref. 7, p 303].
It is striking how closely this early formulation comes to our present goals of “honesty and striving for objectivity.”8
Gold9 advanced an important idea regarding the co‐evolution of general and forensic psychiatry. The psychiatric witness is essentially a product of the 19th century. Before that time, the law did not see a need for psychiatric testimony, since judges set the standards. After 1825, the independent medical witness began to replace the treater as witness.
Famous Cases
I now touch on a handful of famous cases, some already addressed in other parts of this publication. I will mention a few noteworthy points.
Would‐be assassin Edward Oxford was first sent to Bedlam and then transferred to the new Broadmoor Institution.9 Three years later, he was offered a discharge if he would agree to get out of the country. He went to Australia and was never heard of again. (A similar outcome occurred with Prosenjit Poddar, killer of Tatiana Tarasoff). Oxford was diagnosed as having a “lesion of the will.”
James Hadfield was understood to have tried to kill King George III, a crime for which high treason was charged. It appears that he wished neither directly to commit suicide nor harm the king, so he shot next to the king. This may have been a very early example of what we now call “attempted suicide by cop.” He apparently believed he would be killed, since harming the king was a capital offense. We might compare Gary Gilmore, who committed a murder in the only state that still used a firing squad to execute persons. Being shot by a firing squad was allegedly a lifelong fantasy of his.
A central luminary in forensic psychiatric history was Isaac Ray. His 1838 Treatise on the Medical Jurisprudence of Insanity10 became an international classic (and would surely have won the Guttmacher Award). Remarkably, Ray was a general practitioner in Maine. He had had no formal training in the law but had come to study insanity via medical jurisprudence. He reminds us of another great general practitioner, Morton Birnbaum, who more or less invented the right to treatment.11 There was no right to treatment, but Birnbaum thought there should be one. He wrote a paper on it, and it became a theme in patients’ rights.
In 1843 (the same year in which the famous M’Naghten case was tried) a man named Abner Rogers murdered the warden of the prison in which he was confined. He pled “not guilty by reason of insanity” based on an overdose of chloroform he had been given during a previous surgery. He was ultimately acquitted and sent to the Illinois Asylum—clearly a loss for the prosecution side. The defeated prosecutor was so tall and thin that it seemed to some that he had the then unfamiliar entity now known as Marfan’s Syndrome. His name was more familiar: Abraham Lincoln.
The 1850s and 1860s were marked by bitter debates about definitions of mental disease, then as now, with political overtones. These debates often turned on whether there could be a mind lesion without a brain lesion. In other contexts, physicians debated the question of whether alcoholism should be formally designated as a disease and lose its moral position as a vice. At the time, psychiatry was in a virtual war with the neurologists (as with psychologists now) over the treatment of mental illness. The American Neurological Association was essentially in competition with the American Psychiatric Association for treatment of the mentally ill.
Related to this controversy, in the England of the 1860s, the diagnoses of “nervous shock” and “railroad spine” emerged—ancestors, in effect, of both posttraumatic stress disorder and the question of psychosomatic injury.7 These entities were the focus of an upsurge in civil compensation claims instead of criminal cases. Indeed, the toll of death and destruction from railway accidents was so great that all major English railway companies had their own teams of doctors and surgeons. This collaboration led to a close linkage between physicians and forensic practice and a confusion of treaters and experts. A core question was what to make of symptoms in the absence of demonstrable physical injury. The typical symptoms of these entities were: skin sensitivity, sleep disturbance, mutism, stuttering, chorea, and paralyses. Initially, these symptoms were seen as the result of “twists and wrenches of the spine during collisions” or “concussions of the spine.” The “secondary symptoms” without obvious physical change became collectively referred to as “nervous shock.”7
Predictably, malingering was also hotly debated, and concern was expressed about “imaginary symptoms,” which look to a modern reader often like flashbacks.7 The 1870s and 1880s saw an emergence of the functional versus organic distinction.
An important case involving a presidential assassination occurred in 1881, with the trial of Charles Guiteau, who killed President James Garfield. Although Guiteau was obviously insane, he was found guilty and hanged. An autopsy showed “fairly good evidence for syphilis” of the brain.7 A vital issue at trial had been whether there was brain or mind disease present.
The forensic field—and, of course, society in general—struggles with the problem of the psychopath. Early terms for this entity include the apt term “moral insanity” and the phrase in the French literature, manie sans délire, suggesting a disturbance without cognitive impairment. An excellent description of this entity appears in Thomas Mann’s Confessions of Felix Krull, Confidence Man12 and in the movies Kiss of Death and The Onion Field. A truly instructive discussion of this entity occurs in the musical, West Side Story,13 in the song, “Gee, Officer Krupke”—a veritable condensed history of psychopathic theory and terminology.
The discussion of M’Naughten would not be complete in this election year without Queen Victoria’s comment. She did not believe that anyone who wanted to murder a conservative politician could be insane.9
More recent developments have involved medicolegal societies, like the American Academy of Psychiatry and the Law, which attempt to bridge the gaps between the radically different disciplines of psychiatry and law. Forensic psychiatry has benefited from a series of giants in the field: Karl Menniger, Manfred Guttmacher, William Alanson White, A. Louis McGarry, Seymour Pollock, Bernard Diamond, Jonas Rappeport, and, of course, Robert Sadoff. AAPL now has about 2000 members, and a number of other organizations now inhabit the clinical‐law interface. Despite this growth, in number of practitioners and expertise, I suggest the fundamental issues remain exactly the same as in the early days of the profession, as I have tried to show: the clinical foundation of the field; honesty and striving for objectivity; and the problems of malingering, psychopathy, and public criticism.
Acknowledgments
In preparing this piece, I acknowledge my indebtedness and offer my thanks to Liza Gold, Jacques Quen, Douglas Smith, and Ralph Slovenko, all of whom provided background information.
- American Academy of Psychiatry and the Law