Extracting information with drugs

An extract from Dark Persuasion: A History of Brainwashing from Pavlov to Social Media, by Joel E. Dimsdale.

Empirically testing putative truth drugs is problematic. Laboratory studies are “pretend models,” but the models are still quite interesting, albeit inconclusive. The experiments simulate interrogations but cannot ethically reproduce the conditions that a captured soldier, for instance, would face if examined under a truth drug concerning a topic that must not be disclosed.

In 1924, New Orleans reporters assiduously memorized wrong answers to a series of questions and then were interviewed under scopolamine. Their efforts to provide duplicitous information failed, and they gave truthful answers instead.1

Such studies, however, do not prove that the drug works in real-world circumstances. People certainly get tipsy on scopolamine, but can the drug compel them to talk? More problematically, once they start talking, will they tell the truth or just blather on? The old expression In vino veritas implies that people disclose things when they are intoxicated, but of course people also reveal nonsense when they’re drunk.There is also concern that the sedated patient might become more suggestible to the interrogator, revealing what he or she thinks the interrogator wants to know as opposed to the truth.

A group of Yale investigators tried a different design.3 Recruiting volunteers to participate in “an interesting psychological experiment,” they asked their subjects to complete a psychiatric evaluation. Some were judged to be “normal” while others revealed various emotional issues like perfectionism or sexual anxieties. The first researcher asked the volunteers to talk about one event they associated with humiliation or guilt. Then, he told them to invent a cover story to disguise the event and to resist telling the true story when questioned by a second investigator. The second interviewer, who was told only the rough theme of the event (for example, “money issues”), then gave the volunteer amytal. Would subjects disclose the real event or would they be able to hold onto their cover story?

One of the nine subjects interviewed, a graduate student, felt guilty about how he was spending the money his financially strapped parents had saved for his education. The true story was that he was spending money on political causes. His cover story was that he used his parents’ money to pay for his girlfriend’s abortion. Under amytal, he maintained the cover story and did not disclose how he really was using his parents’ money.

Three other subjects held onto their cover stories (that is, amytal couldn’t extract their true secret). These three individuals were those who had been deemed the healthiest, psychologically, of the bunch. The other subjects, who had various emotional issues, had difficulties maintaining their cover stories. Amytal pierced through the cover story to elicit the true story in two subjects and partially exposed bits of the truth in the other four. The researchers concluded that susceptibility to amytal depended upon the individual’s psychological health: “The essential powers forcing us to confess or to resist are within us.”

Subsequent investigators ingeniously extended the Yale study. Choosing healthy undergraduates as subjects, they tested the students repeatedly in response to intravenous barbiturates, alcohol, scopolamine, morphine, amphetamines, atropine, and mescaline (whew!). Before they were given any drugs, the subjects were told to write down something personal – like their mother’s name – and resist disclosing it during the interview. A research assistant then gave them a plausible-sounding military secret that they were also told not to reveal (“The troops are arriving Tuesday afternoon”). Finally, the investigators used the Yale “cover story” procedure again; the subject was asked to recall a humiliating event but to invent a cover story for it. Over the course of four to eight hours of drug-assisted interrogation, the investigators tried to extract the information from the subjects. None of them gave up the military secret or disclosed the personal history item, but two subjects partially betrayed their cover stories. In other words, within the context of a decidedly artificial experimental situation, truth drugs could not convincingly “crack” the students even though they “became semi-comatose, mildly delirious, panicky, markedly loquacious, euphoric, or underwent transient dissociative reactions.”4

Another test of amytal took place in an army military hospital in New Jersey.5 The patients were known to be guilty of various military infractions but denied their guilt. They were interviewed by a psychiatrist who told them that they must submit to an amytal interrogation but that whatever they disclosed during the examination could not be used in court. So, the question was whether the psychiatrist could reliably extract the truth about an actual crime. Note that this study is much closer to a real-world military interrogation than the artificial experiments on college students. The psychiatrist took pains to build a relationship with the prisoners before starting the amytal. Furthermore, the psychiatrist didn’t rush in to query the prisoner about the crime but rather talked around the topic for a while to put the prisoner at ease, saying, “We’ll talk about that later.”

Even with these efforts, amytal could not be relied upon to compel the truth. Interesting features conspired against amytal’s effectiveness. As the authors observed, questioning was not successful unless a good rapport had been established. Even in those cases, however, high doses of amytal were necessary to elicit cooperation, but the result of that was a loss of clarity and a murky dialog because the patient mumbled or embarked on long, tangential fantasies. Even more important, as the authors pointed out, “Testimony concerning dates and specific places are untrustworthy and often contradictory because of the patient’s loss of time-sense. Names and events are of questionable veracity. Contradictory statements are often made without the patient actually trying to conceal the truth.” 

In a classic review of drugs used in interrogation, psychiatrist Louis Gottschalk pointed out some important lessons from the literature.6 He emphasized that the drugs’ effectiveness is not just due to their pharmacological properties. Many people respond to placebos – if told that a drug will have a certain effect, like ameliorating pain or compelling truth telling, 30 percent of people will experience that effect even if the drug is inert. Similarly, the way that the drug is administered is enormously important. Administering the drug in a nonthreatening manner and in low doses (at least initially) helps the subject relax his or her guard. Gottschalk also warned that it is a mistake to leap into talking about explosive topics; instead, interviews are more productive if they focus on relatively benign matters in the beginning, at least until the subject develops some degree of trust.

In another paper, Gottschalk was adamant that “there is no ‘truth serum’ which can force every informant to report all the information he has.” Instead, people can lie or distort the truth while under the influence of drugs. He thought that suggestible individuals, those who are awed by authority or plagued by guilt or depression, might be less successful in withholding information, but they still could unconsciously distort the information and/or confuse fantasies with facts. “It would be very difficult under these circumstances for an interrogator to distinguish when the verbal content was turning from fact to fantasy, when the informant was simulating deep narcosis but actually falsifying, which of contrary stories told under narcosis was true, and when a lack of crucial information coming from a subject under a drug meant the informant had none to offer.”7

These observations led Gottschalk to believe that training could help military personnel resist interrogation under truth drugs. His suggestions about how to protect troops assumed enormous importance during the peak of the Cold War – in the Korean War and the subsequent decade.

‘The informant should know that a drug of itself cannot force him to tell the truth, although it may make him talkative, overemotional, mentally confused, or sleepy   There is no need for the informant to become panicky at any bizarre or uncomfortable reactions he may experience, for these reactions…   [are transitory].

The informant can confound the interrogator by … [simulating] drowsiness, confusion and disorientation early during the administration of the drug. He can revel in fantasies; the more lurid the better. He can tell contradictory stories. He can simulate psychosis …  By these devices he can raise serious doubts in the interrogator’s mind as to the reliability of the information given by him.8

But Gottschalk was mistaken about the very nature of the problem. The government’s next challenge was not how to protect our soldiers from revealing secrets; it was the opposite. How could we protect soldiers from an enemy unswervingly dedicated to implanting ideas into them – that is, converting them? That problem emerged five years later in Korea and China.

Dark Persuasion: A History of Brainwashing from Pavlov to Social Media is published by Yale University Press. This extract comes with their kind permission. For your chance to win a copy of the book, see @psychmag on Twitter. 


1.     “‘Truth Serum’ Test Proves Its Power,” New York Times, October 22, 1924.

2.     The Latin expression In vino veritas means “In wine, there is truth.” There are numerous ancient references along the same lines. In Germania, chapter 22, Tacitus observed that Germanic tribes believed that people are more truthful when they are drunk and lack the power to dissemble. “They disclose their hidden thoughts in the freedom of the festivity.”

3.    F. Redlich, L. Ravitz, and G. Dession, “Narcoanalysis and Truth,” American Journal of Psychiatry 107 (1951): 586–93.

4.    L. D. Clark and H. K. Beecher, “Psychopharmacological Studies on Sup- pression,” Journal of Nervous and Mental Disease 125 (1957): 316–21.

5.    M. J. Gerson and V. M. Victoroff, “Experimental Investigation into the Va- lidity of Confessions Obtained under Sodium Amytal Narcosis,” Clinical Psychopathology 9 (1948): 359–75.

6.     Louis Gottschalk, “The Use of Drugs in Interrogation,” in The Manipulation of Human Behavior, ed. A. D. Biderman and H. Zimmer (New York: Wiley, 1961), 96–141.

7.    Louis Gottschalk, “The Use of Drugs in Information-Seeking Interviews,” in

Drugs and Behavior, ed. L. M. Uhr and J. G. Miller (New York: Wiley, 1960).

8.     Louis Gottschalk, “The Use of Drugs in Interrogation,” in The Manipulation of Human Behavior, ed. A. D. Biderman and H. Zimmer (New York: John Wiley and Sons, 1961), 134.

BPS Members can discuss this article

Already a member? Or Create an account

Not a member? Find out about becoming a member or subscriber