October marks the 50th anniversary of Yale University psychology researcher Stanley Milgram’s first published paper on his infamous shock machine experiments. (Ah, the 1960s in psychology research — when ethics were just something left to philosophers, not psychologists or doctors.)
You probably remember the experiment from your Introduction to Psychology class. Milgram designed a set of experiments where the subject sat next to an electrical “shock machine” that wasn’t hooked up to the subject, but rather to another person hidden from view. It had a set of switches that would give greater and greater voltage shocks to the other person when pressed.
The subject was designated as “the teacher” and the other person was “the learner.” When the learner didn’t learn, the teacher had to administer a shock. A man in a lab coat — “the experimenter” — directed the subject when to administer shocks of increasing intensity when the learner answered a question incorrectly.
What Milgram claimed to have found was that people are easily subjugated and will readily follow instructions to “do evil” to another human. But a more nuanced review of Milgram’s experiments show something quite different.
In case you don’t remember — the shock machine wasn’t actually hooked up to anything. And the person supposedly being shocked was a confederate actor who just faked being in pain as the shocks went up intensity.
Christopher Shea writing for The Boston Globe has the story:
Milgram’s experiments made their first appearance in print in October 1963, in the Journal of Abnormal and Social Psychology. That article focused on an experiment in which the person supposedly being shocked took the shocks silently at first, then pounded on the door if the voltage reached 300 volts, and again at 315 — and then went silent.
Sixty-five percent of subjects nevertheless kept turning the electricity up to the highest voltage.
But that finding is only from the published research Milgram presented. He ran dozens of other experiments that were variations on this theme, and most of those experiments’ results never made it into a journal.
In research, it’s called the “file drawer effect,” which is a type of publication bias that happens when a researcher files away research that doesn’t support their hypothesis or demonstrates negative outcomes. And apparently Milgram did a bit of this:
In more than half the experiments, at least 60 percent of the subjects disobeyed the experimenter before reaching the maximum — a statistic that might change your impression about how bovinely compliant the subjects were.
There is also a question about whether the subjects thought they were actually hurting anyone: Milgram reported that three-quarters of them believed in the setup, but that includes the 24 percent who said they had “some doubts.”
Furthermore, Milgram’s experimental practices in the lab often varied — sometimes significantly — from what he said he did in the published research. “…[S]ometimes the experimenter would comply with the subjects’ demand that he go behind the screen to check on the suddenly silent “learner”; when that happened, the experimenter would come back to report that he was fine. That important detail was omitted in Milgram’s write-ups.”
And the experimenters in the studies often went way beyond simply providing verbal “prods” to the subjects to administer the shock. Sometimes they were downright badgered and shamed into following the rules:
But listening to archived tapes, Perry heard the experimenter downright “badgering people,” repeating the prods and introducing new ones. “You hear a moving of the goal posts,” she says. In one set of experiments involving female subjects, she says, the experimenter insisted 26 times that one woman continue, turned the shock machine back on after another subject turned it off in protest, and got into an argument with a third.
Milgram also did a horrible job in debriefing his subjects, failing to tell the vast majority of his subjects that the shocks were completely faked (he instead just told them “they weren’t as bad as described”). Partly because of psychologist Milgram’s dubious ethical behavior, universities across the country created new guidelines to make modern replications of his experiment much more difficult to conduct (although one has been done).2
And one of the remaining problems with Milgram’s work — conducting a lab experiment on a small group of people that you then generalize to all people’s behavior outside of a laboratory setting — is that psychologists still engage in these same problematic behaviors today. Researchers still conduct artificial lab experiments on a specific group of people — college students — and then generalize those findings to all people, in all situations.
For further information…
Gina Perry has a Kindle book that goes into more detail about Milgram’s experiments: Behind the Shock Machine: The Untold Story of the Notorious Milgram Psychology Experiments
Read the Boston Globe article: Stanley Milgram and the uncertainty of evil
- Burger (2009) replicated one component from one Milgram experiment, demonstrating that even in modern times, people in a lab setting would press a shock button over an arbitrarily-set threshold. However, I would argue that Burger set the voltage threshold — “150 volts” — low enough that he could reasonably suspect most people would go over it. After all, few people die from getting a quick shock from their household outlet, which is 110-120 volts.
And surprisingly, Burger didn’t ask the subjects if they were aware of the Milgram shock machine experiment, excluding only those who volunteered the information on their own or had 2 or more college-level psychology classes. That still could mean lots of the subjects were aware of Milgram’s original experiment and simply never mentioned it… meaning they could’ve also known that the Bulger shock machine wasn’t real. [↩]
- In Milgram’s experiments, the compliance rates conducted at his lab at Yale University were higher than when similar experiments were conducted in a run-down office building in the city — suggesting that the prestige of the institution matters too. [↩]
This post currently has
You can read the comments or leave your own thoughts.
Last reviewed: By John M. Grohol, Psy.D. on 30 Sep 2013
Published on PsychCentral.com. All rights reserved.
Grohol, J. (2013). Psychology Secrets: People Aren’t as Evil as the Milgram Obedience Experiment Suggested. Psych Central. Retrieved on November 21, 2014, from http://psychcentral.com/blog/archives/2013/10/01/psychology-secrets-people-arent-as-evil-as-the-milgram-obedience-experiment-suggested/