Nonviolence means avoiding not only external physical violence but also internal violence of spirit. You not only refuse to shoot a man, but you refuse to hate him.
— Martin Luther King, Jr.
From August 7th, 1961, through the end of May 1962, in the basement of a classroom building at Yale University, Stanley Milgram conducted more than 20 variations of his infamous obedience to authority experiments. He shocked the world with data on how readily people would punish others when cajoled or intimidated by an experimenter. This was a pivotal point in psychology because it was empirical evidence of man’s inhumanity to man — something no one, then or now, really wanted to hear.
The experiments began only months after the start of the trial of German Nazi war criminal Adolf Eichmann, who claimed he was only acting on orders. Milgram wanted to know why people would obey an authority figure. In the experiment, Milgram told subjects to deliver electric shocks to a subject who gave a wrong answer to a question. What he found disturbed the psychological community, then the rest of humanity.
In the most well known of these experiments, no shocks were actually delivered, but the subjects thought they had been. An unseen confederate of the experimenter would yell out when the increasingly strong “shocks” were given. At one point, after excessive screaming and begging for mercy the confederate went silent, as if they had lost consciousness or died. (Some archival footage of the actual experiment, in some sections narrated by Milgram himself, is located here.)
When subjects became distressed and asked to be relieved of the responsibility of the experiment they had been paid a few dollars to participate in, they were simply told they must continue. The result?
They did. Nearly two thirds, 62 to 65 percent, gave what would have been lethal shocks.
This experiment has been extensively written about, reproduced across cultures, and has used both male and female subjects. Nearly 3,000 subjects in at least 11 other countries have participated. It is always about the same: Two thirds to three quarters of the subjects deliver all the shocks. Each new crop of psychology students is incredulous. It boggles them to know someone could shock and perhaps kill someone for a few dollars in the interest of science.
When Milgram was a student at Harvard his dissertation took him to France to study conformity, a precursor to his work at Yale. Now, more than 50 years after this original work in Paris, his ghost has returned — not in a classroom basement of the Sorbonne, but as a reality TV show: “The Game of Death.”
In a documentary by Christophe Nick, the host and audience persuade contestants to deliver what they believe to be nearly lethal electric shocks to fellow players. Those the contestants thought were receiving the shocks were actually faking it; paid actors pretending to be nearly electrocuted. As this CBS video shows, it is quite realistic.
Sound familiar? It should. It was adapted directly from Milgram’s experiment to demonstrate the potentially abusive power exerted by the lure of television. It did just that. In “The Game of Death,” 81 percent — a higher percentage than Milgram found — “shocked” the confederate in strengths up to 20 times the maximum of 460 volts, enough to kill. The remaining percentage refused. Is this the direction of reality TV? The documentary suggests that ratings-hungry producers are limited only by what they can get contestants to do.
But there is something missing. The focus has been on the number of people that did the deed. We now know a lot about how obedient people are to authority, even in the face of common sense, but what we have yet to learn is the hope embedded in the capacity for people to remain conscious of the impact of their decisions. In other words: What do we know about those who refused?
In her review of “The Man Who Shocked The World: The Life and Legacy of Stanley Milgram,” Jemmi Diski puts the issue squarely in front of us:
Why did some people refuse when others didn’t? Yes, we are inclined to comply easy life, fear of group disapproval, reprisals, wanting to be in with the top guys but what is it about the 35 per cent of refusers that made them eventually able to refuse? It was really only half an experiment, and the less useful half.
So, who are the 35 percent? What do we know about them?
Not much, but we are learning. Laurence Kohlberg, a contemporary of Milgram’s, interviewed some of the original Yale subjects. Kohlberg has proposed that there are three levels of moral reasoning: pre-conventional, conventional, and post-conventional. Each level has two stages.
Post-conventional reasoning primarily is concerned with social justice, while conventional judgment centers around social conformity and law and order. Kohlberg found a higher level of moral reasoning might have been a factor in Milgram’s subjects refusing to participate or continue. In the original study about 75 percent of subjects at the post-conventional level (stages 5 and 6) disobeyed, versus 13 percent of subjects grouped as conventional (stages 3 and 4). Other researchers have found similar results when looking at rates of obedience and disobedience to authority figures. To support his work on moral reasoning Kohlberg used a quote from an icon of disobedience, Dr. Martin Luther King:
One may well ask: ‘How can you advocate breaking some laws and obeying others?’ There are two types of laws: just and unjust. One has not only a legal but a moral responsibility to obey the just laws. One has a moral responsibility to disobey unjust laws. An unjust law is a human law not rooted in eternal and natural law. Any law that uplifts human personality is just; any law that degrades human personality is unjust.
A slide presentation from Milgram’s original series shows other variations, including a photo depicting a confederate being “shocked” in the same room as the subject, a condition that greatly lowered the compliance level. It was harder to comply when you saw someone’s pain.
A study in 1995 by researchers Modigliani and Rochat used more ethically appropriate guidelines of putting subjects in potentially stressful conditions (the main criticism of Milgram’s experiments, and the reason he was denied tenure at Harvard.) These studies revealed that the earlier in the experiment a participant showed some resistance, the greater the likelihood he would end up defying the experimenter. More than that, research in 2009 by Jerry Burger duplicated Milgram’s studies (with the appropriate ethical guidelines) and found that those who stopped felt they were the ones responsible for the shocks. Those who continued, not surprisingly, held the experimenter accountable.
Taking personal responsibility for your actions, whether through moral reasoning or proximity, seems a promising start to understanding the nature of those in the minority. Positive psychology has often derived profound understanding from the outliers, from those whose natural gift is to have such qualities as resilience, emotional intelligence, or optimism. Milgrim himself was an outlier and certainly didn’t follow the crowd. Were he alive today there is a good chance he’d be studying disobedience. He might even be inspired by a quote from the very origin that piqued his interest in the first place.
“A soldier’s obedience finds its limits where his knowledge, his conscience, and his responsibility forbid to obey orders.”
– Generaloberst Ludwig Beck (1880-1944)
Executed for treason against Adolf Hitler’s Nazi regime
This post currently has
You can read the comments or leave your own thoughts.
Last reviewed: By John M. Grohol, Psy.D. on 4 Jun 2010
Published on PsychCentral.com. All rights reserved.
Tomasulo, D. (2010). The Ghost of Stanley Milgram and The Game of Death. Psych Central. Retrieved on January 27, 2015, from http://psychcentral.com/blog/archives/2010/06/03/the-ghost-of-stanley-milgram-and-the-game-of-death/