Shock and Awe
Replicating Milgram’s shock experiments
leads to a different interpretation
IN 2010 I WORKED on a Dateline NBC television special replicating classic psychology experiments, one of which was Stanley Milgram’s famous shock experiments from the 1960s. We followed Milgram’s protocols precisely: subjects read a list of paired words to a “learner” (an actor named Tyler), then presented the first word of each pair again. Each time Tyler gave an incorrect matched word, our subjects were instructed by an authority figure (an actor named Jeremy) to deliver an electric shock from a box with toggle switches that ranged in 15-volt increments up to 450 volts (no shocks were actually delivered). In Milgram’s original experiments, 65 percent of subjects went all the way to the end. We had only two days to film this segment of the show (you can see all our experiments on NBCNews.com), so there was time for just six subjects, who thought they were auditioning for a new reality show called What a Pain!
Contrary to Milgram’s conclusion that people blindly obey authorities to the point of committing evil deeds because we are so susceptible to environmental conditions, I saw in our subjects a great behavioral reluctance and moral disquietude every step of the way. Our first subject, Emily, quit the moment she was told the protocol. “This isn’t really my thing,” she said with a nervous laugh. When our second subject, Julie, got to 75 volts and heard Tyler groan, she protested: “I don’t think I want to keep doing this.” Jeremy insisted: “You really have no other choice. I need you to continue until the end of the test.” Despite our actor’s stone-cold authoritative commands, Julie held her moral ground: “No. I’m sorry. I can just see where this is going, and I just—I don’t—I think I’m good. I think I’m good to go.” When the show’s host Chris Hansen asked what was going through her mind, Julie offered this moral insight on the resistance to authority: “I didn’t want to hurt Tyler. And then I just wanted to get out. And I’m mad that I let it even go five [wrong answers]. I’m sorry, Tyler.”
Our third subject, Lateefah, became visibly upset at 120 volts and squirmed uncomfortably to 180 volts. When Tyler screamed, “Ah! Ah! Get me out of here! I refuse to go on! Let me out!” Lateefah made this moral plea to Jeremy: “I know I’m not the one feeling the pain, but I hear him screaming and asking to get out, and it’s almost like my instinct and gut is like, ‘Stop,’ because you’re hurting somebody and you don’t even know why you’re hurting them outside of the fact that it’s for a TV show.” Jeremy icily commanded her to “please continue.” As she moved into the 300-volt range, Lateefah was noticeably shaken, so Hansen stepped in to stop the experiment, asking “What was it about Jeremy that convinced you that you should keep going here?” Lateefah gave us this glance into the psychology of obedience: “I didn’t know what was going to happen to me if I stopped. He just—he had no emotion. I was afraid of him.”
Our fourth subject, a man named Aranit, unflinchingly cruised through the first set of toggle switches, pausing at 180 volts to apologize to Tyler—“I’m going to hurt you, and I’m really sorry”—then later cajoling him, “Come on. You can do this… We are almost through.” After completing the experiment, Hansen asked him: “Did it bother you to shock him?” Aranit admitted, “Oh, yeah, it did. Actually it did. And especially when he wasn’t answering anymore.” When asked what was going through his mind, Aranit turned to our authority, explicating the psychological principle of diffusion of responsibility: “I had Jeremy here telling me to keep going. I was like, ‘Well, should be everything’s all right…’ So let’s say that I left all the responsibilities up to him and not to me.”
Human moral nature includes a propensity to be empathetic, kind and good to our fellow kin and group members, plus an inclination to be xenophobic, cruel and evil to tribal others. The shock experiments reveal not blind obedience but conflicting moral tendencies that lie deep within.
November 1st, 2012 at 2:06 am
It would be wrong to attribute the results of an experiment from an earlier decade to incorrect or biased interpretation of data. The attitudes of individuals to authority have changed so much in the last few decades that comparisons are difficult if not impossible.
To my parents, questioning the decision of a priest, doctor or politician was tantamount to treason. Many, if not most, people believed what they were told without question. These days, I want to see evidence, I want to prove to myself that someone is trustworthy before I accept that what I am being told I should believe is true.
Indeed, I have come to expect that anyone in authority is going to attempt to cheat or defraud me for their own gain.
These changes in attitudes, which seem to have occurred in many areas of society have made me more moral, less willing to accept what I am told until I have thought it through and arrived at my own decisions. I certainly would not do anything a researcher told me to do if it went against my principles and morals whereas I cannot say the same of my parents.
November 1st, 2012 at 11:09 am
Michael:
I think you might want to re-read Milgram’s study and his conclusions. At no point did he state that people blindly accepted the authority. He spent a good deal of time in his too-short life discussing the PROCESS of the inner conflict his subjects suffered, and what great pains he took to set up the perfect scenario in which he could get them to overcome their deep desire to avoid hurting others.
Paramount among these variables was AUTHORITY. The subjects in this replication thought they were taking part in a reality television show. The subjects in Milgram’s initial study thought they were part of a SCIENTIFIC study at YALE university; that’s quite a bit different, and it could easily be argued that the latter had more authentic authority than some random tv producer.
Moreover, I see many similarities between this replication and the much more thorough Milgram study. It’s not like your subjects simply refused to take part. They “shocked” the subject, complained and had inner conflict, and eventually stopped. That’s pretty much what happened in 1960; the differences are likely due to the fact that Milgram’s study was carried out much more professionally.
November 1st, 2012 at 1:21 pm
This seems pretty consistent with Milgram… I’ll never forget watching the original film in Psyc 101, they all looked tortured as they did what they did. Yes, a bigger portion refused to go to the end here but it was a small sample.
November 1st, 2012 at 7:48 pm
I’m a bit distressed that this was replicated at all. Would a university IRB approve it?
November 7th, 2012 at 2:14 pm
…and their [priest/minister] insisted even more that they hate [gays,humanists,women’s choice,…] and though they were clearly uneasy, they continued to increase their hatred…
November 12th, 2012 at 9:18 pm
A similar 90-minute TV game called “Le Jeu de la Mort” (Game of Death)was shown on France’s FR2 TV channel in 2010 (full version in French here: http://www.youtube.com/watch?v=pau7aDYrxFw). The setup was brilliantly convincing, but fake and closely supervised by psychologists.
Most striking of all, in my view, was not the willingness of most participants to inflict pain on an innocent subject as the studio audience egged him or her on, but the outstanding courage of those who stood their ground and refused to obey, ignoring pressure from the game host and the studio audience. These are the people we should be studying. What makes them special? What can we learn from them?
November 18th, 2012 at 11:02 am
Milgram’s study has become practically iconic and has surely influenced our view of human nature — in a negative direction, unfortunately. No one doubts that some people are capable of the mindlessly obedient cruelty he describes (consider the Nazi death camps), but it has not been pleasant to believe that a majority of people are so easily persuaded by a clipboard and an authoritative tone of voice to abandon their moral principles. I am very glad that your results are so different. Whether or not your subtext is a critique of Milgram’s integrity, I thank you for your work.
December 1st, 2012 at 2:22 pm
I usually like Michael Shermer, but oh, what irony.
This whole replication seems to have been a Milgram experiment on Mr. Shermer. Would Mr. Shermer replicate an experiment generally considered an icon of unethical social research? Would he stop the replication? Evidently not, not even when he recognized that the participants were visibly shaken and disturbed.
Mr. Shermer, it was YOU who played the weak-willed, obedient dupe this time.
He didn’t even have a good excuse, given that the replication clearly used ad hoc methods ill designed to test any hypothesis… which might have been convenient, given that there didn’t seem to be one until the “experiment” had ended. Now, he wants to draw “conclusions” based on his own speculation and the loosely cobbled feedback of SIX–count ’em, SIX–whole participants.
Grow up, Mr. Shermer. If you’re going to violate ethical principles, at least use some rigorous methods and get a decent sample size. (By the way, Mr. Shermer dismisses only an over-simplified caricature of Milgram’s interpretations and conclusions.)
How very disappointing, on several levels.
January 19th, 2013 at 5:12 pm
Ncooty, your comment was very entertaining, powerfully argued, and….ridiculous. Michael Shermer knew exactly what he was getting into before he was ever subjected to the “authority figures” on the set. If you want to accuse him of SOMETHING, I guess it could be greed or sadism or scientific curiosity unmediated by empathy, but certainly not meek obedience to authority.
Sheldon W. Helms, your comment was right on target. I saw a video interview once with Milgram and he said if there was one oft-neglected crucial point about his series of studies that badly needed emphasis and re-emphasis, it was this: varying even small details of the set-up had surprisingly large impacts on the outcome. And I remember from the original study that when the subjects were protesting and seemed on the point of refusing to continue, the “authority” figure, in a white lab coat, would say, “The experiment must continue”. So if altering even tiny details had a major effect, changing a scientific experiment presided over by a white-laboratory-coated apparent professor to a reality/game show presided over by an Alex Trebek type could easily account for the lower levels of obedience.