[LINK] Behind the Shock Machine: book reexamining Milgram obedience experiments

post by DanArmak · 2013-09-13T13:20:44.900Z · LW · GW · Legacy · 14 comments

There's a book called Behind the Shock Machine by psychologist Gina Perry, published just a week ago, which investigates the original Milgram obedience experiments. I haven't read it, but I've read a summary / editorial published in the Pacific Standard.

Of course, the editorial is in some measure designed to provoke outrage, generate click-throughs, and leave readers biased against Milgram. I don't trust the editorial to report unbiased truth. If anyone has read the book, what do you think about it?

Key quote from the editorial:

Perry also caught Milgram cooking his data. In his articles, Milgram stressed the uniformity of his procedures, hoping to appear as scientific as possible. By his account, each time a subject protested or expressed doubt about continuing, the experimenter would employ a set series of four counter-prompts. If, after the fourth prompt (“You have no other choice, teacher; you must go on”), the subject still refused to continue, the experiment would be called to a halt, and the subject counted as “disobedient.” But on the audiotapes in the Yale archives, Perry heard Milgram’s experimenter improvising, roaming further and further off script, coaxing or, depending on your point of view, coercing participants into continuing. Inconsistency in the standards meant that the line between obedience and disobedience was shifting from subject to subject, and from variation to variation—and that the famous 65 percent compliance rate had less to do with human nature than with arbitrary semantic distinctions.

The wrinkles in Milgram’s research kept revealing themselves. Perhaps most damningly, after Perry tracked down one of Milgram’s research analysts, she found reason to believe that most of his subjects had actually seen through the deception. They knew, in other words, that they were taking part in a low-stakes charade.

 

14 comments

Comments sorted by top scores.

comment by DavidAgain · 2013-09-13T15:26:06.275Z · LW(p) · GW(p)

Interesting. I have to resist the urge to dismiss this (because finding out about the experiment felt like such an amazing revelation, you don't want to think it's all made up).

I think it's quite possible that the results were exagerrated in that way people do with anecdotes: simplified to hammer the point home and losing some of the truth in doing so. I don't know what the standards of accuracy were in psychological papers of the time, so it's unclear whether to take this as just unfortunate or evidence of dishonesty in the sense of breaking the unspoken protocols of the discipline. I'd be interested to hear what counted as 'coercing' people to press the button, though, and without coercion, I don't see how the distinction between 'obey command to press button' and 'don't' can be blurred as this suggests.

The 'reason to believe people saw through it' is weird: would like more detail.

But I also tend to distrust the author (at least as represented in this editorial) because of a later section which seems very shoddy thinking to me:

"Gradually, Perry came to doubt the experiments at a fundamental level. Even if Milgram’s data was solid, it is unclear what, if anything, they prove about obedience. Even if 65 percent of Milgram’s subjects did go to the highest shock voltage, why did 35 percent refuse? Why might a person obey one order but not another? How do people and institutions come to exercise authority in the first place? Perhaps most importantly: How are we to conceptualize the relationship between, for example, a Yale laboratory and a Nazi death camp? Or, in the case of Vietnam, between a one-hour experiment and a multiyear, multifaceted war? On these questions, the Milgram experiments—however suggestive they may appear at first blush—are absolutely useless."

This seems to be a case of rejecting a very powerful and useful bit of information because it doesn't answer a whole series of additional, arbitarily chosen questions. If we can show that penicillin can stop infection, this is useful. And it isn't 'doubted at a fundamental level' by saying

'Even if 65 percent of patients got better, why did 35 percent not? Why might one infection respond, and not another? How do people get infected in the first place? Perhaps most importantly: How are we to conceptualize the relationship between, for example, a London hospital and a battlefield? Or between an urgent case of gangrene and a chronic illness slowly becoming more life-threatening? On these questions, the Fleming experiments—however suggestive they may appear at first blush—are absolutely useless."

These should be interesting new angles to explore, not reasons to ignore the original study.

Replies from: DanArmak, Protagoras
comment by DanArmak · 2013-09-13T20:17:20.020Z · LW(p) · GW(p)

But I also tend to distrust the author (at least as represented in this editorial) because of a later section which seems very shoddy thinking to me

It jumped out at me too. To me it reads like drivel added by the article to finish on a meaningless note of deep wisdom. That the article said this doesn't provide much evidence one way or the other about the book and the book's author.

comment by Protagoras · 2013-09-13T20:23:04.330Z · LW(p) · GW(p)

Indeed. IIRC, some of the follow-up experiments found that when there are multiple people involved, once one of them defies the authority it becomes much more likely that others will fail to comply as well (an effect not seen in the original study since the original study only applied authority to one subject at a time, of course). On the surface, this seems to suggest that authoritarian regimes should have a problem; the existence of any opposition should substantially undermine their authority. I can speculate about why they are sometimes able to succeed anyway, of course. A government is a much more powerful authority than a researcher, and is able to operate over the long term; that difference is huge enough that I could imagine it pushing compliance from the 60s into the 90s. And once opposition is in single digits, it may become possible to dehumanize and demonize them sufficiently to prevent people from seeing them as a possible example to follow. But I'd love to see some research which provides more than such guesswork. I know plenty of books have been written about how authoritarian regimes maintain power, of course, but mostly they've seemed to me to contain guesswork of their own, usually supported by anecdotes. This may be because this is such a hard topic to research (where to get the data? And so many possible confounding factors for any hypothesis!), but I'd love to discover that I just haven't found the right books.

comment by Desrtopa · 2013-09-13T16:05:50.191Z · LW(p) · GW(p)

I'll point out that for any criticism to cast meaningful doubt on the results of the Milgram Experiment, it would also have to be applicable to the replication.

Replies from: Ishaan
comment by Ishaan · 2013-09-14T03:11:25.893Z · LW(p) · GW(p)

Counterpoint: The reviewers are aware of the replication and still seems to lend the book credence

Jerry M. Burger, a psychologist who claimed to have reproduced Milgram’s results. As it happened, Burger had been put up to his act of scientific replication by ABC News, which funded his research and aired footage of his experiments during “The Science of Evil,” a 2007 episode of its Basic Instincts series pegged to the atrocities at Abu Ghraib.

and also here:

http://www.psychologytoday.com/blog/fulfillment-any-age/201301/the-secrets-behind-psychology-s-most-famous-experiment

I'm still not convinced that Milgram's findings are invalid, but short of reading the book I'd like to see some unfavorable reviews from people who have read it before dismissing it.

comment by falenas108 · 2013-09-13T16:10:56.987Z · LW(p) · GW(p)

Weren't there tons of replications of the experiment? My impression was everyone was shocked at the initial outcome of the study, and rushed to replicate.

Googled, and yep. http://en.wikipedia.org/wiki/Milgram_experiment#Replications_and_variations

Replies from: falenas108
comment by falenas108 · 2013-09-13T16:11:58.745Z · LW(p) · GW(p)

Or rather, people didn't rush to replicate. So that part was wrong. But, there were replications that came out with about the same numbers.

comment by private_messaging · 2013-09-13T16:35:40.367Z · LW(p) · GW(p)

Reads like hair-splitting over a rather unimportant detail.

The interesting thing about the outcome is that most people are rather easily talked into doing things they wouldn't otherwise do. Not that specific sentences work on exactly 65% of participants - a rate that would undoubtedly depend to the appearances of the experimenter, depth of their voice, and so on.

edit: Somewhat amusingly, the "ai box experiment" seem to have had 60% victory rate, though the sample size is way too small.

comment by Eneasz · 2013-09-13T18:24:52.164Z · LW(p) · GW(p)

I think a much more interesting take on Milgram is one presented by Radio Lab in which it's put forth that people were particularly likely to obey due to the spirit of scientific exceptionalism during that era. Science had won WWII, science was responsible for our prosperity and was making life better for everyone on earth, and they were helping that cause. It was idealism and optimism that prompted people to go beyond their own bounds in the pursuit of the greater good, rather than cynicism and obedience.

Also of note was that everyone who continued all the way to the final shock never heard the final prompt (“You have no other choice, teacher; you must go on”). Anyone who did hear that prompt would instantly fight back and refuse to continue. Being told you have no other choice was apparently counter-productive and would trigger resistance.

Replies from: falenas108
comment by falenas108 · 2013-09-14T00:49:36.933Z · LW(p) · GW(p)

But they've repeated the study today, and have the same numbers. Unless you're saying that spirit of scientific exceptionalism is still present?

Replies from: Pablo_Stafforini
comment by Pablo (Pablo_Stafforini) · 2013-09-15T19:29:12.672Z · LW(p) · GW(p)

And if the spirit of scientific exceptionalism is still present, then this explanation does little to undermine the practical significance of Milgram's findings.

comment by ILikeLogic · 2013-09-20T02:39:43.325Z · LW(p) · GW(p)

In an episode of the Freakonomics podcast they talked about similar skepticism about Phillip Zimbardo's Stanford prison guard experiments. The 'guards' felt subtly encouraged to become abusive to the 'prisoners'.

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2013-09-20T06:24:33.845Z · LW(p) · GW(p)

The 'guards' felt subtly encouraged to become abusive to the 'prisoners'.

If being merely subtly encouraged to become abusive resulted in the guards being abusive, that is an important result. In contrast, Milgram's electric shock experiment used a man in a white coat explicitly telling the subject to give the shocks.

Replies from: ILikeLogic
comment by ILikeLogic · 2013-09-21T18:19:21.351Z · LW(p) · GW(p)

I agree it is an interesting result but it isn't really the way the study has been portrayed. The takeaway, before hearing about this, was that anyone with power will start to abuse it, on their own, if just left to their own devices. But this is not, it now seems, what really happened in the Zimbardo prison guard experiments. So just like with the Milgram shock experiments, important information was missing causing the results to imply a more negative picture of human nature.