As a software developer I love to code, but over the last year I have been gaining more and more of an appreciation for marketing (it’s what makes the world go ‘round, that and money, if you can believe the song) and therefore I follow quite a few blogs that deal with the subject. One of the best blogs about online marketing right now is Brian Clark’s Copyblogger, so when Brian recently released a free report titled “Authority Rules” I had to have a read. Imagine my surprise when the report kicked off by describing the Milgram experiment. The reason I was so surprised was that I had an interesting discussion on the Milgram experiment just few days ago in a Lean workshop me and some of my colleagues did with Jason Yip. There are several important lessons that The Milgram experiment can teach us and so I believe it is well worth sharing, as the more people know about it the better.
The Milgram Experiment
I am quoting most of this from Brian’s report:
Let’s say you see a newspaper ad saying the psychology department at Yale is running a little “experiment on memory.” Paid volunteers are needed for the hour-long study, so you figure why not? Upon arrival at the lab, you meet two men — a research scientist in a lab coat, and another volunteer just like yourself. The researcher proceeds to explain the study to you both. He tells you the study is about the effects of punishment on memory. The task of the other volunteer will be to learn a series of word pairings (he’s called the “Learner”). Your job as the “Teacher” will be to test the Learner’s memory of the word pairs, and administer electric shocks for each wrong answer. And for every new wrong answer, the voltage goes up.You’re not sure about this whole thing, but it must be okay, right?
The testing begins, and when the other volunteer misses a question, you pull a lever that delivers a mild shock. Over time, though, the shock levels increase, and the Learner is grunting audibly. At 120 volts, he tells you the shocks are really starting to hurt. At 150 volts, he tries to quit. The researcher tells you to keep going, and that the shocks will cause “no permanent tissue damage” to the Learner. You continue questioning and delivering punishment for incorrect answers. At 165 volts, the Learner screams. At 300 volts, the Learner refuses to respond any longer, as the shocks are impairing his mental capacities. The researcher tells you to treat non-responses as incorrect answers.The Learner is screeching, kicking, and pleading for mercy with every subsequent shock, all the way up to 450 volts when the researcher finally stops you.
Scary story.
This couldn’t possibly have really happened, right? Well, actually, it did, in 1963 at Yale, during a series of experiments by Stanley Milgram. But here’s the real scoop about the Milgram experiment:
- there were no actual electric shocks
- the Learner was an actor
- the study had nothing to do with memory
What Milgram wanted to know was how far the Teachers would go when told to continue to deliver those shocks, since they thought they really were. About two-thirds (65%) of the subjects administered every shock up to 450 volts, no matter how much the Learner begged for mercy. However, without the researcher’s encouragement to continue, the study found that the test subjects would have stopped giving punishment quite early on.
What Lessons Can We Learn From This
This experiment has been used before to explain how during war-time some people can commit horrendous atrocities and their peers wouldn’t try to stop them, but would infact go along with it to the point of becoming directly involved in committing the atrocities themselves (e.g. the holocaust during World War 2). One of the lessons we can learn from this is the fact that people would tend to go along with someone they consider to have authority even if by doing so they go against their better judgment. There are numerous ways we can justify our actions afterwards (only following orders, didn’t know any better etc.), but at the end of the day it comes down to the fact that when we believe that someone knows more than us about a subject, they can get us to do what they want most of the time (or 65% of the time if you can believe the experiment).
The thing is though, even though we do it, it will often be with strong reservations, which means our own mind and judgment are still very much in play. So, why do we, so often let our moral compass be overridden by external factors? And this, I believe leads us to the second and most important lesson we can glean from the Milgram experiment. Trust your instincts, ask questions, challenge those around you no matter how expert you think they are (if their position is strong they will be able to defend it with cool logical argument, without misdirection). This is true for any software project (but is also true for life in general), it is every developers’ responsibility to question processes, patterns and yes, even code, that you believe can lead to trouble. If everyone on a team has this kind of attitude, then you will never be an outcast for doing this, but rather the whole team will either fill you in on what you’re missing or will work together with you to fix or mitigate the issues that were brought up. This, in my opinion, is the crux of the Agile spirit (it would also be a great example of leadership regardless of whether you hold a leadership role or not). I for one would never want to be one of the 65% of people who would shock another human being to the point of insensibility. And so I’ll leave you with this question, what are you doing to make sure that if you find yourself in a ‘Milgram experiment’ type of situation, you’re not one of the 65% either?
Image by takomabibelot