“If a system of death camps were set up in the United States of the sort we had seen in Nazi Germany, one would be able to find sufficient personnel for those camps in any medium-sized American town.” – Stanley Milgram
It wasn’t until the early sixties that Adolf Eichmann (one of the key architects of the Holocaust, for whom the phrase “the banality of evil” was coined) was finally brought to trial. His primary defense was commonly used by others during the earlier Nuremberg trials: he was “just following orders.” To answer whether such a defense could even conceivably excuse an individual who participated in such atrocities, psychologist Stanley Milgram conducted a series of psychological studies on obedience to authority. He found that roughly two-thirds of people were willing to administer fatal shocks upon the command of an authority figure. Let that sink in for a second.
In the early seventies, Philip Zimbardo conducted the famous Stanford Prison Experiment. Basically, he recruited a bunch of students, split them into groups of prisoners and guards and then placed them into an immersive prison simulation. Within days, “prisoners” were experiencing mental breakdowns and “guards” were engaging in abuse. The experiment, scheduled to last two weeks, was aborted after six days.
In his excellent book “The Lucifer Effect,” Zimbardo reflects on the enormous power that social dynamics or situational forces exert on individuals. Given the right circumstances, we are all capable of acting heinously. But what are those conditions? Zimbardo offers us ten, as lessons learned from Milgram. These are steps to turn “good” people “evil” (evil traps). All ten lesson are stated very well in his book. Lessons three, five and eight are the ones that I will focus on. They are:
3. Enforce adherence of “the rules” – arbitrary or changing, they must be followed!
5. Promote the diffusion of responsibility (“I just work here…”).
8. Have the authority start out “just,” and only gradually change to “unjust.”
Most organizations set many of the ten conditions. It’s really in this lesson that the slide towards “evil” happens: “Have the authority start out ‘just,’ and only gradually change to ‘unjust.’” Without this lesson, we are left with basically every bureaucracy on the face of the planet. But if we view this lesson more as a decoupling from principle than a slide towards the unjust, we can even see it at work in many bureaucracies. In this way, it is really just an instance of the organization’s leadership falling victim to, as Zimboardo says, “Enforce adherence of ‘the rules’ – arbitrary or changing, they must be followed!” Absent some external correction mechanism, this situation leads to a sort of moral domain sheer (MDS) – the organization develops its own moral world, replete with an internal logic but disconnected from the surrounding world.
How is it that a soldier can go to war, kill people and remain morally intact? Well, the soldier is no longer in the usual moral world when they go to war. They have left for a moral domain where killing is acceptable, as long as the rules are followed. But this is the same process at work, whether the result is found to be good or bad.
Steve Biko, founder of the Black Consciousness Movement under apartheid rule, taught that all whites in South Africa shared in the responsibility for apartheid – not because they were white, per se, but because they participated in benefitting from the system. Let’s consider lesson number five: “Promote the diffusion of responsibility (‘I just work here…’).” These citizens were effectively members of an organization that suffered from MDS, and he would not let them hide behind diffused responsibility. Nor did the West allow Eichman or many other Nazis to shirk their responsibilities.
When our governments (powerful organizations) take action, we the citizens (participants, not subjects) are responsible for it. We are not solely responsible, but each of us owns a piece of the responsibility. Our governments act in our name. We, the people.
If MDS is practically a feature of organizations and can potentially “corrupt” any of us, but we still retain responsibility for the group and moral culpability for ourselves, what can we do? What if the extent of our political voice is to choose the lesser of two evils?
Remember last week’s lesson: there is no box. If a system produces only unsatisfactory outcomes, maybe it’s time to realize that the system is the problem.
I, for one, wouldn’t want to put Milgram to the test on that opening quote. Maybe it’s time to choose a different system, one that doesn’t produce concentration camps.