Free Will and the Mob
They made me do it
Psychologist Philip Zimbardo from Stanford University in California thinks not. He believes the judge was guilty of the “fundamental attribution error” – overestimating the effects of someone’s temperament on their behaviour and underestimating the effects of the environment in which they were acting. Zimbardo was an expert witness on Frederick’s defence team. He interviewed him at length before the trial and carried out extensive psychological tests. He found no hint of mental illness or sadistic tendencies in Frederick. “In many ways this soldier was an American icon: a good husband, father and worker, patriotic, religious, with many friends and a history of having lived a most normal, moral small town life,” says Zimbardo. Then he went to Abu Ghraib and turned into a monster.
This may be an extreme case, but such transformations are surprisingly common. You find them in just about any environment in which an individual is subsumed into a group or is reacting to what others are doing: rioting mobs, football crowds, committees, social networks, even panels of judges. In such situations a group mentality can easily take over, leading people to act out of character or adopt extreme or risky positions. In an analysis that considered 25,000 social psychology studies, published a few months after the Abu Ghraib abuses emerged, Susan Fiske at Princeton University concluded that almost everyone is capable of torture and other evil acts if placed in the wrong social context (Science, vol 306, p 1482). “Our society tends to focus on individual psychology,” says Zimbardo. “All our institutions – in war, law, religion, medicine – are based on this concept.” Yet if we don’t understand the power of group psychology we can never hope to combat evils such as torture, suicide bombings and genocide, or indeed avoid making bad decisions or committing despicable acts of our own.
Zimbardo has famously shown how easy it is to turn peaceful people abusive and hostile. In an experiment at Stanford University in 1971, he recruited students to imitate prison guards and inmates. After six days the experiment had to be stopped because the guards – ordinary summer-school students selected for their healthy psychological state – had pushed many of the prisoners to the point of emotional breakdown. In a similar experiment published in 1974, Stanley Milgram from Yale University persuaded ordinary people to administer electric shocks to a “victim” sitting behind a screen. Without much trouble Milgram had all of them increasing the voltage until the victim was screaming (it was an act but they didn’t know that). Two-thirds of them carried on until the victim was apparently unconscious.
“If you can diffuse responsibility so people don’t feel accountable, they will probably do things they normally never would,” says Zimbardo. Milgram did this by telling the participants that he was in charge, and that he himself would take responsibility for anything that happened. Zimbardo gave his “prison guards” all the symbols of power of real guards – uniforms, whistles, handcuffs, sunglasses – effectively giving the volunteers permission to behave like them. He also ensured that prisoners were known only by numbers, not by their names. Many studies have found such anonymity to be an effective tool for changing the way someone is treated, or how they treat others. You find the same effect outside the lab. In 1971, anthropologist John Watson from Harvard University found that tribal cultures renowned for their barbaric treatment of enemies usually wear masks or paint their faces when going into battle, while those who go to war unadorned tend to be far less brutal. Likewise, many commentators have observed that people perpetrating crimes such as torture and genocide often dehumanise their victims by thinking of them as animals. Following on from Milgram’s experiment, Albert Bandura from Stanford University found that people would administer more severe electric shocks if he told them that the recipients (whom they could not see) seemed “like animals”.
Personal allegiance
Groups can create environments that diminish individual responsibility, but they can also exert their hold in another way. “There is a significant difference between mob behaviour, in which anonymity and imitation are the important factors, and the direct influence of a group, which involves personal allegiance to leaders and comrades,” says Ariel Merari, a psychologist at Tel Aviv University in Israel and an expert on Middle Eastern terrorism. Groups that recruit suicide bombers are among those that use the latter approach, building a sense of community and encouraging feelings of responsibility towards other group members: the “brotherhood mentality”. Here, individuals take responsibility for their own actions within a culture where suicide bombing is seen as glorious. Then, by recording farewell messages to family and friends either on videotape or in writing, they make a commitment to their own martyrdom that they cannot renege on without losing face (New Scientist, 15 May 2004, p 34).
All of this is a long way from the situations that most of us face. Yet many of the decisions we make every day are heavily influenced by what others are doing. In a study published last year, for example, Duncan Watts and colleagues at Columbia University in New York showed that the reason chart-topping pop songs are so much more popular than average is not because they are significantly better but because consumers are influenced by the buying habits of others (Science, vol 311, p 854). This is known as the social cascade effect, a phenomenon in which large numbers of people end up doing or thinking something on the basis of what others have done.
There are two mechanisms at work here, says Watts. “The first is social learning. The world is too complicated for each individual to solve problems on their own, so we rely on the information that is encoded in our social environment – we assume other people know things we don’t.” Then there is social coordination, where you want to do the same thing as other people not because you think it is better but because what matters is doing things together. “Liking the same song, movies, sports and books not only gives us something to talk about, but makes us feel like we’re part of something larger than ourselves.” As well as directing consumers’ buying habits, these two forces can influence financial markets, protest movements, and even – through opinion polls – how we vote.
It is not surprising that people should be so susceptible to the dynamics of their social environment. After all, we evolved as social animals in environments where cooperation and group cohesion were key survival tools. Our reasons for being influenced by others are often valid, but if we are not careful this tendency can get us into trouble. In a classic study carried out in the 1950s, for example, social psychologist Solomon Asch revealed how the peer pressure associated with being part of a group can lead people to deny the evidence of their own senses. When asked simply to match the length of a line on a card with one of three reference lines, 70 per cent of his subjects ignored their own judgement and sided with the rest of their group who, unbeknown to them, had been primed to make a blatantly wrong choice.
When any group of like-minded people get together, the result can be equally alarming. One common effect is that the group ends up taking a more extreme position than the one its members started with – it becomes polarised. For example, a group of people who begin a discussion believing George Bush’s policies on Iraq are merely ill-advised may finish convinced that his policies are insane. Cass Sunstein, professor of law and political science at the University of Chicago has identified two reasons. First, in like-minded groups you tend to hear only arguments that support your own viewpoint, which is bound to reinforce it. In addition, people are always comparing themselves with others and will shift their position so as not to appear out of line. The same kind of thinking is behind the phenomenon known as “risky shift” in which adolescents, already prone to risky behaviour, are even more inclined to throw caution to the wind when they are with their peers.
Polarisation is related to another form of group psychology known as groupthink, where members strive for cohesion at the expense of all else. Maintaining cohesion can give a group a sense of power and bolster the self-esteem of its members, but it can also lead them to make bad and dangerous decisions. “When group cohesion is based on congeniality, criticising ideas means attacking the source of group cohesion,” says Clark McCauley, director of the Solomon Asch Center for Study of Ethnopolitical Conflict at the University of Pennsylvania, Philadelphia. As with social cascades and polarisation, problems often arise when people rely on what they think others know and fail to share useful information they might have. This mistake can be compounded by the influence of a manipulative leader. Groupthink has been blamed for the CIA’s flawed plan to invade Cuba in 1961 – the infamous Bay of Pigs debacle – and also for NASA’s failure in 2003 to recognise that the damage done to the wing of the space shuttle Columbia by a piece of foam during take-off was potentially fatal. Irving Janis, the psychologist who coined the term groupthink in 1972, believed no one was immune. “Probably every member of every policy-making group is susceptible,” he wrote in a landmark paper.
Another situation in which we are all prone to assuming a strong group mentality is at times of crisis. This explains why support for national leaders increases in wartime – and why George Bush achieved almost unanimous backing for his “war on terror” after 9/11. It is understandable that people look to their own group when they feel threatened, but the result can be an escalation of tension. In a study published last year, for example, a team led by Tom Pyszczynski from the University of Colorado, Colorado Springs, found that Iranian college students who were prompted to think about their own death showed greater support for suicide attacks against the US than they would have otherwise (Personality and Social Psychology Bulletin, vol 32, p 1).
Knowing what we do about group psychology, what are the lessons to be learned? For a start, we should discourage isolated cliques of like-minded people and encourage people with opposing views to speak out – and that applies whether you are trying to prevent terrorism or elect a new school head. The flip side of this is that we should recognise that extremist groups are usually remarkably homogenous in terms of the interests, political affiliations, age and socioeconomic status of their members. “If I were an intelligence agent trying to break a terrorist cell, if I caught one member I’d find out what food he eats and what clothing he wears,” says Scott Atran at the University of Michigan, Ann Arbor. The chances are his fellow terrorists would have very similar preferences. Accordingly, Atran and forensic psychiatrist Marc Sageman are building a database of members of jihadi terrorist networks in Europe and Asia, recording information such as family background and friends.
Another lesson is that the wider social environment influences the decisions made by groups. Pyszczynski found that he could change the attitudes of his Iranian students by convincing them that public opinion in their country was opposed to suicide attacks. What’s more, in similar studies with US students he first increased their appetite for conflict with Arabs by getting them to think of their own death, and then found he could reduce it simply by showing them photos of family life from many different cultures or reminding them of their own group values, such as compassion, and of what they have in common with others. “This is particularly encouraging as it shows a way of reversing a process that otherwise can increase public support for terrorism,” he says.
The behaviour of football hooligans can also be influenced by their social environment, according to Clifford Stott, a social psychologist at the University of Liverpool, UK. Working as a consultant to the police for the European championships in Portugal in 2004, he found that the aggressiveness of football crowds is heavily influenced by how the police treat them. Although violence has been part of the group identity of a significant section of England fans, low-profile policing at certain matches during Euro2004 encouraged them to adopt an uncharacteristically orderly attitude which they then maintained through self-policing (European Journal of Social Psychology, DOI: 10.1002/ejsp.338).
The idea of group psychology is rather unsettling. We like to think that we are in control of our own decisions and behaviour, not at the mercy of our social environment. It is also deeply disturbing to contemplate that any of us might have done what Frederick and the other Abu Ghraib reservists did. Yet Zimbardo also points to a positive side. His latest research looks at what makes a hero, and he has found that our universal capacity to perform evil acts under the influence of the group is matched by a universal capacity to resist peer pressure and do the right thing. “There is nothing special in the backgrounds of heroes – they choose to act on the moment. There are no predictive psychological factors,” says Zimbardo. Ordinary heroes, like ordinary monsters, are everywhere.
Joseph Darby is a perfect example. He was an army reservist in the same company as Frederick, and the person responsible for stopping the torture and human rights abuses at Abu Ghraib. Darby passed a CD of the photographs to his superior officer. He did this despite the severe potential costs to himself and his family, who are now in hiding for fear of retaliation from members of his unit. Zimbardo looked into Darby’s background. “Ordinary,” he says. “He never did anything like it before.”