The shock of the old

Stephen D. Reicher and S. Alexander Haslam introduce a special feature which reconnects with Milgram’s vision for social psychology

Stanley Milgram had an epic vision for social psychology: to create strong experimental contexts that would demonstrate the power of the social world to shape individual behaviour. Consistent with this goal, variants of the obedience paradigm reveal participants’ propensity to show not only total obedience but also total disobedience. This article argues that the key remaining task for researchers is to explain this variation, but that to do this we need to reconnect with the richness of Milgram’s data and ideas. This theme is echoed and elaborated in other contributions to this special feature. Alexandra Milgram tells us about the passions that motivated her husband’s life and work; Jerry Burger discusses his replication of the Milgram studies; film scholar Kathryn Millard explores an overlooked side of Milgram; and historian Richard Overy considers the impact of Milgram’s ideas on our understanding of destructive obedience in, and after, the Second World War.

The year is 1961. The Bay of Pigs invasion is launched. The ANC begins its armed struggle against apartheid in South Africa. Patrice Lumumba is murdered in the Congo. Freedom Riders are firebombed in Alabama. Homosexuality is still a crime in the UK. Yuri Gagarin breaks free of our earthly shackles. Colour television is first demonstrated in the UK – but neither flavoured crisps nor sunscreen factors as yet been introduced.

Who would want to go back? We seem to have progressed so far, both socially and technologically, over the ensuing half-century.

But not in every way.

In 1961 Stanley Milgram conducted his famous ‘obedience to authority’ experiments at Yale University (Milgram, 1963). These studies are so well known to us that they hardly need describing. Even those who have never studied psychology and who cannot recall Milgram’s name still know of ‘those studies’ in which people gave apparently lethal shocks to a learner when told to do so by an experimenter. As a result, his work has spilled over from the world of science into the world of popular culture – notably when William Shatner of Star Trek fame played a Milgram-like figure in a TV dramatisation of the obedience studies. His ideas frame popular understanding of one of the most pressing issues of our age: how can humans act with such inhumanity to their fellow beings? To cite one of the other greats of post-war social psychology, Muzafer Sherif: ‘Milgram’s obedience experiment is the single greatest contribution to human knowledge ever made by the field of social psychology, perhaps psychology in general’ (Takooshian, 2000, p.10).

Sherif is perhaps being modest here, for his Boys’ Camp studies (Sherif, 1956) conducted between 1949 and 1954 inaugurated what might be regarded as a heroic age of social psychology – of which Milgram’s studies were, perhaps, the most outstanding example. Before Sherif, the firm assumption was that those who did monstrous things must themselves be monsters. As the world sought to assimilate the emerging evidence of the Holocaust, it was easiest to believe that those who kill and maim and torture must have something about them which marks them as different from decent folk. More specifically, research concentrated on the pathologies of an authoritarian personality – suggesting that tyranny and oppression were the preserve of those people who crave order, who defer to authority, and who are highly punitive towards those who deviate from the norm (Adorno et al., 1950).

What Sherif did was to challenge such individualistic accounts by revealing the consequences of varying the social setting. However, he didn’t content himself with pallid and trivial interventions that hardly engage the subject. Rather, he conceived of things on an epic scale. He took control of a whole social world – the world of the American adolescent summer school – for an extended period and then examined the consequences of systematic variations in the organisation of that social world. First, he divided people into different groups in which they worked and played and ate and slept 24 hours a day. Then he put the groups in competition against each other, and finally he, contrived to make them cooperate.

What emerged with great force and clarity from the Boys’ Camp studies was that when you are able to vary the worlds in which people live you can transform the ways in which they behave. There is an emblematic moment in the early studies, when the competition and conflict was at its height, where Sherif remarks that, had an outside observer entered at this point, he would have regarded the boys (who were chosen for being the ‘cream of the crop’ in their communities) as vicious and disturbed youngsters. There is a forceful analogy here. For psychology is often like that outside observer who – failing to take account of context and history – provides individualising and pathologising accounts of social processes.

Stanley Milgram had the same ambition and vision as Sherif and the same sense of the epic. In his postdoctoral research he had worked with Solomon Asch on the topic of conformity (Asch’s line-judgement studies, of course, being also classics in social psychology: e.g. see Asch, 1955). But Milgram was dissatisfied with studies that looked at the effects of peer judgement on estimates of line length. He wanted to look at the types of behaviour that motivated research on conformity in the first place. Milgram, born in 1933, was of Jewish background. Even as a young child he was aware of what the Nazis were doing. He later acknowledged how the Holocaust shaped not only his interest in obedience but also the way he examined it. For him, what conformity meant was going along with others in harming, even killing, innocent people. He set out to devise a paradigm that would allow him to investigate when, how and why this happens.

The obedience paradigm was the outcome. In this, Milgram, like Sherif, created a whole new social world for his participants. His studies were like pieces of theatre in which the participants were unwitting actors.

As recently documented by Nestar Russell (2011; see also his ‘Looking back’ piece in the September 2010 issue of The Psychologist:, Milgram put great efforts into the set (much care was put into designing the shock machine as a huge, intimidating and imposing piece of equipment), he put great effort into the actors (spending considerable time carefully recruiting his confederates), and he put much effort into the script (the prompts used by the experimenter were carefully designed as were the reactions of the ‘learner’ who supposedly received the shocks). The ultimate effect was as compelling for the viewer as for the participant. Milgram made a film of the studies with the simple title Obedience. Watch it. As the participants agonise over what to do, as they struggle over whether to heed the words of the experimenter or the cries of the learner, it is hard to tear oneself away. Milgram’s studies endure as great drama as well as great science.

At first, Milgram doubted that anyone would follow the experimenter’s instructions up to the maximum shock level of 450 volts. When he described the set-up to panels of psychiatrists, college students, and middle-class adults, not one person out of 110 predicted that they themselves would go all the way.

On average they thought that they would go up to a point between 120 volts and 135 volts. Then, when he conducted pilot studies on Yale students and found that most did actually go on until the bitter end, he dismissed that as saying something about the privileged upper-class ‘Yalies’. However, when he found repeatedly that ordinary Americans from many walks of life did likewise, he began to take note. Milgram finally realised that he had unearthed the ‘phenomenon of great consequence’ of which he had always dreamed (Blass, 2004).

What Milgram had shown, like Sherif, was the power of context. In ways even more dramatic than his predecessor, Milgram had demonstrated how far one can transform human behaviour when one is in a position to transform the social relations in which we are enmeshed. Indeed, what is often under-appreciated is the extent to which Milgram then examined precisely what sort of situations increase or decrease conformity – producing both obedience and disobedience (Milgram, 1965). In all, he conducted nearly 40 versions of the paradigm (the exact number depends upon how many pilots are counted as actual studies; over 20 are described in his 1974 book Obedience to Authority) in which the level of conformity varied from 100 per cent to 0 per cent.

So what is it about context, then, that explains this extreme variation in responses? This, of course is the critical question (Reicher & Haslam, 2011). But it is also the blind spot in Milgram’s work – or perhaps, more accurately, in the way Milgram’s work has been represented and appropriated. In his early papers, Milgram provides many rich insights into this question. But the explanation that came to predominate (and that occupies a central place in his 1974 book) rather ironically ignores these questions. This suggests that, when confronted with an authority figure people withdraw into an ‘agentic’ state in which they concentrate on how well they obey and ignore what it is that they are obeying. Their focus is on being good followers, not good people – and it is this that allows great evil to be done.

The main problem with this account, is that the studies themselves provide no real evidence of such an agentic state.It doesn’t tally with how people behave in the sessions and it certainly can’t explain differences between different versions of the study. Even Milgram’s fervent admirers (and we are amongst them) are unconvinced by the ‘agentic state’ explanation. The general consensus seems to be that we are left with a compelling phenomenon that lacks a compelling explanation.

That is why it is so important to continue along the path that Milgram started. After all, the phenomenon of human inhumanity hasn’t gone away. The sombre roll call in the years since 1961 – Bosnia, Rwanda, Sudan, Syria – makes that abundantly clear. We need to understand precisely how context impacts on psychological processes in order to understand – and try to reduce the likelihood of – atrocity.

But there is an obvious problem. Ethics.

The great field studies of Milgram, and then Zimbardo, were harbingers of their demise. The very power of what they produced raised questions about our right to inflict such experiences upon people in the name of scientific progress. In many ways, history has been unkind to Milgram on this score (see also Kimmel, in last month’s issue of The Psychologist). He was in fact a pioneer in ethical procedures. He was the first to develop structured debriefings. He was the first to follow his participants and provide systematic data on the impact of participation in his studies. What is more, he argues convincingly that the impact was overwhelmingly positive and that there is little evidence of any harm. Nonetheless, the route to tyranny often derives from the sense that we have the right to use others in order to further our own notions of what is good. Quite rightly, ethical standards have become tighter over recent decades. As a result, the Milgram studies in their original form would not be acceptable science today.

However, this is not the end of the story. Creative ways have been found to overcome ethical concerns. One involves the use of virtual reality paradigms. Another involves replicating the studies only up to the 150-volt point (since what people do at this point predicts very well how they would respond subsequently; Burger, 2009, and in this issue). Thus, following a major debate in American Psychologist last year, it seems that – with sufficient care and creativity – the obedience paradigm is open for business once more.

What we need, though, are not simply ethical solutions. We need to rediscover the vision, ambition and epic sense of scale that Milgram embodied. We need to go back to that heroic era of great field studies where researchers were able to manipulate whole social worlds. For two things are certain.

First, if we do so, we will see again (and perhaps understand more) how social context impacts on human behaviour and how what we do is as much a consequence of what lies outside us as what lies within our brains.

Second, if we fail to do so, and if we close off exploration of the importance of social variability, we will inevitably lay too much emphasis on other forms of variability. In other words, social psychology will become – or, perhaps, remain – sorely unbalanced.

Not everything has been progress since 1961. In some areas of psychology – as in society and technology – we have clearly made massive strides forward. But, with Milgram as a guide, there are ways in which we need to go back to go forward.

Stephen D. Reicher is Professor of Social Psychology at the University of St Andrews

S. Alexander Haslam is Professor of Social and Organisational Psychology at the University of Exeter


Adorno, T.W., Frenkel-Brunswik, E., Levinson, D.J. & Sanford, R.N. (1950). The authoritarian personality. New York: Harper.

Asch, S.E. (1955). Opinions and social pressure. Scientific American, 193, 31–35.

Blass, T. (2004). The man who shocked the world. New York: Basic Books.

Burger, J. (2009). Replicating Milgram. American Psychologist, 64, 1–11.

Kimmell, A.J. (2011). Deception in psychological research – A necessary evil? The Psychologist, 24, 580–585.

Milgram, S. (1963). Behavioral study of obedience. Journal of Abnormal and Social Psychology, 67, 371–378.

Milgram, S. (1965). Some conditions of obedience and disobedience to authority. Human Relations, 18, 57–76.

Milgram, S. (1974). Obedience to authority: An experimental view. New York: Harper & Row.

Reicher, S.D. & Haslam, S.A. (2011). After shock? British Journal of Social Psychology, 50, 163–169.

Russell, N.J.C. (2010). The making of an (in)famous experiment. The Psychologist, 23, 780–783.

Russell, N.J.C. (2011). Milgram’s obedience to authority experiments: Origins and early evolution. British Journal of Social Psychology, 50, 140–162.

Sherif, M. (1956). Experiments in group conflict. Scientific American, 195, 54–58.

Takooshian, H. (2000). How Stanley Milgram taught about obedience and social influence. In Blass, T. (ed.) Obedience to authority (pp.9-21). New York: Taylor & Francis.


BPS Members can discuss this article

Already a member? Or Create an account

Not a member? Find out about becoming a member or subscriber