The new hidden persuaders?

Ella Rhodes considers whether shadowy influencers are really pulling our strings.

More than 60 years on from the influential text about psychological manipulation in advertising, how has the landscape changed? Are we being pushed and pulled in ever-more fundamental directions, or are our fears exaggerated?

In the introduction to his bestselling 1957 book The Hidden Persuaders, American journalist Vance Packard wrote about the ‘probers’, using their psychoanalytically driven ‘depth approach’ and ‘systematically feeling out our hidden weaknesses and frailties in the hope that they can more efficiently influence our behavior’. American psychologists at advertising agencies were, he warned, ‘probing sample humans in an attempt to find how to identify, and beam messages to, people of high anxiety, body consciousness, hostility, passiveness, and so on’. Thus persuasion was painted as intrusive, targeted and all-powerful: ‘You can probably make them do anything for you: Sell people things they don’t need; make women who don’t know you fall in love with you.’ Packard’s approach arguably fuelled a mistrust that has lasted for decades.

But the world is now a very different place. By 2005 more than half of households in Great Britain had access to the internet, and even since then our social, political and consumerist lives have changed beyond recognition. The technology channelling the persuasion to us is becoming more complex: it’s harder for us to understand who is seeking influence and how, or potentially even to have any awareness it could be happening.

We’re confronted with a stream of emotional appeals, conspiracy theories, social-media-proliferated propaganda and ‘fake news’. Psychologists remain at the heart of these issues. The Cambridge Analytica (CA) scandal saw researchers use personality tests taken on Facebook to create ‘psychographic’ profiles of voters, with data sold on to enable targeted messages for the Brexit ‘Leave’ campaign group, and Donald Trump’s and Ted Cruz’s presidential campaigns. Reports of the CA data being accessed from Russia, and hundreds of social media accounts operating from the Russian Internet Research Agency and attempting to influence UK politics, refuelled the ‘Cold War paranoia’ that was so key in Packard’s era.

We’re not talking about which brand of shampoo we buy here. Political decisions reflect our ideologies. Can the ‘new hidden persuaders’ really be shaking our foundations, the very core beliefs we hold dear, and all without us even realising?

A dark forest
In the 1950 Akira Kurosawa film Rashomon, a death in a forest is shown from four entirely contradictory points of view – casting doubt on reality itself. In a recent Wired article on the ‘dawn of the post-truth era’, former head of Facebook’s targeting effort Antonio García Martínez argues that ‘everything is in a Rashomon effect, and real discourse becomes impossible’. In the new media reality ‘the universally acclaimed expert or editor has been replaced by internet-enabled rumor and hearsay arbitrated only by algorithms’, Martínez writes. ‘There are some dominant media outlets with a claim to primacy, just as every village has a particularly well-informed local gossip, but the capital-T Truth, so beloved by the French encyclopedists, will no longer exist across a broad spectrum of society… As with the film, we’ll all choose the version we find most appealing. As to what actually happened in that forest… or whatever the next national crisis is… we’ll never have a definitive account of it. We won’t even think that possible.’

Research published in Science by Soroush Vosoughi, Deb Roy and Sinan Aral layers a worrying trend on top of this: misinformation reaches more people online than the truth. ‘The top 1 per cent of false news cascades diffused to between 1000 and 100,000 people, whereas the truth rarely diffused to more than 1000 people.’ They also observed that the truth took around six times as long to reach 1500 people compared with ‘fake news’ (a data set of ‘rumour cascades’ on Twitter from 2006 to 2017).

Vosoughi told me that there have always been people trying to change our opinions through persuasion. ‘This isn’t anything new. But technology has amplified the effect. Radio and TV allowed advertisers to reach into people’s homes, and now with social media it’s become even easier for persuaders to reach people and it has become harder to trace the source of the persuasion. And then it spreads... psychologists have looked at sharing behaviour in different domains and people tend to share more surprising and interesting stories in the offline world as well. People share stories that have greater emotional valence and tend to elicit a more visceral reaction... I believe false news has that emotional punch.’

Informational hygiene
Dr Tom Stafford, a psychologist at the University of Sheffield, has a more optimistic view. He sees hope that our relationship with the digital world will grow, that we will learn ways to avoid persuasive attempts. ‘We’re in a particularly unusual time, and our culture is acting like adolescence – we haven’t developed the maturity to deal with this stuff and I think we will. It will involve things like checking facts before passing them on. We’re going to improve our informational hygiene… part of that will be technological solutions, but part of it will be cultural change brought about by learning to deal with the abundance of information and fake information. If you’re motivated to collect information that agrees with what you believe already there’s scope like never before for doing that. But that also means people can find critical views, aggregate evidence… if you’re sincerely interested in finding out the truth, it’s now easier than ever before.’

Stafford does suggest that many of Robert Cialdini’s famous principles of persuasion (outlined in his book Influence) have the capacity to be amplified by social media. ‘One of his six principles is social proof, or consensus – if you’re in a theatre and everyone stands up and gets out you’ll leave, but if someone shouts “Fire!” and everyone stays where they are, you’ll stay where you are. What everyone else does matters, and social media amplifies that signal.’ Another principle is scarcity – Stafford says that the urgency of social media is the equivalent of ads shouting ‘Offer must end today’ or ‘While stocks last’. ‘The very latest news is only fresh for a few minutes with Twitter. It’s an interesting thought experiment – if you said to me “You can look at the Twitter feed from 14 July 2013”, there’s no way that would seem as appealing as looking at updates from today, even though the quality of information is probably about the same. It doesn’t have the same feeling of urgency; I want to know what’s happening now.’

Friends and likes
Cialdini’s remaining principles are ‘consistency’ (highlight any prior actions or beliefs that are consistent with your target behaviour); ‘reciprocity’ (encourage a feeling of obligation to give back to those who have given us or shown us something), ‘authority’ (the use of apparently credible experts in persuading us to buy, do or believe something); and ‘liking’ (deliver messages via someone a person likes). These principles are perhaps universally amplified on social media. In the past a toothpaste company may have relied on putting a dentist, complete with white coat, centre stage in its ads; now we often turn to the aggregated opinions, reviews and ‘likes’ of our peers.

Indeed, some of Stafford’s own research has shown people value their friends’ opinions on a topic as highly as a scientific view – not because they see their friends as more expert than scientists, but because they know their friends are on their side. To humans, this is vital. As is the size of the crowd: in one study Andrew Flanagin and Miriam Metzger found that when user-generated reviews of a film hit a certain number, participants saw that information as credible, relied on it and believed it to be accurate. ‘As information is aggregated in high volume, it is more difficult to fake and is less susceptible to individual raters’ subjective biases. Its influence is therefore amplified.’ While participants in their study did judge the film ratings of other members of the public as less credible and accurate than the critics, people were more willing to rely on expert opinions only when there were fewer ratings from non-experts. ‘User Generated Content’, the researchers concluded, ‘can equal, or even trump, expert information.’

Conversion or mobilisation?
This potential for persuasive messages to be amplified by online groups becomes all the more important given our tendency to dwell in ‘echo chambers’ – surrounding ourselves with people who hold similar worldviews and reading/liking/sharing information that chimes with our thoughts and experience. It’s confirmation bias on a mass scale, and there’s some evidence it is negatively skewed.

This is even built into the business models of the tech giants, some argue. Ian Leslie, writing in New Statesman in 2016, says that Facebook in effect charged the Trump campaign lower rates than the Clinton campaign, because Trump’s ads made people angrier and thus generated more clicks. ‘It was the logical outcome of the ad business’s core principle: the more attention you win, the more you get paid. Since negative emotions are more likely to win and hold attention than positive emotions, the system has an incentive to spread fear and loathing. Russia’s entire propaganda campaign relies on an advertising model that rewards paranoia and spite… The way we choose what to buy, like the way we choose to vote, will never be logical. Trying to make it so has created an environment in which our basest impulses are relentlessly stimulated and amplified.’

So far, so shadowy. But perhaps we need to shine a different light on the ‘new hidden persuaders’. Tom Stafford suggests our commonsense understanding of persuasion, and psychological explorations of it, do not reveal the true mechanisms behind how we are persuaded, and potentially overstate the extent.

'This simplistic model of persuasion would be easiest to capture by psychologists in experiments,’ he tells me. ‘You get people in, measure their attitude, do something, then 10 minutes later measure their attitude again. This means that our intuition, and our experiments, make persuasion hard to see in the way it really happens… It’s rare that you have these Damascene conversions and you say, for example, “I was a socialist but now I’m a capitalist”. That does happen, and that captures our imagination, but most persuasion isn’t conversion. It’s gradual and hard to see.’

Stafford argues that ‘psychology is responsible for sowing this idea of human nature over the last 50 years’, by over-emphasising certain findings – for example, that the taller presidential candidate in the USA always wins. ‘People assume Cambridge Analytica are doing some sort of voodoo… that people were happily browsing Facebook, a personality-customised ad drops into their feed and they head zombie-like towards the polls and vote for Trump or Brexit when the thought had never crossed their minds before. Obviously, that’s not true. If you look in the political science literature and at the strategy of actual political parties, they win elections by mobilisation. It’s much easier to persuade people who are already on your side to get out to vote than to get people who are persuaded towards the other side to vote for you.’

The collective mentality
There’s even scepticism over the level of influence when we head back into the world of consumer products. Professor of Marketing Chris Hackley (Royal Holloway University of London) tells me: ‘Advertisers like the idea that they can quantify the results of their ad spend, so they buy one-to-one ads targeted at individuals based on their browsing, liking and spending. But there is no strong evidence that sales or other outcomes – for example attitude change, votes, service enquiries or brand value – respond any more positively to such ads than to old-fashioned mass media ads. The success of propagandistic political advertising, including fake stories, show that humans are not socially solipsistic in forming our attitudes and deciding on our behaviour – we have a collective mentality. We seek to identify with groups.’   

This leads Hackley to favour a perspective drawn from ‘what you might call sociological social psychology’. ‘The distinctions between different categories of advertising and promotion are becoming blurred – for example a sponsored celebrity might appear in an ad, wear the brand in interviews, and mention it in social media, the sponsorship deal might be discussed in trade press and chat threads, and so on. If the brand relationship with the celebrity become widely known, every appearance of the celebrity becomes an intertextual reference to the brand. One promotional intervention can manifest in many different forms of promotional communication.’

I ask Hackley about the research on advertising effectiveness. He tells me it’s a highly disputed area. ‘Ad campaigns can have many different objectives, most of which are intermediate to sales; for example, encouraging consumers to believe X or Y about the brand. So there isn’t one simple answer to what amounts to an effective campaign.’

Into plain sight
So psychologists are surplus to requirements in the real world of advertising and political campaigning? Not so fast. There is research that suggests psychologically tailored adverts can bear fruit – for example a study led by Sandra Matz found that adverts tailored to match individuals’ level of extraversion and openness led to more purchases of beauty products, a crossword app and a game.

Yet Matz believes that psychological tailoring, and other forms of targeting that attempt to match with our behaviours online, are simply an amplification of the way we have always communicated and persuaded others. ‘In face-to-face communication you probably think intuitively about the personality of the other person and wouldn’t bring up the same subjects when you talk to an extraverted or introverted friend. So we hear in the media about this new weapon of warfare, but it’s just not true… we’re imitating the way people have communicated for centuries. I think the scary part is it feels much more anonymous now.’

Matz also suggests that discomfort stems from a dawning realisation about the types of personal data companies have been collecting for many years. ‘Persuasion probably isn’t any more effective than when companies used our data to make predictions on what messages to send or which products to advertise. But now, with personality-targeting, we’re putting a label on it… instead of “dimension 1000 in our complicated machine-learning algorithm” we’re calling it “extraversion”… people wake up and think, “Oh wait this is quite intimate and I didn’t realise the company on the other side actually knew that much about me”.’

Of course, the ‘hidden persuaders’ could be a force for good. Matz stressed a project she is working on with a bank in Central America to come up with ways to encourage lower-income people to save money, by targeting certain elements of their personality. However, she adds, companies must be proactive in increasing public trust in these kinds of methods by increasing transparency on how their data is used. Matz also suggested companies could allow users to engage with the predictions that platforms made about them – feeding in extra information, for example, if the personality profiles weren’t quite accurate or asking platforms to see content that goes against their data-generated psychographic profile. ‘Rather than just feeling outrage for a month then forgetting all about these issues, I’d like to see us having a constructive conversation about them,’ she concludes. ‘We need to discuss which contexts we’d be happy to see these methods used in.’

In the context of politics, however, Matz is more torn. ‘I think one of the biggest dangers we have at the moment is people are disengaged. Trump didn’t just get elected because Cambridge Analytica was running ads, but because 40 per cent of the population didn’t even vote. But if these same techniques were used in a more transparent way, for example in finding out the issues people care about and pushing messages to get them involved in the political process, it could hold value even in this context.’

What can we do about it?
Many psychologists are working to find ways of protecting ourselves against persuasive attempts, and awareness is key. Writing in CREST Security Review Stephan Lewandowsky, Sander van der Linden and John Cook point out that misinformation is ‘sticky’ – even when people know it to be false in the face of facts from a trusted source, they may carry on believing it regardless.

That’s where inoculation, or pre-bunking, comes in – we need to get in before persuasion sticks. Studies by the above authors, along with Ullrich Ecker, have found making participants aware that that information they read may be inaccurate and that people still rely on facts that have since been retracted due to inaccuracies, can cause people to rely on misinformation to a lesser degree. They have also had success in making participants aware of other methods, including exposing the ‘fake expert’ persuasion strategy; at play in debates around smoking in the past, and climate change in recent years.

A simple awareness of persuasive attempts, and the methods used to manipulate us, seems vital. Another potential strategy, put forward in a pilot study by van der Linden and Jon Roozenbeek, is to have people play a ‘fake news game’, in which participants use misinformation strategies to come up with their own fake stories from the perspective of different characters – a denier, an alarmist, a clickbait monger and conspiracy theorist. In their sample of 95 students aged between 16 and 19, the researchers found that playing the game reduced the perceived persuasiveness and reliability of fake news stories.

Interestingly, Sorough Vosoughi told me that the technology we regard with suspicion may be part of the solution. YouTube, for example, has recently introduced a feature that displays information cards from Wikipedia pages and other trusted sources when people search for certain topics, such as the moon landing, which attract conspiracy theorists and misinformation. But the key, he says, is in education. ‘You can provide all the links to true articles you want, you can do all the false news detection you want, but if the person looking at these videos doesn’t want to know the truth there’s nothing you can do about it. That’s something you can change through education. Part of the solution will come through teaching young children how to think more critically and, more importantly, not to take anything at face value. Research what you are told, to check if it is corroborated.’

Cold War echoes
An obvious problem with researching an article about ‘the new hidden persuaders’ is that such people are, well, hidden. Rumours abound about psychologists working on persuasive methods with big tech, consumer companies and governments. It’s a professional concern that such people may be operating with little ethical oversight. Building trust and transparency, and keeping the public engaged in these discussions, would seem vital if we are to quell the paranoia we feel in the face of new technologies.

And yet paranoia about psychologists’ shady role in persuasion is nothing new, and modern concerns have interesting echoes of the reaction to Packard’s The Hidden Persuaders. Charlie Williams, a postdoctoral researcher with Birkbeck’s Hidden Persuaders project – which examines the involvement of ‘psy’ clinicians in brainwashing, interrogation, psychological warfare, subliminal advertisement and therapeutic experimentation – said that while Packard’s book caught the imagination of the public as the first exposé of the role played by Freudian psychologists in advertising and PR, it also reflected wider concerns about the increasingly diverse sets of problems psychological skills were being applied to in the post-war period. ‘Psychologists were, by 1957, routinely employed by government and commercial institutions to recruit, monitor behaviour, and evaluate opinion and morale. In the 1950s, a new term “brainwashing” emerged, often used to describe the most sensationalist fears about psychologists’ purported ability to control minds, feeding into concerns about the influence of communism, the Soviet Union and the role played by new media in shaping human thought.’

While we could look back on the Cold War era as a particularly paranoid time, for understandable reasons, Williams said psychologists have been portrayed in films and literature ever since as playing questionable roles in government and big business. ‘The use of psychological expertise, particularly in the advertising industry and the military, continues to fuel speculation about uses of psychology we are not wholly aware of. What is interesting about the resurgent discussions on “hidden persuasion” is that the concerns about the specialist knowledge of the psychologist or “depth man” has been increasingly replaced by the “knowledge” generated by algorithms and AI.’

…Or perhaps that’s just what the ‘depth man’ wants us to think.

Ella Rhodes is The Psychologist’s Journalist
[email protected]

BPS Members can discuss this article

Already a member? Or Create an account

Not a member? Find out about becoming a member or subscriber