Shadowy puppet masters or snake oil salesmen?
The past week has seen blanket media coverage of the Facebook / Cambridge Analytica scandal, with CEO Mark Zuckerberg finally breaking his silence yesterday to acknowledge that the policies that allowed a misuse of data were ‘a breach of trust between Facebook and the people who share their data with us and expect us to protect it’.
Robinson Meyer, writing for The Atlantic, helpfully summed up the complex scandal: ‘In June 2014, a [psychology] researcher named Aleksandr Kogan developed a personality-quiz app for Facebook. It was heavily influenced by a similar personality-quiz app made by the Psychometrics Centre, a Cambridge University laboratory where Kogan worked. About 270,000 people installed Kogan’s app on their Facebook account. But as with any Facebook developer at the time, Kogan could access data about those users or their friends. And when Kogan’s app asked for that data, it saved that information into a private database instead of immediately deleting it. Kogan provided that private database, containing information about 50 million Facebook users, to the voter-profiling company Cambridge Analytica. Cambridge Analytica used it to make 30 million “psychographic” profiles about voters.’
Cambridge Analytica has significant ties to some of President Donald Trump’s most prominent supporters and advisers, and reportedly used its ‘psychographic’ tools to make targeted online ad buys for the Brexit ‘Leave’ campaign, the 2016 presidential campaign of Ted Cruz, and the 2016 Trump campaign.
This isn’t the first time Facebook has been at the centre of a furore over the use of its data for psychological purposes. In the August 2014 issue we reported on the ethics of their own study of ‘emotional contagion’, in which the site manipulated its users’ News Feeds over a week to assess whether being shown fewer positive or negative stories from friends would affect the emotions of individuals. And the Cambridge Analytica story itself actually broke in 2015.
There are clearly numerous ethical and legal issues around such uses of our personal data online. But there’s also interesting discussion around whether any of this actually works. Is this a story about shadowy psychologists pulling the strings on a global stage? Of psychologists with real research tools increasingly hooking up with the social media titans who have the means to use them for leverage in life-changing ways? Or simply a case of a psychologist providing data to a company who then use it to overpromise and under-deliver?
What’s special about ‘psychographic profiles’?
As Sasha Issenberg reported for Bloomberg in 2015, ‘Cambridge Analytica’s trophy product is “psychographic profiles” of every potential voter in the U.S. interwoven with more conventional political data. The emphasis on psychology helps to differentiate the Brits from other companies that specialized in “microtargeting,” a catch-all term typically used to describe any analysis that uses statistical modeling to predict voter intent at the individual level. Such models predicting an individual’s attitudes or behavior are typically situational – many voters’ likelihood of casting a ballot dropped off significantly from 2012 to 2014, after all, and their odds of supporting a Republican might change if the choice shifted from Mitt Romney to Scott Brown. [Cambridge Analytica CEO] Nix offered to layer atop those predictions of political behavior an assessment of innate attributes like extroversion that were unlikely to change with the electoral calendar.’
Many are sceptical that such an approach actually works. Eitan Hersh, who wrote the 2015 book Hacking the Electorate, said that ‘Every claim about psychographics etc made by or about the firm is BS… “Let’s start with fb data, use it to predict personalities, then use that to predict political views, and then use that to figure out messages and messengers and just the right time of a campaign to make a lasting persuasive impact” ...sounds like a failed PhD prospectus to me.’ New York Times reporter Kenneth Vogel tweeted: ‘BIGGEST SECRET ABOUT CAMBRIDGE ANALYTICA: It was (& is) an overpriced service that delivered little value to the TRUMP campaign, & the other campaigns & PACs that retained it – most of which hired the firm because it was seen as a prerequisite for receiving $$$ from [important Republican donor family] the MERCERS.’
In a piece titled ‘The noisy fallacies of psychographic targeting', Antonio Garcia Martinez wrote ‘the aspiring psychograficist (if that’s even a thing) is now making two predictive leaps to arrive at a voter target: guessing about individual political inclinations based on rather metaphysical properties like “conscientiousness"; and predicting what sort of Facebook user behaviors are also common among people with that same psychological quality. It’s two noisy predictors chained together, which is why psychographics have never been used much for Facebook ads targeting, though people have tried.’ Indeed, he notes that ‘Most ad insiders express skepticism about Cambridge Analytica’s claims of having influenced the election, and stress the real-world difficulty of changing anyone’s mind about anything with mere Facebook ads, least of all deeply ingrained political views.’
Others have written that fears of manipulation by new media are as old as mass media themselves, and 'we’re back to the old new vision of crowd psychology and mass psychosis popularized by Gustave Le Bon in 1895'. Brendan Nyhan, a Professor of Government at Dartmouth College, pointed to a recent meta-analysis of numerous different forms of campaign persuasion, including in-person canvassing and mail, finding that their average effect in general elections is zero. Yet there is recent research showing the effectiveness of these techniques in purchasing behaviour. With politics seemingly becoming more populist by the day, perhaps it’s just a matter of time before our voting behaviour is swayed in the same way. And some have argued that we're looking for effects in the wrong place. Tom Stafford (University of Sheffield) said 'My view is that persuasion effects are probably the wrong place to look… mobilisation effects (or the dark pattern alternative – influencing people not to vote) are probably where the action is.'
Indeed, there have been concerns over the use of the Facebook 'vote' button in influencing voter turnout, described as the 'most effective voter activation tool ever built'. If such a tool is only presented to some users, and the public are none the wiser as to how such users are selected, could it be that slim margins in national elections are in the hands of the social media giant?
The tip of the iceberg?
One thing is for sure: the Cambridge Analytica scandal may end up being only the first of countless similar cases. Facebook has allowed third-party app developers to access private user data since 2007. How many of those developers cached the data and made their own private databases? Where is that data now, and how might it be used?
Some psychologists we spoke to are concerned about this, and are surprised the case isn’t making more waves within the discipline. Stafford told us: ‘When news of this broke in 2015, UK psychology was mostly silent: in marked contrast to many other scandals. Perhaps we weren't au fait with the new frontiers in ethics that big data / social media are opening up. If that’s the case, we need to get up to speed… not only are many academics deeply involved in business, but social science research is itself perhaps moving to tech firms, which have the data and resources to do things that academics would like to do.’ Joseph Devlin (University College London) agreed. ‘I’m baffled about why the Cambridge Analytica affair didn’t even seem to penetrate most people’s awareness, especially among professional psychology researchers in the UK. It was, after all, “one of us” who seemed to have perpetrated this data collection, and he was based at one of the top universities in the world’ (although the University of Cambridge have now issued a statement).
Yet some key figures have voiced their concerns. Michal Kosinski, psychologist, data scientist and now Professor at Stanford University, worked with Kogan and was reportedly the first academic approached by Cambridge Analytica. His emails from May 2014 suggest he described Kogan's approach to the research as 'highly unethical' and warned that 'this situation is really disturbing to the culture of our department and destroys the good name of the university'. In an interview with us last year, Kosinski had said ‘I will stick to teaching and researching at Stanford – consultancy doesn’t interest me much.’ In that interview, he also voiced his concerns over the potential for Facebook to manipulate democracy: ‘The big problem is that in the past, editorial policy was obvious. You could see if you were looking at a left-leaning or right-leaning paper… Now, a guy in the back-room, an engineer, can tweak a tiny thing that will affect the online environment for 1.6 billion people and no one would know. If Facebook decided to be liberal-leaning, nobody would even know because everyone sees a different thing. It’s creating different results for billions of people so it’s sort of difficult to measure, even for the owner of Facebook. No single person can claim to know how it works.’
For his part, Dr Kogan has said: ‘The events of the past week have been a total shell shock, and my view is that I'm being basically used as a scapegoat by both Facebook and Cambridge Analytica when... we thought we were doing something that was really normal.’ Maybe there’s the rub… that as more psychologists become involved with commercial companies and the internet’s big hitters, they will continue to encounter grey areas around what is ‘normal’, ethically and legally speaking. We're becoming ingrained in 'an expansionary economic logic that insists on inspecting ever more of our thoughts, feelings and relationships' (in the words of William Davies in the London Review of Books). And if the data becomes more detailed and the tools more powerful, psychologists may find themselves – wittingly or unwittingly – having a growing yet partly invisible influence on the world we live in.
- This is clearly an evolving news story – there's now a growing link with Brexit – and it's a complex one at that. I am grateful to Tom Stafford and Vaughan Bell, as ever, for pointing me to sources who do a much better job of explaining it than I ever could!
In terms of online privacy and our understanding of it, I have just seen a Sky poll asking 'How well do you understand what Facebook does with the data from your Facebook account?' 7 per cent answered 'very well'; 14 per cent 'fairly well'; 38 per cent 'not very well'; and 42 per cent 'not at all'. Is anyone aware of a survey which asks 'How much do you care what Facebook does with your data?' If you are concerned, here's some guidance on how to change your settings. Or you could simply follow Erin Kissane's advice: 'Get off Facebook. Get your family off Facebook. If you work there, quit. They're fucking awful.'
I hope my grasp on the situation will grow between now and our May edition going to print, and I would be very happy to hear from you with any thoughts or additional angles that could improve our coverage. Email me on [email protected]
BPS Members can discuss this article
Already a member? Or Create an account
Not a member? Find out about becoming a member or subscriber