Psychologist logo
Decision making

‘The fact that we have access to so many different opinions is driving us to believe that we’re in information bubbles’

Poppy Noor meets Michal Kosinski, psychologist, data scientist and Professor at Stanford University

28 April 2017

In Barack Obama’s speech on leaving the White House, he named online echo chambers as one of the big threats to democracy. Across the pond, the same phenomenon – where people are said to be locked in to social media universes that only confirm rather than counter their own opinions – was widely thought to have played a role in ‘Brexit’. But when I speak to Dr Michal Kosinski, one of the most influential ‘Big Data’ psychologists in the world, his thoughts on the issue perhaps go against the grain.

Responding to findings from a recent study by the Pew Research Centre, which suggests that 62 per cent of US adults get news on social media, he ponders ‘Would those people get any news from anywhere other than Facebook? Perhaps they are not interested in news. People who are generally interested in politics would read up on other parties.’

Dr Kosinski is far from the typical university professor. He has written two of the most influential articles ever published by PNAS, ranking in at 4th and 12th. TED talks have covered his research; a character in a play has been based on him; and year-on-year his papers are tabled as the most interesting, imaginative and influential. If you’ve ever been to one of his lectures, you’ll see the queue of students waiting outside afterwards begging to work on his projects.

Perhaps it’s the content. His research covers everything from accurately predicting people’s personalities based on their Facebook likes, to using huge data sets to show that people are most likely to be friends with people who have similar personalities to them. He’s relatable, persuasive and hard to disagree with, but, having worked with him in the past, I am sceptical that you can predict so much from Facebook data. I’m also worried about the unprecedented power that such research provides to manipulate people.

He picks up on this point. ‘Have you listened to Obama’s speech? He spent 10 minutes talking about the information bubble. He calls it one of the three largest threats to democracy. I love Obama, but I very much disagree with him on this one.’ For Kosinski, the only new thing about the online echo chamber is that it’s happening on our Facebook and Twitter feeds rather than in the comfort of our homes across the dinner table. ‘Closing ourselves in information bubbles is natural. One of the most basic theories, confirmation bias, is just that. We like the kind of information which confirms our current beliefs more. We remember it better, we are in fact less likely to hear information that contradicts our biases.’

I ask him whether calling it an inevitable phenomenon does anything other than confirm that Facebook is, indeed, an echo chamber. ‘I would argue that we in fact are exposed to more diverse information than ever. It seems paradoxical, but the fact that we have access to so many different opinions is driving us to believe that we’re in information bubbles. There is evidence to show that people are discovering more than they ever would. Just in the last year, the diversity of the artists that people listened to on Spotify increased by around 20 per cent.’

Spotify does this through a system that makes educated guesses at what users will like based on the music taste of similar users, as well as their own listening history and habits. In other words, what they do is similar to Dr Kosinski’s own research. It is also a technique used by Facebook’s own algorithm, EdgeRank, which learns intently from Facebook users’ behaviour, such as who they interact with most frequently and whether they respond to a certain type of post more – for example a photograph instead of a status update; and it even learns from posts that users write and never use. This information helps to decide what comes up on our Facebook feeds and encourages us to engage more.

Using this type of information for research is not without criticism. In 2014 researchers at Facebook manipulated the posts on some newsfeeds to make them either overwhelmingly positive or negative. They observed the behaviour from users and concluded that online moods were contagious. It was a study that caused controversy. Many called it unethical, whilst others were suspicious that the study would be used to ensure less ‘neutral’ posts appeared on users’ feeds, to encourage their engagement with Facebook.

Kosinski calls this type of research ‘tailoring’. ‘In the past, if you were not interested in what the message was – say you went to school and you were given novels that you weren’t interested in – you would basically just disengage. The hope behind tailoring is that those who were previously excluded will now stay in. They will read, they will become involved in political processes, and so on.’

Surely, I ask him, it is one thing if people choose to confirm their own biases, and another if organisations use vast data subsets about users’ tastes – possibly without their knowing – to manipulate them, sometimes with the particular aim of making their own organisation more profitable or relevant.

‘I think people misunderstand what recommendation is. The fact that you read a novel about dogs doesn’t mean you’ll get recommendations about that and nothing else. They will realise that you’re in a cluster of tastes and therefore recommend to you, perhaps, something about engineering. Because it’s not just that the dogs form the core of your interest, perhaps it’s the style of writing. Our interests are multidimensional… tailoring means more information for you and less indoctrination, which is amazing.’

But what if the algorithm decides that this person would be interested in reading mainly news that’s false, or news that is extremist, or news that confirms their own view? The Wall Street Journal’s ‘Blue Feed, Red Feed’ study took this to its logical extreme by showing us a simulation of a ‘very liberal’ and a ‘very conservative’ Facebook newsfeed side by side. Although they were clear that Facebook feeds may in fact include a more diverse range of sources, the study is uncomfortable viewing in terms of how polarised (and even contrary) news from opposing sides of the same debate can be.

‘Look, first of all, fake news is not new. It’s always been possible that your parents could have been interested, for example, in an ideology that was based on a lot of untrue things, or that was extremist. You could have lived under a regime that fed you propaganda and controlled the information that you received. At least now we have access to multiple independent news sources. Our parents may feed us extreme views, but an algorithm can see that we have looked at, let’s say, something less extreme than what our parents show us, and suggest some things to us based on our decision to look at something different that day.’

But if the world of information now available at the tip of our fingers makes us more aware that opposing opinions exist, surely the mirage on social media that our own view is the predominant, sponsored and, by extension, correct view only serves to make us even more self-assuredly ignorant?

Kosinski points out that he has been to lectures with a number of professors who have been hard-pressed to prove, despite running studies for years, that people’s consumption of news has become increasingly polarised (although a 2016 study led by Michela Del Vicario, ‘The spreading of misinformation online’, makes an interesting read). He asks me whether the sources I’m getting information from are broader, and after careful deliberation I conclude that they are, but that the content is narrower. I, like two thirds of people who go on Facebook, get news on it. That’s nearly half of the general population. And it is clear that the way that people are consuming news is changing: only 20 per cent of people get news from print newspapers now, and whilst TV remains the predominant source for news for most, it might not be for long. A paltry 27 per cent of millennials switch on the TV for their news. Recently Mark Zuckerburg himself, the owner of Facebook and Instagram, reluctantly accepted the role of such platforms in the dissemination of news (see tinyurl.com/za2zfry).

It is on this point that Kosinski becomes more sombre. ‘The big problem is that in the past, editorial policy was obvious. You could see if you were looking at a left-leaning or right-leaning paper… Now, a guy in the back-room, an engineer, can tweak a tiny thing that will affect the online environment for 1.6 billion people and no one would know. If Facebook decided to be liberal-leaning, nobody would even know because everyone sees a different thing. It’s creating different results for billions of people so it’s sort of difficult to measure, even for the owner of Facebook. No single person can claim to know how it works.’

Eerily reminiscent of the commander-type characters in those dystopian novels I read as a child of robot systems that took over the world by accident, Kosinski adds: ‘The machine could develop leanings and we wouldn’t know. Those big [social media] platforms should embrace the solution to manage this risk. Firstly, it’s an insurance problem for them that their computer systems or employees could be doing things beyond their control or without them knowing. They should be mindful, too, of their users becoming suspicious of that likelihood and no longer using their platform.’

Is he suggesting that, hypothetically, a social media organisation could manipulate an election? And does he think that such a thing would be wrong if so? ‘Absolutely yes and absolutely yes. By tweaking their software just a little bit they could affect the outcome of an election, no doubt about it. Obviously that’s wrong – Facebook is not controlled by society and is in no way democratically elected. It’s controlled by one individual. There is something wrong about a single person being able to exert such an influence over the public.’

Kosinski says he is in favour of private institutions like Facebook, but ‘also in favour of their public oversight. They have basically become media companies that aggregate news for people, so I don’t see why they should not be controlled in exactly the same way as we control traditional media. Civilisation has evolved public oversight over media – and for good reason. The new media of the 21st century should be subject to the same oversight for the sake of everyone’s’ safety.’

Will he be advising Facebook and the American government anytime soon? ‘I will stick to teaching and researching at Stanford – consultancy doesn’t interest me much.’

On the question of what we should expect to see from him next, he says: ‘I am currently trying to understand the links between facial features and psychological traits.’ It feels eerily futuristic. But like most of his work, it also sounds very, very interesting.

Further reading: Find out more about Michal at
www.michalkosinski.com

- Poppy Noor is a freelance journalist. She writes regularly for The Guardian: see www.theguardian.com/profile/poppy-noor. Follow her on Twitter @PoppyNoor