Interview: Dispelling mind myths and debunking pseudoscience

Sergio Della Sala lets Lance Workman into his world.

You are both a clinical neurologist and a Professor of Cognitive Neuroscience. I imagine having just one of these roles would be pretty time-consuming.
I am lucky and privileged. I work in a friendly department of psychology, I edit Cortex, a stimulating neuropsychology journal, I travel to interesting conferences in pleasant locations… I find it extraordinary that I could get paid to do things that I like doing and that offer me material to share at cocktail parties. Every morning when I walk to my office I reflect on how fortunate I am.Time-consuming roles which offer you freedom and some pleasures are a luxury compared to what most people have to endure in their working life. Sincerely, it would be provocative if people with a life as jammy as mine would complain.

In addition to your work on neuropsychology you are very much involved in disseminating science to the people. Do you think this has become increasingly important in the 21st century?
I maintain that one should understand science in order to disseminate it. Too often science is reduced to a matter of opinion, following TV or tabloid formats. Instead basing our choices on evidence and information would increase democracy. More often than desirable it’s we, the scientists, who promulgate unnecessary scares, or improbable remedies. We are now all desperate to engage the public. Our institutions push us to branch out and reach out, and we get brownie points if we do so – often independently of what we have to say.

We couldn’t even find a decent label for science dissemination. First we called it Public Understanding of Science, creating the disgusting acronym PUS. Then we corrected it into Public Engagement in Science and Technology, allegedly not better as PEST! Science festivals are springing up in every city. However, the idea that simply discussing science publicly can counter misinformation is naive, and potentially counter-productive.

I must admit the acronym ‘PUS’ hadn’t occurred to me before! I can see there’s a fine balance to make between disseminating findings in science and avoiding simplifying things that can be used to make headlines. Can you think of occasions when you have said no to the media?
Too often we have nothing to say (and we say it!). It also happens that the requests are not for an evidence-based view, but ‘to give a positive spin on the story’. A journalist recently put it to me, that she would not take ‘sorry, there is no evidence that that particular flim-flam works’ for an answer. She wanted a positive answer, and eventually she extracted it from a colleague.

Sometimes the conditions for a decent discussion on science are simply not met. Recently, ‘Sense about Science’, an independent organisation which is doing a lot of good to promote scientific thinking, put me in contact with a journalist who was organising an episode of the Today programme for Radio 4. Her request was for me to debate with Baroness Susan Greenfield, who promotes the idea that children exposed to social media or the internet are more likely to develop autism, on the basis that both use of the internet and autism are increasing in parallel. A bit like saying that eating ice creams causes drawings as consumption of ice creams increases in parallel to observed number of people drawing! However, a few hours before going live on air, the journalist called me asking about the evidence I had to demonstrate that the hypothesis was untenable. I failed to argue convincingly that the onus of presenting evidence is on the proponents. She was unabashed by her construction of opinion versus opinion, no evidence on one side, but equally no evidence on the other. Hence, I did not see fit to discuss science when its basic rules are misunderstood, and withdrew. I strongly believe that science programmes should be run by people with at least a minimal knowledge of science methods.

In relation to disseminating science to the public, was this a conscious decision to go in this direction, or did you somehow fall into it?
I enjoy sharing, it’s as simple as that. And it is often gratifying and rewarding, even flattering.

You’ve conducted a lot of research in the field of dementia. What got you interested in this problem initially?
I was working in Italy at the time, and we had little money; neuropsychology was a cheap option to carry out decent research and the neuropsychology group in Milan had reached international reputation. However, at the time all the focus was on hemispheric differences and on the relationship between lesion sites and cognitive or behavioural changes. Using neuropsychology not only to assess dementia but to derive data which could inform models on the cognitive architecture of the mind was a bit revolutionary. Nowadays it’s accepted wisdom.

In relation to the dementia research, do you feel we are currently making good progress? Will we ever see a time when we will have this growing problem licked?
Some progress is continuously being made, slowly but steadily. What we know today is more than what we knew yesterday. However, too much clout is left in the hands of private interests… most research is still carried out by Big Pharma companies. We really need independent research, as much as possible. On a completely different note, we should invest more money to assist families with the burden of patients with serious and, to date, intractable diseases.

We tend to view growing old as an inevitable loss of cognitive ability and, in particular, memory. Are there any areas where we might actually improve in old age?
Well, if we struggle a bit, we could dig out some cognitive effect which favours old age, like the so-called positivity effect, whereby older adults show improved memory for positive images, whereas younger adults often show an opposite bias. This would indicate that older people are more likely to remember life events positively. However, I am not buying into the camp that considers good ageing is everything that resembles youth. When Anna Magnani, a famous Italian ageing actress was asked whether she wanted to hide her wrinkles, she replied that she would like them left visible as it took her a full life to make them! On a personal level I feel more like Bill, the character played by Michael Douglas in Last Vegas (I know, I know, I should watch better movies…), who says something like, ‘My brain cannot conceive how old my body is’, hence I insist on playing soccer, though I have to convince my fellow players that I run up or down the pitch rather than up and down the pitch.

When it comes to the brain it is often said that use it or you’ll lose it. Do you think the latest computer programs that have been developed to help this really work?
We like metaphors and abhor complexity – hence often discussions about the functioning of the mind are reduced to slogans or simplistic concepts. Of course we should keep our mind alert and busy, especially when growing old. However, playing computer games or buying into improbable miraculous programmes won’t do us any better than seeing friends, enjoying a walk or doing crosswords.

I’ve often read that idea that we only use 10 per cent of our brains. Where did that come from?
If that were my case I’d be really worried. It probably comes from the misunderstanding of terms like ‘potential’. A sentence saying that we use 10 per cent of our potential may be unscientific but still acceptable in the magazines that we find at the hairdressers. We love to think that there are magic silver bullets fixing all. However if something looks too good to be true, as a rule it is too good to be true! So we come to believe that listening to Mozart (but not the Sex Pistols) can boost intelligence or that gulping down fish oil pills can boost our brain power.

You are famous for debunking ‘mind myths’ – in fact you have compiled a book of this name. What would you say is the most outrageous mind myth that is still widely believed?
Claims relating to the brain have a unique mystique, which seems to bypass critical thinking, leaving us especially gullible. I am not sure though that we can compile a ranking of the various myths about the functioning of the mind. I would maintain that they are outrageous when people get economic advantages by proposing remedies or programmes based on pseudoscience, particularly in schools or hospitals.

There are a lot of books around that suggest people can draw on the right hemisphere to improve their level of creativity. You’re critical of this simplistic view?
According to this logic, a poet who uses language, hence a left hemisphere skill, is not creative. It is certainly true that some cognitive functions are expressed via the activations of areas sitting in one or the other brain half. There is, though, no evidence to support the idea that programmes teaching us to use the right hemisphere to become more creative have any grounding in reality. Show us the evidence please, rather than selling improbable enhancing programmes in teacher’s gatherings like in a Casablanca souk. We love dichotomies, we believe we understand them, so we tend to subdivide everything into them. It’s a dichotomania – left and right, yin and yang. I say that if there are no sound empirical data supporting these dichotomies, then these practices should not be given a go in the classroom. For instance, the idea that different children learn information in different ways, and can be classed as verbalisers or visualisers, has been greatly hyped in our schools. The concept of learning styles may be intuitively attractive, but evidence supporting it is scant.

You have uncovered evidence that people have a natural leftward bias for remembering information. Why do you think this is the case, and what does
it tell us about human lateralisation?
These studies capitalise on what we know already about the asymmetry in distributing attention towards the left and the right half of our environment, dramatically demonstrated by the performance of patients with visuospatial neglect following a unilateral brain damage. We have simply added to the arrays of possible asymmetries by demonstrating that even in healthy people asymmetries include imagined scenes, that is scenes (or objects) that we visualise in our head without immediate percept. We labelled this phenomenon ‘representational pseudoneglect’.

A recent book (with Mike Anderson) is concerned with Neuroscience in Education. I’m intrigued by the subtitle – ‘the good, the bad and the ugly’.
The ‘neuro’ prefix is very fashionable, and neuroeducation is just one of the myriad offsprings. This has resulted in the development of a number of new practices in the field of education – some good, some bad, and some just crazy. Too often, people with the clout to make decisions about which practice is potentially profitable in the classroom setting, ignore evidence in favour of gut feelings, the authority of ‘gurus’, or unwarranted convictions. In short, opinions rather than data too often inform implementations in schools. On the other hand, some ‘good’ classroom developments are centred around mainstream cognitive findings rather than brain anatomy or localisation.

For instance, one such finding, named retrieval practice, has been replicated many times. Students stockpile learning just before an exam, they may do well enough on that exam, but if they want to retain the material long-term, then retrieving it via multiple tests is a better bet. However, we always have to remind ourselves that since science is not prescriptive, it’s descriptive, then politics, ethics, individual choices, practicalities would guide implementations.

You’ve achieved a great deal in terms of research and the dissemination of neuroscience. What next?
Thanks for your generous comment, but I am just doing my job. Working wise, my plan is to keep learning as much as I can, until I can, and possibly have some fun in the process. From a private stance, I wish to raise all my daughters so that they could be a bit proud of me now and, particularly, later.

BPS Members can discuss this article

Already a member? Or Create an account

Not a member? Find out about becoming a member or subscriber