best practice research recommendations; dementia screening; BRAIN Initiative; video games and violence; reports from the BPS/British Association Lecture, CogDev 2013, Division of Health Psychology conference; and more

Best practice research recommendations

Following the fraud scandals and replication problems endured by social psychology and other scientific disciplines in recent years – many documented on our news pages – the Society for Personality and Social Psychology (SPSP) Task Force on Publication and Research Practices has released a remedial report: Improving the Dependability of Research in Personality and Social Psychology: Recommendations for Research and Educational Practice.

Published early as an online proof in Personality and Social Psychology Bulletin, the report makes a series of recommendations for ‘best practices’ in personality and social psychology research (paraphrased below):I    Recruit enough participants to ensure adequate statistical power to detect the key effects of interest. This statistical power should be reported when possible and considered as a factor when interpreting results. I    Report effect sizes and 95 per cent confidence intervals.
I    Avoid questionable research practices, including: failing to correct statistically for multiple tests of statistical significance on the same data; recruiting more participants until a significant result is obtained; omitting conditions, participants and other experiment features after looking to see if an effect has been obtained; and running multiple experiments and only reporting those with significant results. Where any such practices are required, for example as part of an exploratory study, they must be fully reported.
I    Provide verbatim wording of all task instructions, manipulations and measures in an online appendix to aid future replication attempts.
I    Archive raw data and make it available to other researchers who wish to verify substantive claims through re-analysis (while ensuring confidentiality of participants is protected and legal rights to data are honoured).
I    Place high value on replication attempts.
I    Avoid inflexible research rules and recognise some research requires unique or unusual methods.

The report also makes recommendations for educational practice in social and personality psychology, including: fostering through student programmes, text books and editorial guidelines a culture of ‘getting it right’ rather than ‘finding significant results’; and improving methods training by increasing awareness of the issues raised in this report.
With an implicit reference to some of the rows that have broken out recently after failed replication attempts of influential social psychology papers, the SPSP Task Force report also urges researchers to respond to failed replication attempts in a civil manner.

‘Failures by others to replicate one’s work should be treated as opportunities to work together with colleagues to find the parameters under which a theoretically expected effect is and is not found,’ the report says. ‘Critiques of research methodology or empirical findings merit constructive, not defensive, responses.’ In the same vein, it says that replication attempts should not be undertaken as point-scoring opportunities, but as ‘open-minded investigations of the generalizability of important, interesting effects’.

The six-person task force that produced the report is chaired by SPSP President Professor David Funder at the University of California, Riverside. Other members include Professor Carolyn Morf at the University of Bern and Professor Stephen G. West, Visiting Professor at Freie Universität Berlin.

Professor Funder pointed out to us that the current ‘replication crisis’ is occurring across science from cell biology to physics and is not limited to or especially affecting psychology. ‘But psychology has the potential to lead the way to improved research practices,’ he said, ‘because of its sophistication in research methodology and because of its expertise in the factors that affect the behavior of humans including scientists.’

He added that the ‘most useful prescriptions will be forward-looking, seeking to improve research and analytic practice as well as the social environment within which research is practiced. Our task force sought to generate ideas along these lines…that we hope in the long run will place all of science – not just social psychology – on an increasingly solid footing.’ cj

The report, which also includes a useful statistical primer covering effect sizes, statistical power and more, is available as a pdf at 


Mapping the brain-mapping project

The initial focus of the Obama administration’s ambitious brain-mapping project, known as the BRAIN Initiative (see April and May news this year), has become clearer thanks to an interim report published by the National Institutes of Health (NIH), one of the principal research agencies involved.

Outlining the priorities for the 2014 fiscal year, which started in October, the NIH advisory committee said that they had ‘identified the analysis of circuits of interacting neurons as being particularly rich in opportunity, with potential for revolutionary advances’.

The report goes on to specify nine priority research areas to receive their share of $40 million of initial NIH BRAIN Initiative funding, including: characterising all the cell types in the human nervous system; the development of ‘large-scale network recording capabilities’ allowing for the recording of ‘dynamic neuronal activity from complete neuronal networks’; and linking neuronal activity to behaviour. Regarding the last item, the report says: ‘[T]he clever use of virtual reality, machine learning, and miniaturised recording devices has the potential to dramatically increase our understanding of how neuronal activity underlies cognition and behaviour.’

Further details are available on the NIH website ( However as we went to press there were fears the US government shutdown will delay the start of the BRAIN 

Dementia screening plans criticised

Increasing the early diagnosis of dementia is an explicit aim of England’s 2009 National Dementia Strategy and was reiterated by Prime Minister David Cameron’s 2012 Dementia Challenge. The rationale is that early diagnosis allows support to be put in place to help people with dementia live well. But now a team of experts has written to the BMJ arguing that an excessive focus on early diagnosis risks misdiagnosis, increases stigma and redirects of resources away from those with severe dementia (

Led by David Le Couteur, professor of geriatric medicine at the Centre for Education and Research on Ageing at the University of Sydney, the group point out that 40 to 70 per cent of people diagnosed with mild cognitive impairment do not in fact progress to full-blown dementia and some even improve. The authors highlight recent research ( that found no beneficial effect for a psychosocial support package for people with early dementia. They also note that there are no drugs currently available that slow the illness.

Memory clinics established to aid early dementia diagnosis often use brain imaging and other bio-marker techniques even though these methods do not yet have an adequate ability to detect dementia or predict its progression, say Le Couteur’s team. Basic screening tools used in general practice are also prone to high false-positive rates. Moreover, on receiving a spurious diagnosis of early dementia, patients may be tempted by complementary, empirically unsupported remedies, including ginkgo biloba or cholinesterase inhibitors. Worryingly, the latter have been linked with increased risk of falls and fainting, the authors said.

‘The strong political lead in the UK and US is increasing the numbers of people that receive a diagnosis of dementia and early dementia,’ concluded Le Couteur and his colleagues. ‘Yet arguably the political rhetoric expended on preventing the burden of dementia would be much better served by efforts to reduce smoking and obesity, given current knowledge linking mid-life obesity and cigarettes.’ cj

BOOK PRIZEChartered Psychologist and Associate Fellow of the BPS Charles Fernyhough is on the shortlist for this year’s prestigious Royal Society Winton Prize for Science Books for his Pieces of Light: The New Science of Memory (Profile Books). The judges, chaired by BPS Fellow Professor Uta Frith, said: ‘Our memories of reading this book are exceptionally good ones!’ The winner will be announced later in November.

Four psychologists have been newly elected to the Fellowship of the British Academy: Dominic Abrams (University of Kent), Usha Goswami (University of Cambridge), Tim Shallice (SISSA), and Jane Wardle (UCL). The quartet were admitted formally to the Academy’s Psychology Section in October at a ceremony held at Carlton House Terrace in London.

A US paediatrician who helped restore psychological sensitivity to neonatal wards has died aged 91. During an era when hospital procedures were focused only on efficiency, Dr John Kennell argued for the importance of contact between parents and infants in the initial hours after birth. He was co-author of Maternal–Infant Bonding, published in 1971, later revised and re-titled Parent–Infant Bonding. The New York Times ( said ‘the great question of [Kennell’s] work…was the mystery of the chemistry between children and parents’.

Ig Nobel
September saw the return of the annual Ig Nobel Awards for research that makes you chuckle at first and then pause to ponder. The psychology prize was taken by Laurent Bègue and his colleagues for their paper (covered in our Research Digest: showing that people who think they’re drunk think they’re hot. That is, participants who were told their drink was alcoholic tended to rate themselves as more attractive based on their appearance on an ‘advert’ they recorded. This was the case whether their drink really was alcoholic or not. Also honoured at the ceremony held at Harvard University was a Japanese team who won the medical prize for their research showing that the sound of operas by Verdi or Mozart helped mice recover more successfully from a heart transplant ( Listening to music by Enya had no such benefit. cj


Untangling dyslexia

The way the term ‘dyslexia’ is bandied around in the popular press, you get the sense that it’s a precise diagnosis, something you either have or you don’t. Answering questions at the end of her joint British Psychological Society/British Academy lecture, BPS Fellow Professor Margaret Snowling exposed this as a myth. ‘Dyslexia is just another name for poor reading,’ she said. ‘Where you put the cut off between dyslexia and normal reading has to be agreed within your education system, your school – it could be a national policy, a policy within a local authority – there isn’t any gold standard.’

There may not be universal agreement on where to draw the line, but research into developmental dyslexia has come a long way since the first case was described by a British GP as ‘word-blindness’ in 1896. Such early accounts, Snowling explained, suffered from referral bias – the deficits had to be severe enough that a child wound up in a doctor’s clinic. Back then the condition also tended to be seen as specific and perceptual, so that it became the domain of ‘eye doctors’.

Our understanding of dyslexia – nowadays recognised as a ‘neurodevelopmental disorder’ affecting the ability to read and spell – was placed on surer footing by a seminal paper published in the mid-1970s. Snowling explained how Michael Rutter and William Yule’s epidemiological work on the Isle of Wight led them to distinguish between children who read poorly relative to their IQ (they called this ‘specific reading retardation’) and those who read poorly for their age (‘general reading backwardness)’. This research made an important contribution, Snowling said, because it showed that both groups of children experienced language delays and deficits that pre-dated their reading problems.

Today there are several agreed-upon facts about dyslexia, Snowling continued. It runs in families; it’s associated with a phonological deficit (i.e. a difficulty translating letters into sounds); and it can manifest in various ways behaviourally. ‘The contemporary view’, said Snowling, ‘is that dyslexia is not a diagnosis, rather it’s a dimensional disorder. Many people have dyslexia and it will vary from mild to severe. It occurs in individuals with all levels of intellectual ability, and it’s associated with multiple risk factors, not a single cause.’

Attempting to untangle these risk factors, Snowling and her colleagues recently conducted a meta-analysis of 14 studies that examined dyslexic children with a family risk of dyslexia; children with a family risk but no dyslexia themselves; and non-dyslexic control children with no family risk for the condition. This confirmed that a phonological deficit is a cognitive risk factor but not an absolute cause. It appears to be an ‘endophenotype’, Snowling said – it mediates the gene–disorder link and it’s found in the unaffected relatives of people with dyslexia.

Other clues come from a family risk study that Snowling co-authored this year. Over a hundred children with family risk of dyslexia were compared at ages three and four with children with specific language impairment and with controls. This showed that some children at family risk have phonological problems and broader language impairments, while about two thirds have the phonological problems without the broader language impairments. It seems as though multiple genes code for different risk factors, Snowling explained, and the more risk factors a child has the more likely they are to get a diagnosis of dyslexia. There’s evidence that if a child has good language skills, they may be able to compensate for phonological problems.

Snowling also touched on the environmental factors relevant to dyslexia. Among the most significant is a child’s native language. Of the alphabetic languages, English is the most difficult to learn, and phonological problems will lead to serious delays in learning to read. By contrast, other languages like Italian and Finnish have far simpler letter-to-sound encoding rules and the consequences of a phonological deficit will be felt less keenly. Another environmental facto that was expected to be important was the ‘home literacy environment’ – for example, how much reading takes place. But surprisingly it’s been found that children at family (i.e. genetic) risk for dyslexia experience similar literacy environments to controls. 

Snowling finished her talk by looking at her latest work on interventions for dyslexia. The gold standard at present is based on teaching letter-sound knowledge; increasing phoneme awareness; and doing all this in the context of book reading. ‘But the elephant in the room,’ said Snowling is language because as we heard earlier, broader language problems also play a significant part in the emergence of dyslexia.

In a trial published this year, Snowling and her colleagues tested a 30-week oral language skills intervention for nursery school children with poor language skills. Compared with a waiting-list control group, the children who received the training showed improvements in vocabulary development, expressive grammar, narrative skills and listening comprehension – and six months later these gains on oral language skills appeared to underpin improvements in reading comprehension. ‘There’s a very simple message here for policy makers,’ said Snowling: ‘language needs to be everywhere in the early years classroom and phonics is not enough to support the reading development of children with language learning deficits such as dyslexia.’ cj


Video game violence review welcomed

During the video game epoch, youth violence across the Western world has plummeted to 40-year lows. Yet concerns about the effect of these games continue to be expressed in academia and beyond.

A notable example was the American Psychological Association’s (APA) 2005 resolution calling for ‘all violence [to] be reduced in video games and interactive media marketed to children and youth’. Dr Elizabeth Carll, a past president of the Media Division of the APA, was quoted at the time as saying: ‘Playing video games involves practice, repetition, and being rewarded for numerous acts of violence, which may intensify the learning. This may also result in more realistic experiences which may potentially increase aggressive behavior.’

Now the APA board of directors has appointed a task force to review the scientific literature published since this policy statement was adopted, and around 230 psychologists from America and further afield, including some British psychologists, have signed a statement welcoming this move.

The psychologists say the APA’s 2005 resolution reached several strong conclusions on the basis of inconsistent or weak evidence and suggest that subsequent research has provided strong evidence that some of those conclusions cannot be supported.

They suggest that ‘rigid or ideological’ policy statements can stifle scientific innovation and may inadvertently increase publication bias. They also express concerns about the reliance upon meta-analysis in this field of research and the ‘overgeneralization of controversial laboratory measures of aggression to public health issues and violent crime’.

Professor Kevin Durkin from the University of Strathclyde, a BPS Fellow, is one of the signatories of the statement. He says: ‘Psychologists know that the origins, developmental course and manifestations of aggression are complex and rooted in our interactions with the real world. Aggression is one of our (and other, non-video-game playing) species’ most serious and far-reaching problems and it behoves our discipline to study it in its full complexity. Video games provide a convenient newspaper headline solution for those who would prefer not to illuminate more substantial causes of aggression. There are good reasons for studying video games, but these relate to the enormous potential of the medium for entertainment, learning, skills practice and social relatedness.’

The psychologists’ statement says the task force ‘has a tremendous opportunity to change the culture of this research field to one which is less ideological and open to new theories, data and beliefs’ and offers to help it in any way the signatories can. Jonathan Calder


Alzheimer’s Research UK has sabbaticals and secondments funding available to enable established researchers to enrich their research programmes and develop collaborations. Up to £50,000 is available for up to one year. Alzheimer’s Research UK will consider applications that address the behavioural and psychological symptoms of dementia. Two application rounds each year; next closing date: 22 November 2013.

The Scottish Cot Death Trust aims to:
I    increase knowledge and understanding of why some babies die suddenly and unexpectedly, and for whom no cause of death can be found;
I    increase awareness and understanding of cot death; and
I    provide comfort and support for those affected by the loss of a baby to cot death.
They offer research grants for proposals that have clear relevance to cot death and the objectives of the Trust. Grants of between £30,000 and £80,000 for up to three years, and small grants of up to £5000 a year are available. Preliminary research proposals can be submitted at any time.

Comic Relief offer funding via their UK grant scheme under the following themes:
Better futures: supporting young people (age range 11–24) who have limited opportunities and face significant challenges.
Safer lives: supporting people who face violence, abuse and exploitation.
Healthier finances: supporting those in severe financial hardship, for example, help for those who live in extreme poverty to maximise their income.
Fairer society: to empower and give a voice to marginalised groups of people so that they can challenge injustice and bring about positive changes for those who face discrimination and stigma.
Comic Relief has an interest in how sport can contribute to delivering the outcomes of their funding themes. Applicants interested in using sport as part of their project should refer to the Sport for Change guidance.
Registered charities and not-for-profit organisations can apply; check eligibility on the website. Apply at any time.


CogDev 2013
Alana James and Katie Rix report some of the highlights from the joint BPS Cognitive Psychology Section and Developmental Psychology Section conference at the University of Reading

The unique opportunity for developmentalists and adult researchers to come together proved immensely popular. As keynote speaker Mark Seidenberg commented in the closing panel, it was often hard to tell where the ‘cog’ and ‘dev’ started and ended.

Working together
Michael Tomasello (Co-Director of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany) showed how a collaborative event like CogDev can succeed – humans are built for collaboration. Tomasello presented the ‘Shared Intentionality’ hypothesis that humans are adapted for collaborative working in a way that great apes are not. Studies comparing chimpanzees and human children show that, although it is also in the interest of chimpanzees to collaborate, children share with others in more sophisticated ways. When faced with a task that involves working with a partner for rewards, one chimpanzee is likely to try to take all of the rewards whereas children will share them equally. If only one reward is available children engage in turn-taking, with each child taking their reward every other time.

Children also prefer to work collaboratively than individually, even when the reward is the same. By contrast, Tomasello explained that chimpanzees do not realise that others are trying to help them. Unlike humans, they do not assume cooperation or engage in joint intentionality, thus there is no resulting coordination. Human children create a joint role, stay committed, learn from their role, revise accordingly, and communicate cooperatively.

This ability to collaborate is also seen in children’s social behaviour. In tasks where participants are given an unfair offer, so that one gets more or less than the other, children reject unfair offers so that nobody gets anything. Chimpanzees on the other hand will accept whatever they are offered. Children also won’t put up with ‘free riders’, who try to take rewards for less effort than other group members, and show more awareness of social evaluation. When being watched by others, children share more and steal less, whereas chimpanzees’ behaviour does not change. Furthermore, children themselves punish stealing behaviour.

Tomasello concluded that human children are adapted for collaboration in a way that chimpanzees are not, and that this adaptation is fundamental to the unique human processes of cognition, communication, culture and morality.

The arbitrariness of language
The winner of the Cognitive Section Award, Padraic Monaghan (Lancaster University) tackled the puzzle of the arbitrariness of the sign. That is, the arbitrary relationship between the sound of a word (the sign) and its meaning. Monaghan showed that historically the arbitrariness of this link has been seen as very bad and linked to the Tower of Babel, but argued that it is in fact advantageous.

In the 1600s John Wilson devised a universal language with systematic mapping between word forms and meaning. For example, all animals began with ‘Z’; fish beginning ‘za’, birds ‘ze’ etc. The problem is that when you are learning new words systematic form-meaning creates greater scope for confusion. Trying to learn the names of common fish in a new language where they all begin with ‘za’, is going to be harder than for a language where each fish has a distinct name. On the other hand, learning categories is easier when words in the same category have similar morphological structure. Remembering that ‘cod’ and ‘trout’ both belong in the category of ‘fish’ would be less difficult if the two words had systematic rather than arbitrary signs. Therefore the arbitrariness of the sign can be said to be both advantageous and disadvantageous.

Monaghan acknowledged that this goes against findings on sound-symbolism. For example, that we are likely to expect something called ‘takete’ to be sharp and something called ‘maluma’ to be soft (as observed first by Kohler in 1929). Surely it helps children to learn language if the sound of a word conveys something about the object’s properties? By looking at how arbitrary versus systematic the words in natural language actually are, and at the age different words are acquired, Monaghan and colleagues have been able to show that the answer is yes and no. For early word-learning during childhood sound-symbolism helps, but for later acquired language from around 13 years onwards the arbitrariness of the sign is more helpful.

Working on working memory
Susan Gathercole (MRC Cognition and Brain Science Unit, University of Cambridge) also focused on the link between research and educational practice; her work considers the cognitive and neural underpinnings of developmental problems of working memory and language, and the use of cognitive methods to overcome them.

Working memory (WM) enables us to store information for brief periods of time, and if this capacity is impaired it can lead to problems in everyday activities, such as following instructions. Gathercole explained that there is a great range of variation in the capacity of children’s WM and that this is a good predictor of how children perform at school. Links can be seen between Baddeley and Hitch’s (1974) WM model and academic skills; the phonological loops are related to children’s vocabulary, the visuo-spatial sketchpad is related to mathematical ability, and the central executive is related to reading and maths skills. Impairments in WM are also found in children with a range of developmental disorders, including ADHD, dyslexia, dyscalculia, specific language impairment, and genetic disorders.

Research on children with low WM capacity has found that the characteristics of this group include being at high risk of poor academic performance, difficulties following instructions, and facing problems when taking part in activities that require processing and storage. Teachers commonly describe these children as inattentive, distractible, and having short attention spans. These cognitive and behavioural profiles are similar to those found in children with reading difficulties and with ADHD. When children with ADHD are compared with children with typical and low WM capacity, no differences are seen between the ADHD group and those with low WM in attentiveness, cooperation, maths, reading and IQ skills.

Gathercole emphasised that greater support is needed for children who have low WM capacity, but who may not present with a developmental disorder. She showed that adaptive WM training is beneficial in supporting learning for children with low WM capacity. The use of a training programme designed to improve working memory resulted in gains in both working memory and mathematical ability. Future work will hopefully see greater transference of skills learnt through such training, and thus improvements in academic performance.

Emerging themes
CogDev 2013 ended with a panel session identifying emerging themes from the conference. Another theme was the use of multiple methods by researchers in both domains, in particular the emergence of neuroimaging as a common currency. A need was perceived for more researchers to become conversant in multiple approaches, to broaden the scope of findings. The panel and audience were in agreement that there must be greater emphasis on links between science and educational policy and practice in areas related to cognitive and developmental psychology. Crucially, psychologists should be actively pushing these links.


Box Text

Reading between the dialects

Mark Seidenberg (University of Wisconsin, Madison) dared to ask why, when we have made considerable scientific progress in understanding reading, is achievement in reading still so poor? For example, in 2003 29 per cent of adults in the US had basic literacy levels and 14 per cent were below basic, and in 2009 a third of 4th grade school students had below basic literacy levels.

Seidenberg presented three possible reasons for the gap between scientific progress and real-world progress in reading. First, maybe English orthography is just hard to learn – there is little sound-symbolism in the English language. This can mean it is harder for children to read words aloud but, as also shown by Monaghan (see above), this does not mean it is difficult to learn word meaning. Second, maybe reading isn’t being taught very well. Seidenberg argued that there is a greater need for the science of reading to be understood by teachers, so that science and educational practice are better linked. Third, is it due to poverty? Seidenberg focused on the widely-evidenced ‘Black–White’ achievement gap in the US, revealing that socio-economic status does not fully explain the gap nor does education narrow the gap. Rather, the gap gets larger across the first few years of schooling, including for children from high socio-economic backgrounds.

The alternative explanation Seidenberg gave for the gap was differences in dialect. His research has looked at how the dialect of the language a person speaks is an emerging risk factor. For children who speak different dialects at school and at home, learning to read is going to be much harder. A child who hears and speaks African American English (AAE) at home may struggle to understand contin

BPS Members can discuss this article

Already a member? Or Create an account

Not a member? Find out about becoming a member or subscriber