Making results dead currency

Our editor Jon Sutton reports from a symposium at the Society's Annual Conference.

Introducing a symposium on innovations in open science, following on from Brian Nosek’s opening keynote, Mark Andrews reminded us that openness has always been one of the defining values of science. In recent years, we have seen how science has fallen short of its own principles. Small teams work in secrecy; data and analyses are not routinely shared; and only a brief carefully curated façade is eventually disclosed. 

Thankfully, psychology has been drawing back the curtains. Chris Chambers (Cardiff University) delivered an upbeat progress report on registered reports, five years on. Calling it a ‘vaccine against bias’, he demonstrated how practices of varying degrees of corruption and conscious effort stem from misalignment between the needs of the individual and needs of the community. ‘We require a fundamental shift in how we think of quality in research… we have to make results a dead currency in quality evaluation.’ Chambers argued that scientific value stems from the question asked, and the quality of the method used: never the result it produces. Registered reports, now adopted by more than 100 journals covering many fields in the medical, social and physical sciences, represent ‘reproducible, transparent, credible science’. 

Registered reports are not a new idea, but there are still emerging areas of debate. These include whether all clinical trials should be published as registered reports, and the provision of pathways for researchers doing purely exploratory research. ‘People always think they’ve found a fatal flaw,’ Chambers said. But in fact, the format is being constantly refined, and the impact assessed. For example, advocates are increasingly working to tie registered reports to funding: ‘your grant and your publication from it are approved at the same time’. Chambers nodded to Marcus Munafo’s efforts with Cancer Research UK and Pfizer. ‘We should be pushing for RR grants from all major funding agencies.’ RR, Chambers argues, liberates you as a researcher – you’ve done the hard work in your design, and you can also benefit from external input at the early stage. 

So what still gets in the way? With funders it’s a logistical challenge is logistical – getting them to work with journals, who can often have different philosophies. With journals, it’s ‘editors who like to choose what to publish’, Chambers concluded. 

Another Cardiff psychologist, Richard Morey, outlined the Peer Reviewers’ Openness Initiative. ‘All members of the scientific community,’ Morey argued, ‘have equal responsibility for upholding standards.’ We want to be open; we want others to be open; yet we aren’t open. Fewer than half of scientists respond with data on request; many report feeling that if they put their data out there, they could get embarrassed’. Reviewers have both the responsibility and the incentive to encourage open science, so the PRO initiative gets them to request open data / materials; or for authors to add a note on why they are unwilling to do that (‘any public justification will do’). It’s a simple step, but ‘a concrete mechanism for collective reviewer action’. 

Mark Andrews (Nottingham Trent University) then took us into a world of RMarkdown, Jupyter Notebooks, and Git: technological solutions for open, transparent and reproducible research. Data and code have, Andrews said, for too long been the ‘second class citizens in scientific communication’. These tools allow you to couple your analysis with your report in a seamless manner.  

Concluding the symposium, Katherine Button from the University of Bath updated us on efforts to instil scientific rigour at the grassroots. Hers is a novel, multi-centre methodology for undergraduate psychology projects. Separate meetings of the consortium discuss design and results, to end up with research that has way more power, generalisability and transparency than your average student project. ‘We’re squaring the circle of training and rigour,’ she concluded. 

- Find much more about replication, reproducibility and open science in our archive.

Find more reports from the Annual Conference on our website, and look out for coverage in the June and July print editions.

BPS Members can discuss this article

Already a member? Or Create an account

Not a member? Find out about becoming a member or subscriber