He asked what is it that publishers can do to help scientists be successful? Scientists are constrained by what is human about them, but the reality and our experience of reality are not the same thing. We all have mental modules that want us to see what we want to in order to reinforce our beliefs. So the brain imposes understanding on what it sees.
Applying sociological approaches to the study of science, there are Norms versus Counternorms e.g. communality versus secrecy; universalism versus particularism (evaluate research on own merit or evaluate research by reputation); disinterestedness versus self interested.
There are stark differences when you unpack what researchers believe about how they do science (they do it for the norms) compared to what they observe about themselves (more counter norms creep in) and what they observe of others (almost all counter norms).
The primary challenge is that incentives for individual success are focused on getting it published, not getting it right. The choices that scientists make when analysing the data can impact on the results. And unless you see the source data, you cannot understand why this is. So how do we get researchers to be more transparent and reproducible in their work?
Barriers include perceived norms, motivated reasoning, minimal accountability, and the ubiquitous 'I am busy'. What can be done about it? Look at the rewards that we need and the means to get them. What if you added rewards for transparency and reproducibility in the research process? What if you were to diversify rewards to include data and materials so there is recognition for research content?
Why is this tricky? There are a lot of stakeholders: universities, founders, publishers, societies which creates a complex group of issues. If there are desired behaviours and people don't know they are happening (to increase credibility of their project) you need to raise their profile and boost credibility of research.
At the Center for Open Science they have created badges for this and defined what this means. Badges are symbols and in real life these are powerful, significant indicators of who you are. They are promoting an open research culture using standards, with three data sharing levels:
- Article states whether data are available, and if so, where to access them
- Data must be posted to a trusted repository. Exceptions must be identified at article submission.
- Data must be posted to a trusted repository, and reported analyses will be reproduced independently prior to publication.
So far there are 749 journals from 62 organizations that are signatories of the top level of guidelines. Another focus has been round preregistration of research studies. The Registered Reports workflow is: Design > Collect & Analyse > Report > Publish. Peer review would usually happen between Report and Publish, but they move it back to between Design and Collect. If reviewers don't know what the results are, they are incentivised (along with the researcher) to make it the best study possible. They have 38 journals so far who have committed to making registered reports happen.
They are developing the possibility of partnerships between publishers and funders. A review report is submitted to both and, if acceptable to funder and publisher, it gets the go ahead, providing an efficiency step for all groups.
One of biggest challenges to reproducibility is completely mundane: labs lose materials and data all the time. People have multiple, personal systems of data preservation. What can you do to mitigate this? Adopt the TOP Guidelines. Adopt Badges. Adopt Registered Reports. Partner on preregistration
and partner on open data with OSF.
Brian Nosek is Executive Director of the Center for Open Science and Professor in the Department of Psychology at the University of Virginia. He gave the keynote at the 2016 STM Frankfurt Conference.