Reproducible and Open Science: Insights from the 3rd Advanced Methods for Reproducible Science workshop
By Emma Norris, on 14 January 2019
By Emma Norris, University College London – UK
Psychology research is in the midst of a transformation. Following various failed attempts to replicate psychology studies and evident p-hacking (data dredging) and publication bias, a revolution is underway to reform research practice. Numerous papers have indicated measures to address these issues, such as Marcus Munafo and colleagues’ ‘manifesto for reproducible science’ and new organisations have been developed to spearhead improvements in psychology methods and practice, such as the UK Reproducibility Network (UKRN) and Society for the Improvement of Psychological Science (SIPS). I write this blog fresh from a workshop discussing these reproducible science issues and giving practical support on how to address them.
The 3rd Advanced Methods for Reproducible Science workshop took place between 6th-11th January 2019 at Cumberland Lodge in Windsor Great Park, UK. Funded by the Biotechnology & Biological Sciences Research Council (BBSRC) and the European College of Neuropsychopharmacology (ECNP), the 6-day residential workshop welcomed an international group of around 30 Early Career researchers and 10 tutors (led by Prof Dorothy Bishop, Prof Chris Chambers & Prof Marcus Munafo).
The workshop provided thought-provoking debate and seminar time to discuss the benefits and challenges of Open Science research. For example, Dr Florian Markowetz from University of Cambridge discussed his ‘5 selfish reasons to work reproducibly’: how working reproducibly is of self-interest to all scientists (as well as beneficial to the wider community!). By clearly logging all parts of the scientific process from start to finish, he argued we can avoid research disasters, write papers easier, facilitate peer review, enable continuity of work and build our reputations as honest and careful researchers.
Dr Kate Button reflected on establishing the GW4 Undergraduate Psychology Consortium with colleagues at the universities of Bath, Bristol, Cardiff and Exeter, to integrate inter-university Open Science research into student dissertations. By pooling resources in data collection and analysis across universities and using rigorous methods, students and supervisors can answer much more complex questions than via individual dissertation projects alone. Also, Dr Kirstie Whitaker showed the wide scope of Open research practice with the ‘Open Scholarship umbrella’: highlighting the multitude of ways researchers can and should make their data and resources shareable to wider communities:
Practical sessions throughout the week introduced a suite of tools to facilitate reproducible research. Websites such as the Open Science Framework and GitHub were introduced as portals to present code and findings, with overviews of R and R Markdown software shown to facilitate clear logging of analysis code. We were also introduced to the processes of Registered Reports by Chris Chambers. This is where researchers are given acceptance from a journal based on their hypotheses, experimental procedures and analysis plan before data collection. Pre-acceptance is attended to reduce publication bias and champion research based on rigorous methods rather than findings.
Wider methods and analysis techniques to facilitate reproducible science were also discussed, such as how to design a good experiment, power calculations, assessing and reducing risk of bias, using statistics to answer questions and claim discoveries and an introduction to Bayesian analysis. Tutors highlighted that reproducible science techniques are not ‘one-size-fits-all’. With so many tools available, the tasters provided enabled us as attendees to decide which tools may be more or less useful for our own work
A highlight for me was an impromptu Hack-a-thon, where students collaboratively devised and started work on small projects to facilitate reproducible practices going forward – no coding skills required! Arising projects included curating a list of Open Science training and conferences, as well as planning for an Early Career Network of reproducible scientists. We will continue to work on these projects going forward.
Some of my key tips and thoughts arising from the workshop are:
- Incorporate smaller reproducibility milestones for each paper – it’s overwhelming to make many changes all at once
- “Even if you’re not collaborating with someone else, you will be collaborating with yourself in 6 months time”. Keeping a clear track of your processes benefits you, your research team and the wider community
- Wider efforts are needed to incorporate Open Science and reproducibility practices within university teaching
The workshop was truly an inspirational event. I came away with new collaborators, friends and a reaffirmed boost to champion Open Science research. On behalf of all attendees I wish to give a huge thanks to the tutor team, dedicating their whole week to supporting us and being such friendly advocates for reproducible and open research! More funding has been secured to run this workshop in future years and I cannot recommend enough that you attend.
You can find all resources from the workshop on its Open Science page: https://osf.io/gupxv/
For updates on this and other Reproducibility initiatives, follow the UK Reproducibility Network.
Read tweets from the workshop at #repro19
Questions
- How can behaviour change be used to widen Open Science practices?
- What changes should be made to include Open Science practices in university teaching?
- How can these practices be applied to qualitative research?
Bio:
Emma Norris (@EJ_Norris) is a Research Associate on the Human Behaviour-Change Project at UCL’s Centre for Behaviour Change. Her research interests include the synthesis of health behaviour change research and development and evaluation of physical activity interventions.