ReproducibiliTea at Leeds is organised by Emily Williams, Eike Mark Rinke, Kelly Lloyd, Stephen Bradley, & Nour Halabi. This blog has been written by two of the organisers: Sections ‘What is ReproducibiliTea?’, ‘The Great First Session’, ‘Forthcoming ReproducibiliTea Meetings’ written by EW, edited by KL and SB. ‘The Successful Follow-up’ section written by KL, edited by EW and SB.

What is ReproducibiliTea?

ReproducibiliTea is a journal club for transdisciplinary discussion of best-practice in research over tea, coffee, and snacks. It brings researchers from across the university together to discuss shared issues in generating reproducible, transparent, and open research, practices often narrowly labelled as “Open Science”. The initiative was created by three early career researchers at the University of Oxford last year and has since been adopted by over 15 universities, in addition to sprouting a successful podcast.

There have been two ReproducibiliTea (or ReproT) sessions at the University of Leeds to date. In this post we will summarise some of the discussions which have arisen so far, and we hope to encourage others to join us for upcoming sessions. We are grateful to Leeds University library for hosting us and funding our tea/coffee and biscuit habit. Check out the Research Support Teams’ active Open Research Leeds twitter account @OpenResLeeds, which also represents the UK Reproducibility Network (UKRN) grouping at Leeds University. You can also contribute to the discussion on Twitter using the hashtag #UKRNLeeds.

The Great First Session

Thursday 18th July marked the inaugural session of ReproT at Leeds. Dr Emily Williams (Postdoctoral Research Fellow in Psychology) chaired the session, a discussion around ‘A Manifesto for Reproducible Science’ by Prof Marcus Munafò and colleagues. Around 12 people attended from disciplines including dentistry, psychology, environmental sciences, engineering, and data science. The article proposed several practices to improve the reproducibility of research, from before the research takes place (e.g. Pre-Registration and Registered Reports) to the reporting stage (e.g. sharing data and materials and using reporting checklists/guidelines, such as the CONSORT). Of similar importance, the snacks of the day were bourbons, chocolate cookies, and angel slices.

On the paper’s promotion of “implementing independent methodological support” (p. 2), there was general agreement that statisticians should be credited as a co-author. Attendees from non-psychological disciplines were surprised that psychologists don’t often have a designated statistician on their project and instead conduct the analyses themselves.

There was general consensus that Registered Reports are a good idea. This is where you submit your introduction, methods and plan for analysis to a participating journal for peer review before data collection. If the reviews are favourable, the journal offers “in principle” acceptance of the paper regardless of outcomes (e.g. null results). Once the full paper has been written, it will be peer reviewed once more to evaluate whether you performed the analyses in the way you said you would. Possible issues with registered reports were discussed including the role of exploratory analyses and reservations that, that this would be tricky to fit into a Masters project or PhD. However, Prof Chris Chambers, co-author of the paper in question, argues that doing a Registered Report in the first two years of a PhD is do-able.

The paper also sparked discussion about the recent Elsevier controversies, Prof Chambers’ book “The Seven Deadly Sins of Psychology” (of interest to researchers in any discipline), and discontent with the current system where students often think a null result is a failure on their part.

The Successful Follow-up

The second ReproducibiliTea session took place on Tuesday 13th August. The meeting was chaired Dr Sam Smith, Associate Professor in the Leeds Institute of Health Sciences. Our attendance almost doubled with around 20 attendees from disciplines, including, but not limited to, engineering, cardiovascular research, psychology, linguistics, and data sciences. The meeting also had a plentiful supply of biscuits, such as digestives and chocolate fingers. Additionally, part of the meeting included a Skype Q&A call with Dr Sebastian Karcher, who alongside Dr Diana Kapiszewski authored of the paper ‘Openness in Practice in Qualitative Research’.

In the Skype Q&A, Dr Karcher stated that the paper was intended to address the repeated debates on openness in qualitative research. There was consensus on one of the paper’s assertion that “openness is not an all or nothing pursuit”. We should not advocate that all research should follow the same guidelines for transparency in research. Instead there are small actions that researchers can take to make their research more open.

There are unique challenges in qualitative research that researchers need to be mindful of when pursuing openness. One of the main issues with openness in qualitative research is regarding the ethics of data sharing. Dr Karcher presented some interesting solutions to this particular challenge in the paper.

There were also discussions on ways to increase transparency by keeping an audit trail of the research process, which is then shared in a public domain. There were debates on whether it is truly transparent to edit transcripts to enhance readability.

Forthcoming ReproducibiliTea Meetings

ReproducibiliTea meets the third Tuesday or Thursday of the month, 1-2pm, in Research Meeting Room 2 of Edward Boyle Library. See the table below for our upcoming sessions, click here to find PDFs of the papers on our OSF page. We look forward to seeing you there soon!

Article Theme Link to Article
3. Thurs 12th Sept. 2019 False Positive Psychology Examining Analytic Flexibility

And why it is a problem

Simmons et al. 2011. False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant. Psychological Science
4. Tues 15th Oct. 2019 Measuring the Prevalence of Questionable Research Practices with Incentives for Truth Telling Questionable Research Practices: Are they really that common? And why are they problematic? John et al. 2012. Measuring the Prevalence of Questionable Research Practices With Incentives for Truth Telling. Psychological Science
5. Thurs 14th Nov. 2019 Is the Replicability Crisis Overblown? Three Arguments Examined Has the debate gone too far? Things will just turn out fine Pashler et al. 2012. Is the replicability crisis overblown? Three arguments examined. Perspectives on Psychological Science
6. Tues 10th Dec. 2019 Open Science: What, Why and How Open Data and Materials Spellman et al. 2017. Open science: What, Why and How. PsyArXiv
7. Thurs 16th Jan. 2020 The Preregistration Revolution Preregistration as a solution Nosek et al. 2018. The preregistration revolution. Proceedings of the National Academy of Sciences
8. Tues 11th Feb. 2020 Observational Open Science Tackling problems of “doing open” in observational research Graham et al. 2019. Observational Open Science. MetaArXiv
9. Thurs 19th Mar. 2020 The Natural Selection of Bad Science And what about the future? Smaldino et al. 2016. The Natural Selection of Bad Science. Royal Society Open Science