This post is by Karen Abel and Nick Sheppard from the Research Support team
On Thursday 9th September, Professor Alan Haywood, Chair of the University of Leeds Responsible Research Metrics Working Group, welcomed Professor Stephen Curry from the department of Life Sciences at Imperial College London. A prize-winning structural biologist, Stephen is also well known as an advocate of open research and improving research assessment. He is the Chair of the San Francisco Declaration on Research Assessment (DORA), with a vision to advance practical and robust approaches to research assessment globally and across all scholarly disciplines.
You can watch a full recording of the talk on YouTube:
As a DORA signatory, the University of Leeds has established a responsible research metrics working group to review current organisational policies. The group also aims to widen participation in this area and to help colleagues to understand and act on the principles of DORA.
If you support the DORA principles, you can sign the declaration as an individual. To date signatories number over 18,000 individuals and 2300 organisations.
There is a disconnect says Professor Curry, between incentives within the Academy and societal values. Many researchers are motivated by altruistic and noble goals, to understand the world and to change it for the better, through their medical research for example. In a 2016 article for the Guardian, Stephen discussed how the Wellcome Trust led an initiative to stimulate research into the Zika virus and its potential link to microcephaly in new-born babies, and to give researchers permission to share their results as quickly as possible via preprints and data sharing, without being penalised when they came to submit their research to a journal.
The need for Wellcome’s action are linked to researchers’ motivation to publish in the ‘best’ journals with high impact factors, but the traditional publication model often leads to significant delays which can be in opposition to the public interest. Especially where a virus like Zika is concerned. Or, say, a novel coronavirus. The majority of research is publicly funded, so it is incumbent on the research community to do its utmost to share results and to build on new findings rapidly.
The problem is also acknowledged in a tweetable quote from Thomas Insel, who led the National Institute for Mental Health in the United States for 13 years:
Has the pandemic created a lasting impact on open research?
To employ a ‘brutal euphamism’, the scientific community has had a ‘good pandemic’, where people have seen the value of science through epidemiological modelling and vaccine development. We must not squander that.
The ‘how’ and the ‘who’ instead of the ‘what’
Stephen has been influenced by the writings of Michael Sandel, a Harvard philosopher. In this lecture from 2018, Sandel discusses the political landscape, specifically the votes for Brexit and Trump, which he traces to a neoliberal, ‘marketised’ status quo in America and Western Europe where successive governments’ policies have been dictated, narrowly, by market value. This, he suggests, has led to a massive disenfranchisement of the public.
Sandals’ proposed solution is to reframe politics around genuine appreciation and social recognition for contribution to the common good and collective well-being. Through the pandemic for example, we’ve seen greater recognition of the value of people working in social care, so often those on minimum wages.
Universities have also been infiltrated by the same neoliberal consensus. How can we go beyond market rewards and market definitions of the value of our contributions as they are currently measured, through journal impact factor and university rankings? The obsession with metrics reduces productivity and, once again, is in opposition to the public interest. Not only does the ambition to publish in high impact journals slow publication, but a positive bias has developed with little incentive to share negative results.
While impact factor devalues important research activities, it also devalues academics themselves. It doesn’t allow the rich variety of academic activities to be properly recognised, such as teaching undergraduates, being a good departmental citizen and contributing to policy work or public engagement.
The current system very much focuses on the the product, the ‘what’, but it doesn’t focus on the ‘how’ or the ‘who’.
Cultural shift to qualitative indicators
DORA has worked with Professor Sarah de Rijke of Leiden University, an academic whose sociological research takes a systemic view of cultural research impact. In a 2017 paper she concludes that the focus on quantative indicators can displace other criteria and that qualitative indicators such as epistemic originality, long term progress, societal relevance, and social responsibility, often fail to gain attention in a system focused on quantitative indicators. Indeed, Stephen discussed occasions where he has heard first-hand, colleagues celebrating a publication in a revered journal over and above any other measures of achievement. While this as a ‘dangerous mode to slip into’ he recognises that it is all too easy, and that most researchers will do so at some point.
There is increasing attention on the damaging effects of hyper competition in the Academy, including recent reports from the Nuffield Council, LERU, the Wellcome Trust and even the Russell Group.
DORA’s conception of healthy research values, Stephen concedes, may appear somewhat utopian, and certainly present a challenge. Nevertheless, the question for DORA, and for all of us working in academia, is how to realise these in practice:
- Reliable, rapidly communicated, accessible, high-quality research that transforms our understanding of the world can change it for the better
- Researchers who collaborate, who feel duty of care to group members and to the society of which they are a part
- A research system as a whole that values the people within it, cares about their quality of life and seeks out the creative vigour of diversity
The link with open scholarship
There is a strong argument that open science can be better science.
During the pandemic we have seen rapid dissemination of Covid science with misleading results rapidly deconstructed through open peer review, though, in some cases not before they have been picked up by the media and shared far and wide as happened with hydroxychloroquine. Notwithstanding such examples, open access enables greater scrutiny through more “eyeballs” on a piece of work.
Data and code sharing is integral to the open research landscape, again enabling greater scrutiny and improving reliability. Indeed, there have also been retractions of hydroxychlorine studies in the peer reviewed literature associated with data issues.
Open research practices during both Zika and Coronavirus have demonstrated how they can change the world for the better. There is still complexity and nuance however, with ongoing debate and disciplinary differences that need to be considered.
It is difficult to talk about research assessment without considering the link to open scholarship, and with considerations of equity and inclusion, which intersect around a broader conception of research culture:
Open scholarship raises important questions around research assessment and its traditional focus on journal articles. We need to think more broadly about the quality and the variety of research outputs if we are to do research assessment in a robust way.
In terms of equality, diversity, and inclusion, the Academy is not representative of the public at large. We also see greater diversity and inclusion at the bottom of the academic career ladder that tends to reduce, as women or people from ethnic minorities find it more difficult to climb to more senior roles. In part, this is due to biases within our research assessment practices that we need to address.
The simple diagram above belies a complex picture and there are many barriers to fostering a values based reseach culture focussed on people: the way that research is managed by government and funders; time pressures on universities and the culture of the ‘hero researcher’ as the lone genius. In combination they create a ‘toxic brew’.
There are no easy solutions, but it is important we come together as a community, nationaly and globally. If you haven’t already, Stephen encourages everyone to become familiar with with DORA’s recommendations.
Less well-know than DORA’s position on journal based metrics are the 17 positive recommendations for different stakeholders. Stephen highlights two in particular: the recommendation for research institutions to be explicit about the criteria that are used for hiring, tenure and promotion decisions and the recommendation for researcher assessment to consider, not just the paper, but the value and impact of all research output including datasets and software and to consider wider impact measures such as influence on policy and practice.
Nothing Succeeds Like Success: DORA’s toolbox
- Understanding the obstacles to change in the way research is assessed
- Experimenting with different approaches
- Creating a shared vision where revising policies and practices
- Communicating that vision on campus and beyond
A further tool, SPACE, is being developed to help institutions think through the stage they are at in their journey, to identify barriers and opportunities, and to evaluate the impact of their initiatives.
A recent grant from Arcadia will help to accelerate research assessment reform still further.
DORA is a small organisation and collaborates extensively, with The Royal Society for example to develop the ‘resumé for researchers’ which describes not only an academic candidate’s research contributions, but also how they support their research team and the wider research community, their discipline or society. UKRI has also recently committed to using a narrative CV format in its application processes. Stephen hopes that we can work towards common international standards, to reduce the burden of multiple different approaches on applicants.
DORA has also collaborated with the Wellcome Trust in the development of their recent open access policy which requires an institution to have sign DORA or committed to implementing its principles to be eligible for funding.
See this paper with the Global Research Council, on developing more joined-up thinking around research assessment across the world.
How to Change the World?
Whilst Stephen and many others are enthusiastic and recognize the importance of research assessment reform and the promotion of open science practices there is far from a consensus. A recent article in Nature Index for instance highlights the difference of opinion about reforms to hiring and promotion criteria at Utrecht University. It is important that advocates listen and engage with these differing views.
Stephen closed his engaging and wide ranging talk by offering two different perspectives on how to change the world.
In ‘Utopia for Realists’, Rutger Bregman’s argues you’ve got to be “unrealistic, unreasonable and impossible” while Atul Gawanda’s take is that, whilst we yearn for frictionless and technological solutions, it is only through dialogue that things ultimately change. Through people talking to people.
Professor Stephen Curry, Chair of DORA, is quite clear he prefers the second approach.