̽Ƶ

‘Toxic culture’ caused by REF pressure to target top journals

Staff at management school say they feel forced to publish in ‘narrow’ subset of elite journals to boost REF standing, restricting types of research they can pursue

Published on
February 12, 2026
Last updated
February 12, 2026
A speleologist crawls with his caving pack through a tunnel during a competition. To illustrate staff feeling forced to publish in a 'narrow' subset of elite journals to boost REF standing.
Source: Viktor Drachev/AFP via Getty Images

Academics claim they are coming under growing pressure to publish in highly rated publications ahead of the Research Excellence Framework (REF), contributing to a “target-driven” culture that discriminates againstother types of research outputs.

In a highly critical report into working practices at the University of Liverpool Management School prepared by University and College Union (UCU) representatives, policiesthat require staff to publish in journals designated 3* or 4* by theand in those on thelist are cited as a source of “considerable apprehension” for a majority of staff.

“Many feel that success in these restrictive outlets…has become the salient measure for promotion and access to resources,” explains the report, which drew on responses from 78 staff members.

“This pressure actively narrows research agendas toward dominant methods and theoretical ontologies (often North American rather than UK/European), and constrains staff from publishing in high-quality disciplinary, specialist, and inter-disciplinary journals not on these lists,” argues the report, adding that this “devalues books, chapters, and other alternative formats, contradicting the university’s public commitment to diverse research”.

̽Ƶ

ADVERTISEMENT

Journal lists are widely compiled by universities as well as various organisations and. Supporters say they are a useful way of quickly identifying exceptional researchers, given the vast range of journals in existence.

But the Liverpool report questions how this approach aligned with the university’s broader commitments to the San Francisco Declaration on Research Assessment (Dora). Signatories to that, endorsed by hundreds of universities worldwide, should not use “journal-based metrics…as a surrogate measure of the quality of individual research articles”.

̽Ƶ

ADVERTISEMENT

Several respondents to UCU’s survey argued that the focus on certain journals was unfair, particularly on early career researchers, because “publication time horizons are long and rejection rates high at FT50 and other 4* journals”.

“Pushing staff to publish exclusively in these outlets may delay the dissemination of solid, field-relevant research that would otherwise find a suitable home in other high-quality journals,” explains the report, which has been sent to university leaders.

“They deny promotions, sabbaticals, and anything else you might ask for if you don’t have an FT50 pipeline,” explains one academic quoted in the report. Another claims FT50 journals are “mentioned at every group meeting, regardless of your research area or topic. It feels similar to a sales environment, where only the target matters”.

The “internal pressure to publish in a narrow set of journals” is driven by the school’s desire to improve its REF standing, the report argues, claiming there is a “belief that there is a direct and exclusive link between FT50 publications and a high REF ranking”.

But the“biased and unfair research evaluation system” hadcontributed to a “toxic culture of fear” within the school demonstrated by“persistently high levels of UCU casework”.

Many institutions have defended journal lists from criticism that they do not comply with the spirit of Dora because they do not exclusively focus on journal metrics and are often used alongside other indicators.

̽Ƶ

ADVERTISEMENT

A University of Liverpool spokesperson said the institution is “a committed signatory” to Doraand “on this basis, we are clear that publication in a FT50 journal is not a criterion for hiring or promotion at the University of Liverpool Management School”.

“Rather, the university has a robust output evaluation programme that is consistent with Dora and is used to assess quality of research,” they said.

̽Ƶ

ADVERTISEMENT

The university was aware of the recent UCU report, they added, and “whilst we do not accept its conclusions, we have engaged in constructive dialogue with UCU in relation to the issues it raises”.

Anna Morgan-Thomas, professor of digital management and innovation at the University of Glasgow’s Adam Smith Business School and former dean of research, told̽Ƶthat the widely used journal lists served a useful purpose.

“Their use is not surprising because there are hundreds of business journals across a wide number of disciplines. So when you’re selecting from hundreds of applicants you need some criteria to both compare across and select the truly exceptional candidates,” explained Morgan-Thomas, whose 2024Research Policyfound that 4* journal publications strongly correlated with 4* REF outputs.

Relying solely on “academic judgement” in evaluating outputs, as suggested by Dora, was difficult for business schools because it would require convening the right expertise, as well as being hugely labour-intensive, said Morgan-Thomas.

“While I fully align with Dora’s intentions, it underestimates the practical constraints of carrying out this kind of academic judgement,” she said, noting that conducting rigorous internal screening of papers necessitated a trade-off.

“It comes down to some sharp choices. Do I ask my staff to spend time shepherding young researchers through the publication process? Or do I devote it to reading and rating articles that staff have already published?” she said.

̽Ƶ

ADVERTISEMENT

jack.grove@timeshighereducation.com

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Please
or
to read this article.

Related articles

Reader's comments (11)

Maybe this is news for Management Schools, but it is a decades old issue for the rest of us.
Nope, been standard in business schools for decades, we even have our own list developed as a guide for this purpose, which is what the article is reporting on. It's just that certain types of academics don't like it, despite the broad interdisciplinary nature of the list used and the acceptance of a wide range of different approaches at most journals mentioned. These complaints cycle around every so often, usually now hiding behind the misreading of DORA, and it's unlikely the UCU did a survey in this case, they probably dusted off an old list of complaints from the usual suspects.
How is this a misreading of DORA? Using the ABS List to assess the quality of a piece of published research, or in hiring/promotion decisions is clearly not consistent with DORA's "general recommendation".
Do people genuinely think that if REF didn't exist, there'd be no pressure to publish high quality papers in high quality journals? Laughable.
Given the pre-REF state of UK publishing in business schools I suspect some would love to go back to the good old days where editing a book some point in your career was enough for a professorship.
Those who claim that they are not ignoring DORA by using journal rankings as a mark of quality are utterly deluded. Of course it takes time to personally review work, but how else does anyone fairly judge the quality. Members of the last Business & Management REF panel developed an approach to doing that which overcame the need for subject/field specialisation. Learning to apply it takes hours, not weeks. Surely they could disseminate their approach to a wider audience and, in doing so, create an environment in which a quality grade based upon that approach is truly consistent with DORA. If they did, universities could then use the higher ranking: CCABS list/FT-50 or REF-panel-methodology-approach-based quality grade. Better still, ignore the CCABS list/FT-50 and rely solely on the REF panel approach. After all, it is the REF panel’s approach business schools should be seeking to emulate. The REF panel surely does not rely on the CCABS list/FT-50 list when assessing the quality of an individual article. So, why are universities insisting on the use of these lists? If submissions to REF are only from journals ranked highly in the lists, a correlation to the REF grade doesn’t mean very much, but those who use the ranking lists despite signing-up for DORA use the observation to deflect attention away from their non-Dora-compliant policies and practices. What is perhaps more interesting is the many that were graded differently in the REF compared to the ranking of the journal they were published in. Would personal evaluation using the REF panel approach be less accurate at predicting REF grade? Probably not. Using that approach, not journal rankings would enfranchise those fields and methodologies that are currently excluded because their journals are lower ranked, and books, and monographs, early career researchers, and everyone else currently being forced to do research in topics that are not their choice or interest. Thus should also result in higher quality research being done. That’s surely a “win-win”.
Publishing in major academic journals has never been easy. It is indeed demanding, stressful and high-risk. The concerns raised in the UCU report and this article are valid. The CABS list is flawed and restrictive. How journals are ranked on the list is problematic; it's a league table but seemingly without promotion or relegation. The upshot is that b-school academics face a very narrow range of journals where publishing is deemed 'acceptable' or 'worthwhile' - only in journals ranked at CABS level 3 or 4. Anything rated a 2 is seen as a waste of time, even when the work is interesting, original, creative, and in the appropriate journals where certain conversations are taking place. In some sub-disciplines you've effectively got a target range of about 2 or 3 journals, usually the more conservative ones. Publishing has never been easy, but it's now in danger of becoming an unrewarding chore. The CABS list is largely responsible for this. It's an unwanted managerial intrusion into our profession, and other disciplines don't have an equivalent.
The ABS list used to be very restrictive years ago, but now it's very expansive with almost 500 journals rated 3* or above. If you can't find any suitable venue in that, you have to question whether you're a business and management researcher. Also, views are changing on monographs. I think the last 3 REFs now, submitted books have almost all been rated 4*. Business schools are picking up on this because they can be double weighted. There's been some fantastic books over the last few years as a result. I don't think using lists as guides for researchers is a bad idea as it provides clarity. Ultimately, any promotion decision must involve manual quality checks (i.e., peer review). Meaning papers in ABS 4* journals could be judged rubbish on closer inspection. Research performance should be about your personal contribution to knowledge. That means down weighting papers with many authors, and also judging how your whole programme fits together. People with many papers as second, third, author or later, across lots of topics, no matter how good the papers are individually, should not be able to advance as fast as someone who has made a personally big and sustained contribution to a single idea, unless of course they bring in lots of grant money that pays them to do this.
Not once has anybody mentioned the ethics of publishing. You can choose to ignore the profit-making major commercial publishers, and crippling 'informal' school journal lists, & still have a great career, as I have done. Business schools, of course, don't seem to recognize this, want publication in a narrow range of largely commercial journals by publishers who overcharge our universities and create untold grief around the world for people unable to pay APCs. There are scholar-led alternatives. Don't demean those.
There is one fundamental problem with the CABS list that cannot be ignored or overcome simply. It rates journals, not the research published therein. This is why the C17 REF Subpanel repeatedly state that they review the submitted papers only, without reference to the journal in which it is published via the CABS list or any other. And whilst Professor Morgan-Thomas and colleagues use a variety of metrics in their study, Social Sciences REF Panels do not use such metrics in their evaluations, so there is an inevitable misalignment between the REF process and their analysis. Why does this matter? Because, put simply, all of this is driven by the REF. The critical data are from the C17 Subpanel themselves, which highlight the extent to which there is (mis)alignment between the journal ranking and REF Panel evaluation. Hence why so many universities are implementing their own peer review processes to determine eligibility of outputs for inclusion in their REF submissions. So when Business Schools use the CABS list to evaluate potential candidates, promotion applicants, etc., they are using misaligned data. Yet this does not seem to stop them. Just one final thought - imagine commercial businesses operating in this way.
The REF is a waste of everyone's time and a colossal waste of money. It's about time this simple truth was faced by those in charge.

Sponsored

Featured jobs

See all jobs
ADVERTISEMENT