色盒直播

Academics value journal prestige over veracity, finds study

If the choice is between impact factor and maintaining the content’s integrity, there is little contest

八月 27, 2024
Scissors threatening to cut a balloon
Source: iStock

The most acclaimed journal articles may contain the scantiest evidence, because researchers are willing “to trade their results for prestige”.

A Queensland University of Technology study has found that academics around the globe prize impact factor above all other considerations when choosing where to publish their work.

While they may resent reviewers’ requests to cut data from their papers, most comply if it means getting published in big-name journals.

The findings, published on a?, reflect perceptions that “academics who play the ‘publish or perish’ game” are strongly incentivised to accept all referees’ “suggestions” – including those that are “misleading or even incorrect”.

“Journals with the highest impact factors potentially have the most partial evidence, as researchers are more willing to ‘hold their nose’ to satisfy the editors at influential journals,” the paper says.

“The research community’s fixation on journal prestige is harming research quality, as some researchers focus on where to publish instead of what.”

The study analysed survey responses from 616 researchers in 63 countries. Participants were asked to imagine they were first authors of a “good quality” 4,000-word manuscript – which either had not been submitted for publication or had been rejected a couple of times – and to select potential publication outlets based on their characteristics.

These included impact factor, the speed of decisions to accept or reject papers, the helpfulness of reviews, usefulness for career advancement, and the likely extent of demands to rewrite material.

The analysis found that impact factor was easily the dominant consideration, followed by the helpfulness of the review and the career payback. Speed of response loomed low, with editing request lowest of all – even if it entailed jettisoning a quarter of the words and a data table.

Focus group discussions revealed that researchers had been asked to “cut results” from their papers at peer review stage, for a multitude of potential reasons: to save space, “to keep a ‘clean story’, to make the story ‘digestible’, to remove results that contradicted previous findings, or to remove findings that were not of interest to journals or colleagues”.

In an op-ed published by?, lead author Adrian Barnett says he now deletes the names of journals in the publication section of his curriculum vitae.

“[This] discourages simplistic scans, such as counting papers in particular journals,” he wrote. “It’s a nudge intervention: a reminder that work should be judged by its content first, journal second.”

john.ross@timeshighereducation.com

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.

相关文章

ADVERTISEMENT