On November 12, we received the news that Web of Science, the platform operated by Clarivate, will no longer list eLife in Science Citation Index Expanded (SCIE). This is the commercial index that produces the Journal Impact Factor, a metric widely acknowledged to be highly problematic because it is often misinterpreted and misused in the assessment of research. Instead, if we were to provide Clarivate with a feed of articles excluding any that are deemed “inadequate” or “incomplete” in their eLife Assessments, then eLife would be partially indexed in Web of Science as part of its Emerging Sources Citation Index (ESCI).
This decision came after eLife was placed “on hold” by Web of Science on October 23 (see our previous statement). As journals that are partially indexed are not given an Impact Factor, we will not receive one when the metric is updated in June 2025. This is despite the fact that such a partial feed would only include papers that Web of Science judges above the threshold for inclusion – those deemed to be “solid” or above – and despite the fact that papers we deem below this threshold can subsequently be published in SCIE-indexed journals. Indeed, there is already an example of this happening.
We have highlighted before that eLife has never supported the Impact Factor. A journal name or its Impact Factor says little about the quality of any individual research article. This is why we created the eLife Model to review, assess and directly engage with research without the need for proxies. Instead of indicating the quality of an article through simple accept/reject decisions, we provide public reviews and eLife Assessments of preprints using controlled vocabulary to convey to readers the significance of the findings and strength of evidence in a nuanced manner. The Assessments include highlighting the strengths and weaknesses of the work, and providing readers with a high-level summary of the detailed reviews. As we believe this approach is much closer to the ideal of how scientific discourse should work, we stand by our model.
We understand that the Impact Factor is still relied upon in some settings. Since we were placed “on hold” by Web of Science, we have been collecting feedback from the community and monitoring submissions closely. Here we share an update on how our model has performed during this period.
First, we would like to remind our authors and readers that eLife has the backing of all the major funders and institutions, who continue to treat eLife papers with the same level of trust and prestige as they always have. There has been an ongoing move away from journal-level metrics to evaluate careers and grant funding, and we feel this will only accelerate in light of this latest move by Clarivate.
While the following data only cover a short timeframe, we hope they serve to provide a positive view of our progress so far.
Submission demographics have shifted
While there has been a substantial fall in the number of submissions from China, where we know there is a strong dependence on the Impact Factor, our submission trends are broadly as they were elsewhere. Some countries, such as the UK, have even shown significant increases. The fact that we are still receiving submissions from other countries such as Italy which rely heavily on the Impact Factor is also an encouraging sign.
The following graph shows the number of submissions received by region between November 1–15, 2024, as compared to the same period in the previous two months:
Quality of submissions remains high
We have continued to receive high-quality submissions since being placed “on hold” in October, meaning that the total number of papers sent for review has been unaffected.
The following graph shows the overall number of submissions sent for review weekly between September 1–November 16, 2024. It is important to note that we have not asked anything different from our editors in the past few weeks – they continue to assess each paper independently.
We are also pleased to have provided and to continue providing high-quality public reviews alongside the papers we publish. This includes some truly outstanding research, such as this “fundamental” and “exceptional” study by Comyn et al., this “landmark” and “compelling” study by Shang and Kojetin, and these “fundamental” and “compelling” works by Rai et al. and Wada et al.
Since we launched our model, more than 96% of authors have revised their Reviewed Preprint according to reviewer feedback before publishing the final Version of Record (VOR). Of those articles that were not revised before VOR publication, most were deemed “solid” or above, while only six were described as “incomplete”. However, even in these cases of “incomplete” articles, we are confident that they would have been published by other journals, albeit without the public record of the context or concerns that eLife provides.
Support from the community is strong
We have been alerting authors at key parts of the process about Web of Science’s decision in October and have been pleased with the overwhelming majority of them wishing to continue with their submissions. We are grateful for the overall support we’ve seen from authors and editors, as well as funders and others across the wider research community.
Next steps
These early data support our belief that the community is ready to challenge the dominance of journal metrics from commercial companies that want to maintain a status quo in which profit motives are not easily disentangled from efforts to maintain high academic standards in science. Although we need more data, we are very encouraged by the support and submissions we are receiving.
We will continue to monitor our progress and share updates in light of Clarivate’s decision. Besides this, our focus will be on working with our communities to find and promote more transparent and accountable ways of publishing research. Our efforts will include continuously developing the eLife Model according to community feedback and showcasing how others can adopt a flavour of it to help improve the current publishing system for all. By also working with like-minded organisations including the Declaration on Research Assessment (DORA), the Coalition for Advancing Research Assessment (CoARA) and the Barcelona Declaration, we hope to drive necessary change in research culture and evaluation away from journal-based metrics and towards more responsible methods of assessment that the community wishes to see.
We’ll have more to say on this soon. For any feedback, questions or concerns in the meantime, please contact us at press@elifesciences.org.