The most common way to measure science is to know the number of scientific papers published by a researcher, as well as the number of times these have been cited.
Knowing where your research was published is also important, especially if this paper appears in high-impact journals.
However, for years, researchers and academics around the world have argued that this is a limited way to know the importance of scientific work.
During the Tec Science Summit 2023, at the Tecnológico de Monterrey, computer biologist Carlos Manuel Estévez-Bretón explained to the audience what are the new trends to measure the impact of scientific findings.
Articles such as Toward More Inclusive Metrics and Open Science to Measure Research Assessment in Earth and Natural Sciences state that the traditional measurement of scientific publications only takes into account the most superficial parts of the impact of research on societies.
New ways to measure science
According to Estévez-Bretón, traditional metrics can encourage researchers to prioritize quantity over quality, since many times whether their research receives funding or not depends on these metrics.
In addition, these types of measurements ignore the differences that exist between disciplines in terms of publication formats (whether it is in articles, book chapters, or others), as well as the speed at which it is published.
These factors have led to some alternatives being proposed, explained the expert, who is also a research intelligence consultant for Elsevier, one of the most recognized academic publishers worldwide.
One of the innovative options presented by Estévez-Bretón is the Plum Analytics system.
This set of metrics provides information on the different ways people are interacting on the internet with the published results of an investigation, whether in the form of a scientific article, book chapter, patent, or video, among others.
This tool is divided into five categories: citations, use, captures, mentions, and social networks.
Citations are the traditional metrics that quantify the number of times an article has been cited.
Usage is related to how much research is being used through clicks, downloads, existence in libraries, or video playbacks.
The captures imply that someone has decided to save the investigation to return to it. For example, if someone classified it as a favorite or saved it.
Mentions are blog appearances, comments, reviews, or references on Wikipedia to know how much an investigation has transcended.
Social media interactions mean how much an investigation has been shared, liked, commented or tweeted.
The snowball effect
Another alternative is Snowball Metrics, a proposal from different institutions such as the University of Cambridge that has been presented as one of the main promoters of this project.
The idea is to generate a set of methodologies or “recipes” tested and agreed upon by members of these institutions to measure the impact of an investigation, explained Estévez-Bretón.
The recipes are online and can be downloaded for free for anyone to use.
Its intention is to reduce biases that could introduce the interests of publishers, funders, and government agencies.
For Estévez-Bretón, whether the future of science is more inclusive, fair, and transparent depends to a large extent on the way it is evaluated and he hopes that, in Latin America and the rest of the world, efforts to modernize these metrics will continue.