New method ranks quality of scientific journals by field

Feb 27, 2008

Worldwide, the number of scientists is increasing as is the number of scientific journals and published papers, the latter two thanks in large part to the rise of electronic publishing. Scientists and other researchers are finding it more difficult than ever to zero in on the published literature that is most valuable to them.

Now, much like Google locates electronic information likely to be relevant and of quality to the user, a team of researchers from Northwestern University has developed a mathematical method to rank scientific journals according to quality, an approach that will help scientists locate high-impact research papers to read and to cite in their own papers. The rankings also should be useful to university administrators and funding agencies who must evaluate the quality of a researcher’s work.

The team analyzed the citation data of nearly 23 million papers that appeared in 2,267 journals representing 200 academic fields and that spanned the years from 1955 to 2006; their analysis produced 200 separate tables of journal rankings by field. The results, including all the rankings, will be published online Feb. 27 in PLoS ONE, an international, peer-reviewed, open-access, online journal published by the Public Library of Science.

“Trying to find good information in the literature can be a problem for scientists,” said Luís A. Nunes Amaral, associate professor of chemical and biological engineering in Northwestern’s McCormick School of Engineering and Applied Science, who led the study. “Because we have quantified which journals are better, now we can definitively say that the journal in which one publishes provides reliable information about the paper’s quality. It is important to grasp a paper’s quality right away.”

Amaral, Michael J. Stringer, lead author of the paper and a graduate student in Amaral’s research group, and Marta Sales-Pardo, co-author and a postdoctoral fellow, developed methods to look at the enormous amount of published papers and to make sense of them. For each of the 2,267 journals, they charted the citations each paper received across a certain span of years and then developed a model of that data, which allowed the researchers to compare journals.

The researchers’ model produced bell curves for the distribution of “quality” of the papers published in each journal. For each field, all the bell curves for the journals then were compared, which resulted in the journal rankings. The field of ecology, for example, had 36 journals, with Ecology ranked highest and Natural History ranked lowest.

“The higher a journal is ranked, the higher the probability of finding a high-impact paper published in that journal,” said Amaral.

Amaral and his team found that the time scale for a published paper’s complete accumulation of citations -- a gauge for determining the full impact of the paper -- can range from less than one year to 26 years, depending on the journal. Using their new method, the Northwestern researchers can estimate the total number of citations a paper in a specific journal will get in the future and thus determine -- right now -- the paper’s likely impact in its field. This is the kind of information university administrators and funding agencies should find helpful when they are evaluating faculty members for tenure and researchers for grant awards.

“This study is just one example of how large datasets are allowing us to gain insight about systems that we could only speculate about before,” said Stringer. “Understanding how we use information and how it spreads is an important and interesting question. The most surprising thing to me about our study is that the data seem to suggest that the citation patterns in journals are actually simpler than we would have expected.”

The PLoS ONE paper is titled “Effectiveness of Journal Ranking Schemes as a Tool for Locating Information.” Upon publication, the paper will be available at www.plosone.org/doi/pone.0001683 .

Source: Northwestern University

Explore further: Researcher figures out how sharks manage to act like math geniuses

add to favorites email to friend print save as pdf

Related Stories

Uncovering the forbidden side of molecules

15 hours ago

Researchers at the University of Basel in Switzerland have succeeded in observing the "forbidden" infrared spectrum of a charged molecule for the first time. These extremely weak spectra offer perspectives ...

Smallest possible diamonds form ultra-thin nanothreads

15 hours ago

For the first time, scientists have discovered how to produce ultra-thin "diamond nanothreads" that promise extraordinary properties, including strength and stiffness greater than that of today's strongest ...

World greenhouse emissions threaten warming goal

15 hours ago

Emissions of greenhouse gases are rising so fast that within one generation the world will have used up its margin of safety for limiting global warming to 2°C (3.6°F), an international team of scientists ...

Recommended for you

User comments : 1

Adjust slider to filter visible comments by rank

Display comments: newest first

BigTone
3 / 5 (2) Feb 27, 2008
I think this is great and I like how they patterned relevance or as they put it - impact - of a paper with one Google-esque method of how many other publications reference it.

The only thing that scares me about this is a bias towards traditional consensus as opposed to new ideas. For example, many open minded scientists are taking a look at extremeophiles and are beginning to think the traditional Goldilocks Zone is not a hard requirement for life - especially microbial life - even as we know it and find it here on Earth. Thus, papers on Goldilocks Zone type research may get an unfair number of references because of its long standing history of most scientists concurring with its assumptions.

Although they are counterbalancing this effect with other criteria for impact, those counterbalances also seem biased by traditional assumptions that could be erroneous but mainstream (peer review is only as useful as what your peers generally believe) - to give a very extreme example - I wouldn't want the impact of my genetic research to peer reviewed by Neo-Nazi quasi scientists that are heavily opinionated and would dismiss or spin my findings to meet their own agenda...