Toward a scientific process freed from systemic bias

Research on how science works - the science of science - can benefit from studying the digital traces generated during the research process, such as peer-reviewed publications. This type of research is crucial for the future of science and that of scientists, according to Frank Schweitzer, Chair of Systems Design at ETH Zurich, in Switzerland. Indeed, quantitative measures of scientific output and success in science already impact the evaluation of researchers and the funding of proposals. He shares his views in an Editorial spearheading a thematic series of articles entitled "Scientific networks and success in science", published in EPJ Data Science. There, Schweitzer notes, "it is appropriate to ask whether such quantitative measures convey the right information and what insights might be missing."

As studies in this thematic series demonstrate, data is in a unique position to leverage large data sets and the latest statistical analysis techniques, and to validate and quantify phenomena related to and publishing practices empirically. For example, Alexander Petersen and Orion Penner from the IMT Lucca Institute for Advanced Studies, Italy, found a strong cumulative advantage by which the initial publishing success of individuals is amplified over time.

In a separate study, Christian Schulz from ETH Zurich and colleagues show how different authors with the same name in a publication repository can be identified by means of an analysis of the similarity of their respective citation networks.

Finally, Emre Sarigöl from ETH Zurich and colleagues address the question of whether quantitative, citation-based measures of can be seen as objective. They show that the position of scientists in the collaboration network alone is - to a surprisingly large degree - indicative for the future citation success of their papers. Citation-based measures may therefore not be the most appropriate way to quantify scientific impact.


Explore further

Researchers prefer citing researchers of good reputation

More information: F.Schweitzer, Scientific networks and success in science, EPJ Data Science 2014, 3:35

A. Petersen, O. Penner Inequality and cumulative advantage in science careers: a case study of high-impact journals, EPJ Data Science 2014, 3:24

C. Schulz, A. Mazloumian, A. M. Petersen, O. Penner, D. Helbing, Exploiting citation networks for large-scale author name disambiguation, EPJ Data Science 2014, 3:11

E. Sarigöl, R. Pfitzner, I. Scholtes, A. Garas, F. Schweitzer Predicting scientific success based on coauthorship networks, EPJ Data Science 2014, 3:9

Provided by Springer
Citation: Toward a scientific process freed from systemic bias (2015, January 26) retrieved 19 August 2019 from https://phys.org/news/2015-01-scientific-freed-bias.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
6 shares

Feedback to editors

User comments

Jan 26, 2015
The bias against new ideas in the sciences occurs at the level of citation networks, so the use of scientometrics itself is the problem. This is actually obvious to anybody who listens to the critics of textbook theories. It takes a scientist who is reliant upon such metrics and who is deeply embedded in textbook theory to fail to observe and appreciate the problem.

What science does not want to own up to is that there needs to be a counter-balancing, non-professional voice in the sciences. In other words, we should be building a scientific social network made up of independent layperson voices that does not depend upon scientometrics, and a close look at science education reform efforts should reveal the path forward on this.

But, of course, academia will never create such a network. So, it will arrive as a business proposition, and the point will be to systematically map out the controversies of science -- an endeavor which academia has completely abandoned.

Jan 26, 2015
I'm a bit in two minds about this.
Mostly: Such evaluations are important.
Niggle: Science is at the very edge - and looking for the ways to 'average out' thing is not a way to keep an edge going.

. They show that the position of scientists in the collaboration network alone is - to a surprisingly large degree - indicative for the future citation success

I don't find this surprising at all. The best researchers are those
- that collaborate best (one-man-research has gone the way of the Dodo)
- that can produce excellent work (which means a lot of people will want to work with them)

So there is a natural tendency for excellent researchers to gravitate towards a central position within networks of other researchers. The position is not the cause but the result of scientific excellence.

Jan 26, 2015
Oh good golly! Physorg dropped the ball on putting this one up without doing the right kind of diligence on the title. Really-Skippy is probably going to take this one over for his very own claim he invented the subject for his book about toes and everything else.

Jan 26, 2015
Oh good golly! The failed-system apologist and his idiot Uncle trying to deny the bleeding objective truths highlighted by many mainstream science insiders/observers of late. But it's never the fault of the system! No, of course it couldn't be! It's always the messengers' fault for bringing the failings into the light of objective discussion and scrutiny for the purpose of correcting those failings instead of continuing denying them while trying to keep 'sweeping them under the carpet', with wall-to-wall troll posts by denialist-apologists and their idiot Uncle-bots. NO wonder science is no longer 'self-correcting'; or avoiding built-in confirmation bias in the accumulated literature as timely/honestly as it was supposed to do. All the mounting evidence/articles by mainstreamers recognizing the problems are ignored; while you pretend that it's only a 'small niggle' which can be self-corrected without actually recognizing the scale/seriousness of the problem FIRST. Good job, losers.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more