Ranking research

May 3, 2011

A new approach to evaluating research papers exploits social bookmarking tools to extract relevance. Details are reported in the latest issue of the International Journal of Internet Technology and Secured Transactions.

Social bookmarking systems are almost indispensible. Very few of us do not use at least one system whether it's Delicious, Connotea, Trunk.ly, Digg, Reddit or any of countless others. For Academics and researchers CiteULike is one of the most popular and has been around since November 2004. CiteUlike allows users to bookmark references but also embeds more conventional bibliographic management. As users of such systems quickly learn the only way to make them useful for others is to ensure that you tag your references comprehensively, but selectively.

On the whole, is very useful but it could be even more so if, rather than using similarity or query-dependent ranking for generating search results if it had a better ranking system.

Researchers in Thailand have now proposed "CiteRank", a combination of a similarity ranking with a static ranking. "Similarity ranking measures the match between a query and a research paper index," they explain. "While a static ranking, or a query-independent ranking, measures the quality of a research paper." Siripun Sanguansintukul of Chulalongkorn University in Bangkok and colleagues have used a group of factors including number of groups citing the posted paper, year of publication, research paper post date, and priority of a research paper to determine a static ranking score, which is then combined with the query-independent measure to give the CiteRank.

The team tested their new ranking by asking literature researchers to rate the results it produced in ranking obtained from the search engines based on an index that uses, TTA, tag-title-abstract. The weighted algorithm CiteRank 80:20 in which a combination of similarity ranking 80% and static ranking 20% was most effective. They found that many literature researchers preferred to read more recent paper or just-posted papers but they also rated highly classic papers that emerged in the results if they were posted across different user groups or communities. Users found good papers based on priority rating but TTA was still important.

"CiteRank combines static ranking with similarity ranking to enhance the effectiveness of the ranking order," explains Sanguansintukul. "Similarity ranking measures the similarity of the text (query) with the document. Static ranking employed the factors posted on paper. Four factors used are: year of publication, posted time, priority rating and number of groups that contained the posted paper."

"Improving indexing not only enhances the performance of academic paper searches, but also all document searches in general. Future research in the area consists of extending the personalization; creating user profiling and recommender system on research paper searching." the team says. The experimental factors that emerged from the study, can help in the optimization of the algorithm to adjust rankings and to improve search results still further.

Explore further: IST researchers categorize social media searches

More information: "CiteRank: combination similarity and static ranking with research paper searching" in International Journal of Internet Technology and Secured Transactions, 2011, 3, 161-177

Related Stories

Search engine marketing for non-profits

December 8, 2008

Non-profit organizations should be exploiting the strategies of online marketers to gain traffic to their websites, raise awareness of their "brand" and its aims and convert visitors into donors, according to a study published ...

Security that nets malicious Web sites

March 23, 2007

Have you ever wondered how fraudulent or malicious websites can rank highly on search engines like Google or Yahoo? Queensland University of Technology IT researcher Professor Audun Josang said a website's ranking was determined ...

Google PageRank-like algorithm dates back to 1941

February 19, 2010

(PhysOrg.com) -- When Sergey Brin and Larry Page developed their PageRank algorithm for ranking webpages in 1998, they certainly knew that the seeds of the algorithm had been sown long before that time, as is evident from ...

New Data Support Use Of Instant Run-Off Voting

December 3, 2009

(PhysOrg.com) -- New data collected as part of a North Carolina State University study during the 2009 municipal election in Hendersonville, N.C., show that voters prefer instant run-off voting (IRV) to traditional voting ...

Recommended for you

The oldest plesiosaur was a strong swimmer

December 14, 2017

Plesiosaurs were especially effective swimmers. These long extinct "paddle saurians" propelled themselves through the oceans by employing "underwater flight"—similar to sea turtles and penguins. Paleontologist from the ...

Averaging the wisdom of crowds

December 12, 2017

The best decisions are made on the basis of the average of various estimates, as confirmed by the research of Dennie van Dolder and Martijn van den Assem, scientists at VU Amsterdam. Using data from Holland Casino promotional ...


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.