Ranking research

May 03, 2011

A new approach to evaluating research papers exploits social bookmarking tools to extract relevance. Details are reported in the latest issue of the International Journal of Internet Technology and Secured Transactions.

Social bookmarking systems are almost indispensible. Very few of us do not use at least one system whether it's Delicious, Connotea, Trunk.ly, Digg, Reddit or any of countless others. For Academics and researchers CiteULike is one of the most popular and has been around since November 2004. CiteUlike allows users to bookmark references but also embeds more conventional bibliographic management. As users of such systems quickly learn the only way to make them useful for others is to ensure that you tag your references comprehensively, but selectively.

On the whole, is very useful but it could be even more so if, rather than using similarity or query-dependent ranking for generating search results if it had a better ranking system.

Researchers in Thailand have now proposed "CiteRank", a combination of a similarity ranking with a static ranking. "Similarity ranking measures the match between a query and a research paper index," they explain. "While a static ranking, or a query-independent ranking, measures the quality of a research paper." Siripun Sanguansintukul of Chulalongkorn University in Bangkok and colleagues have used a group of factors including number of groups citing the posted paper, year of publication, research paper post date, and priority of a research paper to determine a static ranking score, which is then combined with the query-independent measure to give the CiteRank.

The team tested their new ranking by asking literature researchers to rate the results it produced in ranking obtained from the search engines based on an index that uses, TTA, tag-title-abstract. The weighted algorithm CiteRank 80:20 in which a combination of similarity ranking 80% and static ranking 20% was most effective. They found that many literature researchers preferred to read more recent paper or just-posted papers but they also rated highly classic papers that emerged in the results if they were posted across different user groups or communities. Users found good papers based on priority rating but TTA was still important.

"CiteRank combines static ranking with similarity ranking to enhance the effectiveness of the ranking order," explains Sanguansintukul. "Similarity ranking measures the similarity of the text (query) with the document. Static ranking employed the factors posted on paper. Four factors used are: year of publication, posted time, priority rating and number of groups that contained the posted paper."

"Improving indexing not only enhances the performance of academic paper searches, but also all document searches in general. Future research in the area consists of extending the personalization; creating user profiling and recommender system on research paper searching." the team says. The experimental factors that emerged from the study, can help in the optimization of the algorithm to adjust rankings and to improve search results still further.

Explore further: Anti-apartheid hero, ex-Norway PM awarded 'Asian Nobel' prizes

More information: "CiteRank: combination similarity and static ranking with research paper searching" in International Journal of Internet Technology and Secured Transactions, 2011, 3, 161-177

add to favorites email to friend print save as pdf

Related Stories

Search engine marketing for non-profits

Dec 08, 2008

Non-profit organizations should be exploiting the strategies of online marketers to gain traffic to their websites, raise awareness of their "brand" and its aims and convert visitors into donors, according to a study published ...

Security that nets malicious Web sites

Mar 23, 2007

Have you ever wondered how fraudulent or malicious websites can rank highly on search engines like Google or Yahoo? Queensland University of Technology IT researcher Professor Audun Josang said a website's ranking was determined ...

Google PageRank-like algorithm dates back to 1941

Feb 19, 2010

(PhysOrg.com) -- When Sergey Brin and Larry Page developed their PageRank algorithm for ranking webpages in 1998, they certainly knew that the seeds of the algorithm had been sown long before that time, as ...

New Data Support Use Of Instant Run-Off Voting

Dec 03, 2009

(PhysOrg.com) -- New data collected as part of a North Carolina State University study during the 2009 municipal election in Hendersonville, N.C., show that voters prefer instant run-off voting (IRV) to traditional voting ...

Recommended for you

Ig Nobel winner: Using pork to stop nosebleeds

35 minutes ago

There's some truth to the effectiveness of folk remedies and old wives' tales when it comes to serious medical issues, according to findings by a team from Detroit Medical Center.

History books spark latest Texas classroom battle

Sep 16, 2014

As Texas mulls new history textbooks for its 5-plus million public school students, some academics are decrying lessons they say exaggerate the influence of Christian values on America's Founding Fathers.

Flatow, 'Science Friday' settle claims over grant

Sep 16, 2014

Federal prosecutors say radio host Ira Flatow and his "Science Friday" show that airs on many National Public Radio stations have settled civil claims that they misused money from a nearly $1 million federal ...

User comments : 0