Future of biology rests in harnessing data avalanche

September 4, 2008
Future of biology rests in harnessing data avalanche

(PhysOrg.com) -- Like most sciences, biology is inundated with data. However, a group of researchers warns in a Nature feature that the avalanche of biological information is at the point where the discipline may be unable to reach its full potential without improvements for curating data into on-line databases. The commentary appears in the September 4, issue of the journal and outlines specific remedies to harness the information overload.

By July 2008, data-extractors or curators had indexed over 18 million articles in PubMed and sequences of over 260,000 organisms into GenBank. Both are examples of databases where biological information is stored for public access. Data curation is very labor intensive.

“There is a lack of standardization or consistency in the way scientists report their findings in different journals,” remarked corresponding author Sue Rhee of the Carnegie Institution’s Department of Plant Biology and principal investigator of The Arabidopsis Information Resource (TAIR). “In some cases the researchers don’t even specify the species of a gene under study. That leaves biocurators, who have advance degrees in biology, and expertise with databases and scripting languages, to read the full text and transfer the essence of the information into specific fields in the database. They spend a lot of time just figuring out the basics. And that leaves a lot of room for error.”

Curation is not just a data organization tool. Such input has become essential to biological research. The authors note that eleven different databases had ¾ of a million visitors who viewed 20 million pages in just one month. And with inference programs that feed on the curated data, researchers can now tap into other work that relates to theirs and use that data in their own experiments—a huge advancement that is accelerating the pace of biology. “With this vast universe of information, the whole nature of experimentation is changing,” continued Rhee. “But the field is being held back with the curation backlog.”

The group of authors outlined a series of solutions to the problem. The first is to have authors input their data directly into databases upon acceptance in refereed journals. This step has already begun with Plant Physiology and TAIR. When a manuscript in accepted, researchers now fill in a web form about Arabidopsis genes. Second, the commentators urge the biological community to adopt standard reporting formats that are universally agreed upon. And third, curation needs to be elevated by academic institutions and funding agencies. There should also be incentives for researchers to curate their own data, such as increases in academic recognition, career advancement, and funding. They additionally suggest that “community annotation” could be modeled after large-scale astronomy projects like the Sloan Digital Sky Survey, or the Galaxy Zoo, where 80,000 astronomers and interested amateurs classified one million galaxies in less than three weeks.

“The effort and cost required to curate the data is small compared with the cost of carrying out the research in the first place, yet this additional step adds tremendously to the value of the research results to society,” commented Eva Huala, director of TAIR.

Wolf Frommer, acting director of Carnegie’s Department of Plant Biology noted that “advances in our understanding of biology will affect our food supply, our health-care system, the development of remedies for climate change, and many other aspects of daily life. Basic and applied research have to go hand in hand with curation of databases so that humanity can adapt to the quickly changing world as fast as possible.”

Provided by Carnegie Institution

Explore further: Simple errors limit scientific scrutiny

Related Stories

Simple errors limit scientific scrutiny

November 11, 2015

Researchers have found more than half of the public datasets provided with scientific papers are incomplete, which prevents reproducibility tests and follow-up studies.

A public warehouse for toxicity data

October 29, 2015

Part of the SEURAT-1 cluster, TOXBANK (Supporting Integrated Data Analysis and Servicing of Alternative Testing Methods in Toxicology) has developed a series of tools for the scientific community which are expected to help ...

A passion to defeat the whitefly

November 6, 2015

University of New Mexico alumna Laura Boykin (Ph.D. 2003) was recently featured in the article, "12 Badass Scientists...Who Also Happen to be Women" released by Ted Fellows, a program that falls under the purview of TED (Technology, ...

How to catch hair from a bear

November 4, 2015

Brown bears' hair and droppings contain their DNA. This is important for research and management of brown bears. All good in theory, but how does one practically go about collecting DNA?

Recommended for you

Scientists use CRISPR technology to edit crop genes

November 30, 2015

CRISPR gene-editing is allowing rapid scientific advances in many fields, including human health and now it has been shown that crop research can also benefit from this latest exciting technology.

1 comment

Adjust slider to filter visible comments by rank

Display comments: newest first

not rated yet Sep 04, 2008
this is actually a huge step forward

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.