Managing the data deluge through new software

Unprecedented torrents of data flood out of research labs on a continual basis, but making sense of it all remains a major scientific bottleneck. How software is evolving to transform this data deluge into knowledge is the topic of a news story in Chemical & Engineering News, the weekly newsmagazine of the American Chemical Society.

Rick Mullin, senior editor at C&EN, points out that statistical models have allowed researchers to reduce the number of experiments they run, but 40 percent are still unnecessarily repeated due to inefficient experimental design or inadequate information technology. To tame the situation, multiple companies have stepped up with new software to help researchers gain control over the massive data that today's high-throughput labs produce.

The article describes the latest advances in this area, which include new ways to search, access, visualize and analyze data. Some software allows scientists to aggregate and analyze information from various sources. Others combine technology with consulting services to convert raw data into a storage format that's vendor-independent. While providers—and thus the scientists, science and the public benefiting from them—have made considerable progress, advanced informatics that can function at the scale of tens of thousands of variables is still in the works.

More information: "Breaking Big" cen.acs.org/articles/91/i42/Breaking-Big.html

Citation: Managing the data deluge through new software (2013, October 23) retrieved 23 April 2024 from https://phys.org/news/2013-10-deluge-software.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Predicting health risks of everyday chemicals

0 shares

Feedback to editors