UT's Remote Data Analysis and Visualization Center enters full production

Sep 14, 2010

Nautilus, the powerful computer for visualizing and analyzing large datasets at the Remote Data Analysis and Visualization Center (RDAV), goes into full production on September 20. Managed by the University of Tennessee (UT) and funded by a grant from the National Science Foundation (NSF), RDAV and Nautilus provide scientists with cutting-edge capabilities for analyzing and visualizing massive quantities of data. As Nautilus goes into service, RDAV will serve researchers in a wide range of fields.

Louis Gross, professor of ecology and evolutionary biology and mathematics at the University of Tennessee and director of the National Institute for Mathematical and Biological Synthesis (NIMBioS) says, "NIMBioS and RDAV have already initiated new collaborations to enhance the use of for large datasets arising from field observations and from results. Our objective is to increase the ability of biologists to interpret and analyze complex, multivariate data to address fundamental and applied questions in the life sciences."

"For a scientist, visualization is more than just generating pretty pictures," said astrophysicist Bronson Messer of the Oak Ridge National Lab (ORNL) and UT. "As our simulations grow larger and larger, visualization and the associated data analysis are absolutely essential in producing scientific insight from computation." Messer notes, "Nautilus, and the way it is being integrated into the computational ecosystem at NICS, looks like a very promising avenue for us to increase the amount of scientific insight we obtain from simulations on Kraken. The large, shared memory also allows us to translate analysis workflows directly from earlier, smaller versions."

In addition to addressing scientific problems in the life sciences and astrophysics, Nautilus will be used to process data spanning many other research areas. These include: visualizing data results from with many complex variables, such as weather or climate modeling; analyzing large amounts of data coming from experimental facilities like ORNL's Spallation Neutron Source; and aggregating and interpreting input from a large number of sensors distributed over a wide geographic region. The computer will also have the capability to study large bodies of text and aggregations of documents.

With 1024 cores and four terabytes of memory, Nautilus can process large volumes of data and analyze it in ways unlike those used by previous systems. Manufactured by SGI as part of their UltraViolet product line, Nautilus features a shared-memory architecture with a single system image. This configuration allows researchers great flexibility in taking advantage of computing power to analyze larger amounts of data in ways that are impossible on many other high performance computing systems. RDAV has installed a 1.1 petabyte filesystem on Nautilus in anticipation of these vast amounts of data.

Founded in the second half of 2009, the RDAV center is now fully staffed and ready to support scientists and their data and visualization challenges. In addition to configuring and fine-tuning the Nautilus hardware, the team has been developing and refining new and existing software to address the latest scientific problems and to engage the scientific community.

RDAV Director Sean Ahern says, "I'm very excited about standing up this machine for the NSF TeraGrid, as it's going to provide much needed capability for understanding complex datasets from Kraken and other sources. And as datasets continue to grow, the shared memory nature of the SGI is a fertile ground for new analysis and visualization research for very large datasets."

Explore further: Coping with floods—of water and data

Provided by University of Tennessee at Knoxville

not rated yet
add to favorites email to friend print save as pdf

Related Stories

Big Computers For Big Science

Aug 23, 2004

A visiting neutron scattering scientist at ORNL sends data from her experiment to a San Diego supercomputer for analysis. The calculation results are sent to Argonne National Laboratory, where they are turned into "pictures." ...

DataONE helping scientists deal with data deluge

Nov 18, 2009

Vast amounts of information that could hold the key to breakthroughs in environmental research will be made readily available through a network created by Oak Ridge National Laboratory and partners.

Recommended for you

Coping with floods—of water and data

23 hours ago

Halloween 2013 brought real terror to an Austin, Texas, neighborhood, when a flash flood killed four residents and damaged roughly 1,200 homes. Following torrential rains, Onion Creek swept over its banks and inundated the ...

Cloud computing helps make sense of cloud forests

Dec 17, 2014

The forests that surround Campos do Jordao are among the foggiest places on Earth. With a canopy shrouded in mist much of time, these are the renowned cloud forests of the Brazilian state of São Paulo. It is here that researchers ...

User comments : 0

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.