UT's Remote Data Analysis and Visualization Center enters full production

September 14, 2010

Nautilus, the powerful computer for visualizing and analyzing large datasets at the Remote Data Analysis and Visualization Center (RDAV), goes into full production on September 20. Managed by the University of Tennessee (UT) and funded by a grant from the National Science Foundation (NSF), RDAV and Nautilus provide scientists with cutting-edge capabilities for analyzing and visualizing massive quantities of data. As Nautilus goes into service, RDAV will serve researchers in a wide range of fields.

Louis Gross, professor of ecology and evolutionary biology and mathematics at the University of Tennessee and director of the National Institute for Mathematical and Biological Synthesis (NIMBioS) says, "NIMBioS and RDAV have already initiated new collaborations to enhance the use of for large datasets arising from field observations and from results. Our objective is to increase the ability of biologists to interpret and analyze complex, multivariate data to address fundamental and applied questions in the life sciences."

"For a scientist, visualization is more than just generating pretty pictures," said astrophysicist Bronson Messer of the Oak Ridge National Lab (ORNL) and UT. "As our simulations grow larger and larger, visualization and the associated data analysis are absolutely essential in producing scientific insight from computation." Messer notes, "Nautilus, and the way it is being integrated into the computational ecosystem at NICS, looks like a very promising avenue for us to increase the amount of scientific insight we obtain from simulations on Kraken. The large, shared memory also allows us to translate analysis workflows directly from earlier, smaller versions."

In addition to addressing scientific problems in the life sciences and astrophysics, Nautilus will be used to process data spanning many other research areas. These include: visualizing data results from with many complex variables, such as weather or climate modeling; analyzing large amounts of data coming from experimental facilities like ORNL's Spallation Neutron Source; and aggregating and interpreting input from a large number of sensors distributed over a wide geographic region. The computer will also have the capability to study large bodies of text and aggregations of documents.

With 1024 cores and four terabytes of memory, Nautilus can process large volumes of data and analyze it in ways unlike those used by previous systems. Manufactured by SGI as part of their UltraViolet product line, Nautilus features a shared-memory architecture with a single system image. This configuration allows researchers great flexibility in taking advantage of computing power to analyze larger amounts of data in ways that are impossible on many other high performance computing systems. RDAV has installed a 1.1 petabyte filesystem on Nautilus in anticipation of these vast amounts of data.

Founded in the second half of 2009, the RDAV center is now fully staffed and ready to support scientists and their data and visualization challenges. In addition to configuring and fine-tuning the Nautilus hardware, the team has been developing and refining new and existing software to address the latest scientific problems and to engage the scientific community.

RDAV Director Sean Ahern says, "I'm very excited about standing up this machine for the NSF TeraGrid, as it's going to provide much needed capability for understanding complex datasets from Kraken and other sources. And as datasets continue to grow, the shared memory nature of the SGI is a fertile ground for new analysis and visualization research for very large datasets."

Explore further: Kraken becomes first academic machine to achieve petaflop

Related Stories

Kraken becomes first academic machine to achieve petaflop

October 8, 2009

The National Institute for Computational Sciences' (NICS's) Cray XT5 supercomputer—Kraken—has been upgraded to become the first academic system to surpass a thousand trillion calculations a second, or one petaflop, a ...

Big Computers For Big Science

August 23, 2004

A visiting neutron scattering scientist at ORNL sends data from her experiment to a San Diego supercomputer for analysis. The calculation results are sent to Argonne National Laboratory, where they are turned into "pictures." ...

DataONE helping scientists deal with data deluge

November 18, 2009

Vast amounts of information that could hold the key to breakthroughs in environmental research will be made readily available through a network created by Oak Ridge National Laboratory and partners.

Recommended for you

Forget oil, Russia goes crazy for cryptocurrency

August 16, 2017

Standing in a warehouse in a Moscow suburb, Dmitry Marinichev tries to speak over the deafening hum of hundreds of computers stacked on shelves hard at work mining for crypto money.

Researchers clarify mystery about proposed battery material

August 15, 2017

Battery researchers agree that one of the most promising possibilities for future battery technology is the lithium-air (or lithium-oxygen) battery, which could provide three times as much power for a given weight as today's ...

Signs of distracted driving—pounding heart, sweaty nose

August 15, 2017

Distracted driving—texting or absent-mindedness—claims thousands of lives a year. Researchers from the University of Houston and the Texas A&M Transportation Institute have produced an extensive dataset examining how ...


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.