SDSC readying 'Gordon' supercomputer for pre-production trials this month

Aug 10, 2011

The San Diego Supercomputer Center (SDSC) at the University of California, San Diego, will this month launch a pre-production phase of Gordon, the first high-performance supercomputer to use large amounts of flash-based SSD (solid state drive) memory.

The installation of 64 I/O nodes, joined via an InfiniBand switched fabric communications link, is being completed early this month and will be made available to U.S. academic researchers who want to run large-scale database applications, said SDSC Director Michael Norman. University of California interested in early access to Gordon I/O nodes are directed to request a Dash startup allocation from the XSEDE website at https://www.xsede.org/

Norman made the announcement at the 'Get Ready for Gordon – Summer Institute' being held this week (August 8-11) at SDSC. The four-day workshop is designed to familiarize potential users with the unique capabilities of Gordon, the result of a five-year, $20 million award from the National Science Foundation (NSF). Production startup is set for January 1, 2012.

"This year is the start of academic data-intensive supercomputing," Norman said in opening the conference, and encouraged researchers engaged in data-intensive science and data mining across a diverse range of disciplines to apply for allocations.

With about 300 trillion bytes of and those 64 I/O nodes, Gordon will be capable of handling massive data bases while providing up to 100 times faster speeds when compared to hard drive disk systems for some queries. Flash memory is more common in smaller devices such as mobile phones and laptop computers, but unique for supercomputers, which generally use slower spinning disk technology.

"One might ask why we haven't seen flash memory in HPC (high performance computing) systems before," said Norman. "Smaller flash systems for consumers have become less expensive, but they have not been durable enough for HPC applications. Now we have enterprise MLC (multi-level cell), and it's available at both attractive prices and with very good durability (or write endurance), which is achieved by over-provisioning and wear leveling."

Gordon is being configured to aid researchers in conducting data-intensive computational science, such as visual analytics or interaction network analyses for new drug discovery, or the solution of inverse problems – converting observed measurements into information about a physical object or system – in oceanography, atmospheric science, and oil exploration.

Gordon also is suited for large-scale data mining applications, such as de novo gene assembly, or for cosmological applications, or doing modestly scalable codes in quantum chemistry or structural engineering.

Along with the well-publicized exponential growth in digitally based data, there is an explosion in the amount of research based or scientific data, noted Norman. "Data of this size is simply becoming unmanageable for analysis, so there is an urgent need for supercomputers like Gordon," said Norman, adding that "it is part of our genetics here at SDSC to do data-intensive computing."

Explore further: Supercomputer for astronomy 'ATERUI' upgraded to double its speed

add to favorites email to friend print save as pdf

Related Stories

SDSC dashes forward with new flash memory computer system

Sep 02, 2009

Leveraging lightning-fast technology already familiar to many from the micro storage world of digital cameras, thumb drives and laptop computers, the San Diego Supercomputer Center (SDSC) at the University of California, ...

UC San Diego launches Triton Resource Supercomputer

Aug 05, 2009

The San Diego Supercomputer Center (SDSC) at the University of California, San Diego, today officially launched the Triton Resource, an integrated, data-intensive computing system primarily designed to support UC San Diego ...

Customizing supercomputers from the ground up

May 27, 2010

(PhysOrg.com) -- Computer scientist Adolfy Hoisie has joined the Department of Energy's Pacific Northwest National Laboratory to lead PNNL's high performance computing activities. In one such activity, Hoisie will direct ...

Recommended for you

A green data center with an autonomous power supply

2 hours ago

A new data center in the United States is generating electricity for its servers entirely from renewable sources, converting biogas from a sewage treatment plant into electricity and water. Siemens implemented ...

After a data breach, it's consumers left holding the bag

3 hours ago

Shoppers have launched into the holiday buying season and retailers are looking forward to year-end sales that make up almost 20% of their annual receipts. But as you check out at a store or click "purchase" on your online shopping cart ...

Can we create an energy efficient Internet?

3 hours ago

With the number of Internet connected devices rapidly increasing, researchers from Melbourne are starting a new research program to reduce energy consumption of such devices.

Brain inspired data engineering

4 hours ago

What if next-generation ICT systems could be based on the brain's structure and its cognitive and adaptive processes? A groundbreaking paradigm of brain-inspired intelligent ICT architectures is being born.

User comments : 0

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.