IBM's new architecture can double analytics processing speed

Nov 19, 2010

At the Supercomputing 2010 conference, IBM today unveiled details of a new storage architecture design, created by IBM scientists, that will convert terabytes of pure information into actionable insights twice as fast as previously possible.

Ideally suited for cloud computing applications and data-intensive workloads such as digital media, and financial analytics, this new architecture will shave hours off of complex computations without requiring heavy infrastructure investment. IBM won the Challenge competition for presenting the most innovative and effective design in high performance computing with the best measurements of performance, scalability and storage subsystem utilization.

Running analytics applications on extremely large data sets is becoming increasingly important, but organizations can only continue to increase the size of their storage facilities so much. As businesses search for ways to harness their large stored data to achieve new levels of business insight, they need alternative solutions like cloud computing to keep up with growing data requirements as well as tackling workload flexibility through the rapid provisioning of system resources for different types of workloads.

"Businesses are literally running into walls, unable to keep up with the vast amounts of data generated on a daily basis," said Prasenjit Sarkar, Master Inventor, Storage Analytics and Resiliency, IBM Research – Almaden. "We constantly research and develop the industry's most advanced storage technologies to solve the world's biggest data problems. This new way of storage partitioning is another step forward on this path as it gives businesses faster time-to-insight without concern for traditional storage limitations."

Created at IBM Research – Almaden, the new General Parallel File System-Shared Nothing Cluster (GPFS-SNC) architecture is designed to provide higher availability through advanced clustering technologies, dynamic file system management and advanced data replication techniques. By "sharing nothing," new levels of availability, performance and scaling are achievable. GPFS-SNC is a distributed computing architecture in which each node is self-sufficient; tasks are then divided up between these independent computers and no one waits on the other.

IBM's current GPFS technology offering is the core technology for IBM's High Performance Computing Systems, IBM's Information Archive, IBM Scale-Out NAS (SONAS), and the IBM Smart Business Compute Cloud. These research lab innovations enable future expansion of those offerings to further tackle tough big data problems.

For instance, large financial institutions run complex algorithms to analyze risk based on petabytes of data. With billions of files spread across multiple computing platforms and stored across the world, these mission-critical calculations require significant IT resource and cost because of their complexity. Using this GPFS-SNC design, running this complex analytics workload could become much more efficient, as the design provides a common file system and namespace across disparate computing platforms, streamlining the process and reducing disk space.

Explore further: Microsoft to unveil new Windows software

add to favorites email to friend print save as pdf

Related Stories

IBM makes Big Blue cloud

Nov 16, 2009

IBM on Monday announced it has created the world's largest business computing "cloud" capable of holding an amount of digital data on a par with 250 billion iTunes songs.

IBM Unveils Revolutionary Cell Broadband Engine Computer

Feb 08, 2006

At a press conference in New York today, IBM introduced a blade computing system based on the Cell Broadband Engine (Cell BE). The IBM branded Cell BE-based system is designed for businesses that need the dense ...

IBM Storage Services Maintains Worldwide Lead in Market Share

Jul 21, 2005

IBM today announced that it continues to lead in the 2004 storage services marketplace based on revenue throughout Americas, EMEA and Asia Pacific and Japan, according to a recent Gartner annual report on the worldwide storage ...

IBM Extends Deep Computing on Demand Offering

Jun 21, 2007

IBM today expanded its Deep Computing Capacity on Demand (DCCoD) solutions. In a collaboration with Intel, IBM plans to offer the latest Dual-Core and Quad-Core Intel Xeon processor technology on its System x servers for ...

Recommended for you

Does your computer know how you're feeling?

9 hours ago

Researchers in Bangladesh have designed a computer program that can accurately recognize users' emotional states as much as 87% of the time, depending on the emotion.

Microsoft to unveil new Windows software

Aug 21, 2014

A news report out Thursday indicated that Microsoft is poised to give the world a glimpse at a new-generation computer operating system that will succeed Windows 8.

Unlocking the potential of simulation software

Aug 21, 2014

With a method known as finite element analysis (FEA), engineers can generate 3-D digital models of large structures to simulate how they'll fare under stress, vibrations, heat, and other real-world conditions.

User comments : 0