Computer scientists successfully boot one million Linux kernels as virtual machines

Sep 25, 2009
Computer scientists successfully boot one million Linux kernels as virtual machines
Sandia National Laboratories computer scientists Ron Minnich (foreground) and Don Rudish (background) have successfully run more than a million Linux kernels as virtual machines, an achievement that will allow cybersecurity researchers to more effectively observe behavior found in malicious botnets. They utilized Sandia's powerful Thunderbird supercomputing cluster for the demonstration. (Photo by Randy Wong)

(PhysOrg.com) -- Computer scientists at Sandia National Laboratories in Livermore, Calif., have for the first time successfully demonstrated the ability to run more than a million Linux kernels as virtual machines.

The achievement will allow cyber security researchers to more effectively observe behavior found in malicious botnets, or networks of infected machines that can operate on the scale of a million nodes. Botnets, said Sandia’s Ron Minnich, are often difficult to analyze since they are geographically spread all over the world.

Sandia scientists used virtual machine (VM) technology and the power of its Thunderbird supercomputing cluster for the demonstration.

Running a high volume of VMs on one supercomputer — at a similar scale as a — would allow cyber researchers to watch how botnets work and explore ways to stop them in their tracks. “We can get control at a level we never had before,” said Minnich.

Previously, Minnich said, researchers had only been able to run up to 20,000 kernels concurrently (a “kernel” is the central component of most computer operating systems). The more kernels that can be run at once, he said, the more effective professionals can be in combating the global botnet problem. “Eventually, we would like to be able to emulate the computer network of a small nation, or even one as large as the United States, in order to ‘virtualize’ and monitor a ,” he said.

A related use for millions to tens of millions of operating systems, Sandia’s researchers suggest, is to construct high-fidelity models of parts of the Internet.

“The sheer size of the Internet makes it very difficult to understand in even a limited way,” said Minnich. “Many phenomena occurring on the Internet are poorly understood, because we lack the ability to model it adequately. By running actual instances to represent nodes on the Internet, we will be able not just to simulate the functioning of the Internet at the network level, but to emulate Internet functionality.”

A virtual machine, originally defined by researchers Gerald J. Popek and Robert P. Goldberg as “an efficient, isolated duplicate of a real machine,” is essentially a set of software programs running on one computer that, collectively, acts like a separate, complete unit. “You fire it up and it looks like a full computer,” said Sandia’s Don Rudish. Within the virtual machine, one can then start up an operating system kernel, so “at some point you have this little world inside the that looks just like a full machine, running a full operating system, browsers and other software, but it’s all contained within the real machine.”

The Sandia research, two years in the making, was funded by the Department of Energy’s Office of Science, the National Nuclear Security Administration’s (NNSA) Advanced Simulation and Computing (ASC) program and by internal Sandia funding.

To complete the project, Sandia utilized its Albuquerque-based 4,480-node Dell high-performance computer cluster, known as Thunderbird. To arrive at the one million kernel figure, Sandia’s researchers ran one kernel in each of 250 VMs and coupled those with the 4,480 physical machines on Thunderbird. Dell and IBM both made key technical contributions to the experiments, as did a team at Sandia’s Albuquerque site that maintains Thunderbird and prepared it for the project.

The capability to run a high number of operating system instances inside of virtual machines on a high performance computing (HPC) cluster can also be used to model even larger HPC machines with millions to tens of millions of nodes that will be developed in the future, said Minnich. The successful Sandia demonstration, he asserts, means that development of operating systems, configuration and management tools, and even software for scientific computation can begin now before the hardware technology to build such machines is mature.

“Development of this software will take years, and the scientific community cannot afford to wait to begin the process until the hardware is ready,” said Minnich. “Urgent problems such as modeling climate change, developing new medicines, and research into more efficient production of energy demand ever-increasing computational resources. Furthermore, virtualization will play an increasingly important role in the deployment of large-scale systems, enabling multiple operating systems on a single platform and application-specific operating systems.”

Sandia’s researchers plan to take their newfound capability to the next level.

“It has been estimated that we will need 100 million CPUs (central processing units) by 2018 in order to build a computer that will run at the speeds we want,” said Minnich. “This approach we’ve demonstrated is a good way to get us started on finding ways to program a machine with that many CPUs.” Continued research, he said, will help to come up with ways to manage and control such vast quantities, “so that when we have a computer with 100 million CPUs we can actually use it.”

Provided by Sandia National Laboratories (news : web)

Explore further: Artificial intelligence that imitates children's learning

add to favorites email to friend print save as pdf

Related Stories

Linux Kernel to Add VMI

Mar 27, 2007

The next stable update to the Linux kernel, Version 2.6.21, is slated to include a new feature submitted by VMware called Virtual Machine Interface.

Recommended for you

Artificial intelligence that imitates children's learning

9 hours ago

The computer programmes used in the field of artificial intelligence (AI) are highly specialised. They can for example fly airplanes, play chess or assemble cars in controlled industrial environments. However, a research ...

Oculus unveils new prototype VR headset

Sep 20, 2014

Oculus has unveiled a new prototype of its virtual reality headset. However, the VR company still isn't ready to release a consumer edition.

Who drives Alibaba's Taobao traffic—buyers or sellers?

Sep 18, 2014

As Chinese e-commerce firm Alibaba prepares for what could be the biggest IPO in history, University of Michigan professor Puneet Manchanda dug into its Taobao website data to help solve a lingering chicken-and-egg question.

Computerized emotion detector

Sep 16, 2014

Face recognition software measures various parameters in a mug shot, such as the distance between the person's eyes, the height from lip to top of their nose and various other metrics and then compares it with photos of people ...

User comments : 8

Adjust slider to filter visible comments by rank

Display comments: newest first

Smellyhat
Sep 26, 2009
This comment has been removed by a moderator.
PieRSquare
1 / 5 (1) Sep 26, 2009
And, of course, it would be 10 million if the Linux kernel wasn't so bloated.


And about 14 if you used Vista...
swehner
4 / 5 (1) Sep 26, 2009
Wouldn't it be more natural to boot 1 Mio. Windows PC since they are more likely to have these viruses in the first place.Then again how much would it cost ?

On the other hand, it is not clear why full VM's are needed. What can they simulate with VM's which they cannot with 1 Mio separately executing but communicating processes?

Stephan
LinuxUser
Sep 27, 2009
This comment has been removed by a moderator.
DGBEACH
5 / 5 (2) Sep 27, 2009
You're right swehner, in fact many people consider windows as BEING the virus! :)
Alexa
not rated yet Sep 28, 2009
Whereas much more people are using them as a their most favorite OS. Torvalds calls Linux "bloated" and "scary":
http://industry.b...ry/12411

Smellyhat
not rated yet Sep 28, 2009
And, of course, it would be 10 million if the Linux kernel wasn't so bloated.


And about 14 if you used Vista...


It would appear that most people didn't get the joke. Torvalds complained recently about how 'bloated' the Linux kernel was getting. It is, of course, not nearly anything of the sort.
TechMasterGenius
1.5 / 5 (2) Sep 29, 2009
It appears that we have reached the end of "Moore's Law" (Processing power doubling every nine months)from about 2002 and now we are moving into greater clusters of multiprocessors and Virtualization. One problem they will encounter (with regards to Botnets and other Malware) is that "Virtual" Systems have many problems and are at a distinct disadvantage when engaging with certain malware agents and can itself become a giant "Zombie". Unless extra special care is given to "RootKit" penetration, prevention and analysis this giant mass of processors (and especially "Virtual" Systems)could very likely become the "Prey" instead of the "Hunter".
SmartK8
Sep 29, 2009
This comment has been removed by a moderator.
Smellyhat
4.5 / 5 (2) Sep 29, 2009
@TechMasterGenius: You forgot to put quotes around "Virtualization," "Botnets" and "Malware."
docknowledge
not rated yet Sep 30, 2009
Well, ahem, maybe Windows would be more apt, but how big a license fee would they need to pay Microsoft? Can you imagine phoning their marketing department for a discount?

The point of having the million computers would certainly be autonomy of each unit. I.e., some would be infected, some would not. Some would be infected by one virus, some by another, some first by one, then the other, etc. They might have to go in and physically disconnect devices that refused to respond (just like in the real world).