PCs around the world unite to map the Milky Way

Feb 10, 2010
In the constellation Ophiucus resides NGC 6384, a spiral galaxy with a central bar structure and a possible central ring. Because NGC 6384 is nearly in line with the plane of our galaxy, all the stars in the image are foreground stars in our Milky Way. Photo Credit: Sloan Digital Sky Survey

(PhysOrg.com) -- At this very moment, tens of thousands of home computers around the world are quietly working together to solve the largest and most basic mysteries of our galaxy.

Enthusiastic and inquisitive volunteers from Africa to Australia are donating the of everything from decade-old desktops to sleek new netbooks to help computer scientists and at Rensselaer Polytechnic Institute map the shape of our galaxy. Now, just this month, the collected computing power of these humble home computers has surpassed one petaflop, a computing speed that surpasses the world's second fastest supercomputer.

The project, MilkyWay@Home, uses the Berkeley Open Infrastructure for Network Computing (BOINC) platform, which is widely known for the SETI@home project used to search for signs of extraterrestrial life. Today, MilkyWay@Home has outgrown even this famous project, in terms of speed, making it the fastest computing project on the BOINC platform and perhaps the second fastest public distributed computing program ever in operation (just behind Folding@home).

The interdisciplinary team behind MilkyWay@Home, which ranges from professors to undergraduates, began the formal development under the BOINC platform in July 2006 and worked tirelessly to build a volunteer base from the ground up to build its computational power.

Each user participating in the project signs up their computer and offers up a percentage of the machine's operating power that will be dedicated to calculations related to the project. For the MilkyWay@Home project, this means that each is using data gathered about a very small section of the galaxy to map its shape, density, and movement.

In particular, computers donating processing power to MilkyWay@Home are looking at how the different dwarf that make up the larger Milky Way galaxy have been moved and stretched following their merger with the larger galaxy millions of years ago. This is done by studying each dwarf's stellar stream. Their calculations are providing new details on the overall shape and density of dark matter in the Milky Way galaxy, which is widely unknown.

The galactic computing project had very humble beginnings, according to Heidi Newberg, associate professor of physics, applied physics, and astronomy at Rensselaer. Her personal research to map the three-dimensional distribution of stars and matter in the Milky Way using data from the extensive Sloan Digital Sky Survey could not find the best model to map even a small section of a single galactic star stream in any reasonable amount of time.

"I was a researcher sitting in my office with a very big computational problem to solve and very little personal computational power or time at my fingertips," Newberg said. "Working with the MilkyWay@Home platform, I now have the opportunity to use a massive computational resource that I simply could not have as a single faculty researcher, working on a single research problem."

Before taking the research to BOINC, Newberg worked with Malik Magdon-Ismail, associate professor of computer science, to create a stronger and faster algorithm for her project. Together they greatly increased the computational efficiency and set the groundwork for what would become the much larger MilkyWay@Home project.

"Scientists always need additional computing power," Newberg said. "The massive amounts of data out there make it so that no amount of computational power is ever enough." Thus, her work quickly exceeded the limits of laboratory computers and the collaboration to create MilkyWay@Home formally began in 2006 with the assistance of the Claire and Roland Schmitt Distinguished Professor of Computer Science Boleslaw Szymanski; Associate Professor of Computer Science Carlos Varela; postdoctoral research assistant Travis Desell; as well as other graduate and undergraduate students at Rensselaer.

With this extensive collaboration, leaps and bounds have been made to further the astrophysical goals of the project, but important discoveries have also been made along the way in computational science to create algorithms that make the extremely distributed and diverse MilkyWay@Home system work so well, even with volunteered computers that can be highly unreliable.

"When you use a supercomputer, all the processors are the same and in the same location, so they are producing the same results at the same time," Varela said. "With an extremely distributed system, like we have with MilkyWay@Home, we are working with many different operating systems that are located all over the globe. To work with such asynchronous results we developed entirely new algorithms to process work as it arrives in the system." This makes data from even the slowest of computers still useful to the project, according to Varela. "Even the slowest computer can help if it is working on the correct problem in the search."

In total, nine articles have been published and multiple public talks have been given regarding the computer science discoveries made during the creation of the project, and many more are expected as the refined algorithms are utilized for other scientific problems. Collaboration has already begun to develop a DNA@Home platform to find gene regulations sites on human DNA. Collaborations have also started with biophysicists and chemists on two other BOINC projects at Rensselaer to understand protein folding and to design new drugs and materials.

In addition to important discoveries in computer science and astronomy, the researchers said the project is also making important strides in efforts to include the public in scientific discovery. Since the project began, more than 45,000 individual users from 169 countries have donated computational power to the effort. Currently, approximately 17,000 users are active in the system.

"This is truly public science," said Desell, who began working on the project as a graduate student and has seen the project through its entire evolution. "This is a really unique opportunity to get people interested in science while also allowing us to create a strong computing resource for Rensselaer research." All of the research, results, data, and even source code are made public and regularly updated for volunteers on the main MilkyWay@Home Web site found at: http://MilkyWay.cs.rpi.edu/.

Desell cites the public nature and regular communication as important components of the project's success. "They are not just sitting back and allowing the computer to do the work," he says, referencing that volunteers have made donations for equipment as well as made their own improvements to the underlying algorithms that greatly increased computational speed. Varela jokes, "We may end up with a paper with 17,000 authors."

Explore further: Astronomer confirms a new "Super-Earth" planet

Related Stories

Home computers to help researchers better understand universe

Oct 24, 2007

Want to help unravel the mysteries of the universe" A new distributed computing project designed by a University of Illinois researcher allows people around the world to participate in cutting-edge cosmology research by donating ...

World's Largest Working Computing Grid

Sep 05, 2004

This week, UK particle physicists will demonstrate the world's largest, working computing Grid. With over 6,000 computers at 78 sites internationally, the Large Hadron Collider Computing Grid (LCG) is the first permanent, ...

Recommended for you

Kepler proves it can still find planets

Dec 18, 2014

To paraphrase Mark Twain, the report of the Kepler spacecraft's death was greatly exaggerated. Despite a malfunction that ended its primary mission in May 2013, Kepler is still alive and working. The evidence ...

User comments : 8

Adjust slider to filter visible comments by rank

Display comments: newest first

5 / 5 (2) Feb 10, 2010
I'm one of the donors and one of the user in the community always asking for some news and updates from scientists behind the projects.
Anyway, this is a great article, but it doesn't stress enough on the merit of the community. If we had not been so active, milkyway wouldn't be second fastest distributed computing project.
Nowhere in the article is cited the great work by Andreas Przystawik (PhD), aka Gipsel/Cluster Physik in the community. Without his work, GPGPU on milkyway would be a dream and 1 petaflop would be far far away, so the researches at Rensselaer institute.
Sorry for my poor english, btw. Keep crunching and learn about BOINC if you don't.
not rated yet Feb 11, 2010
They should get on PS3.Folding@Home hit a petaflop in 2007 because of PS3's help.
not rated yet Feb 12, 2010
I'm one of the donors and one of the user in the community always asking for some news and updates from scientists behind the projects.
That's nice - but I'm in doubt, such distributed computing is economical in terms of electric energy consumption with compare to supercomputers (especially when considering energy consumption of WAN infrastructure).
not rated yet Feb 12, 2010
That's nice - but I'm in doubt, such distributed computing is economical in terms of electric energy consumption.

well, it's economical if you consider that maybe your pc would still be powered on. Of course, if it's idle it draws less power, but the power added because of boinc is maybe equal to the power used by a supercomputer to do the same work, without considering that supercomputers need cooling systems and, first of all, need to be bought. And this means that you have to spend energy to create them from silice...
It's so difficult to really understand it well! I like boinc because, first of all, it tries to divulgate some hint of science to normal people. I'm studying physics and so maybe i'm already into this type of talks, but I find really nice that so many people are involved!

To whom says "port to the ps3": on your pc you can use your GPU which will "destroy" the performance of many ps3 cell broadband engines. without considering that sony removed linux from slims
not rated yet Feb 12, 2010

Folding doesn't use Linux, it's a standard feature on every PS3.And they would customize for the CELL instead of many different GPU's being used. The customize and many PS3 users should give a nice upgrade.
not rated yet Feb 13, 2010
I know that folding doesn't use Linux!
BOINC on PS3 exists (now abandon-ware) but requires Linux installed (Sony wants royalties for native apps and you have to but their SDK, only folding got it free because it was the most important project at that moment). With restyled PS3 slim, Sony removed the ability of installing linux (because of the new hypervisor) and so cut out BOINC entirely, without ever pre-announcing it! It's not been a fair move, most of all for the many developers that worked on porting boinc and apps on cell...
(end of my ranting)
not rated yet Feb 13, 2010
also, you do not have to customize for many different GPU. ATi has Brook+/CAL and nvidia has CUDA.
"Tomorrow" both will use OpenCL (it exists but is really new to develop safely and effectively on it).

Also, consider that new ATi HD 5870 has 2 teraflop of power... one of those is equal to many many CPUs for these type of calculations
not rated yet Feb 13, 2010
I was wondering when someone might mention GPGPUs in general. Nvidia's Tesla system has been getting some press lately. According to Wikipedia:

"As of January 2009, the Tesla computer platform delivers the best performance among all stand-alone systems being used in distributed computing projects like Seti@home, Spinhenge@home or Folding@home."

Tesla Personal Supercomputer: http://en.wikiped...computer

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.