New algorithm to improve video game quality

Feb 18, 2010

Research presented in a paper by Morgan McGuire, assistant professor of computer science at Williams College, and co-author Dr. David Luebke of NVIDIA, introduces a new algorithm to improve computer graphics for video games.

McGuire and Luebke have developed a new method for computerizing lighting and light sources that will allow to approach film quality.

Their paper "Hardware-Accelerated Global Illumination by Image Space Photon Mapping" won a Best Paper award at the 2009 Conference on High Performance Graphics.

Because video games must compute images more quickly than movies, developers have struggled with maximizing graphic quality.

Producing light effects involves essentially pushing light into the 3D world and pulling it back to the pixels of the final image. The method created by McGuire and Luebke reverses the process so that light is pulled onto the world and pushed into the image, which is a faster process.

As video games continue to increase the degree of interactivity, graphics processors are expected to become 500 times faster than they are now. McGuire and Luebke's algorithm is well suited to the quickened processing speed, and is expected to be featured in video games within the next two years.

McGuire is author of "Creating Games: Mechanics, Content, and Technology" and is co-chair of the ACM SIGGRAPH Symposium on Non-Photorealistic Animation and Rendering, and previously chaired the ACM Symposium on Interactive 3D Graphics and Games.

He has worked on and consulted for commercial video games such as "Marvel Ultimate Alliance" (2009), "Titan Quest" (2006), and "ROBLOX" (2005).

Explore further: Earthquake simulation tops one quadrillion flops

Provided by Williams College

4.5 /5 (16 votes)

Related Stories

AMD Launches 780 Chipset

Mar 04, 2008

AMD is announcing the availability of the AMD 780G chipset, designed to deliver the ultimate mainstream computing experience.

Samsung Speeds Up World's Fastest Graphics Memory

Feb 23, 2007

Samsung Electronics announced today that it has increased the data transfer speed of the world’s fastest graphics memory -- GDDR4 (series four of graphics double-data-rate memory) -- by two-thirds. Graphics memory processes ...

Video games shown to improve vision

Mar 15, 2007

According to a new study from the University of Rochester, playing action video games sharpens vision. In tests of visual acuity that assess the ability to see objects accurately in a cluttered space, game players scored ...

Recommended for you

Dish Network denies wrongdoing in $2M settlement

7 hours ago

The state attorney general's office says Dish Network Corp. will reimburse Washington state customers about $2 million for what it calls a deceptive surcharge, but the satellite TV provider denies any wrongdoing.

Yahoo sees signs of growth in 'core' (Update)

7 hours ago

Yahoo reported a stronger-than-expected first-quarter profit Tuesday, results hailed by chief executive Marissa Mayer as showing growth in the Web giant's "core" business.

Intel reports lower 1Q net income, higher revenue

7 hours ago

Intel's earnings fell in the first three months of the year amid a continued slump in the worldwide PC market, but revenue grew slightly because of solid demand for tablet processors and its data center services.

Earthquake simulation tops one quadrillion flops

9 hours ago

A team of computer scientists, mathematicians and geophysicists at Technische Universitaet Muenchen (TUM) and Ludwig-Maximillians Universitaet Muenchen (LMU) have – with the support of the Leibniz Supercomputing ...

Twitter buys data analytics partner Gnip

10 hours ago

Twitter says it has bought its data partner Gnip, which provides analysis of the more than 500 million tweets its users share each day—to advertisers, academic institutions, politicians and other customers.

User comments : 4

Adjust slider to filter visible comments by rank

Display comments: newest first

Scryer
not rated yet Feb 19, 2010
In 10 years, even this feat will look like a small stepping stone... Too bad this technology can't be out sooner.
cybrbeast
not rated yet Feb 19, 2010
"As video games continue to increase the degree of interactivity, graphics processors are expected to become 500 times faster than they are now."

What a useless sentence. It doesn't give a timespan. I could also say GPUs will get a 1000 times faster than they are now. It will happen, the question is when.
degojoey
not rated yet Feb 19, 2010
well 500x in around 15 years, 1000x in 17 years, according to Moore. :D
NanoStuff
not rated yet Feb 21, 2010
1000x in around 10 years, 128,000x in 17 years, according to historical observation.

I also want to emphasize the stupidity of this article, as cybrbeast describes. Also, it's not a new algorithm, it is an existing algorithm running on a GPU.

More news stories

Intel reports lower 1Q net income, higher revenue

Intel's earnings fell in the first three months of the year amid a continued slump in the worldwide PC market, but revenue grew slightly because of solid demand for tablet processors and its data center services.

Low Vitamin D may not be a culprit in menopause symptoms

A new study from the Women's Health Initiative (WHI) shows no significant connection between vitamin D levels and menopause symptoms. The study was published online today in Menopause, the journal of The North American Menopa ...

Astronomers: 'Tilt-a-worlds' could harbor life

A fluctuating tilt in a planet's orbit does not preclude the possibility of life, according to new research by astronomers at the University of Washington, Utah's Weber State University and NASA. In fact, ...