New algorithm to improve video game quality

Feb 18, 2010

Research presented in a paper by Morgan McGuire, assistant professor of computer science at Williams College, and co-author Dr. David Luebke of NVIDIA, introduces a new algorithm to improve computer graphics for video games.

McGuire and Luebke have developed a new method for computerizing lighting and light sources that will allow to approach film quality.

Their paper "Hardware-Accelerated Global Illumination by Image Space Photon Mapping" won a Best Paper award at the 2009 Conference on High Performance Graphics.

Because video games must compute images more quickly than movies, developers have struggled with maximizing graphic quality.

Producing light effects involves essentially pushing light into the 3D world and pulling it back to the pixels of the final image. The method created by McGuire and Luebke reverses the process so that light is pulled onto the world and pushed into the image, which is a faster process.

As video games continue to increase the degree of interactivity, graphics processors are expected to become 500 times faster than they are now. McGuire and Luebke's algorithm is well suited to the quickened processing speed, and is expected to be featured in video games within the next two years.

McGuire is author of "Creating Games: Mechanics, Content, and Technology" and is co-chair of the ACM SIGGRAPH Symposium on Non-Photorealistic Animation and Rendering, and previously chaired the ACM Symposium on Interactive 3D Graphics and Games.

He has worked on and consulted for commercial video games such as "Marvel Ultimate Alliance" (2009), "Titan Quest" (2006), and "ROBLOX" (2005).

Explore further: Researchers developing algorithms to detect fake reviews

Provided by Williams College

4.5 /5 (16 votes)

Related Stories

AMD Launches 780 Chipset

Mar 04, 2008

AMD is announcing the availability of the AMD 780G chipset, designed to deliver the ultimate mainstream computing experience.

Samsung Speeds Up World's Fastest Graphics Memory

Feb 23, 2007

Samsung Electronics announced today that it has increased the data transfer speed of the world’s fastest graphics memory -- GDDR4 (series four of graphics double-data-rate memory) -- by two-thirds. Graphics memory processes ...

Video games shown to improve vision

Mar 15, 2007

According to a new study from the University of Rochester, playing action video games sharpens vision. In tests of visual acuity that assess the ability to see objects accurately in a cluttered space, game players scored ...

Recommended for you

Tablets, cars drive AT&T wireless gains—not phones

2 hours ago

AT&T says it gained 2 million wireless subscribers in the latest quarter, but most were from non-phone services such as tablets and Internet-connected cars. The company is facing pricing pressure from smaller rivals T-Mobile ...

Twitter looks to weave into more mobile apps

2 hours ago

Twitter on Wednesday set out to weave itself into mobile applications with a free "Fabric" platform to help developers build better programs and make more money.

Blink, point, solve an equation: Introducing PhotoMath

3 hours ago

"Ma, can I go now? My phone did my homework." PhotoMath, from the software development company MicroBlink, will make the student's phone do math homework. Just point the camera towards the mathematical expression, ...

Google unveils app for managing Gmail inboxes

3 hours ago

Google is introducing an application designed to make it easier for its Gmail users to find and manage important information that can often become buried in their inboxes.

User comments : 4

Adjust slider to filter visible comments by rank

Display comments: newest first

Scryer
not rated yet Feb 19, 2010
In 10 years, even this feat will look like a small stepping stone... Too bad this technology can't be out sooner.
cybrbeast
not rated yet Feb 19, 2010
"As video games continue to increase the degree of interactivity, graphics processors are expected to become 500 times faster than they are now."

What a useless sentence. It doesn't give a timespan. I could also say GPUs will get a 1000 times faster than they are now. It will happen, the question is when.
degojoey
not rated yet Feb 19, 2010
well 500x in around 15 years, 1000x in 17 years, according to Moore. :D
NanoStuff
not rated yet Feb 21, 2010
1000x in around 10 years, 128,000x in 17 years, according to historical observation.

I also want to emphasize the stupidity of this article, as cybrbeast describes. Also, it's not a new algorithm, it is an existing algorithm running on a GPU.