New algorithm to improve video game quality

Research presented in a paper by Morgan McGuire, assistant professor of computer science at Williams College, and co-author Dr. David Luebke of NVIDIA, introduces a new algorithm to improve computer graphics for video games.

McGuire and Luebke have developed a new method for computerizing lighting and light sources that will allow to approach film quality.

Their paper "Hardware-Accelerated Global Illumination by Image Space Photon Mapping" won a Best Paper award at the 2009 Conference on High Performance Graphics.

Because video games must compute images more quickly than movies, developers have struggled with maximizing graphic quality.

Producing light effects involves essentially pushing light into the 3D world and pulling it back to the pixels of the final image. The method created by McGuire and Luebke reverses the process so that light is pulled onto the world and pushed into the image, which is a faster process.

As video games continue to increase the degree of interactivity, graphics processors are expected to become 500 times faster than they are now. McGuire and Luebke's algorithm is well suited to the quickened processing speed, and is expected to be featured in video games within the next two years.

McGuire is author of "Creating Games: Mechanics, Content, and Technology" and is co-chair of the ACM SIGGRAPH Symposium on Non-Photorealistic Animation and Rendering, and previously chaired the ACM Symposium on Interactive 3D Graphics and Games.

He has worked on and consulted for commercial video games such as "Marvel Ultimate Alliance" (2009), "Titan Quest" (2006), and "ROBLOX" (2005).


Explore further

Hardware-accelerated global illumination by image space photon mapping

Provided by Williams College
Citation: New algorithm to improve video game quality (2010, February 18) retrieved 25 May 2019 from https://phys.org/news/2010-02-algorithm-video-game-quality.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
0 shares

Feedback to editors

User comments

Feb 19, 2010
In 10 years, even this feat will look like a small stepping stone... Too bad this technology can't be out sooner.

Feb 19, 2010
"As video games continue to increase the degree of interactivity, graphics processors are expected to become 500 times faster than they are now."

What a useless sentence. It doesn't give a timespan. I could also say GPUs will get a 1000 times faster than they are now. It will happen, the question is when.

Feb 19, 2010
well 500x in around 15 years, 1000x in 17 years, according to Moore. :D

Feb 21, 2010
1000x in around 10 years, 128,000x in 17 years, according to historical observation.

I also want to emphasize the stupidity of this article, as cybrbeast describes. Also, it's not a new algorithm, it is an existing algorithm running on a GPU.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more