Nearly everyone is familiar with terms like 20/20 vision – but what does that mean when it comes to developing a flight simulator?
Most flight simulators only offer 20/40 vision – meaning that important details are often blurred or not easily identifiable to the pilot. Yet most young military pilots starting their flying careers have 20/13 eyesight, which allows them to see far more detail than most visual systems provide.
Over the past year at NASA Ames, our small team has been developing an "eye-limited" visual system for a flight simulator being co-developed with the United States Air Force (USAF) Research Labs. So, for the first time the USAF can explore the various elements of vision within a flight simulator where the acuity of the displayed images will be 20/10.
To create this simulator we used nine 4K (4096×2160) projectors, which together provide more than 36 times the resolution of a standard high-definition television. The images are projected on a dome, with each projector overlapped and edge-blended, providing the pilot a seamless view of the simulated outside world. While we work with nine projectors today, we plan to scale up to 15 projectors in early 2013, increasing the overall resolution of the simulation.
From 36 Computers To Just 5
An Image Generator (IG) is the name given to the collection of computers, graphics subsystems, visual databases, control and rendering software operating together to graphically simulate synthetic worlds in real-time. Traditionally, a large cluster of computers would be used to drive a configuration likes ours. If we configured the USAF simulator with a traditional IG architecture, maintaining a 60 frames-per-second update rate would require 36 computers, each rendering a small portion of the screen. These clusters are large, complex and can be difficult to manage.
For this project we used NVIDIA's Quadro SVS technology and reduced the overall complexity down to five computers, each driving multiple Quadro GPUs and Quadro Sync cards to one or two 4K dome projectors. To reduce latency, cost and complexity even further, all the warp and blending between projectors is done in software using the NVIDIA GPUs.
Another significant challenge that comes with scaling up the projector resolution to be eye-limited is that the visual database needs to be eye-limited as well. The visual database in our system is made up of very high resolution satellite images which give the pilot an out-the-window view of the world. For this project we used ultra-high resolution 4K by 2K satellite images, often more than eight-times or more the resolution of textures used in traditional flight simulations, ensuring the image database matches the display resolution where possible.
More Pixels Than People
The visual system still needs to draw a new scene every 1/60th of a second. So, we use the largest frame buffer memory available from NVIDIA Quadro cards to store the entire visual database in use on the graphics card. With all the image data on the GPU, the IG doesn't need to page them to and from system memory on the CPU, an operation that can cause the rendering to stutter.
The result: the image generator NASA developed for the USAF draws more pixels every two seconds than there are people on the planet.
You can see some of the NASA's work for this simulator at the I/ITSEC tradeshow in Orlando this week. I/ITSEC is the world's largest tradeshow for the visual simulation industry. NASA is demonstrating this technology with NVIDIA, Sony and VDC Display Systems to show this state-of-the-art visual simulator driving a pair of new Sony 4K projectors from a PC with four Quadro K5000 cards and a Quadro Sync card. You can see a sample of this software and hardware running at VDC's booth number 1070.
Explore further: Researcher launches successful tech start-up to help the blind