All-Weather Landing: New Radar To Help Aircraft Land in Low-Visibility Conditions

All-Weather Landing: New Radar To Help Aircraft Land in Low-Visibility Conditions
GTRI researchers are investigating the use of millimeter-wave imaging radars that would allow aircraft crews to generate a pilot-perspective image of a runway even in zero-visibility conditions. Credit: U.S. Department of Defense Photo

Aircraft facing low-visibility conditions have traditionally been dependent on ground-based navigational aids to guide them to a safe landing. Even then, there were limits on the visibility conditions under which pilots were allowed to land.

Georgia Tech Research Institute (GTRI) research engineers are investigating the use of millimeter-wave imaging radars that would allow aircraft crews to generate a pilot-perspective image of a runway area even in zero-visibility conditions and without ground support. Such a radar could be combined with other sensors to provide a sensor suite that could help aircraft land in virtually any condition.

“The Air Force wants to field an onboard system that allows aircraft to land in any type of weather condition, whether it be rain, fog, snow, a dust storm, day or night.” says Byron Keel, a research scientist with GTRI’s Sensors and Electromagnetic Applications Laboratory.

Called the Autonomous Approach and Landing Capability Program, the project is directed by the Air Force Research Laboratory at Wright-Patterson Air Force Base for the Air Mobility Command, and is funded by the U.S. Transportation Command. GTRI is working collaboratively with BAE Systems, MMCOM Inc. and Goleta Engineering and the Air Force Research Laboratory.

The U.S. Air Force is interested in autonomous-landing technology for several reasons. In Europe, where U.S. forces often prepare for a deployment, dense fog conditions can prevent landings for days. Moreover, when U.S. planes land in primitive areas, they can face a range of unpredictable landing conditions.

When a radar senses a runway environment, what a layman might call distance from the airfield is measured in “range.” Width is associated with “azimuth” or “cross-range,” and height is associated with “elevation.”

GTRI began about two years ago to look for radar systems with the potential for supporting low-visibility landings. As a part of the process, they identified BAE Systems Inc. as having an experimental two-dimensional system developed in the 1990s. It measured azimuth and range using millimeter-wave technology at 94 GHz, a frequency at which a radar can see effectively through fog and dust.

The 2D system, however, does not measure elevation, a potential shortcoming. Pilots need accurate elevation measurements that represent elevated structures on or near the approach path, such as towers, buildings, or trees. Instead, the 2D system assumes that all objects lie on a flat earth, and derives a pseudo elevation based on range and aircraft height above the airfield.

“If a pilot is coming in, it’s hard for him to tell if there’s a building in front of him,” Keel says. “He has no real height information because that building in a two-dimensional system is projected onto the ground.”

In trying to measure both azimuth and elevation, researchers face the problem that an aircraft has a limited area in which to place an antenna.

A radar’s angular (i.e., azimuth or elevation) resolution is dependent on antenna size. Existing C-130 and C-17 transport aircraft have sufficient area to support the antenna’s horizontal dimension but are significantly limited in the vertical dimension. Even if sufficient area were available, scanning rate requirements limit a true pencil-beam approach.

To support elevation measurements, BAE Systems has developed a new approach that uses an interferometer to measure elevation. They modified their experimental 2D radar system, which had one transmit channel and one receive channel, and converted the single receiver channel into two receive channels.

A radar measures range by sending out a signal and measuring the time it takes for that signal to return from objects that it hits. The interferometer measures how long it takes a signal to return to both receiver channels. By comparing the difference between the two return times, the interferometer can estimate the elevation angle to objects in the runway area and along the glide slope.

GTRI has supported the Autonomous Approach and Landing Capability Program with extensive pretest analysis and test planning of BAE Systems’ new 3D hardware. Keel took part in non-flight testing of the new hardware at Wright Patterson in the winter and spring of 2005.

Initial test results were encouraging, Keel says. Still, he adds, researchers are busy enhancing the system with modifications to both the hardware and image processing algorithms. Flight tests of the radar’s effectiveness in low-visibility landings are planned for the latter part of 2006.

In addition to a radar system, Keel says, a full-fledged Autonomous Approach and Landing system might include a forward-looking infrared system (FLIR); light detection and ranging (LIDAR), a form of a laser radar; and perhaps even a radiometer, which could measure the temperature of ground objects.

“It’s really a suite of sensors that is being looked at,” Keel says. “There’s a larger program that the three-dimensional millimeter-wave radar system is feeding into.”

The program is also considering whether synthetic aperture radar (SAR) could be useful to pilots landing in poor or zero visibility conditions. SAR is a method of generating high-resolution ground images and has already been used in such applications as earth-mapping and environmental monitoring.

Since radar resolution is limited by the small size of aircraft-based antennas, SAR gets around the size problem by generating a synthetic aperture that functions like a large real-beam antenna.

An aircraft using SAR moves sideways or at an angle to the area it is imaging – unlike a real-beam radar, which is used during a straight-on approach. By moving at angle with respect to the scene, a SAR gathers many image slices and assembles them into a high-resolution image – almost as if it were using a physically long antenna. Since this antenna-like effect yields high resolution, it could be used by approaching aircraft to make a detailed airfield image prior to landing.

“A SAR produces an image with fine resolution in both range and cross-range for the purpose of identifying a particular target – or in the case of a landing field, identifying debris or other objects on the runway that may pose a threat to a safe landing” Keel explains.

Source: Georgia Institute of Technology, by Rick Robinson

Citation: All-Weather Landing: New Radar To Help Aircraft Land in Low-Visibility Conditions (2006, May 4) retrieved 24 April 2024 from https://phys.org/news/2006-05-all-weather-radar-aircraft-low-visibility-conditions.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Novel early-detection method aims to stem disease spread in animal trade

0 shares

Feedback to editors