All-Weather Landing: New Radar To Help Aircraft Land in Low-Visibility Conditions

May 04, 2006
All-Weather Landing: New Radar To Help Aircraft Land in Low-Visibility Conditions
GTRI researchers are investigating the use of millimeter-wave imaging radars that would allow aircraft crews to generate a pilot-perspective image of a runway even in zero-visibility conditions. Credit: U.S. Department of Defense Photo

Aircraft facing low-visibility conditions have traditionally been dependent on ground-based navigational aids to guide them to a safe landing. Even then, there were limits on the visibility conditions under which pilots were allowed to land.

Georgia Tech Research Institute (GTRI) research engineers are investigating the use of millimeter-wave imaging radars that would allow aircraft crews to generate a pilot-perspective image of a runway area even in zero-visibility conditions and without ground support. Such a radar could be combined with other sensors to provide a sensor suite that could help aircraft land in virtually any condition.

“The Air Force wants to field an onboard system that allows aircraft to land in any type of weather condition, whether it be rain, fog, snow, a dust storm, day or night.” says Byron Keel, a research scientist with GTRI’s Sensors and Electromagnetic Applications Laboratory.

Called the Autonomous Approach and Landing Capability Program, the project is directed by the Air Force Research Laboratory at Wright-Patterson Air Force Base for the Air Mobility Command, and is funded by the U.S. Transportation Command. GTRI is working collaboratively with BAE Systems, MMCOM Inc. and Goleta Engineering and the Air Force Research Laboratory.

The U.S. Air Force is interested in autonomous-landing technology for several reasons. In Europe, where U.S. forces often prepare for a deployment, dense fog conditions can prevent landings for days. Moreover, when U.S. planes land in primitive areas, they can face a range of unpredictable landing conditions.

When a radar senses a runway environment, what a layman might call distance from the airfield is measured in “range.” Width is associated with “azimuth” or “cross-range,” and height is associated with “elevation.”

GTRI began about two years ago to look for radar systems with the potential for supporting low-visibility landings. As a part of the process, they identified BAE Systems Inc. as having an experimental two-dimensional system developed in the 1990s. It measured azimuth and range using millimeter-wave technology at 94 GHz, a frequency at which a radar can see effectively through fog and dust.

The 2D system, however, does not measure elevation, a potential shortcoming. Pilots need accurate elevation measurements that represent elevated structures on or near the approach path, such as towers, buildings, or trees. Instead, the 2D system assumes that all objects lie on a flat earth, and derives a pseudo elevation based on range and aircraft height above the airfield.

“If a pilot is coming in, it’s hard for him to tell if there’s a building in front of him,” Keel says. “He has no real height information because that building in a two-dimensional system is projected onto the ground.”

In trying to measure both azimuth and elevation, researchers face the problem that an aircraft has a limited area in which to place an antenna.

A radar’s angular (i.e., azimuth or elevation) resolution is dependent on antenna size. Existing C-130 and C-17 transport aircraft have sufficient area to support the antenna’s horizontal dimension but are significantly limited in the vertical dimension. Even if sufficient area were available, scanning rate requirements limit a true pencil-beam approach.

To support elevation measurements, BAE Systems has developed a new approach that uses an interferometer to measure elevation. They modified their experimental 2D radar system, which had one transmit channel and one receive channel, and converted the single receiver channel into two receive channels.

A radar measures range by sending out a signal and measuring the time it takes for that signal to return from objects that it hits. The interferometer measures how long it takes a signal to return to both receiver channels. By comparing the difference between the two return times, the interferometer can estimate the elevation angle to objects in the runway area and along the glide slope.

GTRI has supported the Autonomous Approach and Landing Capability Program with extensive pretest analysis and test planning of BAE Systems’ new 3D hardware. Keel took part in non-flight testing of the new hardware at Wright Patterson in the winter and spring of 2005.

Initial test results were encouraging, Keel says. Still, he adds, researchers are busy enhancing the system with modifications to both the hardware and image processing algorithms. Flight tests of the radar’s effectiveness in low-visibility landings are planned for the latter part of 2006.

In addition to a radar system, Keel says, a full-fledged Autonomous Approach and Landing system might include a forward-looking infrared system (FLIR); light detection and ranging (LIDAR), a form of a laser radar; and perhaps even a radiometer, which could measure the temperature of ground objects.

“It’s really a suite of sensors that is being looked at,” Keel says. “There’s a larger program that the three-dimensional millimeter-wave radar system is feeding into.”

The program is also considering whether synthetic aperture radar (SAR) could be useful to pilots landing in poor or zero visibility conditions. SAR is a method of generating high-resolution ground images and has already been used in such applications as earth-mapping and environmental monitoring.

Since radar resolution is limited by the small size of aircraft-based antennas, SAR gets around the size problem by generating a synthetic aperture that functions like a large real-beam antenna.

An aircraft using SAR moves sideways or at an angle to the area it is imaging – unlike a real-beam radar, which is used during a straight-on approach. By moving at angle with respect to the scene, a SAR gathers many image slices and assembles them into a high-resolution image – almost as if it were using a physically long antenna. Since this antenna-like effect yields high resolution, it could be used by approaching aircraft to make a detailed airfield image prior to landing.

“A SAR produces an image with fine resolution in both range and cross-range for the purpose of identifying a particular target – or in the case of a landing field, identifying debris or other objects on the runway that may pose a threat to a safe landing” Keel explains.

Source: Georgia Institute of Technology, by Rick Robinson

Explore further: Faradair team determined to make hybrid BEHA fly

add to favorites email to friend print save as pdf

Related Stories

Operation IceBridge turns five

Oct 17, 2014

In May 2014, two new studies concluded that a section of the land-based West Antarctic ice sheet had reached a point of inevitable collapse. Meanwhile, fresh observations from September 2014 showed sea ice ...

NASA HS3 instrument views two dimensions of clouds

Sep 16, 2014

NASA's Cloud Physics Lidar (CPL) instrument, flying aboard an unmanned Global Hawk aircraft in this summer's Hurricane and Severe Storm Sentinel, or HS3, mission, is studying the changing profile of the atmosphere ...

IceBridge starts with sea ice surveys

Mar 14, 2014

NASA's Operation IceBridge started the 2014 Arctic campaign with two surveys of sea ice north of Greenland. The two flights follow similar surveys flow in previous years and continue the mission's goals of ...

IceBridge wraps up successful Antarctic campaign

Dec 11, 2013

Operation IceBridge's 2013 Antarctic campaign came to a close after NASA's P-3 research aircraft returned to its home base, NASA's Wallops Flight Facility in Wallops Island, Va., on Dec. 3. During the mission's ...

NASA's HS3 hurricane mission called it a wrap for 2013

Dec 02, 2013

NASA's Hurricane and Severe Storms Sentinel airborne mission known as HS3 wrapped up for the 2013 Atlantic Ocean hurricane season at the end of September, and had several highlights. HS3 will return to NASA's ...

Recommended for you

Faradair team determined to make hybrid BEHA fly

19 hours ago

Aiming to transform their concept into a real success, the Faradair team behind a six-seat Bio-Electric-Hybrid-Aircraft (BEHA) have taken this hybrid aircraft project into a crowdfunding campaign on Kickstarter. ...

How polymer banknotes were invented

Nov 26, 2014

The Reserve Bank of Australia (RBA) and CSIRO's 20-year "bank project" resulted in the introduction of the polymer banknote – the first ever of its kind, and the most secure form of currency in the world. ...

Enabling the hearing impaired to locate human speakers

Nov 26, 2014

New wireless microphones systems developed at EPFL should allow the hearing impaired to aurally identify, even with closed eyes, the location of the person speaking. This new technology will be used in classrooms ...

User comments : 0

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.