NASA set to market headset that lets pilots (and drivers?) see through fog

Augmented reality is barely making inroads in other areas; NASA is pitching it for pilots

NASA has taken the next step toward turning real life into a video game:

It has married augmented reality to heads-up-displays and come up with a system to let pilots see through fog, glare, darkness and other conditions that contribute to the single most common factor in airline crashes.

Global positioning systems tell pilots exactly where they are, but don't have any information about rolling terrain, buildings, mountains and other obstacles that are as deadly as they are hard to see in challenging conditions.

"If pilots are not familiar with the airport, they have to stop and pull out maps," said Trey Arthur, an electronics engineer at NASA Langley Research Center in Virginia. "This display, in the new world where these routes are going to be digital, can tell them what taxiway they're on, where they need to go, where they're headed, and how well they're tracking the runway's center line."

The augmented-reality headset fits over one eye and display what looks like the actual runway, taxiways and other directional data as the plane is on the ground as well as the runway centerline and terrain details during the approach to landing.

The headset is a new product in itself, but doesn’t use any GPS or terrain data that aren't already available, according to NASA. The system works in ways similar to the heads-up-displayes fghter pilots use, but incorporates data from high-precision efforts to map the Earth's surface such as the Space Shuttle's Radar Topography Mission in 2000.

It has been tested in a unique NASA plane with two cockpits – one normal, with windows, the other totally enclosed so the "blind" pilot has to fly using only data from the augmented reality system.

The setup helped keep test pilots from crashing, but was primarily designed to verify the quality of NASA[s terrain and location data, the accuracy of the augmented reality display and the ability of pilots to fly using only a virtual picture of the real world.

NASA has been working on Synthetic Vision systems for civilian use since 1993. This headset, which is designed for airliners, could also be adapted to help drivers in cars, though the challenge of gathering enough detailed terrain data to make it practical for cars is a huge challenge.

The challenge of landing a plane, not to mention the downside of failing to notice the bit of terrain into which you're about to fly is incalculably higher. The number of locations needed to make the device useful for pilots and level of detail needed about airports is far lower than trying to provide the same level of detail in a form that would be useful for drivers on any of the 10s of thousands of miles of roads in the U.S.

If the headset ever makes it into the consumer market, it won't be for quite a while.

The headset does not yet even have an official name. NASA is still looking for commercial business partners who can bring the headset to market aimed specifically at pilots. The auto version will have to wait until later.

Read more of Kevin Fogarty's CoreIT blog and follow the latest IT news at ITworld. Follow Kevin on Twitter at @KevinFogarty. For the latest IT news, analysis and how-tos, follow ITworld on Twitter and Facebook.

What’s wrong? The new clean desk test
Join the discussion
Be the first to comment on this article. Our Commenting Policies