Looking into HUDs

 

Does Head-Up Guidance improve situational awareness?

 

Head-up or Head-down? That was the title of an article I wrote in Professional Pilot magazine in August 2012, discussing the pros and cons of Head-Up guidance (HGS) systems as compared to advanced head-down displays. Since then, a lively debate has evolved in professional aviation magazines pitching true HUD believers versus HDD advocates.

 

While it seems that the branch of former military service (Air Force or Navy) has a profound influence on the opinion of the respective proponents, which may well be the result of differing operational needs of these equally great services, this debate leaves the average corporate aviation department a little lost as they try to justify investment or non-investment in HGS systems.

 

And more and more flight departments are pondering this question, as HGS systems have become smaller and cheaper and may now be adapted to midsize and smaller business jets. So here is an overview of the HUD situation, that hopefully is helpful to you.

 

HUDs were originally developed by Flight Dynamics, now a unit of Rockwell Collins, for military use. They allow the pilot to spend more time looking forward through the windshield, while still being provided with essential flight path data that are projected on a see-through display between the pilot’s eye and the windshield. The design goal was to reduce reaction times and increase handling precision.

 

Crew reaction times depend on the transition time needed between the interpretation of instrument displays and the outside world. From the very first flight lesson on we work on these transitions times, as we try to improve and maintain our scan. The idea of the HUD was, that information displayed right in front of the pilot as he looks through the windshield would almost entirely eliminate this transition time leading to a faster reaction. Well, that is true, but as a 1993 NASA study found, only if the HUD display truly conforms with the outside world. That means, there is no visible geometric difference between the real world outside the windshield and corresponding symbols displayed on the HUD, because even the slightest cues of difference cause the human brain to process two separate images. If there is a difference, the transition times are almost identical to head down displays.

In civil aviation, this very high degree of precision could until recently only be achieved during precision instrument approaches for low visibility landings. This is why HUDs where mostly installed in smaller regional jets as well as larger executive jets. HUDs made manual approaches to CAT2 and CAT3 minimums possible on aircraft that did not have the capability for automatic low visibility landings with coupled autopilots as is common on larger commercial jets. In other words: HUDs only made sense for low visibility landings, and paperwork and training required to get CAT2/3 certification for a HUD installation on a corporate jet only paid off if foggy airports were regular destinations. It should be noted, that most commercial air carriers even today do not use HUDs, with the notable exception of Alaska and Southwest airlines. Large transport jets offer automatic landings in fog, with no flightdeck view of the runway at all.

 

But technologies such as wide area augmentation (WAAS) and local area augmentation (LAAS) have made the position accuracy of GPS almost perfect in large parts of the country. Add enhanced vision IR cameras on your flight deck, and suddenly you have a sensor that does not even need a position to show your surroundings accurately. This of course made HUDs a much more valuable tool, as they can greatly improve situational awareness during arrivals and departures at airports with difficult terrain. If you look at some of the approach plates for RNP SA GBAS approaches, in other words approaches in difficult terrain entirely based on precision GPS, it is a comforting thought to be able to see through the dark and the clouds to visually confirm the flight path. I have flown the G450 with EVS IR images displayed on the HUD. Once you have flown it you almost feel a little uncomfortable without these “super-eyes”.

But of course the EVS image could also be displayed on a dashboard screen. This is actually the case in the Planeview avionics installation on the G450. There is one EVS HUD display for the left seat, and a panel mounted EVS screen for the right seat. This enables the other pilot to monitor the other pilot, just as it should be with modern crew coordination.

 

And crew coordination, monitoring the other pilot and the rest of the flightdeck, is a big issue with HUD installations. The pilot using the HUD looks outside, the HUD symbols are focused in infinity so that the pilot will see them sharply. Only symbology essential for flightpath control and energy is displayed in the HUD, so as not to overly clutter the display. All other information such as warnings or gear and flap status are shown on regular flightdeck displays as usual. Information about aircraft system and configuration status is mostly monitored by the other pilot.

 

The Design Eye Point (DEP) is marked in the cockpit with symbols on the sidewall and glareshield. This reference eye position should be maintained by the pilot while using the system. Older HGS systems have bulky projector units right above the DEP, intruding into the valuable headroom in an executive jet cockpit. Newer units such as the HGS 3500 from Rockwell Collins project into the HUD from the upper side, resulting in a much smaller and less bulky installation tailored to midsize jets.

 

As the HUD projects symbols on the real world outside, it can not be scaled. It has to be 100% conformal. Symbols are further apart then on the PFD, which has to be scaled for a meaningfull display of the aircraft situation in space. For example, the outer edges of the airspeed and altitude scales on the PFD are about 5 inches apart, an angle of about 12 degrees from the DEP, while the same symbols on the Head-Up Display spread almost 30 degrees.

 

While civil HUD installations feature indications that are familiar from the PFD, such as a speed tape, altitude tape, horizon and heading bug, there is also new information. The Aircraft Reference Symbol (Boresight) represents the projected centerline of the aircraft while the Flight Path symbol is Inertial-derived and provides instantaneous indication of where the aircraft is going relative to the outside world . The Zero Degree Pitch Reference Line is positioned relative to the aircraft’s current pitch attitude and conformal to the outside world, if the Flight Path symbol is placed on the Horizon level flight results.

 

Speed, or energy, is managed with the Speed Error Tape, a dynamic bar that displays the difference between actual and selected airspeed, and the flight path acceleration symbol. Wind Speed and direction are displayed, as well as the AOA (Angle of Attack) and the resulting pitch limit. A slip/skid indicator and a heading bug are also available. The goal is to create a display that allows complete situational awareness just by looking at the landscape outside – through the HUD.

 

As there are no HUDs that cover the entire windshield yet (I only saw those in the landing vehicles featured in the science fiction movie “Avatar”) the conformality and greater precision of the HUD comes at the price of only looking at a small portion of the entire possible display at a time, almost like a “looking-glass” effect. As pilots, we are trained to scan the entire cockpit and surrounding area all the time, so it takes some training and experience to integrate the HUD into one’s own scan in a meaningful way.

 

And of course, the principal design purpose of the HUD is to reduce the scan the pilot has to do during critical phases of flight. So while the pilot flying takes advantage of the HUD and is focused outside, the pilot not flying (PNF) remains responsible for all indications and systems that can only be seen inside the cockpit. The training of proper crew coordination procedures is of essential importance when HGS is used, but is not new to aviation. A similar split of duties is normal during short final of airliner non-HUD automatic CAT II or III approaches. While the pilot flying (PF), usually the aircraft commander in the left seat, is entirely focused on the flight path and visual cues of the runway, the pilot-not-flying (PNF) monitors the flight path but also all other systems and warnings.

 

Recently, manufacturers have begun offering dual HUD installations. Wether these should both be used at the same time, or just used alternately as pilots switch duties between legs, is still out for discussion. For example, if a waypoint sequencing error causes the HUD to give guidance to the wrong waypoint, this would normally only be caught on the PFD or FMS. The inherent design problem for HUD engineers is to offer a decluttered, monochrome display that leaves out nothing of importance.

 

HUD advocates such as the flight operations people at Alaska Airlines recommend that the HUD is used full time, for taxi, takeoff, cruise and landing. While the benefits of the HUD vary between phase of flight and are usually only relevant close to the ground, constant use of the HUD improves proficiency and, most importantly, pilots regains their ability to see through the HUD and develop a full scan of their surroundings, inside and outside the flightdeck.

 

This being said, it is quite clear that the decision to install HUDs is not an easy one. While there are without question benefits, especially when paired with EVS systems, the effort required by the flight department to gain certification for lower landing minimums and to retain currency fo all pilots should not be underestimated. Three landings in 90 days using the HUD are normally required to keep current. And, as with all aircraft systems, if you install it you also have to train for failure of the HUD. The more crews rely on the HUD, the more a HUD error will hurt. The HUD display is only as good as the underlying sensors, if these fail the HUD will fail you as well. Good basic flying skills remain quintessential, with or without HUD. HGS also means another system that needs to be maintained and calibrated.

 

So in the end it really depends on the type of your operation. If you fly a lot of difficult approaches into mountainous terrain, by all means, go for it. I have not seen a better tool then EVS on a HUD to stay clear of hard terrain. On the other hand, if occasional morning fog is your only problem, you should think twice as things can get quite involved.

 

While the HUD installation comes with a considerable price tag, there is one benefit that is sometimes overlooked: Pilots learn the handling characteristics of their aircraft quicker, as they get good guidance during manual flying with the HUD. This may help to cut training time, but is especially useful if your department has a mixed fleet of aircraft that are flown by all pilots. Sometimes this reverse training, from fully automated flight to manual raw data flight works quite well when transitioning to a new type. I remember my own type rating on the B747-400. We started using all autoflight functions in the simulator and had lots of spare capacity to look at instruments and numbers. Then, from mission to mission we worked our way back to fully manual raw data visual approaches. That worked very well, and to this day I feel confident when manually flying my B747-400.

 

HUD does encourage manual flying, as it puts the pilot right in the loop of aircraft control. It seems that in some parts of the aviation community, and more so in airline operations then in executive jet departments, a tendency to overly rely on automation to control the aircraft’s flightpath has creeped in. If you read some recent accident reports, you get the feeling that some pilots seem to actually have lost to the ability to manually fly their aircraft with confidence and precision. HUD installation may help to regain and maintain manual flying skills, as the display is centered on the pilot and augments the main and probably most valuable human sensor, the eye.

 

 

J. Peter Berendsen