The Buzz

The F-35 Still Has a Long Way to Go before It Will Be Ready for Combat

One system, the Electro-Optical Targeting System (EOTS), was singled out by pilots as inferior in resolution and range to the systems currently being used on legacy aircraft. EOTS is one of the systems designed to help the F-35 detect and destroy enemy fighters from far enough away to make dogfighting a thing of the past. Mounted close to the nose of the aircraft, it incorporates a television camera, an infrared search and track system, and a laser rangefinder and designator. These sensors swivel under computer control to track targets over a wide field of regard and display imagery on the pilot’s helmet visor display.

But the limitations of EOTS, including image degradation with humidity, force pilots to fly in closer to a target than they had to when using earlier systems just to get a clear enough picture to launch a missile or take a shot. The report says the problem is bad enough that F-35 pilots may need to fly in so close to acquire the target that they would have to maneuver away to gain the distance needed for a guided weapon shot. Thus, the system’s limitations can force an attacking F-35 to compromise surprise, allowing the enemy to maneuver to a first-shot opportunity. Surrendering the element of surprise and enabling an opponent to shoot first is what we want to force the enemy to do, not ourselves.

Another often-touted feature that is supposed to give the F-35 superior situational awareness is the Distributed Aperture System (DAS). The DAS is one of the primary sensors feeding the displays to the infamous $600,000 helmet system, and it is also failing to live up to the hype. The DAS sensors are six video cameras or “eyes” distributed around the fuselage of the F-35 that project onto the helmet visor the outside view in any direction the pilot wants to look, including downwards or to the rear. At the same time, the helmet visor displays the flight instruments and the target and threat symbols derived from the sensors and mission system. But because of problems with excessive false targets, unstable “jittered” images, and information overload, pilots are turning off some of the sensor and computer inputs and relying instead on simplified displays or the more traditional instrument panel.

Here again, the system is little better than those it’s supposed to replace.

Test pilots also had difficulty with the helmet during some of the important Weapon Delivery Accuracy tests. Several of the pilots described the displays in the helmet as “operationally unusable and potentially unsafe” because of “symbol clutter” obscuring ground targets. While attempting to test fire short-range AIM-9X air-to-air missiles against targets, pilots reported that their view of the target was blocked by the symbols displayed on their helmet visors. Pilots also reported that the symbols were unstable while they were attempting to track targets.

Then there is the matter of pilots actually seeing double due to “false tracks.” There is a problem with taking all of the information generated by the various onboard instruments and merging it into a coherent picture for the pilot, a process called sensor fusion. Pilots are reporting that the different instruments, like the plane’s radar and the EOTS, are detecting the same target but the computer compiling the information is displaying the single target as two. Pilots have tried to work around this problem by shutting off some of the sensors to make the superfluous targets disappear. This, DOT&E says, is “unacceptable for combat and violates the basic principle of fusing contributions from multiple sensors into an accurate track and clear display to gain situational awareness and to identify and engage enemy targets.”

And as bad as the problem is in a single plane, it’s much worse when several planes are attempting to share data across the network. The F-35 has a Multifunction Advanced Data Link (MADL) that is designed to enable the plane to share information with other F-35s in order to give all the pilots a common picture of the battlespace. It does this by taking all of the data generated by each plane and combining it into a single, shared view of the world. But this system, too, is creating erroneous or split images of targets. Compounding the problem, the system is also sometimes dropping images of targets altogether, causing confusion inside the cockpits about what’s there or not there.