Sensor Fusion: The Secret Sauce that Makes the F-35 Revolutionary

Sensor Fusion: The Secret Sauce that Makes the F-35 Revolutionary

F-35 jets turn pilots into “true tacticians.”

The state-of-the-art F-35 fighter is armed with a 25-millimeter cannon, flies with a fifth-generation stealth configuration, and attacks with an entirely new generation of air-to-air and air-to-surface weapons. And yet, its most defining characteristic may ultimately lie in the often-cited realm of “sensor fusion.” 

This sensibility seemed to emerge as a consistent point of emphasis among F-35 pilots. Lockheed Martin F-35 Test Pilot Chris “Worm” Spinelli, who spent twenty-four years in the Air Force, told me the aircraft is defined by “data integration.” 

“For me, the biggest difference I've seen between flying the fourth-generation F-16, which is what I previously flew, and my few hours in the F-35 is its data integration and data management capabilities,” Spinelli said. “It allows extreme situational awareness­–more than any other platform that we've generated, at least that I've flown.”

The merits of the technological process are certainly clear, yet “sensor fusion” also introduces interesting tactical dynamics that might easily be overlooked. F-35 jets turn pilots into “true tacticians.” Tony “Brick” Wilson, who now works for Lockheed Martin as the chief of Fighter Flight Operations told me.

“Applying the system of sensor fusion reduces pilot workload and allows the pilots to have a situational ‘bubble’ so that they’re more than just a pilot and they‘re more than a sensor manager,” Wilson said. “They’re true tacticians. The fact that the pilot has the spare capacity increases survivability and makes them more lethal.”

Fighter jets operate in alignment with a host of distinct and manageable variables such as altitude, navigational trajectory, speed and a need to gather and process time-sensitive data and weapons information. F-35 jets are armed with a next-generation suite of electronic warfare weapons, upgraded air-to-air attack missiles, long-range targeting sensors, mission data files or a threat library for enemy target “identification;” sensor fusion “declutters” all of this, Monessa “Siren” Balzhiser, a Lockheed Martin F-35 Production and Training PIlot, told me. 

“The great thing about [the display] is you can control what you see and what you don’t,” Balzhiser said. “You can declutter and put everything on it that you need, so you’re seeing an advanced picture of friendlies, air-to-air and air-to-ground stats, and navigation points. It’s all encompassed in one display, which is why I say it becomes a matter of how well a pilot can process all that data, because it’s a lot of data and it’s always dynamic. It's always giving you real-time information from every single sensor in the jet.

When it comes to destroying an enemy, an F-35 Advanced Electronically Scanned Array can see a threat object, a long-range infrared targeting sensor can produce a rendering of it, an onboard threat library database can positively ID the target and then precision-guided weapons can strike.

The end of this process ultimately comes to a simple “decision,” or point at which the pilot needs to act quickly and decisively. 

“When I first got into the F-35 and even still today, the biggest, game-changing difference that I've seen specifically for the person in the cockpit, the ‘decisionmaker,’ the pilot, is the F-35’s fusion and integration of all of the different sensors from the aircraft,” Spinelli said. “It brings together a holistic picture that's quite amazing. This was never, never seen before on any fourth-generation platform.”

The fusion process is ultimately the intended by-product of advanced computing. A former Air Force chief scientist told me a few years ago that the F-35’s sensor fusion is in fact an early iteration of artificial intelligence. Advanced computer algorithms perform a wide series of automated functions, meaning many procedural analysis tasks can be performed without needing human intervention. This not only eases the so-described “cognitive burden” placed upon the pilots to free them up for more important tasks requiring human cognition but also independently compares separate pools of incoming data to one another to draw conclusions. 

This is how artificial-intelligence-enabled systems work: massive volumes of gathered or incoming sensor data can be analyzed in relation to vast or seemingly limitless amounts of information in seconds to solve problems, draw concussions, and organize a host of variables to see how they fit together, integrate or simply influence one another. For example, speed and altitude will impact both navigation and weapons targeting. Threat data can determine the closing speed or drive decisions about which weapon might be optimal for a particular specific threat. Artificial-intelligence-enabled computing can perform many of these functions autonomously, based in large measure upon comparing variables against existing information and precedents set in prior instances to make recommendations to a human decision-maker. 

“It’s super, super, super easy to fly,” Wilson told me. “It’s made easy to fly for a reason, because of all the management you have to do in the cockpit.”

Kris Osborn is the defense editor for the National Interest. Osborn previously served at the Pentagon as a Highly Qualified Expert with the Office of the Assistant Secretary of the Army—Acquisition, Logistics & Technology. Osborn has also worked as an anchor and on-air military specialist at national TV networks. He has appeared as a guest military expert on Fox News, MSNBC, The Military Channel, and The History Channel. He also has a Master's Degree in Comparative Literature from Columbia University.

Image: Flickr / U.S. Air Force