Top Drone: The Future of Fighter Pilots

Published
10 min read

In 1986, the film Top Gun debuted as a straight-faced depiction of competition among Navy fighter jocks at the pinnacle of American Cold War airpower.

Thirty years on, in the iPhone era, the film’s contest between two rival aviators — Tom Cruise’s aggressive but unpredictable “Maverick” and Val Kilmer’s aptly named “Iceman” — squaring off both in the sky and on the beach volleyball court, seems as dated as a Motorola DynaTAC.

Yet as the U.S. military prepares for an era of autonomous combat aircraft, one of the film’s core questions has resurfaced. What will it take to trust the fighter pilots in the aircraft next to you? Will these futuristic drones be more like Iceman or Maverick?

Expecting the Unexpected

“It’s one thing to use or rely on a system when everything is going well. But what will happen in a mission-critical situation when something unexpected happens?” asks Kim Jackson Ryan, principal human systems engineer at Draper Laboratory, whose research explores the bonds of trust between people and machines.

The question is moving quickly from theoretical to practical with new aircraft debuting from the U.S. and its allies — as well as potential adversaries — in recent months.

In February, Boeing and the Australian government rolled out a model of the Boeing Airpower Teaming System unmanned aircraft set to start flying in 2020. The 38- foot, AI-empowered jet will have a 2,000-nautical-mile range and be able to complete myriad missions alone or in tandem with piloted warplanes.

In March, Kratos Defense & Security Solutions flew its XQ-58A Valkyrie for the first time as part of the Air Force Research Laboratory’s push to develop less costly “attritable” aircraft for use on missions when there is a high chance of losing U.S. jets to enemy attack.

International Front

China is readily exporting its first generation of combat-ready military drones. The superpower is sending these to places like Iraq and Saudi Arabia and recently saw its China Aerospace Science and Industry Corp. roll out the Sky Hawk drone, with a flying-wing design meant for missions in which human fighter pilots fly alongside robotic planes.

While these aircraft generate tremendous buzz for their potential to fulfill sci-fi fantasies about future battles, the reality may come down to something that can be found in war stories as old as The Iliad: bonds of trust between comrades in arms. The modern twist is one that poses real challenges for engineers trying to figure out how to create reliable autonomous partners.

Intensely contested electronic and cyber environments are a characteristic of current war zones like Syria and will surely be even more lethal, given recent breakthroughs in stealth-aircraft detection and longer-range surface-to-air missiles, should a future conflict directly involve Chinese or Russian forces.

Making a Connection

Countless threats to fighter pilots make the matter of trust even more acute.

According to Ryan, a variety of factors contribute to the human-machine relationship. There are dispositional elements that are inherent to each person. Situational factors are context dependent and are crucial to understand, as well. Past experience also matters in establishing the influence of learned factors.

Human fighter pilots flying together on an air-defense mission would know how to handle an unexpected electronic attack, for example, because they have trained together for years. In effect, that confidence is the benchmark that their autonomous wingmen will have to be able to meet.

“It’s going to be an interesting day when an autonomous wingman is put in an environment with cyber and EM (electromagnetic) interference and they’re cut off. What do they do?” Ryan said.

Pilot Integration

For aviators who have flown older fighters as well as the latest generation of U.S. jets, so-called “fifth-generation aircraft,” it is starkly evident just how much a software-laden aircraft can do on behalf of its human pilot.

“There’s a relatively short continuum of realizing the airplane is making decisions that are more accurate and timely than mine,” says Dave Berke. He’s a retired Marine Corps officer and former Top Gun instructor. He also holds the rare distinction of having flown both of the Pentagon’s most advanced fighters in addition to other fighters during his military career.

An aircraft such as the F-35 can autonomously cue its pilot visually to pay attention to a potential or actual threat that onboard or networked sensors elsewhere picked up. This shifts the cognitive load from human to machine and over time builds true trust in the jet.

Breaking the Rules

Based on Berke’s experience, the concept of flying in combat alongside an autonomous aircraft can be seen as an extension of this dynamic.

“In terms of the psychological burden of the remote wingman, I have no problem with that,” he says.

Next year, a sequel to Top Gun will hit movie theaters, Top Gun: Maverick. It will reportedly again feature Tom Cruise, as well as drones. Given the recent real-world developments in autonomous wingmen, the timing is fitting to once again consider the nature of trust and predictability.

Ryan says, “How do you make creative algorithms that understand the rules but know when to break them?”

Hollywood AI: HerPrevious ArticleHollywood AI: Her A Sit-Down with Nicholas Horbaczewski, Founder of the DRLNext ArticleA Sit-Down with Nicholas Horbaczewski, Founder of the DRL