Tempest: when the cockpit disappears to connect the pilot to AI

Tempest

In the GCAP, the Tempest relies on a virtual cockpit, AR, gestures, and biometrics. The promise: less mental load. The risk: too much software dependency.

In summary

The Global Combat Air Program (GCAP) aims to make the Tempest an aircraft where the interface becomes the real “weapons system.” The central idea is simple: move the cockpit into the pilot’s equipment. Augmented reality helmet, eye, gesture and voice controls, and sensors capable of estimating stress and fatigue. This “portable cockpit” promises to reduce head-down time, speed up access to information and reconfigure the aircraft via software, without touching the airframe. But this gamble adds risks. The first is technical: latency, stability, and readability during high acceleration and in a cluttered environment. The second is human: cognitive overload if ergonomics are poorly calibrated, loss of confidence if automation “surprises” the pilot. The third is strategic: a digital interface is an attack surface, and therefore a target. The Tempest does not “read” minds in the science fiction sense. Rather, it aims to fuse physiological and behavioral signals to adapt the display and assistance. The real debate is not “the last piloted aircraft,” but how much to delegate and how to keep the pilot in control.

The gamble of a cockpit that disappears only to reappear on the pilot

The most spectacular concept of the Tempest can be summed up in one image: an almost empty cockpit. No more walls of screens, almost no more dials. The ambition is to transform the cockpit into available space and move the display to the helmet, then to the algorithm.

The vocabulary says it all. We talk about GCAP Tempest as a system of systems, where the airframe becomes a platform and the cockpit is software. The British program has popularized the idea of a portable cockpit: the interface “moves” to the helmet, sensors, and mission computer. In concrete terms, this aims to achieve three gains.

First, increased adaptability. A conventional avionics suite ages quickly because the screen, console, and interface logic require major modifications. A virtual environment can evolve more quickly, in theory, in step with threats and updates.

Second, reduced “head-down time.” The promise is to increase “head-up, eyes-out” time, which remains vital in combat, refueling, night flying, and especially in complex multi-threat missions.

Finally, simplifying the physical cockpit to reduce weight, cabling, maintenance, and mechanical vulnerabilities. It’s not glamorous, but a less dense cockpit means fewer points of failure.

fighter jet Tempest

Augmented reality as the new dashboard

The visible heart of the device is the helmet. The Tempest project promotes a wearable cockpit based on an advanced display helmet, combining augmented reality and, depending on the concepts, elements of virtual reality for preparation and certain modes.

The operational idea is to project the instruments “where the pilot is looking.” Altitude, speed, threats, navigation, imagery, fuel status, and mission data become layers. The pilot no longer has to look for a screen. The information comes to him, without requiring multiple head movements.

The critical issue is the quality of the rendering. In combat, readability must remain perfect despite vibrations, sweat, variations in brightness, acceleration, and physiological effects. The system must also maintain a credible degraded mode. Even if the concept assumes “zero screens,” operational reality often pushes for a minimal backup, if only to survive a helmet failure, combat damage, or jamming.

In this model, augmented reality is not just used to display numbers. It is used to merge sensors. If the aircraft has several sensors (radar, IR, ESM, data links), the interface can represent a coherent “tactical scene.” This is where the interface becomes a force multiplier: it saves decision-making time.

Eye tracking and gestures, from gadget to safety logic

Tempest does not rely on a single modality. It adds multiple inputs so that interaction remains possible when one modality fails. Eye gaze, hand movements, voice commands, and traditional physical controls coexist. This is a key point: during rapid acceleration or turbulence, subtle gestures become less reliable. The classic approach is to keep physical controls for critical maneuvers and delegate information management to virtual controls.

Eye tracking serves two purposes. First, it allows for faster pointing. Looking at an item to select it is faster than using a cursor or navigating through menus. Second, it helps manage clutter. Some concepts describe displays that “enlarge” when the pilot looks at an area and then shrink when they look elsewhere. The goal is to avoid occlusion, thereby limiting visual overload.

Gestural control is an alternative to the mouse, touchscreen, or buttons. The idea is to “grab” a virtual window and move it, pin it, or make it disappear. On paper, it’s intuitive. In reality, combat imposes a brutal rule: if gestures increase error or fatigue, they will be rejected. The real challenge is to calibrate robust, few in number, and unmistakable gestures.

An often underestimated issue is latency. Even a few extra milliseconds can undermine confidence. A fast but unstable interface is worse than a slower but predictable one. In a fighter jet, confidence is a tactical variable.

Biometric data, cognitive load, and the promise of a “neuro-adaptive” cockpit

The most sensitive topic in 2025 concerns the integration of physiological sensors. The project does not aim to read thoughts like a movie. It aims to infer a state. Hence the use of biometric data: heart rate, variability, respiration, skin temperature, micro-movements, possibly muscle activity, and, eventually, neurological signals if the technology becomes robust enough.

The stated goal is to monitor stress, fatigue, and attention. The cockpit would be able to estimate cognitive load and adjust the interface. Fewer unnecessary alerts when the pilot is overwhelmed. More assistance when the situation deteriorates. Contextual recommendations, automatic reconfiguration of priorities, and delegation of certain tasks to a virtual assistant.

This is a credible promise if it remains modest. A model that detects “probable fatigue” can help. A model that claims to know the pilot’s intention can become dangerous because it may surprise them. The red line is predictability. In military aviation, automation is accepted if it is understandable, testable, and reversible.

This logic reflects a broader trend: “neuro-adaptive” systems tested in simulated environments, which modulate the amount of information based on measures of mental effort. The issue is not only technical. It is also legal and ethical. Who stores this data? Who has access to it? To what extent can it be used to evaluate a pilot? Biometrics can improve safety, but it can also become a tool for social control within an organization.

Embedded AI, between virtual co-pilot and software dependency

The most decisive layer is the algorithm.
Tempest promotes a virtual co-pilot capable of filtering, suggesting, anticipating, and coordinating. This is where the expression “the last human-piloted aircraft” finds its media fuel.

The risk is to confuse assistance with substitution. A software co-pilot can be excellent at sorting radar tracks, suggesting evasive maneuvers, or managing data links. But aerial combat is also a matter of uncertainty, deception, and jamming. AI trained on specific scenarios can fail in noisy environments. And if the interface is completely virtual, AI becomes the “conductor” of everything the pilot sees.

This is why onboard AI must be designed as a hierarchy of functions, with degraded modes. It must be possible to revert to a simpler, more deterministic logic when the environment is contested. The “black box” effect must also be avoided. The more powerful the assistant, the more the pilot needs to understand why it is recommending a particular action.

Another risk, rarely mentioned openly, is dependence on the software ecosystem. A virtualized cockpit requires update cycles, testing, patches, and digital supply chains. This can accelerate innovation. It can also create fragility if the program does not invest heavily in verification and cybersecurity.

Cybersecurity and electronic warfare, the true judge

An “all-digital” cockpit increases the attack surface. This does not mean that it is necessarily vulnerable. It means that it must be designed as a resilient system from the outset. In a fighter jet, electronic warfare aims to blind, deceive, saturate, and degrade. If the interface is dependent on fused sensors, manipulating those sensors becomes an indirect attack on the pilot.

Jamming can create false positives. Data fusion can give a coherent but false picture. And an AR display can amplify confidence in erroneous data because it is presented as visual evidence.

The answer is not to go back to dials. The answer is secure software architecture, redundant channels, strict separation of critical functions, and the ability to display uncertainty. Displaying “probable,” “confirmed,” or “single source” may seem trivial. It is essential. An interface that knows how to say “I don’t know” is better than an interface that invents.

The question of human pilots: more political than technological

The debate over the “last piloted aircraft” often masks a more concrete issue: the number of pilots available and their training costs. A cockpit that reduces mental load can, in theory, expand the pool of pilots and accelerate skill development. But an overly complex cockpit can do the opposite.

The most important point is this: a sixth-generation fighter cannot rely on individual heroism. It relies on a distributed team, data, connected effectors, and procedures. The pilot remains central, but becomes a combat manager rather than a systems operator.

In this context, the interface must enhance the pilot’s autonomy, not reduce it. The pilot must be able to challenge the AI. They must be able to take back control. They must be able to “disconnect” certain automatic functions if the environment becomes unpredictable. The portable cockpit must not be a software prison.

The 2035 timeline and the risk of overly ambitious promises

GCAP aims to enter service around 2035. This date imposes discipline. Attractive concepts must become certifiable building blocks. An AR helmet that can be used in combat is not a prototype for a trade show. Robust eye tracking is not a lab demo. Reliable biometrics is not just a sensor. All of this requires testing, pilot programs, feedback, and iterations.

The most common risk is stacking. Each innovation is defensible. Adding them all at once can create a fragile system. Technological maturity is therefore a strategic choice. Decisions will have to be made about what is essential for the initial block and what can be added incrementally.

This realism is also a strength. If Tempest succeeds with its interface, it can export a philosophy: that of a “reconfigurable” aircraft, and therefore one that can adapt to evolving threats. But if it fails on ergonomics, it will lose the confidence of pilots, and the aircraft will be judged on something very human: the comfort of decision-making in extreme situations.

Tempest

The final stretch: operational truth

The Tempest’s virtual cockpit is not a fantasy. It is an attempt to solve a worsening problem: too much information, too fast, in an overly contested environment. The ambition is to give the pilot an interface that adapts, filters, and keeps their eyes outside.

The question is not “does AI read minds?” The question is “does the interface help without betraying?” If the portable cockpit reduces mental load, improves safety, and remains reliable under stress, it will become the norm. If virtualization adds fragility, it will be relegated to a secondary role, with backup screens, simplified modes, and strict limits on automation.

This is where Tempest will be judged. Not on slogans. On flight hours, realistic scenarios, and the cool confidence of a pilot who has no right to doubt.

Sources

  • RAF, Team Tempest, “The tech” (Wearable cockpit section), institutional page, accessed in December 2025.
  • BAE Systems, “BAE Systems eyes novel way of flying” (eye-tracking controls), September 21, 2018.
  • Institution of Mechanical Engineers (IMechE), “The ‘wearable cockpit’ could change fighter-jet controls forever,” May 12, 2021.
  • The War Zone (TWZ), “Britain Banks On Tempest Future Fighter Program…,” October 16, 2020 (psychophysiological tests, eye-tracking).
  • Royal Aeronautical Society, “Wearable cockpits – the ultimate human-machine interface?”, August 21, 2018.
  • BAE Systems, “Futuristic radar for the UK’s future combat aircraft…” (references to wearable cockpit concepts), October 15, 2020.
  • National Defense Magazine, “Industry Partners Form GCAP Electronics Consortium,” September 18, 2025.
  • IAI (Istituto Affari Internazionali), “The New Partnership among Italy, Japan and the UK on GCAP,” March 2025 (PDF).

Live a unique fighter jet experience