Lockheed Martin’s Skunk Works Shows Off AI That Can Replan A Drone Mission Mid-Flight

Amazon Drone Deals: DJI Mini 5 Pro Fly More Combo with DJI RC2 now for $1,099!
Lockheed Martin’s Skunk Works just ran a live AI demonstration that hits on one of the hardest problems in autonomy – the unknown of what happens when an unmanned mission goes sideways.
In the demonstration, Lockheed’s AI-driven contingency management software detected simulated fuel issues on a Stalker XE drone, generated updated mission options in seconds, and presented those options to a human operator. Once the operator selected a preferred route, the onboard AI automatically reassigned the Stalker’s remaining tasks to a second drone, a Freefly Alta X Gen 2 modified by Drone Amplified, then commanded the Stalker to return to base.

What “AI Contingency Management” Means
Every single aviation mission runs on the same hope that everything goes off without a hitch. Ideally the aircraft takes off, the route is flown, and the aircraft returns for a landing. In reality, almost every person in the aviation industry can tell you: real life does not care about your flight plan.
Batteries will have voltage sag, wind speed changes mid-flight, sensors degrade, wildlife can get defensive, and the list goes on. Something you did not anticipate appears, and the route needs to change quickly.
Mission contingency management is the system’s ability to recognize those problems, build alternative routes, and help the operator pick the best one before the mission time window closes. In this demo, the trigger was simulated fuel contingencies.
The Demo Setup: Stalker + FreeFly Alta X, With a Human Still In Charge

According to Lockheed, the team simulated multiple variations of fuel contingencies during the flight. The AI agent inside the command-and-control environment assessed the situation and generated new mission options in seconds. The operator picked one option, and the system executed the retask automatically.
Two details here matter more than the buzzwords:
- Human-in-the-loop stayed intact. The operator made the call. The AI did the heavy lifting. That is the kind of division that autonomy needs if it is going to be trusted outside of demos.
- It was multi-asset, not single-drone autonomy. The mission did not “end safely.” The mission continued, just on a different platform – it’s the same concept as the X1 robot recently released by Caltech, but with a more dangerous mission.
One Command Node, Multiple Drones, Plus an Unmanned Ground Vehicle (UGV)

Lockheed also used the test as a trial run for a different purpose: mission data from the Stalker was shared with a central command node that coordinated additional assets, including an unmanned ground vehicle (UGV) located in Kansas and other UAV support from Fulcrum. The future of a fully autonomous army is inching closer and closer by the day.
STAR.OS, STAR.SDK, and the UI Problem Autonomy Always Runs Into
Lockheed tied the demo to its STAR.OS AI integration framework and the STAR.SDK toolset, which was used to connect the contingency management app to the operator. In at least one description of the test, operators could use a chat assistant to request and review mission options.
Autonomy fails in the real world when it cannot explain itself fast enough for a human to approve the next step. A clean UI that can surface options quickly, with logic an operator can defend later, is the difference between “helpful autonomy” and “autonomy that nobody turns on.”
Why This Matters Outside Defense
Enterprise operations deal with the same pain points, just with different stakes:
- A drone needs to exit early mid-inspection, but the deliverable still needs to be completed
- A no-fly situation appears and the mission must reroute immediately
- A sensor or link degrades and another aircraft needs to pick up the remaining work
- A multi-drone jobsite needs a coordinator that can keep the operation productive when one platform drops out
If autonomy is going to scale, it has to operate at the mission level, not just the aircraft level. This is a real step in that direction.
DroneXL’s Take
This is the part of autonomy that actually matters.
Auto takeoff and auto landing are nice. Waypoint missions and position hold are useful. But the real test is what happens when the mission stops matching the plan.
In a high stress situation, your fine motor skills will be thrown out the window, and you will instinctually fall back on whatever training and repetition you already have. If an operator is unable to operate an unmanned vehicle, an AI system that can spot the issue, propose smart alternatives fast, and then move work to another platform is the direction our country’s unmanned industry should be moving.
Speaking from personal experience, the most difficult things to overcome in any safety-sensitive operation is human error. We’re naturally inclined to make mistakes, so any equipment that can guide our service members in the right direction under fire is progress in my book!
I can’t recall how many times I’ve had to remind students to remove a gimbal cover from their DJI Mavic 3 Enterprise aircraft. Thankfully, DJI was kind enough to let you know on their transmitter if your gimbal’s motor is strained – it’s indirectly saved me thousands of dollars in just 18 months so far
Let me know in the comments below if you have any predictions for where this AI-enabled technology may take us next, I’m all ears.