Certifying the Future: How Emerging Aviation is Defining Safe Paths for Artificial Intelligence and Machine Learning

An expert panel at NBAA BACE 2025 provided their perspectives on how the FAA approaches AI, ML and autonomy in aviation.

By: Charlton Evans, AG Certification Ambassador

At this year’s NBAA-BACE, one of the final panels of the week tackled a question that will shape the next decade of aviation innovation: how to bring machine learning safely into advanced aircraft design. The session, “Clearing a Path to Machine Learning Certification in Business Aviation,” moderated by Wes Ryan, a Northrop Grumman Technical Fellow who spent 20 years at the FAA certifying new technology, featured industry leaders with hands-on experience at the intersection of automation and airworthiness: Jon Damush, President and CEO of uAvionix; Kyle Ford, Technical Fellow at Collins Aerospace and Jean-Guillaume (JG) Durand of Joby Aviation.

Charlton Evans/End State Solutions
The panel lineup: Jon Damush, President and CEO of uAvionix; Kyle Ford, Technical Fellow at Collins Aerospace and Jean-Guillaume (JG) Durand of Joby Aviation.

Busting the AI Myths in Aviation

The realities of certifying autonomy are not as mystical as the headlines. For all the futuristic headlines about artificial intelligence (AI), the panelists agreed that aviation machine learning (ML) isn’t about self-aware systems making unknown decisions. ML is not a magic wand for autonomy, but rather a focused tool for specific functions such as aircraft perception, pilot monitoring and predictive maintenance. Wes Ryan pointed out that most advanced flight systems, from synthetic vision to enhanced flight cameras, entered service first in business aviation because operators had both the incentive and financial means to invest in certification. “Emerging ML capabilities will follow the same trajectory,” Ryan said.

“Autonomy has existed in aviation for decades,” Ford added. “The autopilot reduces human workload. Systems like Collins’ integrated checklists already act autonomously. Machine learning simply enables new types of tasks we couldn’t model analytically.” He explained how this practical framing allows engineers to design within existing certification boundaries. “We train these systems offline. They don’t learn on the aircraft. Once deployed, they’re deterministic. They do one job, very efficiently, and they do it the same way every time,” he noted. “That alone changes how we design systems, but not the safety logic that governs them.”“What we have today is math,” added Damush. 

“These systems may appear intelligent, but they’re simply computational methods that have become practical because of scalable computation on the edge. It’s an implementation choice. AI isn’t replacing humans, it’s augmenting them,” he continued. He explained how this data-driven detection work began years ago at Iris Automation, where they flew over 17,000 air-to-air encounters to train optical detect-and-avoid (DAA) algorithms. “Machine learning allowed us to process mountains of flight data that no human could analyze,” he said. “But the system didn’t make the decision. It helped humans make a better one.” Damush summarized, “This technology isn’t about building machines that think like people. It’s about giving humans superhuman perception and precision in the cockpit.”

Durand expanded on this thought with a comparison to classical techniques, “Machine learning is just one way to implement a function,” he said. “You could tune a filter mathematically, or you could teach it with data. Both are valid. The choice comes down to what best serves the safety and performance of that specific task.” Durand also provided examples from Joby Aviation’s “Super Pilot” autonomy stack to illustrate how selective AI adoption creates tangible safety gains. “Only three functions in our autonomy stack rely on machine learning: GPS-denied navigation, obstacle detection and terrain awareness,” he shared. “Everything else depends on classical methods. Machine learning gives us better performance where visual and contextual recognition matter most.” His team has already demonstrated autonomous taxi capability that detects obstacles on the runway and stops automatically. “It stopped for an unexpected object on the taxiway, no pilot input,” he said. “That’s the level of augmentation we’re talking about.”

What Does Certification Look Like for AI?

Dawn Zoldi/P3 Tech Consulting
Every year, NBAA packs the house for BACE and increasingly, focuses on emerging aviation topics.

So exactly how will aviation authorities evolve into certifying machine learning in aerospace, a regulatory environment defined by incremental change  and accountability? Should they use what we already know in systems engineering and safety assurance?”

The FAA favors a use-case-based, incremental method. “It’s the difference between trying to regulate every possibility up front and building from concrete examples,” Ryan observed. “The FAA wants the industry to help teach the regulator through incremental certification.”

“I don’t think we need to reinvent assurance,” Ford agreed. “We verify each feature by its behavior and its effect on system safety, not by its implementation choice. The pilot doesn’t care whether it’s a neural network or conventional software. They care that it works and alert them if it fails.” He outlined how Collins Aerospace approached the FAA with its first certifiable ML component, a modest system that calculates optimal cruise altitude. “We started small,” he said. “It wasn’t a new capability; the goal was to test the certification path itself. Working directly with the FAA for over a year, we proved it could be done.”

Damush concurred that regulatory agility will matter more than new standards. “Certification can’t keep pace with technology if we stay tied to purely physical validation,” he said. “We need to embrace computational assurance: simulation, digital twins, model-based certification. Otherwise, tech outpaces regulation and we constrain innovation before it can even prove its safety benefits.” He described how transparency, data sharing and collaboration are central to success. “There’s a skeleton key written throughout the regs: ‘in a manner acceptable to the Administrator.’ That means if you can demonstrate safety, you can get novel technologies approved. Developers who engage the FAA honestly and early help both sides learn. That creates trust, and trust accelerates progress,” he suggested.

Durand emphasized the complexity of data traceability as a new assurance challenge. “We must prove that our training data fully represents the operational domain, the weather, lighting, geometry, everything that defines the aircraft’s reality,” he said. “That’s hard, but it’s manageable with rigor and documentation. The goal isn’t to rewrite safety engineering, it’s to apply it carefully to new kinds of inputs.”

Certification is an Exercise in Trust

From Hollywood headlines to flight test facilities, society’s image of AI veers between wonder and fear. Much of the public’s unease with AI in aviation revolves around predictability. How can you trust a system that “learns? Within aviation’s disciplined culture, the outlook is deliberate and determined. “We design bounded systems,” Ryan reminded the audience. “They behave within controlled parameters. The myth that AI will suddenly misbehave ignores how we actually engineer.”

In truth, machine learning  in aviation is less about autonomy than assurance. It’s about making sure predictive maintenance actually predicts, that pilot monitoring truly monitors, that visual detection never misses what a human might overlook. These are life-critical, quantifiable improvements. And in certified aircraft, the “learning” in ML stops long before installation. From Joby’s autonomous air taxi trials to Collins’ embedded optimization engines and uAvionix’s DAA systems, the pathways may differ, but the strategy is the same: start small, validate safely and grow trust through data.

“The phrase ‘deterministic AI’ sounds contradictory, but that’s exactly what we’re doing,” said Ford. “Once the model is trained and verified, it’s frozen. Same input, same output every time.” He continued, “Everybody wants new features on aircraft, but they also want them certified. The only way to deliver both is by showing our work, completely, transparently, and repeatedly.” [and with a well defined goal that we agree with the FAA on reaching.]

Durand added that the question of determinism is as much about architecture as algorithms. “Machine learning modules must operate within guardrails: hardware and software layers that detect and isolate faults. If the model fails, the aircraft knows it, alerts the pilot, and falls back to a safe mode. That’s what earns trust.”

Machine learning’s promise in aviation will hinge on its certifiers as much as its coders. “The challenge isn’t proving performance. It’s proving predictability,” Ryan said. “That’s exactly what certification exists to do.”

What’s ahead of the leading edge? What’s next? Looking ahead, both the FAA and EASA are developing new levels of guidance for AI assurance. EASA’s “Level 3” guidance, expected soon, will address moving from human-supervised automation to genuinely independent functions. Meanwhile, FAA’s performance-based certification framework is experimenting with real-world partnerships, what Ryan described as “crawl, walk, fly.”

Durand sees a potential future where pre-trained models can be periodically updated, improving safety without full recertification. “There’s a sweet spot between static and self-learning systems,” he said. “But regulators and industry need to define that mechanism together. Updating safely is the next big question.”

As the session closed, Ryan paused to thank his panelists and the engineers in the room still wary of diving into this uncharted regulatory territory. “Every major step in aviation—glass cockpits, fly-by-wire, gps navigation—faced uncertainty until someone certified the first version,” he said. “Machine learning is just the next step in that lineage.”

The collective message was clear: certification isn’t  stopping AI implementation. It’s the path to implementing it safely into the service. And as business aviation once again takes the lead proving what’s possible. On the horizon, meticulous, mathematical progress will  replace hype and fear.