By: Dawn Zoldi
When most people talk about AI, they mean chatbots in the cloud. At The Hive in Grand Forks, North Dakota, and Buffalo3, the AI venture born inside it, Johnny Ryan and Garett Elwood do something far more concrete. They build custom AI for autonomy that runs on drones, edge devices and tightly secured local systems for industries like oil and gas and advanced aviation. Their work offers a rare, hands-on look at how applied machine learning can vastly improve decision-making in high‑value, high‑risk environments.
A Maker Space, An Autonomy Engine and “Stingers”

The Hive began as a UAS- and autonomy-focused tech accelerator in downtown Grand Forks, operating as a kind of connective tissue for the broader GrandSKY/Northern Plains/UND/Air Force ecosystem that has turned North Dakota into one of the country’s premier testbeds for uncrewed systems. (See prior AG coverage of the Hive). In Ryan’s words, this “enabler of all types of UAS opportunities” gives more than 35 member companies a place to build, test and iterate on everything from automated drone-in-a-box systems to multi-domain autonomy concepts. (Watch the previous Dawn of Autonomy Episode 99 with Johnny Ryan).
Buffalo3 grew directly out of that environment. Three North Dakotans (Ryan, Elwood and co-founder Jacob Hanson) formalized this venture to tackle hard software and AI problems for real customers, from oil and gas inspections to custom local language models for sensitive data. The name, chosen over tacos at Grand Forks staple Red Pepper, reflects both place and mindset. “A buffalo…goes right into the storm,” Ryan explained. “That’s our mentality. Go right at the problem, right in the middle, and come out with solutions.”
To channel all that talent, The Hive introduced a concept Ryan calls the “Stingers.” Stingers are hybrid operators, part technical SWAT team, part concierge, tasked with elevating member and visitor experiences. Whether that means spinning up a last-minute projection-mapping demo for a U.S. Senator’s visit or whiteboarding an AI-enabled inspection concept for a new partner, the Stingers are there to get the job done.
Elwood and Hanson serve as Stingers. This makes it easy for Buffalo3’s AI expertise to flow directly into Hive projects without forcing companies to restructure six-month engineering plans. A member might want to test automated anomaly detection on perimeter flights or to explore a more intelligent intake process for flying at GrandSKY. A Stinger team can often deliver a working proof-of-concept “by the end of the meeting.” That agility makes Grand Forks’ ecosystem so potent.
Data, Drones and Real-Time Autonomy

If the last decade of uncrewed systems was about learning to collect data, Ryan and Elwood argue the next one is about learning to use it intelligently. Buffalo3’s answer starts with edge AI.
In oil and gas, for example, they deploy object detection models to analyze live video from drones or fixed cameras on sites. It flags anomalies such as discoloration, leaks or unexpected equipment states. Instead of flying an entire field repeatedly and pushing terabytes to the cloud, a drone can detect an anomaly in real time, dynamically adjust its flight plan to focus on that location and capture higher-resolution imagery or additional sensor data only where needed.
That changes inspection economics. Flights become problem-seeking rather than just data-gathering. Follow-on missions can be targeted instead of exhaustive. The same pattern applies across domains and perimeter security to critical infrastructure to beyond visual line of sight (BVLOS) logistics experiments at GrandSKY.
“We’re not trying to recreate the wheel,” Ryan emphasized. “We are trying to get the best ROI for the time, cost and the output. That’s how we look at everything.”
The “Uncloud”: Local, Small and Private AI Architectures

Buffalo3 helps businesses move away from monolithic, cloud-only AI toward smaller, task-specific models that run locally. Elwood describes this as SLMs (small language models) and compact vision models tuned for tight, well-defined use cases.
For highly regulated sectors like oil and gas, the primary driver isn’t convenience. It’s security. “They are very, very particular about all of their data,” Elwood noted. This creates a preference to avoid leveraging the cloud. In response, Buffalo3’s preferred pattern involves:
- Run local language models on high-end workstations or Mac Minis with strong GPUs or unified memory.
- Fine-tune them on a company’s proprietary workflows and terminology.
- Physically disable Wi‑Fi and Ethernet to guarantee nothing leaves the box.
“The selling point is nothing is going to hit the internet when you have this local model on your computer,” Elwood said. “You could literally turn the internet off on the computer and it would still work.”
Ryan calls that model the “un‑cloud,” a siloed environment where teams can experiment with AI on sensitive data under the watchful eye of skeptical IT and security leaders.
According to both AI leaders, some organizations use more of a hybrid approach. They keep a hardened local core for their most sensitive data and models, but selectively layer in cloud-based tools and open-source agents, like the OpenClaw (formerly “Clawdbot”) ecosystem Elwood has been experimenting with, to extend capabilities. The key is control. With open-source models, Buffalo3 can inspect code, fine-tune behavior and control exactly what data flows where. “You start with the isolated model that is the uncloud,” Ryan explained. “Then there’s a bridge…tools that are maybe only accessible in the cloud.”
From AI in the Loop to AI on the Move
While much of Buffalo3’s work centers around language models and perception, both developers see physical AI, robots and drones that not only perceive but act, accelerating quickly. Buffalo3 plans a near-term focus on embedding AI closer to the edge, not just for perception, but for adaptive behavior, such as:
- Processing inspection imagery onboard the drone so raw data never leaves the platform.
- Sharing only event-driven snippets, like a single alert photo, over low-bandwidth links, while keeping full context onboard for later secure retrieval.
- Using onboard intelligence to re‑task the vehicle in real time when anomalies appear.
“Unless you capture the drone, which at that point you’ve got bigger problems, the data doesn’t leave the drone,” Ryan noted. “All those insights might be processed on the drone itself, but it’s only sharing data when there’s a flag.” That aligns with the push towards higher levels of autonomy in BVLOS operations and contested environments, which remains essential to unlock full-scale uncrewed aviation.
Where to Go To Turn Curiosity into Capability
Why isn’t this everywhere yet? Technically, Ryan and Elwood believe the pieces are already in place for the type of AI they customize to be ubiquitous. The real bottlenecks are human, in terms of risk appetite, organizational inertia and skills. On the risk side, critical‑infrastructure operators may hesitate to let AI anywhere near safety-critical decisions, even after successful proofs of concept. On the workforce side, demand keeps outpacing formal training.
Even so, looking ahead, Ryan expects 2026 to be dominated by practical AI projects that make complex operations more efficient. One example already underway includes using AI to streamline GrandSKY intake processes so that, instead of spending days navigating forms and approvals to get an aircraft in the air, operators can be “ready to go” in half an hour.
To tackle this, and similar projects, Buffalo3 “does the work of a thousand people in a team, in a fun way,” according to Ryan. He believes small, nimble companies like Buffalo3, than can move “at the speed of light,” will eventually erode the traditional advantage of large enterprises. “David can now compete in any industry of Goliaths,” because of AI. Perhaps this capability is perfectly timed, as the frontier shifts toward more embodied autonomy with drones, robots and other systems that not only analyze but adapt and act.
For those working at the intersection of data and autonomy, or looking to get there, Ryan and Elwood’s developer’s-eye view offers not only inspiration, but a roadmap. It looks something like this:
- Start local and safe: build or pilot small, isolated models on dedicated hardware, with clear IT guardrails.
- Move intelligence to the edge: let drones and robots process data in real time and adapt missions around anomalies.
- Leverage open tools but own your stack: use open-source models and agents where possible, fine-tune them for your domain, and understand their behavior.
- Treat AI as a force multiplier, not a replacement: focus it on drudgery and data triage so humans can focus on judgment and strategy.
As Elwood put it, no degree program can teach this fast enough. “If you want to get into it, all you have to do is just dive in,” he recommended. “Make a ChatGPT account, learn how it works, talk to it…then ask it: if I want to be considered an expert in artificial intelligence, what do I need to learn, how do I need to learn it, what do I need to download onto my computer without putting myself at risk?” In his view, self-directed experimentation, much like the culture at The Hive, is the best way to stay current.
The future will belong to those willing to roll up their sleeves, open a terminal and learn by building. That’s exactly what’s happening every day at The Hive, inside Buffalo3 and across the broader autonomy ecosystem.For listeners, readers and innovators who want to move from interest to implementation, watch the Dawn of Autonomy episode with Johnny Ryan and Garett Elwood, Part 1 of 4 of the GrandSKY Aerospace Innovators Series.