By: Dawn Zoldi
Geospatial infrastructure has moved from the back office to the front line of national security. At this year’s GeoBuiz Summit, the “Sovereign, Secure, and Resilient Geospatial Infrastructure for National Security” panel argued that satellite constellations, data platforms and cloud environments now underpin defense readiness, disaster response,and trusted government operations…and that the community still underestimates both its power and its risk.
From Hidden Enabler to Mission Backbone

Moderator Ronda Schrenk, CEO of the United States Geospatial Intelligence Foundation (USGIF), opened by explaining that geospatial platforms sit at the nexus of commercial cloud, hyperscale services and classified missions. That convergence creates “enormous opportunity,” but it equally creates “enormous risk” when systems become stressed by cyber incidents, contested space domains or geopolitical crises.
Schrenk framed sovereignty as being no longer about where data lives, but who controls it, who can access it, how it is protected and whether it remains available when things go wrong. Resilience, she emphasized, must be measured by whether systems continue to function under attack or disruption, not simply by architecture diagrams.
Policy Blind Spots and a Nascent Congress
Romel Nicholas, chairman of ERA Government Affairs and a former Senate staffer, described a policy environment badly out of sync with the sector’s strategic importance. To test congressional awareness, he called three Senate legislative directors, exactly the people who decide what gets introduced and co‑sponsored, and floated key terms from the panel’s agenda. “The answer with all three of them was, ‘Dude, what are you talking about?’” he said.

Nicholas said his own understanding of geospatial policy makes him “the exception to the rule, not the rule,” on the Hill. He compared today’s comfort talking about “data” and “cloud” to ten years ago. He said talking about geospatial in the halls of Congress is closer to how cryptocurrency and Bitcoin sound in Washington today: everyone knows they matter, no one is sure what their position should be.
In his view, the United States is drifting toward an unhealthy extreme: either hyper‑restrictive notions of sovereignty that demand all data stay within national borders in hardened facilities, or a hands‑off stance that masquerades as “pro‑business” but is really inattentive. He argued for targeted federal guidance on cross‑border data flows, coalition operations and post‑mission data custody, particularly for multilateral defense and intelligence sharing, where today’s patchwork rules leave both agencies and companies exposed.
Nicholas was blunt about the political opportunity cost of staying quiet. The geospatial sector, he suggested, is “considerably” larger than the refining industry. Yet a typical Senate staffer is far more worried about crossing refinery lobbyists than the “nerds, wonks and lesser‑known companies” in geospatial. The real inflection point, he argued, will be whether this community chooses to flex its muscle in Washington, and whether it does so with focused, unified priorities rather than a grab bag of “asks.”
Disaster Response, Mission Sovereignty and a New Space Stack

From the vantage point of civil missions and disaster response, Brian Collins, Executive Director of Earth Fire Alliance, warned that resilience is actually moving backward in some domains. Global disaster management still leans heavily on systems built for other purposes such as weather, national security or legacy programs, which have been repurposed over time for fire, flood and civil protection.
“We were leveraging someone else’s system and never asked a question,” he said. When those systems stop being funded or supported, civil users look around and ask, “Why aren’t you doing that?,” even though they were never the primary mission.
Collins argued for what he called “mission sovereignty.” If a mission is critical, whether economic security, wildfire response or flood management, then that community has a sovereign responsibility to advocate for its own dedicated infrastructure, rather than riding along on others’ platforms. The historical excuse was cost and complexity. That excuse, he said, is gone.
Today, Earth Fire Alliance, a nonprofit, has gone from startup contract to first satellite on orbit in 12 months, with a more advanced system than most legacy platforms and a pipeline of additional spacecraft…all at a fraction of historical program costs. Synthetic aperture radar (SAR) and specialized fire‑monitoring satellites are now accessible at scales that were unthinkable a decade ago.
“We no longer have to leverage someone else’s system. We can now build systems ourselves,” Collins argued. That shift changes the definition of assured access, from hoping a weather or defense satellite continues to be funded, to taking direct ownership of capacity tailored to fire or other civil missions.
But technical capability is not enough. Collins warned that communities that fail to advocate will end up asking, too late, “How did we get to a point where GOES‑Next is not going to launch?” In his view, governments are increasingly prepared to partner with industry and nonprofits to co‑develop resilient systems. The remaining gap is user communities stepping up to define requirements and demand continuity.
Cloud, Classification and Rethinking What Really Belongs “High Side”
From the hyperscale cloud perspective, Sean P. Batir, global head of mission innovation for the National Security Division at Amazon Web Services, focused on how secure cloud regions and network designs enable both resilience and sovereignty across classified and unclassified missions. He pointed to AWS’s dedicated, isolated regions for U.S. national security customers, underpinned by hardened Nitro instances that prevent unauthorized access and align with strict accreditation requirements.
Those same design principles, he noted, ride on top of a massive global backbone of 39 commercial regions that already serve everything from streaming video to retail, and that could be more fully leveraged by governments for unclassified and lower‑sensitivity geospatial workloads. Batir highlighted emerging initiatives like the Intelligence Community’s data consortium, which aims to rationalize procurement and sharing in an era of data abundance rather than continuing today’s “multiplicity of procurements” that waste time and taxpayer dollars.
In his current global role, Batir sees a sharp contrast between U.S. instincts and those of other nations. Many foreign partners are more open to relying on the global backbone for certain classes of data, while the U.S. has tended to default to building more isolated regions and additional domestic data centers. He urged a more nuanced classification and risk conversation. “Does this belong on the high side?” should be asked against a concrete test. “Would exposure seriously damage U.S. interests, or is the same information already on the front page of major newspapers?” he asked.
Without that recalibration, he warned, the U.S. risks missing out on lower latency, greater scale and more cost‑efficient infrastructure that allies are already exploiting. For Batir, the key policy question moving through 2026 is how to compare, evaluate, and down‑select among emerging models and architectures so they deliver measurable operational impact, rather than just consuming compute.
Data Abundance, Trust, and the Analyst on Day One

If infrastructure is the backbone, decision advantage still lives, or dies, with the analyst. Jesse Kallman, founder and CEO of Danti, framed the central problem as data contextualization, not data scarcity. Today, an experienced wildfire expert might know how to pull information from ViaSat, NASA, government systems and commercial APIs, but “the other 98% of the world” has no idea these pipelines exist.
Danti’s mission, he explained, is to let “the firefighter sitting in a truck” ask a direct question, such as how a wildfire might impact critical infrastructure, schools, or police, and have the system orchestrate relevant data across NASA, NOAA, national archives, commercial repositories and even open internet videos of fire locations. Most end users, he stressed, are not 15‑year GEOINT veterans, and the system must respect that reality.
That requirement collides with an equally urgent need: preventing trust erosion as analysts tap into dozens of heterogeneous sources. In applied deployments with real analysts, Kallman’s team has learned that provenance and explainability are non‑negotiable. Users need to know where an image came from, such as direct from a trusted provider, via an open API or scraped from the public internet, and which elements of a result are deterministic versus generated.
“We had to put a lot of thought into how you present information to the user—its entire path to get to you,” he said. But oversharing raw provenance can backfire. Drown analysts in caveats and metadata and they may ignore the system altogether out of confusion or fear of making a wrong call. The goal, Kallman argued, is a middle ground where systems surface the right context, filter aggressively and deliver a condensed set of insights the analyst can actually act on.
Schrenk reinforced the stakes from personal experience. Most analysts are junior, and “it’s so scary to be the analyst” when facing complex radar or multi‑source products with limited experience.
Kallman added that workforce dynamics are compounding the challenge. Senior experts are retiring, and new personnel, such as Space Force operators. may arrive on a mission straight out of school, not even knowing what “geo” stands for. Yet they are delivering reports to combatant commands in their first week.
Technical and Operational Risk: Chain of Custody and Human Context
Collins distinguished between technical and operational risks in this new environment. Technically, the geospatial community has moved from coarse‑resolution fire detection at hundreds of meters (early MODIS‑era products) to global systems that can detect five‑meter‑by‑five‑meter fire events. That represents orders of magnitude more data than any human decision‑maker can absorb at the tempo required for modern wildland fire operations.
As a result, automated processing and model‑driven workflows are no longer optional, and the “chain of custody” for both operational and training data streams becomes critical. If a model is trained on one pattern of data and then exposed to a different stream in operations, subtle mismatches can lead to very real operational errors, especially when the person asking the question is a fire battalion chief, not a GEOINT specialist.
Operationally, Collins worries about a widening gap between the questions decision‑makers want to ask and what systems have actually been prepared, and validated, to answer. With a human GIS analyst, a fire chief can sit down and negotiate intent: “Here’s what I really need.” With a machine interface, misalignment may be harder to detect and potentially more dangerous.
In both the civil and national security spheres, that tension between automation, chain of custody and human comprehension may prove to be one of the most under‑appreciated risks of the next decade.
The Imperative: Speak Up, Standardize and Shape the Rules
When asked to identify the single most important decision over the next few years, panelists continually returned the concept of “agency,” both human and institutional.
Batir focused on disciplined evaluation of emerging models and architectures, ensuring each agency can match the right tools to its users and missions and capture retiring tradecraft inside systems that amplify junior analysts rather than overwhelm them.
Kallman argued that geospatial providers must stop treating every problem as a “geo problem” and instead design products that plug seamlessly into inherently multimodal environments, where imagery is just one signal among many.
From the civil side, Collins called on user communities to shed their traditional passivity, “just give me change detection,” and become vocal, precise advocates for the decisions they need to make and the continuity they require.
Nicholas returned to Washington. The pivotal choice, he said, is whether this industry flexes its political muscle, and if so, whether it does so with clear priorities that tie sovereign, secure and resilient geospatial infrastructure directly to economic strength and national security.
The panel’s overarching message: geospatial infrastructure is now national infrastructure. Whether it remains sovereign, secure and resilient will depend less on technology than on whether the community can align users, policymakers and providers around concrete missions…and speak with enough clarity and volume that no one on the Hill (or anywhere else) can say, “Dude, what are you talking about?”