By: Dawn Zoldi
Utilities no longer simply keep the lights on and the taps flowing. They have rebuilt their entire operational DNA around data that actually moves, learns and predicts. At this year’s GeoBuiz Summit 2026, panelists representing Denver Water, NV5, and Cyient explained how that change has impacted workflows, risk and revenue protection in measurable ways.
Moderator Jeremiah Karpowicz, Content Director for DTECH, opened with this observation: utilities have spent a decade getting good at collecting data; the struggle now is to make that data meaningful inside real-world workflows rather than on static dashboards that “look good” but don’t actually answer operational questions. Across water, energy and communications, that means evolving digital twins from pretty 3D models into operational systems that drive predictive maintenance, outage prevention and asset decisions in real time. He shared his perspective on the session with the DTECH audience, but the panelists shared a wealth of knowledge that is relevant to audiences beyond the utility industry.
Garbage In, Garbage Out: Getting the Data House in Order
For Denver Water, the biggest barrier to sophisticated analytics revolves around basic data governance. Director of Enterprise IT John Nolte came into his role from the GIS side and quickly discovered that, despite more than a century of operations, much of the organization’s critical infrastructure history was effectively unusable for modern analysis.

The simple question “Are we even collecting the right data?,” exposed just how deep the problem ran. Denver Water designed its pipes for 100‑year lifecycles. This means they should be replacing about 1 percent of their mains annually to stay ahead of failures. In practice, resource constraints have kept that number closer to 0.27 percent, all while the cost of failure keeps rising. Recent major breaks ran into the 10–15 million dollar range once flooded neighborhoods, basements and vehicle damage had been factored into the calculation.
The utility brought in an outside firm to build a predictive model for main breaks. The verdict was that only about 30 percent of their historical break data was accurate enough to support meaningful AI. Breaks had been logged to intersections rather than specific assets (e.g., four pipes in a crossing, but no way to know which had failed). This rendered decades of records nearly useless for machine learning. “Garbage in, garbage out” stopped being a cliché and became a real budget problem. It forced Denver Water to launch a formal data governance program as a prerequisite for any serious AI effort.
That groundwork paid off most visibly in Denver Water’s Lead Reduction Program, where the utility used machine learning to identify and prioritize lead service line replacements. Out of roughly 320,000 customer taps, they had hard documentation for only about 800 service lines, a needle-in-a-haystack challenge with major public health and regulatory implications.
By fusing 45 different criteria, from asset age and construction era to legacy “master plumber books,” microfiche records and GIS layers, the team trained models to predict which services were likely to be lead and where field crews should dig first. The real turning point, Nolte said, came when they started potholing and found the model was right about 95 percent of the time. That field‑validated accuracy convinced leadership that data cleanup and analytics were not abstract IT investments, but powerful levers for compressing timelines and cutting capital waste.
Denver Water originally projected a 15‑year horizon for the program. However, with predictive targeting, they are already more than halfway done in just five years. Along the way, that precision helped avoid mobilizing crews in “every third house” patterns that burn time and money. This allowed contractors to work street‑by‑street and neighborhood‑by‑neighborhood instead. While a widely cited claim that analytics saved $50 million has since been softened internally. Nolte said the real value is bigger than any single figure: the organization now has a data‑driven way to defend decisions, timing and prioritization for one of its most visible programs.
More Than Pretty Pictures: Digital Twins That Actually Work for Crews
On the geospatial side, Paul Braun of NV5 argued that digital twins only matter when they become places where people actually work, not just environments where data is stored. NV5 operates across acquisition (“planes in the air, vessels in the water, mobile units”), analysis,and delivery, combining classic surveying and mapping with a growing software business that puts those products into the hands of utilities and municipalities.

Braun pointed to the City of Salem, Oregon, where a full 3D digital environment of facilities allows staff to:
- See historical business data (think: work orders, inspections and interventions) attached to specific assets in context.
- Let a new 24‑year‑old engineer “inherit” decades of institutional knowledge that previously lived only in the head of a retiring 63‑year‑old colleague.
- Create work orders directly from the 3D twin, so that what field crews see on screen matches what they’ll encounter in the field.
That kind of digital twin is less a technology showcase and more a workforce continuity tool. When integrated with AI‑enabled analytics, it becomes the operational front end for predictive maintenance programs, surfacing which assets demand attention, why and what’s been tried before.
Still, Braun cautioned against uncritical automation. His “stump speech” had two planks: demand that AI “show its work,” and “trust but verify.” He applauded emerging efforts to formally validate AI outputs, but emphasized that utilities must insist on traceability from answer back to data and model assumptions. In his words, running a model that’s right “about 50/50” is little better than flipping a coin. Deploying that into production is akin to “hammering nails with a screwdriver” just because the tool is close at hand.
Shared Utility Data: Vegetation, Wildfire and the New Normal of Risk
Whereas water utilities use predictive analytics to avoid catastrophic bursts, electric utilities in fire‑prone regions turn to similar techniques to prevent their infrastructure from starting fires.
Cyient’s Souvik Bhattacharya described how transmission operators in California are evolving from crews making manual, experience‑based calls on which trees to trim, to geospatially precise, model‑driven vegetation plans. Utilities now:
- Map individual trees relative to lines and structures, then model their growth rates over time.
- Integrate weather, terrain and ground cover data to understand how vegetation and climate interact with infrastructure risk.
- Prioritize trimming that protects both line clearances and the safety of vegetation crews who often work in extremely hazardous exposure zones.
These digital workflows pull in new data sources as well, from drone‑based inspections of transmission towers to rich 3D models that multiple departments can reuse. One team may commission data for vegetation management, only to have asset engineering or risk groups repurpose the same datasets for structural assessments or regulatory reporting. This turns a one‑off survey into an enterprise asset.
For Bhattacharya, this cross‑functional reuse is where “digital twin” becomes an operational reality. It is not a monolithic platform for a single purpose, but a shared, living data environment where planning, operations and risk teams can see the same truth from different angles.
Operators, Not Algorithms, Still Own the Risk

Despite the sophistication of all of these tools, none of the panelists believed automation will replace operators any time soon, especially in high‑consequence domains like water and power. Denver Water’s new state‑of‑the‑art treatment facility is a case in point. Twenty operators once managed the plant. Now just six run it, with only one physically on site at any given time. The headcount was not simply eliminated, though. The organization redeployed people so the remaining operators now carry a different kind of responsibility.
The problem, Nolte noted, is that when something goes wrong in such a highly automated environment, the human on duty may never have physically “turned the valves” or run the process by hand. To close that gap, Denver Water ingested 2,000–3,000 pages of operational documentation into an internal agent that allows operators to troubleshoot issues quickly, understand failure modes and know when to escalate, without pretending that the system can operate itself.
That impacts training. Rather than teaching operators every mechanical step, utilities now need to teach them how to interrogate digital systems, interpret model outputs and make judgment calls under uncertainty. Trade schools and college programs preparing the next generation of water and energy professionals will have to catch up by building a curriculum that sits at the intersection of process, data and software, rather than treating them as separate domains.
Braun echoed this concern from the vendor side. Younger staff may be adept at “pushing the buttons,” but without understanding the underlying math, physics or statistics, they risk putting flawed analytics into production with real customer and safety consequences. The answer, he argued, is not to slow innovation but to change internal communication patterns by sharing lessons, tools and failures openly so that “the rising tide raises all ships.”
Scaling Without Buying the Hype
All three panelists agreed that utilities trying to move from pilots to enterprise deployments should go slower than the vendors want, but faster than one’s legacy culture is comfortable with.
Nolte noted that utilities are relentlessly pitched on bespoke AI platforms. Denver Water is increasingly leaning the other way. It looks first to embed new capabilities into applications they already rely on, rather than “reinventing the wheel” with custom builds. If a billing system, GIS platform or asset management tool is adding robust analytics features, it often makes more sense to exploit that than to stand up an entirely parallel stack.
Bhattacharya warned against assuming technology alone will unlock value. Even the most advanced digital twin or AI model will stall if organizations treat them as isolated projects rather than shared infrastructure that everybody, from field crews to executives, can access and understand.
In closing, Braun offered a personal analogy that captured where utilities now find themselves. Parenting a son on the autism spectrum has taught him to communicate with extraordinary clarity, something he likens to the emerging discipline of “prompt engineering.” The way utilities ask questions of their data, their models and their digital twins will increasingly determine the quality of the answers they get back.
The future of utilities may depend less on having the flashiest AI, and more on asking better questions of cleaner data inside digital environments designed for the people who still own the risk.