The More Things Change, The More They Stay the Same: Why AI Context Engineering Has Brought Us Full Circle in the Public Sector
AI is back in the spotlight. Public sector agencies are rushing to explore what’s possible with agents, copilots, and foundation models. Every week, there’s another demo, another beta, another tool that promises to save time, cut waste, and generate insight on demand.
But if you zoom out, this moment looks familiar.
We’ve seen this kind of excitement before. In the big data era, the goal was to collect everything. More data meant more potential. But it didn’t take long to realize that fragmented systems, inconsistent formats, and unstructured documents turned that promise into a mess. The volume was there, but the value wasn’t.
Then came the analytics push. Dashboards, reporting, machine learning. Again, potential was there. But progress hit a wall without proper pipelines, data governance, or aligned schemas. It took years of cleanup just to get usable insights from the systems agencies had been using for decades.
Now, AI agents are here. And what’s happening?
Same story. New chapter.
AI agents depend on access to relevant, clean, well-structured data. If they can’t find it, reason over it, or connect it to the right context, they fall short. They hallucinate. They misinterpret. They give partial answers. And in public safety, justice, or health, that’s not just annoying. It has real world impact on some of society’s most vulnerable populations.
We’re repeating the cycle. And what makes it more urgent this time is the speed of deployment. These tools don’t need months of integration. They can be live in days. But their success still depends on the same thing it always has: the maturity of your data and the systems that surround it.
It’s also worth recognizing something else. These models don’t just consume data. They create it. Every agent interaction, every output, every synthetic record used for testing adds to your footprint. If you’re already struggling to manage what you have, AI will accelerate that sprawl. It’s not just amplifying your signal. It’s also amplifying your noise.
That’s why I keep coming back to this: The more things change, the more they stay the same.
The tech evolves. The pain points stay familiar. If the data isn’t ready, the tool won’t perform. It’s that simple.
So the next phase of AI in government won’t be defined by who pilots the most tools. It’ll be defined by who builds the strongest foundation.