Idaho state agencies are using software systems right now to screen benefit applications, flag tax returns, score risk in child welfare cases, and decide how law enforcement resources get allocated. Most of these systems use some form of machine learning or automated decision-making. None of them were debated in the legislature. None of them are listed in a public inventory. And when one of them gets it wrong, there is no one in state government whose job it is to answer for it.
The real question is simpler than most policy people make it: when it goes wrong, who answers for it?
Right now, the answer is nobody.
This Is About Government Power, Not Technology
I want to get something out of the way early. This has nothing to do with private companies building AI products. That is innovation, and Idaho should stay out of its way. This also has nothing to do with Palantir, data centers, federal surveillance, or whatever Washington is doing this week. Those are different fights.
This is about one narrow thing: the state government of Idaho already has coercive power over your benefits, your property, your taxes, your family. When that government starts running software that makes those decisions faster and at a scale no human caseworker could match, the question of who approved it and who is watching it is not optional. It is basic.
Conservatives have understood this for decades about every other kind of government power. We do not let agencies spend money without appropriations. We do not let regulators write rules without public comment. But somehow an agency can deploy an automated system that affects thousands of Idahoans and nobody is required to disclose it, justify it, or take responsibility for it.
That is not a technology problem. That is an accountability problem.
What Idaho Should Require
The fix does not require a new agency, a new budget line, or a single additional regulation on private business. It requires three things that any serious conservative should support.
Visibility. If a state agency is using an automated system that touches citizen data or shapes government decisions, it should be on a public list. What the system does, what agency uses it, what vendor built it, and what decisions it influences. This is government transparency. We already require it for contracts and expenditures. Software systems that affect people’s lives should not get a pass because they are digital.
Responsibility. One office in the executive branch should own this. Call the role whatever you want. What matters is that when an automated system produces a wrongful denial, a false flag, or a misallocation, there is a name and a phone number, not a vendor contract and a shrug. Right now, agencies blame vendors, vendors point to specs, and the citizen who got harmed has nowhere to go. One designated point of responsibility fixes that.
Appeal. If a government decision affecting your benefits, your property, or your family was shaped by an automated system, you should be told. And you should have a clear way to get a human being to review it. This is not a new right. This is due process, already guaranteed by Idaho’s constitution. The only thing missing is a mechanism to enforce it when the decision-maker is software instead of a person.
That is the whole proposal. Visibility, responsibility, appeal. Three things. Not a regulatory empire.
Why States Should Act First
If you are skeptical of government, I understand the instinct to leave this alone. Another official, another mandate, another report nobody reads. I get it. But consider what happens if states do nothing.
State agencies keep adopting these systems anyway. They do not need your permission. They are already doing it. The question is not whether government will use automated decision-making. That ship has sailed. The question is whether it happens with any structure at all or whether each agency does whatever its IT vendor suggests.
And if states sit this out, Washington fills the gap. That means a federal framework written for California and New York, applied to Idaho whether it fits or not. Probably heavier, probably broader, probably with reporting requirements that make no sense for a state our size. That is the actual alternative to acting now.
A lean state-level accountability structure is how Idaho keeps control of this on its own terms. That is the conservative position. Not because we love government. Because we take government power seriously enough to insist that someone is always answerable for how it is used.
The software is already in the building. The only question is whether anyone is accountable for how it gets used on the people of Idaho.
Patrick J. Wolf, PhD, is the Executive Director of the Institute for American Manufacturing & Technology (IAMT), based in Post Falls, Idaho. For a deeper technical treatment of state AI governance structures, see Morgan Dixon’s research brief “Architecting the Machine-Readable State” published through IAMT’s Aegis Institute.