President Donald Trump signed Executive Order 14365 on December 11, declaring that the US should maintain “global AI dominance” through a “minimally burdensome” national framework – warning that state-level rules on AI regulation are becoming an obstacle to the government. The order introduces an AI Litigation Task Force to challenge state laws and identify “onerous” statutes, threatening that states could lose access to certain federal funding. The White House casts the decision as a necessary move against a messy “patchwork” of regulations across the country, while critics call it a strategy designed to halt real oversight before Congress passes anything at all. So, is this really about national coherence, or is there a hidden motive?

What the Order Actually Does (and Doesn’t)
The executive order directs federal agencies to identify, challenge, and neutralise state-level AI regulations deemed “burdensome” or inconsistent with a future national framework. The Department of Justice is also authorised to contest state laws in court, signalling that federal funding – such as technology or broadband grants – could be withheld from states attempting to continue with their own AI rules.
In real terms, this order does not create a new comprehensive federal AI law or national framework. Instead, it freezes the regulatory landscape by discouraging states from acting at all. Until Congress passes legislation, Washington effectively becomes the sole gatekeeper, whereas responsibility has previously been spread across states to act independently of one another depending on their own interests. So, this means there are still no clear rules, but states should not be attempting to fill the regulatory vacuum in the meantime.
The Official Case: Avoiding a Patchwork of Rules
Publicly, the administration’s argument is simple. More than two dozen states have already passed AI-related laws on topics such as training data disclosure, consumer protections, and transparency, which is leading to what the White House calls a “patchwork” of compliance obligations.
From this perspective, uniformity would increase competitiveness. A single federal approach would allow US companies to scale faster and compete with centralised systems in China, where AI regulation is nationwide and tightly aligned with the country’s priorities. Fragmentation, Washington claims, risks slowing innovation and driving investment elsewhere.
This particular framing has strong backing from the tech industry, which has long warned that differing state rules could raise costs and create legal uncertainty.
Unspoken Context: Congress Failed to Act
What’s not being acknowledged by the government is the reason states stepped in to regulate AI themselves in the first place. Congress has been unable, unwilling, or both to pass meaningful AI legislation. For years, lawmakers have held hearings warning about deepfakes, algorithmic bias, labour displacement, and surveillance risks, yet comprehensive federal laws have repeatedly stalled. In that vacuum, states did what they felt they needed to: they acted.
The executive order therefore reads less like a response to regulatory chaos and more like a reaction to their embarrassment inaction. Rather than setting national standards themselves through legislation, the administration is opting to block others from setting any standards at all.
Who It’s Really Targeting
The order avoids naming specific statutes, but the targets are obvious. California’s requirements on AI transparency and training data disclosure, along with Colorado’s rules governing discrimination in “high-risk” AI systems, are two of the big ones. These laws go far beyond voluntary guidelines and impose enforceable obligations – precisely what industry leaders have been pushing hard against.
Many of these laws focus on accountability – disclosure of data sources, testing for bias, checking consumer rights when automated decisions cause harm – rather than acting as sweeping prohibitions. But, in an international race to AI dominance, they introduce unwelcome (even if healthy) friction.
Can – or Should – the President Even Do This?
The executive order arguably undermines the very principles of federalism by attempting to nullify state laws without explicit congressional authorisation. Civil liberties groups and consumer advocates also warn that sidelinining existing state regulations will weaken protection against algorithmic discrimination, privacy violations and unsafe AI deployments in sensitive domains. They also argue that state experimentation has historically driven stronger national standards from environmental protection to data privacy.
But there’s also a basic constitutional problem. Executive orders can direct federal agencies, but they cannot repeal state laws – only Congress can explicitly pre-empt state authority.
Legal scholars argue the order rests on shaky ground, especially where it hints at financially punishing states for passing laws the administration dislikes. Courts have repeatedly ruled that funding threats must be authorised clearly by Congress, not improvised through executive action.
In other words, the order may success politically – by slowing state action – even if it fails legally.
Why It’s So Convenient for AI Industry Leaders
The order aligns neatly with the priorities of major AI firms. Slowing down regulation and centralising negotiations means the technology can continue rapidly developing in the absence of proper rules. It also means that lobbying is easier for the AI companies, who can target a single federal regulator rather than 50 state legislatures.
So, while the administration talks about “innovation” and “competitiveness”, the most valuable outcome for the tech giants is time. Time for companies to entrench themselves, normalise AI systems across society, and develop quickly before strict, central regulation kicks in.
Final Thought
Who gets to define AI governance first: state legislators reacting to visible harms, or federal officials prioritising scale, speed and geopolitical competition? If Washington truly believes a single federal framework is essential, will it use this moment to produce clear, enforceable rules – or will it simply become a convenient way to keep the field open for rapid deployment while everyone argues about jurisdiction?
The Expose Urgently Needs Your Help…
Can you please help to keep the lights on with The Expose’s honest, reliable, powerful and truthful journalism?
Your Government & Big Tech organisations
try to silence & shut down The Expose.
So we need your help to ensure
we can continue to bring you the
facts the mainstream refuses to.
The government does not fund us
to publish lies and propaganda on their
behalf like the Mainstream Media.
Instead, we rely solely on your support. So
please support us in our efforts to bring
you honest, reliable, investigative journalism
today. It’s secure, quick and easy.
Please choose your preferred method below to show your support.
Categories: US News