Budget bill could decimate legal accountability for tech

1 day ago 2

A provision in the House budget bill passed last week that would impose a moratorium on state AI laws makes tech companies above the law, decimates federalism, and destabilizes predictability in business-to-business transactions, writes David Brody.

WASHINGTON, DC - MAY 22, 2025: US Speaker of the House Mike Johnson (R-LA) speaks to the media after the House narrowly passed a bill forwarding President Donald Trump's agenda at the US Capitol. (Photo by Kevin Dietsch/Getty Images)

Buried deep in the “Big Beautiful Bill” passed by the United States House of Representatives last week is a small provision that could blow up most state laws, good and bad, that regulate any use of technology. It might be intentional arson of federalism, or it might be sloppy drafting. But the effect is immense: states would be unable to enforce any non-criminal law in situations where someone used a computer, which constitutes most of modern life and commerce.

Because of the centrality of states in the US legal system, anyone could obtain legal immunity for almost any non-criminal act simply by inserting technology into their workflow. This is a serious threat to the rule of law.

This should be outrageous to anyone who cares about state sovereignty, including governors, state attorneys general, and state legislators. It will make tech companies above the law. It will break basic commercial regulations essential to stable markets. It will hamstring a state’s ability to protect its citizens or incentivize good tech to boost its economy. It will decimate federalism itself by restricting states’ ability to serve as laboratories of democracy.

The provision in question is styled as a 10-year moratorium on AI regulations by states, and has primarily been reported in the context of its effects on AI development. It looks like an AI lobbyist’s wish list. But it actually applies to any meaningful use of a computer in any context, and it does not have functional or sufficient exceptions beyond carving out state criminal laws. The enforcement of any state civil law regarding the use of a computer—even laws of general applicability—will be in jeopardy.

Some obvious consequences include blocking state laws that protect civil rights, privacy, consumers, and online safety. Do you want schools to ban smartphones in classrooms or provide technological accommodations to kids with disabilities? Sorry, you cannot. Did medical software lead to a misdiagnosis of your loved one? Did an insurance company’s algorithm deny your claims? Sorry, you have no recourse. But the effects are further-reaching. It would also impair enforcement of state civil laws against fraud or hacking, undermine the ability of companies to enforce contracts for technical services, and even block subsidies or other state laws seeking to promote the adoption or development of emerging technologies like AI. You cannot have a functioning communications network or digital commerce if you cannot trust that contracts are enforceable. It would introduce immense legal uncertainty and chaos into the market.

Let’s examine how the provision works and then what effect it would have if enacted into law. 

How does this bill break the internet? Take a close look at the text.

To understand how the so-called “AI moratorium” provision will destroy state laws and the internet, we have to closely read the literal meaning of its text—just as a court would.

The provision is in Section 43201 of the bill. That section, titled “Artificial Intelligence and Information Technology Modernization Initiative,” contains paragraph (c), titled “Moratorium.” The moratorium states:

In General.—Except as provided in paragraph (2), no State or political subdivision thereof may enforce, during the 10-year period beginning on the date of the enactment of this Act, any law or regulation limiting, restricting, or otherwise regulating artificial intelligence models, artificial intelligence systems, or automated decision systems entered into interstate commerce.

What does this mean? Let’s break it down piece by piece:

1. “Except as provided by paragraph (2)…” means there will be exceptions; we’ll come back to that. Teaser: most of the exceptions are broken.

2. “…no State or political subdivision thereof may enforce…” means any aspect of state governance, including litigation between private parties in state court. It does not just mean enforcement by a state attorney general, prosecutor, or other state actor. For example, in the landmark defamation case New York Times Co. v. Sullivan, the Supreme Court held that the federal constitutional protections applied even in state civil actions between private parties, because the state courts “applied a state rule of law which petitioners claim to impose invalid restrictions on their constitutional freedoms of speech and press.” This means this moratorium applies to contract law and similar private disputes, which are almost entirely governed by state law.

3. “…any law or regulation limiting, restricting, or otherwise regulating…” actually means “any law regulating.” Why? Well, “law or regulation” just means “law” because regulations are a subtype of laws. Then “limiting, restricting, or otherwise regulating” just means “regulating” because that final catchall phrase “otherwise regulating” expands the clause to apply to any form of regulation. So now we have a provision that means “any law … regulating.” Which also just means “any law” because “regulating” is what all laws do. This sounds banal, but it is important—the law in question does not need to be directed at AI or tech to be within scope. Laws of general applicability are included as well, because as soon as you apply such law to a covered system, you are “regulating” that system.

4. “...artificial intelligence models, artificial intelligence systems, or automated decision systems…” These terms are defined in the statute; we’ll come back to that in just a minute.

5. “…entered into interstate commerce” means Congress recognizes that it has authority under the Constitution only to regulate interstate commerce, and it cannot stop a state from governing these systems if they stay inside state borders. However, particularly in the context of the internet, AI, and other digitized commerce, this is not much of a limitation. Moreover, the Supreme Court has held that what constitutes interstate commerce is very broad and can include intrastate activities that have substantial effects on interstate commerce.

So, if we put together what we have so far, what does this provision mean? It reduces to this: For ten years, no state law may regulate [AI systems] that are not used exclusively in intrastate commerce, unless an exception is met.

If this sounds incredibly broad, that is because it is! This is what lawyers call “field preemption”—Congress is blocking states from governing anything that falls within this field. So, what is the field? Let’s look at the definitions of “artificial intelligence models”, “artificial intelligence systems”, or “automated decision systems.” It turns out we only need to analyze the last one, “automated decision systems,” because it is so broad that the other terms become mostly irrelevant.

(d) Definitions.—In this section:…

(4) Automated decision system.—The term “automated decision system” means any computation process derived from machine learning, statistical modeling, data analytics, or artificial intelligence that issues a simplified output, including a score, classification, or recommendation, to materially influence or replace human decision making

.

Once we break it down, we will see that this definition ultimately means an automated decision system is “any meaningful use of a computer.”

1. “any computational process derived from… [list of things]” is broad. Everything a computer does is a “computational process.” So this part of the definition can apply to anything a computer does, depending on the rest of the definition.

2. “…machine learning, statistical modeling, data analytics, or artificial intelligence…” sounds limited, but it is not. “Data analytics” sweeps in all the other terms and then some. Interpreted literally, as courts are wont to do, almost everything a computer does (other than pure data storage) involves data analytics. A computer algorithm is a mechanized system to analyze a data input from a user or other source in order to determine what output to give in response. So this clause ultimately means any meaningful use of data by an algorithm, which is basically all computing, not just fancy AI algorithms.

3. “…that issues a simplified output, including a score, classification, or recommendation…” as a practical matter means any output from a computer. When a statute says “thing, including [list of examples]”, the list is illustrative, not exhaustive. So all that really matters here is the phrase “that issues a simplified output.” And all computer outputs are “simplified outputs” unless the computer is spitting out binary machine code.

4. “…to materially influence or replace human decision making.” The term “material” is common across many areas of law. In general, something is “material” if it has a not-inconsequential effect on a decision-making process. Something needs only to be a contributing factor, not a primary cause, to be “material.” So any computer output that is at all useful to the human decision maker will qualify as an output with a “material influence.” Which again means any meaningful use of a computer.

So, in summation, the term “automated decision system” boils down to “any meaningful use of a computer.”

This means, in turn, that the moratorium as a whole means: For ten years, no state law may regulate any meaningful use of a computer that is not used exclusively in intrastate commerce, unless an exception is met.

Our Content delivered to your inbox.

Join our newsletter on issues and ideas at the intersection of tech & democracy

The exceptions are badly broken.

Congress would not blow up such a wide swath of state law so cavalierly, right? Surely they wrote a broad general rule because they also wrote meaty exceptions to pare it down? Well, I have bad news for you.

There are two categories of exceptions. Notably, there is a clear carve-out of state criminal law in subsection (c)(2)(B). State criminal laws regarding fraud, hacking, stalking, theft, etc., will remain unaffected, which is a good thing. Unfortunately, the same is not true for state civil laws.

The exceptions for state civil laws are broken in two independent ways. First, the statute uses “and” instead of “or”, which means all the conditions on a list must be satisfied to trigger an exception. And the list does not fit together so almost nothing can thread the needle. Second, the exception for laws of general applicability only kicks in if there is an analog equivalent to the computerized functionality being regulated. It will not apply to vast and novel swaths of modern digital commerce.

First, as a threshold matter before we even look at the substance of the exceptions, subparagraph (c)(2)(A) lists three requirements, and all three must be satisfied to get an exception. The subparagraph joins them with a conjunctive “and” rather than a disjunctive “or.”

(2) Rule of Construction.—Paragraph (1) may not be construed to prohibit the enforcement of—

(A) any law or regulation that—

(i) the primary purpose and effect of which is to [ease deployment of AI systems];

(ii) does not impose any substantive design, performance, data-handling, documentation, civil liability, taxation, fee, or other requirement on [AI systems] unless such requirement—

(I) is imposed under Federal law; or

(II) in the case of a requirement imposed under a generally applicable law, is imposed in the same manner on [non-AI systems] and [AI systems];

(iii) does not impose a fee or bond unless [it is reasonable and treats AI systems and non-AI systems the same.]

The “and” between (2)(A)(ii) and (2)(A)(iii) is key. It means that a law only satisfies the exception if it satisfies all three categories. It seems practically impossible to satisfy (i) and (ii) at the same time. The first provision is about giving tech special regulatory treatment, and the second provision is about treating tech the same as other areas of regulation. For example, general regulations like contract law that are essential to a functioning market cannot satisfy subparagraph (i), while most sector-specific laws seeking to incentivize tech adoption cannot satisfy subparagraph (ii).

Moreover, it does not seem like the “and” can simply be changed to an “or” while maintaining the probable legislative intent, because satisfying the subparagraph (iii) exception on its own would exempt all state laws unless they impose tech-targeted fees or bonds. That seems like it could be easily gamed. So these exceptions are a mess.

Second, the savings clause for laws of general applicability (subparagraph (2)(A)(ii)) appears to require treating AI systems the same as non-AI systems. But as shown above, the bill’s definitions of AI systems reduce to “any meaningful use of a computer.” So this exception clause should be read as “the requirement is imposed in the same manner on models and systems, other than meaningful uses of a computer, that provide comparable functions to meaningful uses of a computer.” Good luck to the general counsels and courts that have to decipher that.

This means that even if the bill had not botched its conjunctive vs disjunctive, this general applicability clause would only apply to laws where a function could be accomplished without a computer. So, for example, laws prohibiting retail stores from refusing service on the basis of race that apply equally to brick-and-mortar and online commerce may satisfy this condition. But state laws regulating the many areas of modern life where there is no analog equivalent will not survive. A law prohibiting landlords from using software to raise rents unfairly may get preempted. Determining which applications of which laws should get preempted or not preempted will become extremely fact-specific, require a lot of litigation to sort out, and be onerous for legal compliance.

When you combine the moratorium language, definitions, and exceptions, the only reasonable interpretation is that the moratorium blocks all enforcement of state civil law involving the meaningful use of a computer.

The effect of this provision will be to make tech companies effectively above civil law at the state level. But also, it will break many aspects of state civil law that are necessary to commerce generally, particularly online commerce, and the basic functionality of communications networks like the internet. It will create dramatic uncertainty. Even if the courts ultimately pare back the scope of the moratorium, it will take at least five to ten years of litigation across many jurisdictions to sort it out. In the meantime, businesses will not know what the law requires.

Preempting state civil law will break a vast swath of important things.

What is the scope of this bill's effects on state civil law? It is hard to ascertain the outer limits. But here are some categories and examples of what this bill destroys or calls into question. The bottom line is that whether you want more or less state regulation of the tech sector, you will be disappointed by this moratorium. If you want to stop Big Tech abuses and protect consumers from scams, this moratorium is an impediment. And if you want to boost tech company growth and investment in your state, this moratorium is an impediment to that, too. 

  1. Civil rights protections. The moratorium will block the ability to enforce state civil rights laws against algorithmic discrimination and other tech-enabled inequities. These laws are crucial for ensuring everyone has equal opportunity in the modern economy. For example, Meta is currently being sued in the District of Columbia for illegal racial steering of higher education ads on Facebook and Instagram. On average, state civil rights laws are more protective than federal ones. For example, it is legal under federal law (but not most state laws) for a store to discriminate against women or Christians. State laws generally offer more protections for sexual orientation and gender identity. And state civil rights laws can be more tailored to the needs of different areas of commerce, whereas federal laws are blunter.
  2. Civil fraud and other consumer protections. State consumer protection laws are the first line of defense against scams, deception, fraud, and other unfair trade practices. They can be enforced by state attorneys general or private litigants. The Federal Trade Commission routinely collaborates with state attorneys general because no one agency, state or federal, can handle the scale of online harms to consumers. While states will still be able to prosecute some fraud criminally, a lot is handled with civil enforcement because the bar for criminal convictions is much higher than civil liability. If the moratorium passes, consumers will lose key avenues of recourse for scams and fraud, decreasing consumer trust. Undermining consumer confidence will hamper the adoption and growth of new technologies.
  3. Contract law. The states generally govern contract law. After this moratorium, there could be enforceability questions for any commercial contract that involves the provision or use of a computerized system. Businesses cannot operate without enforceable contracts. This uncertainty can break all kinds of things in highly unpredictable ways. It could break the internet. It calls into question interconnection agreements between telecommunications companies, contracts with intermediary service providers, and terms of service agreements with end users.
  4. Privacy laws. There is no federal comprehensive privacy law. What protections exist are largely at the state level, such as the Illinois Biometric Information Privacy Act, the Maryland Online Data Privacy Act, the Washington My Health My Data Act, and the collection of privacy laws enforced by the California Privacy Protection Agency. The moratorium will eliminate all of these laws and protections, and states currently debating new laws will be stopped cold. This will have a devastating impact on states trying to protect reproductive privacy.
  5. Protections against corporate surveillance. Data brokers and private surveillance companies like Palantir are contracting with federal agencies to help deport immigrants and with state governments for law enforcement purposes. If these companies violate people’s rights, directly or indirectly, they could not be held accountable under state law.
  6. AI, broadband, and tech subsidies. Ironically, because the moratorium’s exceptions are broken, it would also preempt any state law that seeks to subsidize tech research and development, including subsidies for tech procurement modernization and infrastructure upgrades. It would hamper state efforts to close the Digital Divide and increase broadband availability in low-income, rural, and tribal areas. It would also block state efforts to understand and support the adoption of beneficial uses of AI.
  7. Protections for children and schools. The moratorium will block the many efforts at the state level to protect kids from online harms and improve educational environments. While kids' online safety legislation can be a mixed bag and some state laws are quite problematic, we do not want to block states from making any effort to protect children from intentionally or negligently harmful technologies. This moratorium could even block school board regulations that prohibit the use of smartphones in classrooms or reduce tech-enabled bullying.
  8. Property and likeness rights. State law is overwhelmingly responsible for governing private property, both real estate and personal property. For example, Tennessee recently enacted the ELVIS Act to protect individuals’ property rights in their likeness, voice, and image against AI deepfakes. In addition, the moratorium could disrupt zoning laws and other property regulations (or waivers) as applied to data centers, private space company facilities, and other high-tech enterprises.
  9. Accessibility requirements. Many technologies are inaccessible to people with disabilities or limited English proficiency. States play a vital role in innovating new forms of digital accessibility and supporting deployment of assistive devices, a role that this moratorium will undermine.
  10. Limits on facial recognition. States and municipalities are passing laws restricting the use of facial recognition technologies by law enforcement and private businesses. The moratorium would preempt all these laws. This would then raise questions about what rules, if any, restrict law enforcement surveillance with these technologies.
  11. Voting rights. Increasingly, voter suppression is going digital, and state attorneys general are the front line of defense against it. For example, after two men sent voter intimidation robocalls targeted at Black voters in several states in the 2020 election, the New York Attorney General brought a civil suit using, in part, New York's voting rights laws. (Disclosure: the author represented other plaintiffs in this case.) The moratorium would block the ability to use state laws against new forms of online voter suppression.
  12. Harassment, stalking, and nonconsensual intimate imagery. The moratorium would block state laws providing remedies for people who suffer tech-enabled domestic abuse, harassment, stalking, and publication of nonconsensual intimate imagery.
  13. Negligence and tort law generally. Finally, there is a huge spectrum of state common law that forms the legal background to all forms of commerce. This includes product liability actions, medical malpractice, other forms of negligence, theft, right to publicity (which limits the ability of AI companies to use your image without your consent), and other privacy torts. The moratorium calls into question the ability to use any of these and other common laws against tech-enabled harms. Every single case would be a novel legal and factual scenario requiring a court to determine if the moratorium applies, which will increase litigation costs for all sides.

The state law moratorium in the Big Beautiful Bill will decimate vast and unexpected areas of state governance across many different sectors, not just tech or AI. It should not pass.

Read Entire Article