When AI invents case law: Why the hallucination defense is crumbling in accessibility court
In January 2023, a federal judge in New York fined two lawyers $5,000 for submitting legal citations that never existed. ChatGPT had invented them. The judge wrote that the citations "appear to be fabricated."
That moment—lawyers punished for AI hallucinations—felt like a line in the sand. But here's what happened next: defendants started treating it like a defense strategy.
They were wrong.
Courts are sanctioning lawyers and litigants for AI-generated fabrications. But those sanctions do nothing to invalidate the underlying accessibility claim. A plaintiff's lawyer can get fined for hallucinated case law and still win on the merits. Your website can be inaccessible and the plaintiffs can still be right, regardless of whether their legal team used AI competently.
This is the real lesson from the hallucination crisis: it exposes incompetence in the legal process, not a loophole in the law itself.
The hallucination problem in accessibility litigation
Hallucination is a technical term for when large language models generate text that looks authoritative but is entirely invented. A judge's name that doesn't exist. A case citation with the right formatting but no ruling behind it. A statute that sounds plausible but never passed.
In accessibility litigation, where many plaintiffs are filing pro se (without lawyers)—40% of ADA digital-specific cases involve self-represented litigants—this problem compounds quickly. A plaintiff uses ChatGPT to draft a demand letter. The letter cites three cases that don't exist. The defendant's counsel flags the fabrications. The case gets messier from day one.
The timeline tells the story:
The pattern is consistent: judges sanction the lawyers and litigants who submit hallucinated citations. But the sanctions are narrow. They target conduct, not the claim itself.
Why sanctions don't protect defendants
Here's the critical distinction courts are making: a lawyer's incompetence is not evidence that the website is accessible.
A plaintiff's lawyer submits a brief full of fabricated case law. The judge sanctions the lawyer. The judge then evaluates the underlying accessibility claim on its own merits. Does the website meet WCAG 2.1 AA? Can a person who is blind navigate the checkout flow? Are form labels properly associated with inputs?
The hallucinations are irrelevant to those questions.
Courts distinguish between process integrity and substantive law. A judge can fine a lawyer for inventing citations and hold the defendant liable for an inaccessible website in the same ruling.
Why? Because the two things are separate. The citation fraud is about courtroom conduct. The accessibility violation is about the code, the design, the architecture of your digital product.
The real defense: fixing accessibility issues
The only defense that matters is the one that changes the outcome: actually fixing your website.
Courts are now familiar with the hallucination problem. They're skeptical of high-volume, template-based filings. They're adopting stronger standards for citation verification. But they're doing this in addition to evaluating the merits of each case.
If your website has color contrast problems, missing alt text, broken keyboard navigation, or inaccessible forms, a hallucinated case citation in a plaintiff's brief will not save you.
What will save you is a remediation plan that works:
Week one: Baseline audit. Run an automated scan with inspekter or similar tools. Document actual failures against WCAG 2.1 AA. You're looking for images without alt text, form fields without labels, color contrast below 4.5:1 for regular text, interactive elements not keyboard accessible, and missing page structure (headings, landmarks).
Week two: Prioritize scope. You cannot fix everything at once. Prioritize by user impact. Start with checkout flows, authentication, and critical paths. These are where most litigation concentrates and where barriers matter most.
Week three: Establish remediation timeline. Courts care about intent and speed. A documented plan to remediate within 30–60 days carries weight. A vague promise carries none. Make it specific: "We will fix image alt text across product pages by [date]. We will audit and fix form labels by [date]."
Week four: Document everything. Keep records of audit results, remediation work completed, testing with assistive technology, internal training completed, and ongoing monitoring process. This documentation becomes evidence of good faith. Courts care about whether you're taking accessibility seriously, not whether you're perfect.
The hallucination defense doesn't exist
In the long view, the hallucination crisis in accessibility litigation reveals something important: the law itself is not the problem. The problem is that accessibility is hard and many organizations have not invested in fixing it.
Courts are dealing with both hallucinated citations and the legitimate accessibility claims they accompany. They're developing tools to catch fabrications—citation verification, stricter scrutiny of AI-generated briefs, sanctions for misconduct. But those tools are being built on top of existing liability standards.
The WCAG 2.1 AA standard doesn't change because a plaintiff's lawyer hallucinated a case. The business rationale for accessibility doesn't shift because an AI tool invented a citation. The expectation that websites should be usable for people with disabilities doesn't diminish.
What changes is the procedural noise level. And what remains constant is the underlying obligation.
The hallucination defense doesn't exist because defending against an accessibility lawsuit has never been about winning on technicalities. It's always been about proving your website is accessible. Everything else is distraction.
In Part 3, we'll examine the other side of the AI paradox: how AI tools are simultaneously creating accessibility bugs and being marketed as the fix.