Skip to main content
A woman concernedly looks at her laptop screen, fearing that she has been a victim of data misuse.

Undisclosed AI Chatbot Lawsuits

When you chat with a company online about a purchase, a medical question, a loan, or a customer service issue, you have a reasonable expectation of knowing whether you are speaking with a person or a machine. A growing body of state law now backs that expectation with legal force — and in some states, with the right to sue.

Several states have passed chatbot disclosure laws requiring companies to tell consumers upfront when they are interacting with automated AI. Here is where the law currently stands and what it means for you. Contact a consumer privacy lawyer to discuss your legal options. 

States Requiring Chatbot Disclosure

California’s BOTS Act prohibits companies from using automated bots in commercial transactions or political messaging without clearly disclosing the bot’s identity. Violators can face claims under California’s consumer protection statutes.

California went further in 2025 by enacting SB 243, the companion chatbot law, which took effect January 1, 2026. This law requires operators of AI systems designed to engage users in ongoing, human-like social interaction to disclose clearly and conspicuously that the user is not talking to a human. Critically for consumers, SB 243 includes a private right of action — meaning you can sue the company directly for at least $1,000 per violation plus attorney fees.

New Jersey prohibits bots from interacting with consumers in online commercial transactions and real estate advertising without disclosure.

Utah requires anyone using generative AI in a high-risk consumer interaction — covering healthcare, finance, and legal services — to disclose that fact at the start of the conversation.

Colorado’s AI Act, set to take effect in June 2026, mandates disclosure whenever AI is used in consequential decisions involving employment, education, housing, finance, healthcare, or legal services.

Washington signed a companion chatbot law in March 2026 requiring recurring disclosure notifications throughout interactions, with disclosures every hour when minors are involved. Like California, Washington’s law includes a private right of action.

Maine and New Jersey require disclosure at the start of any bot interaction involving the sale or advertising of merchandise.

At the federal level, the FTC treats undisclosed chatbot use as a potentially deceptive trade practice, meaning companies can face enforcement even where no state-specific chatbot law applies.

The Wiretap Law: A Growing Legal Risk

Beyond disclosure laws, a separate wave of class action lawsuits is targeting companies whose website chatbots record and share consumer conversations with third-party vendors without consent. In states including California, Massachusetts, Illinois, Florida, and Pennsylvania, this conduct may violate state wiretapping statutes that require all parties to consent before a communication is recorded.

These chatbot wiretap class actions have grown rapidly — from just two cases in 2021 to more than 58 active federal matters by early 2026. If you used a company’s chat function and were never informed the conversation was being recorded or shared with a third-party AI vendor, you may have a claim under your state’s wiretapping law regardless of whether a specific chatbot disclosure statute applies.

An older woman looking at her cell phone

What Companies Have Been Sued?

In January 2026, Kentucky’s Attorney General filed the first state lawsuit in the country against an AI chatbot company, Character.AI, for deceptive practices including failure to disclose risks to consumers, particularly children. The suit was brought under Kentucky’s consumer protection act, showing that existing consumer fraud statutes can reach chatbot misconduct even without a chatbot-specific law on the books.

This matters for consumers nationwide. Even in states without specific chatbot disclosure laws, companies that use AI to deceive or mislead consumers in commercial transactions may face liability under general unfair and deceptive trade practices statutes.

What Can You Do? Your Legal Options

If you believe a company used an undisclosed chatbot during a transaction, a healthcare interaction, a financial conversation, or any other exchange where you reasonably expected to be speaking with a person, you may have legal options including:

  • A direct lawsuit under California SB 243 or Washington’s companion chatbot law if you are in those states
  • A wiretap claim if the chatbot recorded or shared your conversation without your consent
  • A consumer protection claim under your state’s unfair and deceptive trade practices statute
  • A class action if the same undisclosed chatbot affected a large number of consumers in the same way

The Lyon Firm represents consumers in class action and consumer protection litigation nationwide. If you interacted with an undisclosed AI chatbot during a commercial transaction and believe your rights were violated, contact us at 513.381.2333 for a free and confidential consultation. We serve clients nationwide from offices in Cincinnati and Cleveland, Ohio; St. Louis, Missouri; and Irvine, California, and we take consumer protection cases on a contingency basis — no fees unless we recover for you.

CONTACT THE LYON FIRM TODAY

Please complete the form below for a FREE consultation.

  • This field is for validation purposes and should be left unchanged.