Can You Sue Someone for Deepfake Fraud?
Artificial intelligence has created remarkable new tools, but it has also introduced risks that few anticipated. Among the most concerning developments is the rise of deepfakes—AI-generated videos, audio clips, or images that look and sound authentic but are entirely fabricated. Once thought of as science fiction, deepfakes are now used in scams, harassment, political misinformation, and nonconsensual pornography.
Depending on the facts of your case, you may be able to pursue civil claims for fraud, defamation, misappropriation of likeness, and other torts. In some cases, law enforcement may also pursue criminal charges. Contact our AI Privacy Lawyers to investigate your claim and to discuss taking legal action.
What Is a Deepfake and How Is It Used to Cause Harm?
A deepfake is a video, audio recording, or image created using artificial intelligence to realistically simulate a real person’s appearance, voice, or likeness without their consent. The technology has become widely accessible, and its misuse has grown rapidly across multiple categories of harm:
- Financial fraud. Criminals use AI-generated voice or video deepfakes to impersonate executives, attorneys, or financial advisors to trick employees or clients into wiring money or sharing sensitive information. In January 2024, an engineering firm in Hong Kong lost $25 million when an employee was deceived by a deepfake video call that appeared to feature the company’s CFO and several colleagues.
- Defamation and reputation attacks. Fabricated videos depicting real people engaged in illegal or morally damaging conduct are used to destroy professional and personal reputations.
- Nonconsensual intimate images. AI tools are being used to generate explicit sexual images of real individuals without their consent, targeting women and girls at a disproportionate rate.
- Election interference. Deepfake audio and video of political candidates and officials is being used to spread disinformation during election cycles.
- Workplace harassment. Colleagues, supervisors, or outside parties are creating harassing or sexualized deepfake content targeting coworkers, creating hostile work environment liability for employers who fail to act.
Because deepfakes look authentic, they can cause significant financial, reputational, and emotional harm. Therefore, many may seek legal recourse to right the course of the damage. Contact our lawyers to discuss your legal options. We offer free consultations for plaintiffs nationwide.
What Damages Can Deepfake Victims Recover?
Depending on the legal theory and the jurisdiction, victims of deepfake harm may recover:
- Compensatory damages for direct financial losses, including wire fraud, business losses, and remediation costs
- Reputational and professional harm damages
- Emotional distress damages
- Punitive damages where conduct was malicious or reckless
- Statutory damages under applicable state or federal statutes, which do not require proof of a specific dollar loss
- Court-ordered injunctions requiring removal of content and prohibiting further distribution
- Attorneys’ fees in cases under statutes that allow fee shifting

Can You Sue Someone for Making a Deepfake of You?
In 2025, federal law began catching up with deepfake technology. The TAKE IT DOWN Act, signed into law in May 2025, criminalized the knowing publication of nonconsensual intimate deepfakes and required platforms to remove that content within 48 hours of notice. In January 2026, the U.S. Senate unanimously passed the DEFIANCE Act, which would give victims a federal civil right to sue creators and distributors of nonconsensual explicit deepfakes for $150,000 to $250,000 in statutory damages. That bill is now before the House.
Beyond nonconsensual intimate images, victims of deepfake fraud, business impersonation, defamation, and financial scams already have legal claims available under existing state and federal law in all fifty states.
If someone created or distributed a deepfake of you that caused financial harm, reputational damage, or emotional distress, contact The Lyon Firm today for a free consultation. We represent deepfake victims nationwide.
Why Hire The Lyon Firm for Deepfake Fraud Cases?
Deepfake litigation is complex. Unlike traditional fraud or defamation cases, these claims involve cutting-edge technology, forensic evidence, and rapidly evolving laws. At The Lyon Firm we have extensive experience handling complex fraud, cybercrime, defamation, and privacy lawsuits. Our team can guide you through the following tasks:
-
Investigating and identifying anonymous perpetrators using advanced digital discovery tools.
-
Working with forensic and cyber experts to prove the authenticity and impact of deepfakes.
-
Pursing aggressive civil litigation to recover damages and securing court-ordered takedowns.
-
Providing compassionate support to victims suffering reputational and emotional harm.
We understand that deepfake fraud is not only a legal battle but also a personal and emotional crisis. Our firm is dedicated to helping victims restore their reputations, recover financial losses, and hold wrongdoers accountable.
Frequently Asked Questions
Is making a deepfake of someone illegal? It depends on the content and the state. Nonconsensual intimate deepfakes are now a federal crime under the TAKE IT DOWN Act signed in May 2025. Many states have additional criminal and civil statutes. Beyond specific deepfake laws, creators can also face civil liability for fraud, defamation, and misappropriation of likeness depending on how the deepfake was used.
Can I sue someone for a deepfake that cost my business money? Yes. If a deepfake was used to impersonate you, your executives, or your business in a way that caused financial harm, you may have claims under fraud, wire fraud, and computer fraud statutes, as well as commercial defamation and business interference torts.
What does the DEFIANCE Act do? The DEFIANCE Act, which passed the U.S. Senate unanimously in January 2026 and is currently pending in the House, would give victims of nonconsensual sexually explicit deepfakes a federal civil right to sue for statutory damages of $150,000 to $250,000, plus attorneys’ fees.
What if I don’t know who made the deepfake? We can file suit against anonymous John Doe defendants and use court-ordered subpoenas to compel platforms to disclose identifying information about account holders. Digital forensic experts can also assist in tracing content origins.
Can I force a platform to remove a deepfake? The TAKE IT DOWN Act requires covered platforms to remove nonconsensual intimate deepfakes within 48 hours of notice. Courts can also issue injunctions ordering takedowns. We advise acting quickly because content spreads rapidly once published.