top of page

Court Signals Strong Warning to AI Companies Relying on Pirated Content ⚖


A recent federal court ruling is making waves in the world of copyright law and AI, and it could have serious implications for companies that train AI using creative works without permission.


In a first-of-its-kind decision, a California district court ruled in favor of AI company Anthropic, finding that training a large language model (LLM) on lawfully obtained copyrighted books could qualify as fair use. But here’s the part the headlines are missing: the ruling strongly condemns AI companies that rely on pirated material instead of legally licensing content.


1. Why This Matters


Our team is currently leading class action lawsuits against AI companies like Suno and Udio, platforms which trained on music without permission and are now releasing AI-generated songs that compete directly with human-made music.


This ruling validates what we have been arguing all along:


  • Fair use is not a free pass. It is a legal defense, not a right.

  • Training AI with lawfully purchased or licensed content may be defensible.

  • But downloading content from pirate sites when legal options exist? That is not fair use.


The court made clear that bad faith actions, such as pirating content, undermine the fair use defense, even if the AI’s output is deemed transformative.


2. What This Means for Music AI Lawsuits


Anthropic’s case involved using books to train a text-based model, not to replicate books, but to generate new language-based responses. The court called this “quintessentially transformative.”


In contrast, the AI companies we are suing trained on music to make….more music. This is not transformative, it is replicative. It directly threatens the music licensing ecosystem our clients rely on.


Additionally, we believe these companies retained pirated content, just like Anthropic did. The court said retaining pirated works, especially when lawful access was available, was “irredeemably infringing.”


3. The Bottom Line


This ruling strengthens our position:


  • Training AI can be fair use, but only under the right circumstances.

  • Piracy is not one of them.

  • Creating direct competitors to human-made creative work is not transformative.


The case against Anthropic is ongoing, and the trial will now focus on damages. However, the message is clear: AI companies must respect copyright law just like everyone else.


Want to stay informed or join the fight? Visit indieilawsuits.com to sign up or become a plaintiff.




 
 
 

2 Comments


tealemera
2 days ago

It's great to see the court taking a firm stance on the use of pirated content by AI companies. This sets a precedent that can help protect creators and their intellectual property. As AI continues to evolve, it’s essential for companies to innovate responsibly. Just like in games like Bitlife , where choices can shape your virtual life, the choices made by AI companies will have real-world consequences.

Like

fionapeach
3 days ago

Great insights! This warning to AI companies about using pirated content is much needed. As technology evolves, ethical boundaries must be respected. It reminds me how Moto X3M, a popular game, thrives on creativity without stealing others’ work. Hopefully, more developers and AI firms will follow suit, prioritizing originality and copyright laws. Thanks for sharing this important update—it's truly a hot topic right now!


Like
bottom of page