This week, a federal court in Tennessee transferred to California a lawsuit brought by several large music publishers against a California-based AI company, Anthropic PBC. Plaintiffs in Concord Music Group et al. v. Anthropic PBC[1] allege that Anthropic infringed the music publishers’ copyrights by improperly using copyrighted song lyrics to train Claude, its generative AI model.  The music publishers asserted not only direct copyright infringement based on this training, but also contributory and vicarious infringement based on user-prompted outputs and violation of Section 1202(b) of the Digital Millennium Copyright Act for allegedly removing plaintiffs’ copyright management information from copies of the lyrics.  On November 16, 2023, the music publishers also filed a motion for a preliminary injunction that would require Anthropic to implement effective “guardrails” in its Claude AI models to prevent outputs that infringe plaintiffs’ copyrighted lyrics and preclude Anthropic from creating or using unauthorized copies of those lyrics to train future AI models. 

In response, Anthropic accused the plaintiffs of forum shopping, filing a motion to dismiss for lack of personal jurisdiction and improper venue in the Middle District of Tennessee or, in the alternative, to transfer venue.  Anthropic argued that the case belongs in the United States District Court for the Northern District of California given that (1) all but one of the parties resides in California, (2) the AI at issue was developed and sold in California, and (3) Anthropic’s terms of service require any lawsuits arising out of its model’s outputs to be litigated in California.  Anthropic further argued that it does not have sufficient ties with Tennessee to establish jurisdiction.  Anthropic also asserted that, even aside from the lack of jurisdiction and venue, the preliminary injunction should be denied because Anthropic had already made changes to prevent further alleged infringing outputs, its copying of works for training was fair use that prevented a finding of likely success on the merits for the publishers, and these issues should not be decided without a more developed factual record.

On June 24, 2024, nearly five months after the motion was filed, the court issued an order concluding it lacked personal jurisdiction over Anthropic.  The court reasoned that three remote employees who live in Tennessee and a handful of contracts with Tennessee-based companies is insufficient to show that Anthropic deliberately aimed its conduct to the forum state.  Further, Anthropic’s website, which is generally accessible nationwide, did not suffice to establish jurisdiction absent evidence to establish steps to specifically target Tennessee residents, which was lacking here. 

Rather than dismiss, the court transferred the case to Anthropic’s preferred venue, the Northern District of California.  In doing so, the court acknowledged that plaintiffs had requested “expedited disposition” of their preliminary injunction motion, and noted that this decision will create additional unanticipated delay given that it must now be reviewed and decided by a California federal court.  This delay, the court found, was a problem of plaintiffs’ own making.  They could have “played it safe by filing in a forum that clearly has personal jurisdiction.”  Instead, they made a “strategic decision to sue a California-based company in the Middle District of Tennessee, and in doing so ran the risk of encountering a jurisdictional hurdle too high to climb.” 

Whether the publishers will continue to pursue a preliminary injunction in California given the passage of time, and how that motion may ultimately be resolved on the facts of this case, remain to be seen.  What is clear is that the question of proper application of fair use principles to the training of a generative AI model will not be resolved in Tennessee.


[1] No. 3:23-cv-1092-WDC, 2024 WL 3101098 (M.D. Tenn. Jun. 24, 2024).