By Angela Dunning and Lindsay Harris.[1]  Note, Cleary Gottlieb represents Midjourney in this matter.

On October 30, 2023, U.S. District Judge William Orrick of the Northern District of California issued an Order[2] largely dismissing without prejudice the claims brought by artists Sarah Andersen, Kelly McKernan and Karla Ortiz in a proposed class action lawsuit against artificial intelligence (“AI”) companies Stability AI, Inc., Stability AI Ltd. (together, “Stability AI”), DeviantArt, Inc. (“DeviantArt”) and Midjourney, Inc. (“Midjourney”).  Andersen is the first of many cases brought by high-profile artists, programmers and authors (including John Grisham, Sarah Silverman and Michael Chabon) seeking to challenge the legality of using copyrighted material for training AI models.

On October 30, 2023, the G7 Leaders published a Statement on the Hiroshima Artificial Intelligence (“AI”) Process (the “Statement”).[1] This follows the G7 Summit in May, where the leaders agreed on the need to address the risks arising from rapidly evolving AI technologies. The Statement was accompanied by the Hiroshima Process International Code of Conduct for Organizations Developing Advanced AI Systems (the “Code of Conduct”)[2] and the Hiroshima Process International Guiding Principles for Advanced AI Systems (the “Guiding Principles”).[3]

The U.S. District Court for the District of Columbia recently affirmed a decision by the U.S. Copyright Office (“USCO”) in which the USCO denied an application to register a work authored entirely by an artificial intelligence program.  The case, Thaler v. Perlmutter, challenging U.S. copyright law’s human authorship requirement, is the first of its kind in the United States, but will definitely not be the last, as questions regarding the originality and protectability of generative AI (“GenAI”) created works continue to arise.  The court in Thaler focused on the fact that the work at issue had no human authorship, setting a clear rule for one end of the spectrum.  As the court recognized, the more difficult questions that will need to be addressed include how much human input is required to qualify the user as the creator of a work such that it is eligible for copyright protection.

On June 6, 2023, New York Senate Bill S5640 / Assembly Bill A5295 (“S5640”) won near-unanimous final passage in the New York Assembly with a 147-1 vote, after being passed unanimously by the Senate the previous week.  If signed into law by Governor Hochul, the legislation would, effective immediately, add to New York labor law a new section 203-f that renders unenforceable provisions in employee agreements that require employees to assign certain inventions developed using the employee’s own property and time. 

GitHub, acquired by Microsoft in 2018, is an online repository used by software developers for storing and sharing software projects.  In collaboration with OpenAI, GitHub released an artificial intelligence-based offering in 2021 called Copilot, which is powered by OpenAI’s generative AI model, Codex.  Together, these tools assist software developers by taking natural language prompts describing a desired functionality and suggesting blocks of code to achieve that functionality.  OpenAI states on its website that, Codex was trained on “billions of lines of source code from publicly available sources, including code in public GitHub repositories.” 

As we continue to see the rapid development of digital technologies, such as artificial intelligence (“AI”) tools, legislators around the world are contemplating how best to regulate these technologies.  In the UK, the Government has adopted a “pro-innovation” agenda, with the aim of making the UK “an attractive destination for R&D projects, manufacturing and investment, and ensuring [the UK] can realise the economic and social benefits of new technologies as quickly as possible.”[1] 

On 16 March 2023, the US Copyright Office (“USCO”) published guidance on the registration of works containing AI-generated content. The USCO’s policy statement was released against the backdrop of the proliferation of generative AI tools which are able to create content based on user prompts. The USCO ultimately concluded that the “authorship” requirement of US copyright law refers to “human authorship” (in line with prior case law) and appears to reject the extension of copyright to works generated with the aid of AI technology outside of the user’s control.

On 15 March 2023, the UK ICO published an update to its Guidance on AI and Data Protection (the “Guidance”), following requests from the UK industry to clarify requirements for fairness in artificial intelligence (“AI”).  The Guidance contains advice on the interpretation of relevant data protection law as it applies to AI, and recommendations on good practice for organisational and technical measures to mitigate risks caused by AI.

In light of the increasing prevalence of automated “self-driving” vehicles, the Law Commissions of England and Wales and Scotland published a joint report on automated vehicles at the beginning of last year, which is currently before the UK Parliament for consideration. The report recommends the introduction of a new Automated Vehicles Act specifically to regulate automated vehicles and recalibrate legal accountability for their use.

In September 2022, the European Commission published its proposal for a new product liability directive (“PLD”), and a proposal for a directive on adapting non-contractual civil liability rules to artificial intelligence (“AILD”).