AI and Copyright Law : A potential rise in infringement claims?


AI technology has come on exponentially in the last few years and the question is now being asked, how long will it be before the current provisions in legislation are obsolete? Can the UK's current legislative stance resolve any disputes over the ownership of AI generated material or does there need to be a more altered approach in considering the question of ownership in claims of copyright infringement?

In this article we consider the current position in relation to the ownership of copyright in computer-generated works (CGW), including how the test for originality is overcome. We also look at the revolutionary programme, Chat GPT, as well as the risks involved with giving AI the same rights as that of a legal person. Finally, we will take a brief look at a recent UK Government review on whether the current legislative framework is fit for purpose and what it might mean in the context of resolving disputes over the ownership of content created by AI.

Copyright position - human generated works

In England & Wales, copyright protection applies to original literary, dramatic, musical and artistic works and lasts for the author’s lifetime plus a further 70 years. The owner of any of these works is the author (or their employer), including the artist, photographer and / or writer. The relevant UK legislation that applies is the Copyright, Designs and Patents Act 1988 (the CDPA).

Copyright position - computer generated works

Works generated by a computer (or produced with the assistance of one), if original, attract copyright protection for a period of 50 years from the date the work is made. The copyright is owned by the person who enabled the generation or creation of the work.

Test for originality

There have been an increasing number of questions raised in relation to ownership of works created by AI. Namely, can the subjective test of "originality", which is based on human characteristics such as skill and judgment, be moulded into something applicable to an ever-evolving technological landscape?

The traditional two limb test for originality is:

  1. the work must have originated from the author and cannot have been copied or replicated; and
  2. the work must have required a certain level of skill, labour and judgement.

There is a relatively low bar for the test, for example, a food menu, football fixture list or a timetable is capable of being protected. For this reason, we can see that AI generated content could meet the threshold set out.

ChatGPT AI Case Study

The media is currently flooded with coverage of the new 'chatbot' system developed by OpenAI, ChatGPT. The programme has been designed to be used across various industries with reference made to its potential use for the generation of marketing materials, articles, lyrics and report summaries.

In essence, ChatGPT is a chatbot language model that analyses and utilises copious amounts of data from various sources such as books, articles and even text messages. The algorithms then produce responses, taking into account all of the input.

When read carefully, the description above sounds very much like the human brain, which is capable of learning, unscrambling data, understanding the input and formulating a specifically applied response to a question or scenario.

The question now is, how does the protection differ between a human and a non-human, when the technology has blurred the lines between content produced by either? Will the extent of its use be a key component in determining how copyright protection should apply? For example, where the technology is used as an administrative or aid assistant, common sense may direct us to only apply copyright protection to the human author who can clearly be identified in the scope of creative ability, judgement and skill.

However, when the works become products of a purely autonomous function, can we simply apply the same principles of a computer-generated work? A further question is whether the works can even be deemed as purely "original", given that the autonomous technology has regurgitated existing data and ideas into a moulded form of information.

Risk of giving AI right of legal person

Given current legislation imagines a human enabling AI to generate or create works, recent AI such as Chat GPT have called into question non-human ownership, including recognising the autonomous programme as being the owner and giving the owner a similar status to that of a legal person.

However, the obvious risks associated with recognising a non-human as a legal person comes with its own connotations of accountability and remedy for breaches of various IP rights, including copyright. The question of responsibility and being able to question ownership begins to take a far more precarious angle. How do you sue a chatbot for copyright infringement?

Some have argued that a solution to this could be the AI being construed as an employee, with a designated human responsible as an employer. This would certainly give a clearer indication of accountability and where to go in instances of a breach.

Recent Government review and changes to the law on AI and copyright 

In June 2022, the UK Government published a skeletal response to a consultation on IP and AI noting its objective is to support the AI sector and, if necessary, update relevant law.

The conclusion of the consultation is that the Government does not currently intend to change the law surrounding copyright protection for computer-generated works as AI is not yet advanced enough to warrant this. However, the consultation states that, "should copyright infringement be identified as an obstacle to AI development, one option would be to review, and potentially broaden, the exceptions which allow copies to be made within an AI system – for example for training purposes". 

Many interested parties believed the current legislation may impede the development and usefulness of AI itself by reducing the rate of further innovation and research. The government review acknowledged this stance whilst maintaining the position that copyright protection may need additional measures to exploit the works created by AI. Conversely, their executive summary also went on to highlight the need to continue to reward people for their inventiveness and creativity. Clearly, the question of attributing praise is something that may diminish if the current legislation is applied too broadly. A full range of responses were given in this summary, which can be found on the government website.

The matter is very much still under review, but for the time being, it seems that any potential dispute over ownership or infringement of a computer-generated work will need to be examined on a case-by-case basis in accordance with current legalisation. One thing that is very clear is that anyone drafting or entering contracts in relation to the product of computer-generated works should carefully review the terms in relation to ownership of IP. A large number of disputes over infringement stem from the key question of ownership and exploitation, which is incidentally the key component in determining how tailored the legislation may need to be.

Quote mark icon

How does the protection differ between a human and a non-human, when technology has blurred the lines between the content produced by either? Will the extent of its use be a key component in determining how copyright protection should apply?

featured image