Can new artificial intelligence such as ChatGPT be relied upon for legal advice?


Last week it was announced that Microsoft is planning to invest billions of dollars into the creator of new chatbot, ChatGPT. Artificial intelligence (AI) platforms such as ChatGPT are garnering increased attention, including in the legal industry, where people are questioning whether these types of technology can be relied upon by individuals or businesses to provide legal advice. If you are thinking about using such technology, there are important issues to bear in mind.

What is ChatGPT?

ChatGPT is a state-of-the-art open source language processing model or 'chatbot'. ChatGPT generates human-like text using a statistical model trained on billions of text samples from the internet. It can answer questions, summarise complex information and even write a song or produce a novel in real-time. When a user enters a command, ChatGPT processes the input and continues to update its replies as the conversation progresses.

Anyone who has tried ChatGPT will know how remarkable it is at providing human-sounding responses in next to no time. In fact ChatGPT wrote the previous paragraph when asked to describe itself in 4 sentences!

Could ChatGPT be used to provide legal advice?

AI experts suggest that ChatGPT could revolutionise the provision of legal services especially in relation to drafting legal documents such as wills, leases and other standard form agreements. In fact ChatGPT recently passed exams in four courses at the University of Minnesota Law School according to professors.

ChatGPT can analyse a large amount of information in seconds, which has the capacity to dramatically reduce the amount of time spent researching a legal point. However, when tasked with applying legal principles it may be less effective. For example, when asked about the correct procedure in England & Wales for serving legal documents in specific circumstances, it gave a conceivable sounding but (importantly) partly incorrect answer.

The main issues with relying on answers provided by ChatGPT are:

  1. Although its responses are informed by a multitude of internet data, it doesn’t have access to anything behind paywalls such as case law on LexisNexis or Westlaw;
  2. It doesn't explain or give the source(s) of the information it provides. This is risky in a legal context and makes it hard to check its output if you don't already know the answers;
  3. It is only trained on information up to the end of 2021 so recent changes in law, policy or guidance will be missed, and
  4. Legal advice is often highly context-specific necessitating an appreciation of a range of outside factors, which ChatGPT is not yet equipped to incorporate.


Although AI technology might in the not too distant future assist lawyers by drastically reducing time spent on legal research or precedent documents, there are far too many risks involved in relying on it for legal advice at this stage.

Quote mark icon

OpenAI warns that "ChatGPT sometimes writes plausible-sounding but incorrect or nonsensical answers"
featured image