Artificial intelligence ("AI") is becoming an ever-increasing presence in the legal sphere, from disclosure review to document drafting to legal research. In this article we consider whether proposed amendments to the Arbitration Act 1996 (the "Act"), which are intended to modernise arbitration law in England, Wales, and Northern Ireland, sufficiently take into account the increasing use and reliance on AI.
Update on the bill to amend the Arbitration Act
Reforms to the Act have been proposed in the Bill to Amend the Arbitration Act 1996 (the "Bill"). The Bill was introduced in the House of Lords and remains under consideration at Committee Stage. The proposals largely mirror the recommendations set out by the Law Commission in their September 2023 report (commented on in detail in a previous article from Howard Kennedy's Head of International Arbitration, Duncan Bagshaw).
The main reforms proposed by the Bill are as follows:
- Summary disposal: the Bill aims to ensure an express power for arbitrators to make an award on a summary basis, with the test being whether the relevant party has a "real prospect of success".
- Court powers in support of arbitral proceedings: section 44 of the Act as it currently stands does not make it clear that injunctive and other orders may be made against a party that is not a party to the arbitration. The Bill seeks to amend this to confirm that orders may be made against third parties.
- Emergency arbitrators: the Bill seeks to support the emergency arbitrator process by formally recognising it in the Act for the first time and proposing methods to assist with the enforcement of orders made by emergency arbitrators. For example, an emergency arbitrator whose order has been ignored can issue a peremptory order which can be enforced by the court.
- Challenging jurisdiction: amendments are proposed to section 67 of the Act on challenges to a tribunal's award on jurisdiction. The reform provides for court rules to be made which could limit the ability of parties to rely on new grounds or evidence before the court, and restrict judges' ability to re-hear evidence. The Bill also contains an amendment clarifying the tribunal's power to award costs in situations where there has been a finding (by the tribunal or the court) that the tribunal lacks substantive jurisdiction.
- Governing Law: the Bill provides that the default position should be that the arbitration agreement is governed by the law of the seat, unless expressly agreed otherwise by the parties. This reverses the current position under the decision of the Supreme Court in Enka v Chubb, which indicates that the arbitration agreement will generally be governed by the substantive governing law of the contract.
A Special Public Bill Committee has received written and oral evidence and views on the proposed amendments and the Lords will now consider this feedback before the Bill's next reading.
What about AI?
AI is increasingly being used in arbitration in numerous ways, such as:
- Document review: in the 2016 case of Pyrrho Investments Ltd v MWB Property Ltd, the English courts approved the use 'predictive coding' and 'computer-assisted review' for document review in litigation for the first time. The flexibility of arbitration has always allowed such innovations to be used as part of the process, where appropriate. The use of such tools is now widespread in both arbitration and litigation, as AI document review systems allow for a focus on the most relevant documents first, increasing the speed of reviewing data on a large scale. This reduces the time (and cost) spent by lawyers on document review.
- Translation software: foreign language documents are commonplace in international arbitration and the use of AI-backed translation software to review and translate them (as opposed to more labour- and cost-intensive human translation) is becoming increasingly prevalent. Section 34 of the Act states the translation of documents is a procedural matter and the tribunal can decide the manner in which the translation is carried out.
- Legal research: legal research sites are increasingly using AI and machine learning to assist lawyers. For example, last year LexisNexis launched its generative AI legal platform 'Lexis+ AI' which it claims can locate relevant case law and citations and identify key passages in documents more quickly for users.
- Drafting documents: generative document drafting by AI is offered by a range of platforms and can create a first draft of a legal document or email, allowing users to change the language and tone using prompts. In time it could also be developed to tackle more complex tasks such as generating cross-examination scripts.
Internationally, there have been numerous recent documented issues in legal proceedings with the use of AI. For example, in February 2023 we reported on the news that a judge in Colombia noted he had used ChatGPT to pose legal questions about a case and incorporated its responses in his decision. Then in June 2023, two New York lawyers were fined $5,000 for relying on fictitious case law they had found through ChatGPT. Similarly, in November 2023, a junior lawyer in Colorado was fined by the court and fired from his role after filing an AI-generated brief that contained incorrect information and fake citations.
There is therefore a discussion to be had as to whether amends to the Act should consider and legislate for the increasing use of AI within arbitration. As it stands, the Bill to amend the Arbitration Act does not contain any comment or amends to the Act to reflect or address the increased use of AI in arbitration or any of the issues raised.
This may be because the inherent flexibility of English-seated arbitration proceedings (enshrined in section 34 of the Act which gives the tribunal the power 'to decide all procedural and evidential matters') offers scope for the introduction of even the most cutting-edge technology, if the tribunal consider it allows for a fair procedure. It seems likely, however, that in due course some further advice or regulation may well be required. For example, in the US state of Michigan, it has been proposed that attorneys must review and verify AI-generated content included in filings, and disclose when AI has been used to compose or draft a filing. Similarly, in Canada several courts have also issued general practice directions requiring disclosure of the use of AI and the manner of its use in any drafting or legal research.
It may well be that calls arise for international arbitration to introduce similar provisions. However, there is scope to do this in the procedural rules which govern arbitration (whether institutional, or ad hoc in the tribunal's procedural orders) rather than in national legislation.
Arbitrators themselves will also come under scrutiny for their use of AI tools, to analyse and consider evidence and documents, and to prepare arbitral awards. As has been noticeable in relation to the use of secretaries to assist the tribunal with administrative tasks (see our colleague Taner Dedezade's article here), there may well be a gradual widening of the acceptable scope of the use of AI by arbitrators.
The difficulty in legislating to create certainty and reduce the risks around the use of AI is not limited to the world of arbitration. The speed at which AI capabilities and use cases are evolving makes it difficult to assess the extent of AI's inherent opportunities and risks. This is something that many industries and, indeed, governments are grappling with. While it may therefore be too early for such reforms to the Act at this stage, as technology advances, arbitrators, lawyers and law makers will need to be ready to adapt to the changes that it will bring using all the tools at their disposal.
In its Navigating Risk Horizons report, our Dispute Resolution team have identified key risks threatening business resilience in 2024 as well as practical steps that leaders can take to manage those risks and maximise opportunities to strengthen their business. In particular, within the report we consider the use and development of AI as well as the threat posed to businesses by those looking to exploit opportunities provided by AI to commit fraud.