Artificial Intelligence and the Courts
Sir Geoffrey Vos Sets Out the Boundaries of Human and Machine Judgment
At the Legal Geek Conference in London on 16 October 2025, the Master of the Rolls, Sir Geoffrey Vos, delivered a significant speech on artificial intelligence and its role within the legal system. His address reflected on the pace of change in the profession, identified essential principles for lawyers and judges using AI, and called for a national conversation about which decisions should remain human. It highlighted both the promise and the limits and dangers of technology in the administration of justice.
The transformation in legal practice
Sir Geoffrey began by noting the extraordinary shift that has taken place in little more than a year. Many lawyers previously viewed AI tools with suspicion but within less than a year the profession had moved to widespread adoption, deploying generative systems such as ChatGPT 5, Harvey, CoPilot, Claude and Gemini across litigation support, document analysis and contract drafting. Yet, as the Master of the Rolls observed, this technological revolution had not caused the disruption or chaos that some had feared. Instead, he described a profession that is learning, often unevenly, to integrate AI responsibly within established professional standards.
He was careful to stress that artificial intelligence is not an autonomous actor but a tool, whose value depends on the knowledge, judgment and integrity of those who use it. It can enhance productivity and insight, but it also magnifies professional risk if used without understanding or oversight. In this respect, the speech carried a clear warning: competence in the use of AI is becoming an ethical and professional duty, not an optional skill.
The three core rules for AI use
Central to Sir Geoffrey’s address were what he termed the “three core rules” for the profession’s use of large language models. First, every lawyer and judge must understand what an AI system is doing before relying on it. Secondly, no one should input private or confidential data into a public model. Thirdly, the output of an AI system must always be checked carefully before it is used for any purpose. These rules, though simple and obvious encapsulate the principles of competence, confidentiality and accountability that underpin professional regulation. They are as relevant to judicial officers and public bodies as they are to private practitioners.
The implication is clear: AI cannot absolve the user of responsibility. Whether drafting a pleading or summarising a case, the practitioner remains the author in law and ethics. Blind reliance on an algorithm is no substitute for professional judgment, and ignorance of how a model works will not excuse negligence. Future litigation may well concern not only the negligent use of AI but also the negligent failure to use it where reasonable competence requires it.
What should AI be used for?
The most important question posed in the speech was not technical but ethical: what tasks should be entrusted to machines, and which must remain the preserve of human decision-makers? The Master of the Rolls accepted that AI can properly assist in drafting contracts, conducting research and streamlining routine aspects of case management. These are applications that can enhance efficiency without undermining legitimacy. The real dilemma arises in judicial decision making itself. Can a machine determine rights and liabilities between citizens? Should it?
Sir Geoffrey warned that while AI may be capable of generating judgments, the substitution of machine for human decision making raises three fundamental concerns. First, judicial decisions are often final. If an automated process errs, there may be no effective remedy. Secondly, machines lack the qualities that give human justice its moral authority: empathy, intuition and an understanding of human fallibility. Thirdly, decisions generated by fixed models risk freezing the evolution of jurisprudence, locking society into the assumptions of a particular moment in technological time.
Human rights and the legitimacy of machine decisions
Sir Geoffrey situated these questions within the framework of human rights. He asked whether a machine-generated judgment could ever satisfy Article 6 of the European Convention on Human Rights, which guarantees a fair hearing before “an independent and impartial tribunal established by law”. If a decision were reached without a human mind, could it truly be said to come from a tribunal, or even to be a decision at all in the legal sense? The Master of the Rolls suggested that these are questions that must be resolved now, not after the technology has overtaken the system. The courts have a duty to preserve the legitimacy of justice as well as its efficiency.
Linked to this is the broader issue of how digital justice will operate in practice. Under the Judicial Review and Courts Act 2022, the Online Procedure Rules Committee continues to develop the framework for digital dispute resolution, including the possible use of AI tools to triage claims or assist litigants. Sir Geoffrey has been one of the principal architects of this digital justice system, repeatedly emphasising that its purpose is to increase access and reduce delay, not to replace judicial discretion. His Legal Geek address reaffirmed that position while urging the profession to engage actively in shaping the boundaries of machine assistance.
The wider context: digital assets and future law
Towards the end of his speech, the Master of the Rolls turned to another aspect of digital transformation: the law of digital assets and smart contracts. He announced the creation of an International Jurisdiction Taskforce to promote cross-border alignment of private-law principles in this rapidly developing area, following on from the Electronic Trade Documents Act 2024 and the anticipated Property (Digital Assets etc) Act. This illustrates a coherent judicial vision of a digital legal order, in which AI, blockchain and data governance evolve together within a principled and rights-based framework.
Practical implications for the profession
For practitioners, the message is that understanding AI is now an integral part of professional competence. Firms must ensure that any use of AI systems is accompanied by clear governance, data-protection compliance and human oversight. The confidentiality of client information protected by legal privilege and professional conduct rules, must be preserved in the digital domain. AI tools may assist, but they cannot displace the duty of care owed by the lawyer to the client or by the judge to the public.
For the courts, the challenge is to embrace the efficiency and analytical power of AI without eroding public confidence in human adjudication. As Sir Geoffrey made clear, the legitimate question is not whether AI will be used in the justice system but which tasks it should perform, under what governance structures, and with what safeguards for accountability and fairness. The transformation is inevitable; the preservation of justice’s human character is a matter of deliberate choice.
Conclusion
Sir Geoffrey Vos’s Legal Geek address was not a warning against technology but a call for principled engagement with it. The Master of the Rolls framed the issue not as a conflict between innovation and tradition but as a question of trust, ethics and legitimacy. The debate about what decisions should be made by machines, and which by people, is no longer theoretical. It is the defining legal issue of the years ahead.
Further Reading
- Professional Negligence
- Solicitors Negligence
- Lessons from Ayinde v London Borough of Haringey
- Conveyancing Negligence
If this issue affects you or your organisation, or you wish to discuss your legal position in confidence, please contact Carruthers Law today on 0151 541 2040 or 0203 846 2862, or email info@carruthers-law.co.uk. You may also reach us via our Contact Us page.
Disclaimer: This article is provided for general information purposes only and does not constitute legal advice. Carruthers Law accepts no responsibility for any reliance placed on the contents. This article may include material from court judgments and contains public sector information licensed under the Open Justice Licence v1.0.