Artificial Intelligence: Learned Friend or Foe?
Filing a brief in court referencing case authorities, finding out that the cases were actually made up (because no one – not opposing counsel, nor even the judge – could find them), and then blaming ChatGPT for it – that sums up the predicament that a lawyer in the United States faced when he used ChatGPT “unaware of the possibility that its content could be false”.1 If the lawyer had taken the time to verify the case references before submitting them, he would neither have made international news, nor be fined USD$5,000 for his actions.2
Closer to home, in Singapore, a self-represented person had also consulted ChatGPT on a list of cases and their summaries for a particular legal proposition, and ChatGPT gave five fictitious cases which came to light when opposing counsel looked them up and found they did not exist.3
These two cases perfectly illustrate the key issues arising from the growing use of artificial intelligence (AI) in the practice of law:
- In the first case – the lawyer should have known better and verified the case references himself (or had an associate verify them). This leads to the broader question of how should lawyers be using AI in the course of their work? Will using AI be an acceptable delegation of duty (if its use is properly supervised) or a dereliction of duty?
- In the second case – the self-represented person’s use of ChatGPT highlights that there is a growing demand from the public for legal AI tools intended for laypersons so they can access the law and “self-help”, and the legal profession must adapt to it.4 This leads us to examine what is the role of a lawyer in society, what aspects of a lawyer’s work can AI replace, and can AI facilitate greater access to (do-it-yourself) justice?5
To address the issue of the impact of AI on lawyers,6 we must understand what is AI and how it works. Artificial intelligence7 is achieved primarily through machine learning, where a machine automatically learns and improves from experience instead of requiring explicit programming to reach an outcome.8 There are many applications of AI, from using it to aid in decision-making, to generating images or text based on prompts entered. In all cases, AI is trained on copious amounts of (past) data, so that it recognises patterns or attributes in the data to make predictions and find connections.9
Knowing this will help you put things in perspective: AI actually runs on a lot of humans behind the scenes, gathering and deciding what data to train it on, managing its training process, giving feedback on the responses it produces, and also updating it with information about the latest legal developments to stay relevant. Humans can never be removed from the equation.
Co-existing with the Machines … Or Are They Coming for Your Job?
Law firms are already using AI in a variety of ways, from discovery and document review, to legal research, predicting the outcomes of court cases and reviewing contracts. With the advent of generative AI like ChatGPT, law firms are also looking into how they can harness this technology to aid with legal research and drafting.10 Right now, ChatGPT is known to “hallucinate” (i.e. give made up information), but developers are intensively looking at ways to fix this so that the results are more reliable.
The use of AI in the workplace is not something that can be stopped, but is in fact gaining momentum – the Singapore civil service has incorporated ChatGPT into its work processes this year, with guidelines on AI usage.11 Reading the news and opinion pieces on how the legal profession will fare, there is cause for optimism as well as caution.12 For example, a March 2023 report from Goldman Sachs estimates that about 44% of legal positions could be automated.13
Singapore has also grappled with this balance, where our Second Minister for Law remarked that AI “can help you with the base material, but it cannot replace the creativity that the lawyer can bring to the team”.14 The prevailing sentiment is that AI will take away the parts like research and sifting through large volumes of documents, and leave lawyers more time to do analysis and strategy,15 such that “the value proposition of lawyers will lie in the complex legal work that cannot be effectively provided by technology”.16
The nature of legal work will definitely change, with different tasks and different jobs created (although it is difficult to predict exactly how at this point in time). Some lawyers could be training chatbots, and reviewing their output for accuracy so that it may be fine-tuned. A prominent UK firm had put out a notice for a “GPT legal prompt engineer” earlier this year to help it apply ChatGPT to its work.17
What Remains for the Lawyers?
So what is the essence of being a lawyer, if a machine can be programmed to do some of the work that we do? A very clinical answer would be what’s in section 33 of the Legal Profession Act 1966 (which sets out the circumstances under which an “unauthorised person” acts as an advocate or solicitor), but being a lawyer is so much more than that!
(1) Judgment, empathy and the ability to read a room
One of the first things we should understand is that clients don’t have a “legal problem” – they just have a problem, and law is one of the tools that can be used to solve it. Clients also don’t give you the facts of their case neatly laid out like a law school hypothetical. You have to find out what they need, and think about the gaps in the information you have (and weigh up the pros and cons of asking something (that may turn out to be irrelevant and frustrate the client)). This requires judgment, empathy, and the ability to read a room, on the part of the lawyer, to understand what is the client telling you, not telling you, or doesn’t know how to tell you – that really cannot be outsourced to a machine!
(2) Customising the solution to the client
Second, even if an AI system were to say that the likelihood of a positive outcome in court is “70%”, or “high”, that number or estimate alone is not helpful to the client or the lawyer in deciding whether the client should proceed with the case, and what strategy to adopt. Lawyers are still needed to uncover what the client’s interests are (and the client’s risk appetite18), explain the strengths and weaknesses of the case, and provide a legal strategy.
Lawyers also need to assess whether a solution on paper (regardless of whether it is generated by AI) would work in reality, requiring a nuanced understanding of the client’s business.19 And where another party is involved, to be able to get into the shoes of that other party to assess if the solution is also palatable to them, so that they are more likely to accept it.
(3) Adaptability; thinking on one’s feet
At present, generative AI can give a good first draft, but you still have to check it and refine it. For example, you could ask it to come up with cross-examination questions after feeding it parties’ statements – or you could prepare a list of such questions yourself. However, if at trial the answer from a witness is not as expected, the lawyer must be able to deviate from “script” within seconds to get the answers needed to prove the case, and have the flexibility to try various approaches based on the witness’ reaction.
New Rules for the Legal Profession with the Use of AI?
There will be a host of interesting questions for the legal profession to address as AI-adoption becomes more commonplace, such as:
- If you rely on AI, must you disclose that you use AI to the client or to the Court, or both? If lawyers must disclose the use of AI, what uses will this apply to? After all, we all use technology to make our lives easier – would it be overkill to disclose every single use?
- Will there be a requirement to certify that if generative AI was used, that the content generated was subject to a lawyer’s review?
Recently (in May 2023), a Texas judge was the first to direct lawyers appearing before him to file a certificate that “no portion of any filing will be drafted by generative artificial intelligence (or that any language drafted by generative artificial intelligence […] will be checked for accuracy, using print reporters or traditional legal bases, by a human being before it is submitted to the Court.”20
- Will there be a requirement or expectation for lawyers to use AI tools – a duty to keep up to date on technological developments? 21 Will it be seen as professionally negligent not to use such tools if they have a higher rate of accuracy than a human?
The answer may in part depend on how comfortable clients are with the technology (which determines whether they want their lawyers to use it), and whether lawyers can justify that the use of such technology is in the best interests of their clients (as clients would rely also on their lawyer’s judgment). Will firms that do not use AI be viewed as “out-dated” and “out-of-touch” when AI is the “new normal” in the legal sector?22
When Technology Crosses the Legal Line
AI tools also raise the question of whether they are providing unauthorised legal services, especially where they are designed for use by members of the public directly (as opposed for internal use within a law firm). There is a very fine line between legal advice (which only lawyers may provide), and legal information. It also raises the question of who will assume liability if the advice is incorrect.
An interesting case is that of DoNotPay, a New York-based startup which had described itself as a “robot lawyer”.23 DoNotPay’s plan was to assist persons challenging speeding tickets in court, where the litigant would wear smart glasses to record court proceedings, have responses generated by AI text generators including ChatGPT, and the responses fed into the defendant’s ear from a speaker. The first of such AI-assisted defence was scheduled to take place in California in February 2023, but it was cancelled when DoNotPay received notices from multiple state bars about the unauthorised practice of law, and is now under investigation by the state bars.
Nevertheless, since there is no lawyer on record, can we say that the litigant is still in effect representing himself, and he can decide for himself each time whether to follow what the AI text generator recommends? This is especially if the litigant is aware of the risk of “mistakes” (hallucinations) in the advice from such tools – should a person be allowed to rely on them, if done so voluntarily and fully informed? These will be issues we have to grapple with as the use of AI becomes more prevalent.
What Can You Do to Stay Ahead in this Age of AI?
1. You still need to put in the effort to know the law – there are no shortcuts
Even with the use of AI tools, lawyers must still put in the hard work to familiarise themselves with the law in the areas they practice in. This is in order to —
- know what questions to ask the client so you have a complete set of facts to advise on, and what assumptions to make;
- identify the legal issues, which can only stem from understanding the subject matter, in order to engineer effective24 prompts for the generative AI system;
- review the output from the generative AI system to see if it is correct, and make the appropriate edits and additions to it – otherwise you are just correcting English, and not applying a legal mind;
- explain the reasoning behind your conclusion, or why you recommend one option over another.
2. Take time to understand the technology and experiment with it
Instead of just reading about the latest AI tool on the market, take some time to try it out and see for yourself how it enhances your work, as well as its limitations. There are many free articles on the web about AI, as well as books and videos – follow up with those that speak in a way you understand, and then pursue a few more to ensure you have both the correct information and a balanced perspective. It is important to understand the technology and its developments to distinguish fact from hype. This also helps you in asking the right questions of your clients if they approach you about AI.
3. Keep updated on the legal developments surrounding the use of AI
Lawyers aren’t the only persons navigating this new field – everyone is! Organisations will have questions about their employees’ use of generative AI like ChatGPT while on the job, or about the AI system that the organisation is developing, or planning to purchase from a developer.
At present, Singapore has not implemented general AI regulations (unlike the EU’s draft Artificial Intelligence Act), but there are guidelines issued by regulators, such as the Model Artificial Intelligence Governance Framework, to aid organisations deploying AI. It is important to think about whether this new technology has features that affect the application of our existing laws. Keeping updated on legislative developments and regulatory action/judicial decisions around the world will help you get perspective on what are the risks and methods of mitigating those risks when using AI, and what are the grey areas (e.g. IP issues relating to generative AI), enabling you to effectively advise your clients (or your own organisation) on their use of AI.
At the end of the day, it is about perspective – how do you see your role as a lawyer? If it is to solve people’s problems and ensure access to justice, AI can be harnessed as one more tool to help with achieving it.
Congratulations on being called to the Bar! My wish for each of you is that you will find a career that continues to excite and challenge you daily.
The views expressed in this article are the personal views of the author and do not represent the views of Drew & Napier LLC.
|In the lawyer’s own words; see a report of the case at https://www.nytimes.com/2023/05/27/nyregion/avianca-airline-lawsuit-chatgpt.html
|This example was brought up in Chief Justice Sundaresh Menon’s Speech delivered at the 3rd Annual France-Singapore Symposium on Law and Business in Paris, France on 11 May 2023, accessible at: https://www.judiciary.gov.sg/news-and-resources/news/news-details/chief-justice-sundaresh-menon-speech-delivered-at-3rd-annual-france-singapore-symposium-on-law-and-business-in-paris-france at para 26.
|See paras 16, 24 and 26 of CJ Menon’s 11 May 2023 speech above.
|OCBC Bank rolled out a free online service for Singaporeans to prepare a will in less than 10 minutes, compared to hiring a lawyer for this purpose which would cost anywhere upwards of $100. This online will generator was referenced in CJ Menon’s speech at the 29th Inter-Pacific Bar Association Annual Meeting and Conference, available at https://medium.com/@singaporeacademyoflaw/deep-thinking-the-future-of-the-legal-profession-in-an-age-of-technology-6b77e9ddb1e9. The will generator also comes with appropriate caveats, informing the user to “seek legal advice from appropriately qualified lawyers for more specific Will requirements (e.g. Islamic law, persons under 21, not residing in Singapore etc).”
|This article will focus on how AI affects the performance of legal work, and not with other administrative tasks such as logging your time.
|Artificial intelligence “refers to set of technologies that seek to simulate human traits such as knowledge, reasoning, problem solving, perception, learning and planning, and, depending on the AI model, produce anoutput or decision (such as a prediction, recommendation, and/or classification)” – per the IMDA/PDCP’s Model Artificial Intelligence Governance Framework, available at: https://www.pdpc.gov.sg/-/media/files/pdpc/pdf-files/resource-for-organisation/ai/sgmodelaigovframework2.pdf.
|This makes it ideal for deployment in the legal field since law is all about precedent (e.g. how were past cases decided; what contractual document was used in the past for a similar matter).
|See, for example, the opinions expressed in the Australian Law Society Journal at https://lsj.com.au/articles/chat-gpt-is-putting-the-future-of-grad-lawyers-under-the-microscope/
|See, for example, the opinions expressed by the developers of ROSS, marketed as “the world’s first artificially intelligent attorney”, at https://www.washingtonpost.com/news/innovations/wp/2016/05/16/meet-ross-the-newly-hired-legal-robot/
|See para 29(b) of CJ Menon’s 11 May 2023 speech above.
|Clients may still wish to pursue a case even if the likelihood of success is low, for reasons such as principle or reputation.
|The Law Society hosted an online Colloquium on 19 May 2020, and in one of the panel sessions on Legal Ethics and Technology, views were sought as to whether lawyers should have an additional ethical duty to be “technologically competent” – see https://lawgazette.com.sg/news/events/the-future-of-lawyers-colloquium/
|See the UK Department of Education’s statement on Generative artificial intelligence in education, issued in March 2023, available at: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/1146540/Generative_artificial_intelligence_in_education_.pdf – “We can only learn how to write good prompts if we can write clearly and understand the domain we are asking about. We can only sense check the results if we have a scheme against which to compare them, Therefore, generative AI tools can make certain written tasks quicker and easier but cannot replace the judgment and deep subject knowledge of a human expert.”
Author: Cheryl Seah
First published in the Special Issue for Mass Call 2023 of the Singapore Law Gazette