Back to overview

The Oracle

Using AI in law firm applications

updated on 21 May 2024

Dear Oracle

Should I use AI in my applications to law firms or chambers?

The Oracle replies

Reading time: seven minutes

Anticipation. Fear. Excitement. These are just a few of the emotions we all felt when ChatGPT burst onto the scene in November 2022. With its introduction came endless questions about its use and, as we continue to grapple with the technology and more generative AI tools are created, the debate about whether you should be using them in your law firm applications continues.

In our recent AI masterclass with Shoosmiths, we asked attendees whether they’d used – or considered using – AI in their law firm applications. Of those who answered, 59% said yes, 30% said no and 11% said they weren’t sure. With this in mind, here’s our advice if you’re wondering whether to use AI in your vacation scheme, training contract or pupillage applications.

Identify the firm’s or chambers’ stance

As firms come to terms with AI and consider whether they want to see it used in applications, your best bet is to check whether the firms or chambers you’re considering applying to have released any communication about it.

Published law firm guidance

Shoosmiths recently published general guidance about the use of AI in the firm’s training contract applications, and what it wants to see from candidates. The guidance highlights the value that the firm places on innovation and the need to embrace this technology, rather than avoid it.

While Shoosmiths makes it clear that AI shouldn’t be used in place of the individual making the application (ie, copying and pasting generic AI-generated answers), the guidance indicates that the firm isn’t rejecting its use from the outset.

In fact, Shoosmiths welcomes the “responsible” use of AI to “enhance” application answers – for example, using it as a starting point for your research. The firm is quick to emphasise how important it is for candidates to retain their “unique voice” if they’re opting to use AI. Plus, applicants must show that they understand the benefits and limitations of such tools, and the impact its presence is having on the legal profession. If applicants choose not to use AI, the firm expects you to provide reasons for this decision. As mentioned in the masterclass, it’s about being able to justify your opinions and AI adoption.

With Shoosmiths the first to do so, it’s likely that other law firms will soon publish similar guidance as they get to grips with the technology, so keep your eyes peeled for more communication in the future.

Find out about the pros and cons of using AI to do your legal work in this Commercial Question from Shoosmiths trainee Yasmin Brown.

AI and networking

If you can’t find any communication online, we don’t think there’s any harm in asking firms what their stance on AI is at law fairs or other networking opportunities such as LawCareersNetLIVE. It’s important to be sensitive in the way you ask the question but the response will likely give you a good indication of whether you should utilise AI in your application for that specific firm.

Read our guide to networking on LawCareers.Net for more tips on this useful skill.

Application form tick boxes and AI statements

The firm’s stance might also become obvious from its application form – for example, if a firm asks you to tick a box to confirm you’ve not used AI, this could be a clear sign that it doesn’t want to see AI being used. Barrister chambers are also adding tick boxes to this effect on their application forms for pupillages and mini-pupillages. For example, towards the end of 12 King’s Bench Walk’s mini-pupillage application, there’s a line asking applicants to confirm that all of the application is their “original work, and has not been written, drafted or generated, in whole or in part, by any automatic or computer-aided system such as AI”.

Some firms are even including AI statements in their applications that request candidates not to use AI to complete specific questions, while others are simply explaining that applications could be withdrawn if candidates are suspected of having used AI. It’s worth reading the entire application form (if possible) before you begin writing your answers so you know what’s expected of you, and where you should – or shouldn’t – be utilising AI.

Adapting application questions

We’re also seeing some firms asking applicants to explain how you’ve used AI (if you have), and if you’ve not used it, why you haven’t. This could be an excellent opportunity for you to outline your understanding of AI (including its limitations) and how you used it to support your application.

If you think using AI fits in with the culture and values of the firm (eg, the firm prides itself on being dynamic and innovative, uses AI in its working processes or has its own legal tech division), you could use these questions to showcase your knowledge of the firm and why you’d therefore be a good fit.

The general consensus

The stance on AI in law firm applications is likely to differ between firms as the technology continues to evolve and organisations review how it can support their processes. While Shoosmiths has published guidance on AI in training contract applications, not all law firms have communicated their preferences (and whether they do or not is up to them) – a factor that you must bear in mind when putting your applications together.

That said, it's clear, and understandable, that no law firm or chambers will be best pleased if you’re just chucking application questions into a generative-AI platform and copying and pasting the exact answers it provides. This doesn’t demonstrate intuitive or innovative thinking – instead it suggests that you’re not taking the application process seriously, perhaps don’t understand AI and therefore your application could be rejected.

However, if you’ve discerned that a law firm is open to the use of AI in its applications, here are some tips on how you should use it.

Three ways to use AI to support your application

Research

When used properly, AI can be a great research tool. However, as with any research you conduct, you must approach the AI-generated answers with caution – don’t take them as gospel. It might be that you ask AI to pull a list of commercial news stories relating to banking and finance or a list of notable cases that a specific firm has worked on in a particular area. You should then do some additional research to check the automated answers, build your own understanding around them and bring in any knowledge that you already have on the subject or case.

The key, as ever, is to maintain your authentic voice. Use AI to build a base or help you to get started – but then do your own analysis to develop your thoughts and opinions.

Checking spelling and grammar

Having an extra pair of eyes check over your application or CV has always been useful, and AI can do just that. Once you’ve written the application, you can ask a generative-AI platform to check it for any spelling and grammatical errors. There’s also nothing stopping you putting a single sentence in to ask whether you’ve added a comma in the right place, for example.

Generating summaries

When you’ve written something yourself, it can be quite difficult to cut and edit it. This is often because you’re too close to the content, which makes it hard to identify repetitive or unnecessary sections. It’s for this reason that the LawCareers.Net content team gets other members to proof their own work!

AI has the capacity to summarise copy that you’ve written if required. It’s then your job to review the summary and make sure your points still make sense. You must also remember to maintain your voice and authenticity – you don’t want AI to rewrite your application for you, but rather highlight the key points and set you on your way.

AI application don’ts

It might be tempting to rely heavily on AI for your applications, but law firms and chambers can easily see through answers that have been entirely generated by AI. There are many ways that recruiters can identify AI-generated answers, here are a few:

  • The answer feels generic, rather than unique to the applicant.
     
  • AI-generated answers can often be quite wordy and don’t match up to the style or tone of the applicant’s CV.
     
  • Answers created using AI often include inaccurate or biased information.

Having outlined the various ways you can use AI to improve your application or CV, here are some things you should avoid.

  • Don’t use AI to create fully formed answers. Even when firms have said you can use it, it’s unlikely they’ll want you to input an answer that’s generated completely by AI. They want to see how you use it, as Shoosmiths said, to “enhance” your application – think research, for example.
     
  • Don’t say you’ve not used AI when you have. If a firm has asked you to confirm whether you’ve used the technology, be honest. If you get through to the next stage, you might need to prepare an answer explaining why and how you used it (or why you didn’t), how it made you more efficient or how your use of it could be translated into the work of a practising solicitor, for example.
     
  • Don’t take research you do using AI as the complete truth. You must check each source and build on the research yourself – remember, AI is a great starting point but there’s every chance it’s pulled some biased or inaccurate details, and it’s your job to check them.
     
  • Don’t use AI without fully understanding how it works; it’s crucial that you recognise what it can do alongside its limitations.

It’s important to emphasise that AI isn’t a substitute for you – it’s a useful tool that, if used appropriately, can show off a new and evolving skill set that, ultimately, law firms and chambers are going to require in future. However, not everyone is open to it yet, so approach your applications with this in mind.