Answering your questions: opinions on junior lawyers and AI

Overheard at the Law Society: at our recent junior lawyers event, artificial intelligence (AI) emerged as the standout issue, prompting thoughtful and practical questions from attendees. Here, we share the panel’s opinions on navigating some of AI’s ethical, regulatory and career development challenges.

What ethical and regulatory risks are there if a firm ‘blanket’ bans AI, but their lawyers secretly use publicly available AI tools in their work instead?

Louise Nicholson: Using publicly accessible AI tools when they are not permitted is a significant ethical risk. Doing so would breach confidentiality and may jeopardise legal professional privilege.

While some firms remain cautious about AI, I think they are in the minority. Clients increasingly expect us to use these tools, and firms that resist may struggle.

Harry Clark: A policy that says ‘don't do this’ is only as effective as a firm’s ability to enforce it. Training and auditing is just as important as policy. Individuals need to understand how these tools work so they can make informed judgement calls about their use within that framework.

Dan Kayne: From an in-house perspective, this is a real risk. What’s to stop a non-legal colleague turning to AI for advice rather than waiting for the lawyer? As AI gets more sophisticated, this will become a genuine ethical and regulatory challenge.

Harry: To add to that, these behaviours (cutting corners, misattributing case law and so on) have existed long before generative AI. I’m sure every lawyer has experienced a discussion with a non-lawyer client trying to tell them how the law works! Unsupervised AI use will compound that.

Four solicitors seated in a row at a panel discussion

As pressure grows to reduce our environmental impact, what practical steps can firms take to respond to the sustainability issues linked to AI?

Louise: This is a tough question. Given the energy and water consumption of data centres, firms must consider the environmental impact of AI and tech tools and assess where they can offset it.

For example, technology has made it possible to meet overseas clients without flying, which could go some way towards offsetting environmental impact – although I appreciate that isn’t AI-specific.

Harry: I have three insights from innovation colleagues. First, there’s an optimistic view that the cost of inference (using AI) is falling and may reach a point where its economic and environmental impact is negligible. However, many other people are sceptical about this argument and I can see why.

Secondly, AI strategists are exploring when a smaller language model – which has much more limited parameters and therefore less of an environmental impact when trained – can do the job instead of a large language model.

Often, a smaller or less sophisticated model can be enough to answer a query – or at least the part that matters – avoiding a blanket, one-size-fits-all solution that defaults to using the biggest and best models with ‘reasoning’ capabilities, in turn requiring far less energy.

Finally, it’s also about knowing when not to use AI. Sometimes a search engine will do the job just as well, with a much smaller environmental impact and without the risk of hallucinations.

Key terms

Artificial intelligence (AI) – the theory and development of computer systems able to perform tasks that usually require human intelligence, such as visual perception, speech recognition and decision-making.

Generative AI – generative AI is a subcategory of AI that uses deep learning algorithms to generate new content such as: text, images, audio or code.

Large language model (LLM) – an advanced AI system trained on an exceptionally large amount of text data to understand and generate human like language.

Small language model (SLM) – a more compact version of an LLM, trained on specific data.

Data centre - a physical facility which houses the systems and related equipment behind AI technology.

How can junior solicitors develop experience and judgement in certain areas when we’re being encouraged to use AI? For example, if we’re told to test out drafting witness statements with AI.

Louise: Using AI to draft witness statements will inevitably limit your drafting experience – you can’t develop a skill you’re not actively using. To the best of my knowledge there’s no reported case or judicial guidance on using AI to produce witness statements yet.

Even so, in my personal opinion, I’d have questions about whether AI-drafted witness statements for trial would be compliant in the business and property courts. There are other, better, use cases for AI withing law firms, especially for efficiency gains. (Since this event the Civil Justice Council has published a consultation paper on AI use in court documents.)

Dan: From the client’s perspective, their questions will be: “How long will it take you, compared to if you’re using AI?” and “How much will it cost?” In the future, partners will still need to sign off statements, but I do think using AI to produce statements could eventually become the norm.

But I recently read an article arguing we’re experiencing ‘brain fog’ because so many tasks now require minimal thought. Without that sense of ownership, people feel less inspired, and the pride that once came with completing work is fading. Studies show this is already affecting wellbeing. It brings up a host of issues.

On a more personal level, if these tools had existed when I was a junior, I’d have thought, ‘this is bloody brilliant! I don’t need to spend so much time on a task’. 

Collage of the three speakers talking at the event.

If we rely on AI for drafting, how can junior lawyers develop their own drafting style and professional identity?

Dan: Realistically, I think clients won’t care. Ultimately, this comes down to what people are willing to pay for.

In terms of professional identity, what matters more than drafting style are the relationships you cultivate. As drafting becomes a level playing field, it’s your relationship-building skills that will set you apart.

I have heard of a new AI tool developed by a former private practice lawyer that captures a partner’s thinking as they draft. It interrogates every drafting decision to help others understand the reasoning behind decisions.

With technology like this, experience could be shared digitally rather than only through being in the room, listening to the partner and taking notes.

Harry: There will always be information that never finds its way into an AI training dataset. In my practice, clients often bring us novel products that don't fit traditional definitions or want help with emergent laws.

An AI tool might give an interpretation based on the letter of the law, but in some cases our team’s experience – and conversations with regulators or policy makers – can lead to a more appropriate interpretation. Those insights aren’t easily fed into AI.

I want to know more

Watch the panel event in full

Catch up on the key challenges facing junior lawyers in 2026, which also features insights from LittleLaw’s Idin Sabahipour.

Be part of the Junior Solicitors Network

This free community for junior members is a space to discuss issues of concern, make your views heard and network with other junior lawyers.

Understand AI in practice

Build your confidence navigating AI’s fast-moving developments. 

Explore our AI and lawtech hub for guidance tailored to solicitors.