Mitigating AI risks and preventing deepfake attacks
There is vast opportunity for new AI technologies to be used within the legal sector – from supporting risk identification, to drafting contracts.
But while AI has the ability to positively impact the legal profession, its adoption introduces risks such as:
- The potential for infringements of copyrights, trademarks, patents and related rights if tools are being trained on protected material without permission
- The misuse or disclosure of confidential, personal or sensitive information which can result in breaches of legislation
- The risk of hacking, data breaches and malicious cyber activities such as deepfakes
- The risk that generative AI will produce misleading, inaccurate or false outputs – including ‘hallucinated’ outputs, where AI produces highly plausible case law or statute law that is fabricated
- The risk that AI models reflect social bias in their output, resulting in output that is discriminatory or unfair
It is important to remember that solicitors using AI remain responsible for the work produced using those tools, and must ensure any information or documents are accurate and from genuine and verifiable sources.
In order to comply with the SRA Principles and Code of Conduct, practitioners should supervise AI use for quality control, as an improper reliance on technology risks violating professional obligations to uphold the rule of law, maintain public trust, and act with integrity.
How can firms manage the risks around AI use?
Firms should consider how they can mitigate the risks surrounding AI usage. They should:
Deepfakes
Deepfakes involve the use of AI to create convincing forgeries of images, videos and audio recordings. These can be indistinguishable from genuine content, making it difficult to identify whether content is real or not.
Cyber criminals may create deepfake audio or video messages to deceive employees into divulging sensitive information or authorising fraudulent financial transactions.
Criminals may transform existing content by swapping one person for another. They may also create entirely original content where a person appears to say or do something that they did not.
Key warning signs of deepfakes include:
- Audio issues: odd noise distortion in the background or voice quality
- Sync problems: disconnection or delay between speech and mouth movement
- Visual Anomalies: pixelation or lack of visual clarity, or an absence of or unusually patterned blinking
In the LexisNexis Cybersecurity and AI 2025 Report, 24% of legal professionals cited AI-generated threats such as deepfakes and synthetic email scans as their second biggest concern after phishing.
Why are solicitors at risk?
Although technical controls may be in place to prevent cyber-attacks, deepfakes target human trust. Deepfake technology is also evolving at a fast rate, making it essential for businesses to continuously monitor and improve their deepfake detection capabilities.
Law firms are particularly vulnerable to attacks as they often manage substantial sums of client money. Advances in deepfake technology are a particular threat in conveyancing and property transactions.
Criminals can convincingly impersonate sellers or agents using deepfake technology, resulting in solicitors unwittingly facilitating fraudulent transactions. Additionally, the nature of conveyancing transactions provides cyber criminals with both the method for committing fraud and the means to launder stolen funds effectively.
How can firms manage the risk of deepfake crime?
To manage the threat of deepfakes, law firms should implement a robust, multi-faceted security strategy.
This article is written by Paragon as a hosted feature on the Law Society website. Views expressed are Paragon’s.
This article is provided for general information only. It is not intended to amount to advice on which you should rely. You should obtain professional or specialist advice before taking, or refraining from, any action on the basis of the content in this article. See our website legal notice here.
Paragon is a partner of the Law Society. This article is written in collaboration with Beale & Co, who offer support to Paragon’s LawSelect clients when claims or circumstances arise.
If you would like to discuss this article or how Paragon can help with professional indemnity now or in the future, contact their team.