AI tools are revolutionizing legal work. They can draft contracts, analyze case law, and provide near-instant answers to complex questions. But while the technology is smart – and impressively so, there’s one thing its not: a lawyer.
And that matters.
Because in the eyes of the law, AI doesn’t qualify for attorney-client privilege – the bedrock principle that keeps a client’s communications with their lawyer private and protected. If you’re relying on AI-generated advice, you may be risking one of the most important aspects of your relationship as a client with your lawyer, your secrets might not stay secret. And in a profession built on trust, that’s a problem.
Why Privilege Exists – and Why AI Doesn’t Qualify
Attorney-client privilege protects the confidentiality of legal advice shared between a lawyer and their client. It’s what allows clients to speak freely to their lawyers, without fear that their words will be used against them later.
But this privilege only applies when certain conditions are met. Usually, these conditions include that:
- The advice must come from a licensed attorney – someone authorized to practice law; and
- There must be a confidential relationship, whereby the client seeks legal counsel directly from that lawyer.
An AI tool, no matter how advanced, ticks neither of these boxes. It doesn’t hold a license. It hasn’t passed the bar. It doesn’t take an oath to act ethically or serve a client’s best interest. And that’s because AI isn’t a lawyer; it’s a tool – admittedly powerful and efficient, but entirely unrecognized by the legal system as a privileged source of advice.
The result? If a client uses AI to generate legal guidance, that communication is not protected. It could be subpoenaed. It could be hacked. It could be shared, and there’s no legal privilege to keep it out of the wrong hands.
The Risks of AI Legal Advice
AI tools promise convenience, but they come with real risks – risks that lawyers, clients, and legal tech creators should take very seriously.
1. No Guaranteed Confidentiality
For privilege to apply, legal communications must remain strictly private. But AI tools often store data on third-party servers, transmit it through the cloud, or retain user inputs for future “learning.” If those systems are compromised, confidential advice could become public.
2. Unauthorized Practice of Law
In most jurisdictions, offering legal advice without a license is illegal. But does that mean that AI tools that generate unsupervised advice may violate regulations around the unauthorized practice of law? The answer seems to be yes. That puts not just the tool, but also its creators and deployers, at legal risk.
3. Who’s Liable When AI Gets It Wrong?
Unlike a licensed attorney, an AI system has no professional duty to its “client.” If a tool provides incorrect advice – or even advice that leads to costly mistakes or missed opportunities – there is no clear path for accountability. Who’s to blame? The AI’s developers? The client for trusting it? These are not novel questions, but they remain relevant.
Without privilege, the damage could be compounded. Imagine an AI-generated memo being used as evidence against the very client who, thinking they had access to fast, reliable and affordable legal advice, relied on it.
Could Privilege Ever Extend to AI?
For now, privilege belongs to human lawyers. But as AI becomes more embedded in legal practice, the law may evolve. Here’s what the future could hold:
1. AI as a Lawyer’s Assistant
The safest approach today is using AI as a legal assistant –a tool that supports human lawyers but doesn’t replace them. If a lawyer oversees the AI’s work and incorporates it into their advice, privilege can still apply. The key is control. The attorney – not the AI, must remain responsible for the final product.
2. The Licensing of AI Tools
A more radical shift could involve AI systems being “licensed” or certified to perform legal functions under specific regulations. If this happens, privilege might extend to AI-generated advice, but only in scenarios where the system meets ethical and professional standards.
3. New Regulations for Legal Tech
Governments and bar associations may create new protections for AI legal tools – protections that, while not identical to attorney-client privilege, could safeguard confidentiality and limit liability for clients and developers alike.
For now, though, AI exists outside the privilege circle.
Now for Something Really Far-Fetched: AI Isn’t a Lawyer, But Could It Be a Witness?
If AI tools aren’t lawyers and their advice isn’t privileged, what role do they play in the legal system? Surprisingly, AI might become something far stranger: a witness.
Imagine this: A client uses an AI tool to generate advice. Months later, that very same advice gets subpoenaed as evidence. Instead of being protected, the AI’s outputs—its “conversations” with the user – become admissible in court. The tool, once trusted to help, has turned into a silent record of the client’s thoughts, plans, and vulnerabilities.
And this raises an unsettling question: if AI can be subpoenaed like a witness, does it become the first tool in history that records everything – and forgets nothing?
This isn’t just about privilege anymore. It’s about the unintentional risks we create by relying on tools that weren’t designed with human trust in mind. Lawyers and clients might assume AI outputs are private, fleeting, or disposable. But in reality, those outputs might live on servers, logs, or training datasets, ready to be retrieved, exposed, or weaponized later.
AI doesn’t protect itself. It doesn’t forget, redact, or understand confidentiality. And unless the law catches up, AI risks becoming the very thing no lawyer wants in the room: an uninvited witness who remembers everything.
This changes the conversation entirely. It’s not just that AI isn’t a lawyer. It’s that AI might become a risk – a record-keeper that could speak against you when you least expect it. So, here’s the bigger lesson: in the age of AI, legal confidentiality isn’t just about who you trust. It’s about what you trust – and whether the tools you rely on are designed to protect you, or expose you.