A Documented Risk Analysis for Litigation Firms
Why High-Sensitivity Matters Need Private AI
Every architecture that promised impenetrability has been breached, subpoenaed, or compromised by the humans in the chain. This is not a trend. This is the record.
It is established, documented, and knowable."
The System Was Never the Problem. The Humans Always Were.
Ross Ulbricht built the most sophisticated anonymous marketplace in history. Tor. Encryption. Decentralized. No names. No addresses. No paper trail. The FBI took it down in a San Francisco library in 2013. Ulbricht was sitting at his laptop.
The cryptography was fine. The architecture around the cryptography was not.
The NSA did not hack Bitcoin. They mapped the humans. Wallets fell because the people holding the keys made mistakes, trusted intermediaries, or got subpoenaed. This is the template. It repeats.
Right now, law firms are uploading privileged client data to cloud AI platforms built on the infrastructure of major frontier model providers. Deposition strategies. Settlement positions. Attorney work product. Into systems resting on the same architectural pattern that has failed every time it has ever truly mattered.
"Legal AI is not a new category of risk. It is an old category of risk wearing a new interface."
— Architectural risk analysis, CloseVectorThe Breach Timeline No Vendor Will Send You
These are not edge cases. They are the pattern. Government compulsion. Nation-state intrusion. Insider theft. Ransomware. Each event on this timeline involves infrastructure that legal AI products run on today.
Microsoft — First Partner Recruited into PRISM
Microsoft becomes the first company recruited into the NSA's PRISM surveillance program. The Guardian and Washington Post reporting documents that Microsoft collaborated with intelligence services and that the Skype collection capability was subsequently expanded dramatically. Whether fully voluntary or legally directed, Microsoft was first and went furthest.
Skype Acquisition — Surveillance Triples
Nine months after acquiring Skype, Microsoft helps triple the volume of Skype video calls collected through PRISM. Rapid integration for surveillance. Not an accident.
Silk Road Seized — The FBI Sat Next to Ulbricht
The most technically sophisticated anonymous marketplace in history falls. Not through a cryptographic flaw. An agent sat next to Ulbricht at his laptop in a San Francisco library and grabbed the screen before the session encrypted. The perimeter was irrelevant.
CLOUD Act — No Jurisdiction Is Safe
Congress passes the Clarifying Lawful Overseas Use of Data Act. U.S. companies must produce data regardless of where it is stored. A Microsoft executive later confirms on the record: they cannot guarantee data in Europe is safe from U.S. authorities. Not because of a hack. Because the law requires it.
Grubman Shire Meiselas & Sacks — Lady Gaga's Contracts on the Dark Web
One of the most powerful entertainment law firms in the world. Celebrity contracts, settlement terms, confidential strategy — auctioned publicly because they refused to pay. Law firms are the most targeted professional services sector for ransomware. The data is valuable, the systems are old, and the partners are busy.
OpenAI Internal Breach — Undisclosed for Nearly a Year
Attackers access OpenAI's internal messaging systems and extract details about AI research and architecture. OpenAI does not disclose the breach publicly for nearly a year. Most legal AI products run on OpenAI's infrastructure. Their clients were not informed.
Storm-0558 — Chinese Actors in U.S. Government Email on Microsoft Infrastructure
Chinese state-sponsored hackers compromised senior U.S. government email accounts hosted on Microsoft infrastructure, persisting for at least six weeks before detection. The U.S. Cyber Safety Review Board subsequently called Microsoft's security culture "inadequate" and found the company had "drifted away" from its Trustworthy Computing ethos.
NYT v. OpenAI — Deletion Promises Evaporate Overnight
The court orders OpenAI to halt its standard 30-day data deletion process. One preservation order. The vendor's deletion policy — the one in the contract the law firm signed — is legally void. The court did not ask for permission. Critically: the only entities whose data received protection were those with direct enterprise Zero Data Retention (ZDR) agreements. Third-party legal AI wrappers did not qualify.
Ex-Google Engineer Convicted — AI Secrets to China
Linwei Ding convicted on 14 federal counts for stealing over 2,000 pages of Google AI trade secrets — TPU architecture, GPU systems, SmartNIC designs — and uploading them to his personal cloud account. The first AI-related economic espionage conviction in U.S. history. He used Apple Notes to bypass detection. Google chose openness over security, his defense argued. The jury disagreed with the framing. The theft was still real.
Three Google Engineers Indicted — Trade Secrets to Iran
Three Silicon Valley engineers indicted for stealing hundreds of files — processor security, cryptography architecture, trade secrets — from Google and related technology companies and transferring them to Iran. They photographed screens to evade detection. Signed false affidavits. Destroyed evidence. Google's own internal security systems caught them — but not before the data was gone. The people with keys are always the vulnerability.
Secretary of Defense Flags Major AI Vendor as Supply Chain Risk
The U.S. Department of Defense designates a leading AI vendor as a national security supply chain risk. This is the infrastructure that legal AI products are built on. The Pentagon won't trust it with defense data. You are trusting it with privileged client communications.
The Threats Don't Come From One Direction.
Every firm that has assessed cloud AI risk has evaluated one threat at a time. The actual risk environment is three simultaneous vectors, each enabled by the same architectural decision: data leaving the building.
From Above: Government Compulsion
The CLOUD Act requires U.S. providers to produce data on demand, regardless of jurisdiction. No contract clause overrides federal law. Zero Data Retention (ZDR) does not stop a subpoena. Your vendor's privacy policy is not a shield against a warrant.
From Outside: Nation-State Actors
Storm-0558 persisted on Microsoft infrastructure for at least six weeks before detection. State-funded operators targeting hardened commercial systems. Patient, stealthy, and already inside some upstream environments.
From Below: Ransomware & Insiders
Law firms are the most targeted professional services sector. Grubman Shire. Campbell County Health. One phishing email. One misconfigured permission. One disgruntled employee photographing screens the night before they leave. The data doesn't need to be exfiltrated at scale. It needs to leave once.
Confidential Compute Is a Padlock on a Glass Door.
When confronted with the risks above, vendors offer confidential compute as the answer. Trusted Execution Environments. Hardware-level encryption. "Your data is encrypted even from us."
Here is what they are not telling you.
Confidential compute does not eliminate trust dependencies. It redistributes them. When you run your data in a cloud TEE, you are now trusting:
You replaced one trust relationship with six. You called it security. Your vendor called it "enterprise grade." The federal government calls it "compliant."
Even On-Premises Infrastructure Has Failure Modes.
This Is Why You Need a Professional Deployment.
Local compute eliminates the cloud's trust chain. But an unmanaged local deployment introduces its own risk surface. The goal is not "local." The goal is properly deployed, encrypted, and isolated local.
The Disgruntled Employee
An associate with file system access copies privileged case data to a personal device on their last day. Without per-matter encryption with access-controlled keys, there is no cryptographic barrier between the employee and the corpus. Local storage without LUKS-level per-matter encryption is a filing cabinet without a lock.
The Unencrypted Drive
A workstation is stolen or decommissioned without proper data sanitization. Without full-disk encryption and key destruction protocols, the data is readable on any machine. Physical access is logical access without encryption.
The Shared Network Mount
A local model deployment with open SMB shares means every person on the network can access every matter's document corpus. Network segmentation and per-matter access controls are not default configurations. They must be engineered deliberately.
The Update Channel
Without hardware data diodes controlling what enters and exits the deployment environment, the update mechanism becomes an attack surface. A compromised update delivered over a standard network connection can exfiltrate data invisibly. Physics, not policy, must govern what crosses the boundary.
"Local compute is necessary. It is not sufficient. The gap between 'we bought the hardware' and 'we have a defensible deployment' is where firms get exposed."
— CloseVector deployment architecture principlesThe Bar Has Already Ruled on This. You Have a Duty.
This is not a technology debate. It is a professional responsibility question. Two decisions define the current landscape for attorneys using AI tools to handle privileged data.
Judge Rakoff ruled from the bench on February 10, 2026 — with a written opinion filed February 17 — that 31 documents a defendant generated using Claude were not protected by attorney-client privilege or the work product doctrine. The court held: (1) Claude is not an attorney; (2) Anthropic's privacy policy, to which the user consented, defeats any reasonable expectation of confidentiality because it permits data collection, training use, and disclosure to third parties including government authorities; (3) the materials were not created at counsel's direction and therefore did not constitute work product. This is the first ruling of its kind on AI-generated documents and privilege.
ABA Formal Opinion 512 (July 2024) and Model Rule 1.1 Comment 8 together establish that attorneys have a specific, non-delegable duty to understand the technology they use to handle privileged client data. Competence in the AI context means understanding what happens to client data when it enters a cloud platform — who can access it, under what legal authorities it can be compelled, and whether contractual deletion promises survive a preservation order. This is not a vague professional aspiration. It is a concrete obligation with disciplinary consequences.
The court ordered OpenAI to immediately halt its standard 30-day data deletion process. One preservation order rendered the vendor's deletion policy legally void overnight. The court did not consult the vendor's terms of service. It issued an order. The data that clients had been told would be deleted in 30 days was suddenly subject to indefinite preservation and potential discovery.
The exception is instructive: the only entities whose data received protection were those with direct enterprise Zero Data Retention (ZDR) agreements with the model provider itself. Third-party legal AI wrappers — the products that sit on top of frontier model APIs — did not qualify for this protection. Their clients' data remained subject to the preservation order.
Rule 1.6 requires attorneys to make reasonable efforts to prevent inadvertent or unauthorized disclosure of client information. The standard is "reasonable efforts," not perfection. But reasonableness is evaluated against what was knowable at the time. When the risk of cloud AI disclosure is this documented — this public, this adjudicated — a firm that continued using cloud AI for privileged matters without assessing that risk will face a difficult argument that their efforts were "reasonable."
The Risk Profile Is Already Established.
When the first case surfaces where privileged strategy leaked through a cloud AI platform, it will not be a surprise to anyone who has read this far.
The firm will not be able to say the risk was unforeseeable. The cases were decided. The architecture was documented. The breach history was public. The legal obligations were established.
That is the definition of a knowable risk. And knowable risk that materializes into client harm creates serious malpractice exposure.
The firms getting quietly furious are not the laggards. They are the ones who read Heppner, looked at their cloud AI contracts, and understood the gap. They are not angry at the technology. They are angry that they were sold a solution to a problem that the solution itself creates.
"The risk profile is established. The only remaining question is which firm, which client, which case."
There Is One Architecture With No Upstream Trust Dependency.
The hardware is in your building. You hold the only key. The data never leaves. Government compulsion applies to your vendor, not to hardware you own. Nation-state actors target networks you are not on. Insider threats are contained by per-matter encryption with keys you control and destroy.
This is not theoretical. It is an engineering specification. And it requires professional deployment to be defensible.
Per-Matter LUKS/AES-256 Encryption
Each matter receives its own cryptographic container with a unique key. When the retention period ends, the key is destroyed. The data becomes mathematically inaccessible — not deleted, not archived, irrecoverable. This is the only technical architecture that satisfies "destroyed" in a legally meaningful way.
Hardware Data Diodes
Physics, not policy, governs what crosses the network boundary. A hardware data diode allows information to flow in one direction only — enforced at the electrical level. No firmware vulnerability, no misconfigured firewall, no social engineering attack can create a reverse channel that does not physically exist.
Verification Before Production
Every output is anchored to a source document before it reaches the attorney. No citation without a document ID. No summary without a verifiable reference. The audit trail is cryptographic. When opposing counsel challenges the methodology, the response is a hash, not a vendor's assurance.
Forward-Deployed Professional Architecture
The gap between "we bought the hardware" and "we have a defensible deployment" requires expertise. Network segmentation, access controls, update channel isolation, key management protocols, chain of custody documentation — these are not default configurations. They are engineered deliberately by someone who has done it before under adversarial conditions.
Not a policy. Not a contract. Not a TEE attestation. Not a confidential compute enclave running on someone else's silicon.
A key. Yours. In your building.
Everything else is a story you're telling yourself while someone upstream reads your mail.
That's what CloseVector is for.
The Risk Is Documented.
The Duty Is Established.
The Next Step Is Yours.
CloseVector builds air-gapped legal AI infrastructure for deployment inside your firm's walls. No cloud. No upstream trust dependency. No vendor with a better lawyer than you.
We excavate. We don't fabricate.