A Documented Risk Analysis for Litigation Firms

Why High-Sensitivity Matters Need Private AI

Every architecture that promised impenetrability has been breached, subpoenaed, or compromised by the humans in the chain. This is not a trend. This is the record.

"The risk is not hypothetical. It is not emerging.
It is established, documented, and knowable."
Scroll
2007 Microsoft — first partner recruited into PRISM
3+ Confirmed Google insider theft incidents — 2024–2026
0 Deletion promises that survived a federal preservation order

The System Was Never the Problem. The Humans Always Were.

Ross Ulbricht built the most sophisticated anonymous marketplace in history. Tor. Encryption. Decentralized. No names. No addresses. No paper trail. The FBI took it down in a San Francisco library in 2013. Ulbricht was sitting at his laptop.

The cryptography was fine. The architecture around the cryptography was not.

The NSA did not hack Bitcoin. They mapped the humans. Wallets fell because the people holding the keys made mistakes, trusted intermediaries, or got subpoenaed. This is the template. It repeats.

Right now, law firms are uploading privileged client data to cloud AI platforms built on the infrastructure of major frontier model providers. Deposition strategies. Settlement positions. Attorney work product. Into systems resting on the same architectural pattern that has failed every time it has ever truly mattered.

"Legal AI is not a new category of risk. It is an old category of risk wearing a new interface."

— Architectural risk analysis, CloseVector

The Breach Timeline No Vendor Will Send You

These are not edge cases. They are the pattern. Government compulsion. Nation-state intrusion. Insider theft. Ransomware. Each event on this timeline involves infrastructure that legal AI products run on today.

2007
Government

Microsoft — First Partner Recruited into PRISM

Microsoft becomes the first company recruited into the NSA's PRISM surveillance program. The Guardian and Washington Post reporting documents that Microsoft collaborated with intelligence services and that the Skype collection capability was subsequently expanded dramatically. Whether fully voluntary or legally directed, Microsoft was first and went furthest.

2011
Government

Skype Acquisition — Surveillance Triples

Nine months after acquiring Skype, Microsoft helps triple the volume of Skype video calls collected through PRISM. Rapid integration for surveillance. Not an accident.

2013
Takedown

Silk Road Seized — The FBI Sat Next to Ulbricht

The most technically sophisticated anonymous marketplace in history falls. Not through a cryptographic flaw. An agent sat next to Ulbricht at his laptop in a San Francisco library and grabbed the screen before the session encrypted. The perimeter was irrelevant.

2018
Legal Mandate

CLOUD Act — No Jurisdiction Is Safe

Congress passes the Clarifying Lawful Overseas Use of Data Act. U.S. companies must produce data regardless of where it is stored. A Microsoft executive later confirms on the record: they cannot guarantee data in Europe is safe from U.S. authorities. Not because of a hack. Because the law requires it.

2020
Ransomware

Grubman Shire Meiselas & Sacks — Lady Gaga's Contracts on the Dark Web

One of the most powerful entertainment law firms in the world. Celebrity contracts, settlement terms, confidential strategy — auctioned publicly because they refused to pay. Law firms are the most targeted professional services sector for ransomware. The data is valuable, the systems are old, and the partners are busy.

2023
Breach

OpenAI Internal Breach — Undisclosed for Nearly a Year

Attackers access OpenAI's internal messaging systems and extract details about AI research and architecture. OpenAI does not disclose the breach publicly for nearly a year. Most legal AI products run on OpenAI's infrastructure. Their clients were not informed.

2023
Nation-State

Storm-0558 — Chinese Actors in U.S. Government Email on Microsoft Infrastructure

Chinese state-sponsored hackers compromised senior U.S. government email accounts hosted on Microsoft infrastructure, persisting for at least six weeks before detection. The U.S. Cyber Safety Review Board subsequently called Microsoft's security culture "inadequate" and found the company had "drifted away" from its Trustworthy Computing ethos.

2025
Precedent

NYT v. OpenAI — Deletion Promises Evaporate Overnight

The court orders OpenAI to halt its standard 30-day data deletion process. One preservation order. The vendor's deletion policy — the one in the contract the law firm signed — is legally void. The court did not ask for permission. Critically: the only entities whose data received protection were those with direct enterprise Zero Data Retention (ZDR) agreements. Third-party legal AI wrappers did not qualify.

Jan 2026
Insider Theft

Ex-Google Engineer Convicted — AI Secrets to China

Linwei Ding convicted on 14 federal counts for stealing over 2,000 pages of Google AI trade secrets — TPU architecture, GPU systems, SmartNIC designs — and uploading them to his personal cloud account. The first AI-related economic espionage conviction in U.S. history. He used Apple Notes to bypass detection. Google chose openness over security, his defense argued. The jury disagreed with the framing. The theft was still real.

Feb 2026
Insider Theft

Three Google Engineers Indicted — Trade Secrets to Iran

Three Silicon Valley engineers indicted for stealing hundreds of files — processor security, cryptography architecture, trade secrets — from Google and related technology companies and transferring them to Iran. They photographed screens to evade detection. Signed false affidavits. Destroyed evidence. Google's own internal security systems caught them — but not before the data was gone. The people with keys are always the vulnerability.

Feb 2026
National Security

Secretary of Defense Flags Major AI Vendor as Supply Chain Risk

The U.S. Department of Defense designates a leading AI vendor as a national security supply chain risk. This is the infrastructure that legal AI products are built on. The Pentagon won't trust it with defense data. You are trusting it with privileged client communications.

The Threats Don't Come From One Direction.

Every firm that has assessed cloud AI risk has evaluated one threat at a time. The actual risk environment is three simultaneous vectors, each enabled by the same architectural decision: data leaving the building.

From Above: Government Compulsion

The CLOUD Act requires U.S. providers to produce data on demand, regardless of jurisdiction. No contract clause overrides federal law. Zero Data Retention (ZDR) does not stop a subpoena. Your vendor's privacy policy is not a shield against a warrant.

From Outside: Nation-State Actors

Storm-0558 persisted on Microsoft infrastructure for at least six weeks before detection. State-funded operators targeting hardened commercial systems. Patient, stealthy, and already inside some upstream environments.

From Below: Ransomware & Insiders

Law firms are the most targeted professional services sector. Grubman Shire. Campbell County Health. One phishing email. One misconfigured permission. One disgruntled employee photographing screens the night before they leave. The data doesn't need to be exfiltrated at scale. It needs to leave once.

Confidential Compute Is a Padlock on a Glass Door.

When confronted with the risks above, vendors offer confidential compute as the answer. Trusted Execution Environments. Hardware-level encryption. "Your data is encrypted even from us."

Here is what they are not telling you.

Confidential compute does not eliminate trust dependencies. It redistributes them. When you run your data in a cloud TEE, you are now trusting:

01 The Silicon Manufacturer Critical The chip itself must be free of backdoors. Supply chain compromises at the hardware level have been documented. You cannot audit what you did not manufacture.
02 The Firmware Layer Critical Firmware vulnerabilities bypass operating system protections entirely. Spectre, Meltdown, and their descendants showed that hardware-level assumptions about isolation are fragile.
03 The Attestation Infrastructure High TEE attestation relies on a certificate chain. That chain terminates at the cloud provider. You are trusting them to accurately represent the state of the enclave to you.
04 The Cloud Provider's Key Management Critical Even with TEEs, key management operations often touch provider infrastructure. CLOUD Act compliance means those keys can be compelled. You replaced your key with theirs and called it security.
05 The Hypervisor High TEEs are designed to resist hypervisor attacks, but misconfigured hypervisors have leaked enclave data. The isolation is a design goal, not a physical guarantee.
06 The Federal Government Critical None of the above matters if the CLOUD Act applies. A lawful order compels the provider to assist in accessing your data regardless of what encryption layer is in place. The law supersedes the technology.
The Verdict on Confidential Compute

You replaced one trust relationship with six. You called it security. Your vendor called it "enterprise grade." The federal government calls it "compliant."

Even On-Premises Infrastructure Has Failure Modes.
This Is Why You Need a Professional Deployment.

Local compute eliminates the cloud's trust chain. But an unmanaged local deployment introduces its own risk surface. The goal is not "local." The goal is properly deployed, encrypted, and isolated local.

The Disgruntled Employee

An associate with file system access copies privileged case data to a personal device on their last day. Without per-matter encryption with access-controlled keys, there is no cryptographic barrier between the employee and the corpus. Local storage without LUKS-level per-matter encryption is a filing cabinet without a lock.

The Unencrypted Drive

A workstation is stolen or decommissioned without proper data sanitization. Without full-disk encryption and key destruction protocols, the data is readable on any machine. Physical access is logical access without encryption.

The Shared Network Mount

A local model deployment with open SMB shares means every person on the network can access every matter's document corpus. Network segmentation and per-matter access controls are not default configurations. They must be engineered deliberately.

The Update Channel

Without hardware data diodes controlling what enters and exits the deployment environment, the update mechanism becomes an attack surface. A compromised update delivered over a standard network connection can exfiltrate data invisibly. Physics, not policy, must govern what crosses the boundary.

"Local compute is necessary. It is not sufficient. The gap between 'we bought the hardware' and 'we have a defensible deployment' is where firms get exposed."

— CloseVector deployment architecture principles

The Risk Profile Is Already Established.

When the first case surfaces where privileged strategy leaked through a cloud AI platform, it will not be a surprise to anyone who has read this far.

The firm will not be able to say the risk was unforeseeable. The cases were decided. The architecture was documented. The breach history was public. The legal obligations were established.

That is the definition of a knowable risk. And knowable risk that materializes into client harm creates serious malpractice exposure.

The firms getting quietly furious are not the laggards. They are the ones who read Heppner, looked at their cloud AI contracts, and understood the gap. They are not angry at the technology. They are angry that they were sold a solution to a problem that the solution itself creates.

"The risk profile is established. The only remaining question is which firm, which client, which case."

There Is One Architecture With No Upstream Trust Dependency.

The hardware is in your building. You hold the only key. The data never leaves. Government compulsion applies to your vendor, not to hardware you own. Nation-state actors target networks you are not on. Insider threats are contained by per-matter encryption with keys you control and destroy.

This is not theoretical. It is an engineering specification. And it requires professional deployment to be defensible.

🔐

Per-Matter LUKS/AES-256 Encryption

Each matter receives its own cryptographic container with a unique key. When the retention period ends, the key is destroyed. The data becomes mathematically inaccessible — not deleted, not archived, irrecoverable. This is the only technical architecture that satisfies "destroyed" in a legally meaningful way.

🛡

Hardware Data Diodes

Physics, not policy, governs what crosses the network boundary. A hardware data diode allows information to flow in one direction only — enforced at the electrical level. No firmware vulnerability, no misconfigured firewall, no social engineering attack can create a reverse channel that does not physically exist.

Verification Before Production

Every output is anchored to a source document before it reaches the attorney. No citation without a document ID. No summary without a verifiable reference. The audit trail is cryptographic. When opposing counsel challenges the methodology, the response is a hash, not a vendor's assurance.

Forward-Deployed Professional Architecture

The gap between "we bought the hardware" and "we have a defensible deployment" requires expertise. Network segmentation, access controls, update channel isolation, key management protocols, chain of custody documentation — these are not default configurations. They are engineered deliberately by someone who has done it before under adversarial conditions.

The Architectural Verdict

Not a policy. Not a contract. Not a TEE attestation. Not a confidential compute enclave running on someone else's silicon.

A key. Yours. In your building.

Everything else is a story you're telling yourself while someone upstream reads your mail.

That's what CloseVector is for.

The Risk Is Documented.
The Duty Is Established.
The Next Step Is Yours.

CloseVector builds air-gapped legal AI infrastructure for deployment inside your firm's walls. No cloud. No upstream trust dependency. No vendor with a better lawyer than you.

We excavate. We don't fabricate.