The legal profession’s cautious, centuries‑old rhythms have been ruptured by a new force: artificial intelligence. According to a recent industry overview, routine tasks that once consumed vast swathes of associates’ and paralegals’ time — drafting, document review, e‑discovery triage and research — are now routinely delegated to machine models that can cluster documents, surface issues and produce first drafts in minutes rather than hours. Firms that move fastest report measurable efficiency gains, but those claims are almost always accompanied by an insistence on human oversight and governance. (National Law Review; firm innovation pages.)

A cluster of firms has emerged as exemplars of how to deploy AI at scale. In May 2025 Adams & Reese announced it had adopted Everlaw as its exclusive e‑discovery platform, saying the cloud‑native, AI‑enabled tools have accelerated document review, improved data management and reduced case turnaround times. The firm and Everlaw executives described faster analytics, automated timeline building and summarisation capabilities — concrete efficiencies, the announcement framed, that are already lowering costs and improving client responsiveness. (National Law Review; Adams & Reese announcement.)

Some legacy global firms have pursued vendor partnerships as their primary route to capability. Allen & Overy was an early and high‑profile adopter of Harvey, rolling the system out through its Markets Innovation Group and trialling it with thousands of lawyers; the firm says the tool has been used for multilingual drafting, contract analysis and research while every AI output is audited by humans. DLA Piper likewise positioned a 2023 rollout of Casetext’s CoCounsel as essential to staying competitive, describing AI adoption in blunt terms: that the market is an “arms race” and no firm wants to be left behind. (National Law Review; Allen & Overy announcement; DLA Piper announcement.)

Other firms have blended vendor technology with bespoke internal systems. Baker McKenzie describes a period of piloting large language models to generate drafts and assist with legal research while emphasising client‑by‑client sandboxing to meet bespoke data‑security needs. Dentons has taken the step of building a client‑secure GPT‑4 environment — fleetAI — developed with third‑party partners to ensure uploaded client data would not be used to train external models and that retention rules and deletion policies are enforced; public reporting around the launch stressed staff training, governance and the need to verify AI output. (National Law Review; firm statements and press coverage.)

Some firms have chosen fully proprietary routes. Cooley publicises a suite of in‑house platforms — including Vanilla, a secure cloud system aimed at private funds, and Cooley GO with its embedded Cooley GObot chatbot — positioning these tools as client‑facing, quality‑led innovations governed by principles of ethics and transparency. Cooley’s materials describe the platforms as designed to streamline investor onboarding, compliance and practical guidance for startups, and the firm has been explicit about publishing a manifesto that sets out those commitments. (National Law Review; Cooley innovation materials.)

Specialist and practice‑led experiments are also common. Wilson Sonsini added an AI‑enabled, fixed‑fee commercial contracting offering to its Neuron platform after in‑house testing that the firm said achieved roughly 92% accuracy in identifying issues and applying its playbook positions; the model is presented as a lawyer‑in‑the‑loop tool to speed contract lifecycle tasks. Employment boutique Fisher & Phillips helped design and now uses Casetext’s CoCounsel to compress legal research tasks from hours into minutes. Elsewhere, firms such as Gunderson Dettmer, Macfarlanes, Holland & Knight, Cuatrecasas, Orrick and the Big Four‑aligned KPMG Law have each taken different paths — from bespoke chat apps and amplified internal workflows to ABS‑enabled, hybrid operating models that combine non‑traditional ownership with aggressive technology deployment. (National Law Review; Wilson Sonsini announcement; firm statements.)

That diversity of approach underlines the twin promises and pitfalls of legal AI. Vendors and early adopters trumpet time and cost savings and increased throughput; press releases and corporate pages point to measurable productivity improvements and new fixed‑fee offerings that can accelerate clients’ time to revenue. But these same announcements repeatedly attach caveats: models are sandboxed for particular clients, outputs are subject to human verification, data‑use and retention policies are prescribed, and training and governance are elevated as equally important investments. As DLA Piper’s chair of AI practice, Daniel Tobey, put it in the firm’s 2023 statement about CoCounsel: “This is an arms race, and you don’t want to be the last law firm with these tools. It’s very easy to become a dinosaur these days.” (Firm announcements; Wilson Sonsini testing; DLA Piper statement.)

The regulatory and ethical questions are now the most consequential. Firms and platforms are experimenting with notice obligations to clients, deletion and retention windows, and controls to prevent inadvertent model training on confidential information — measures that aim to reconcile commercial advantage with professional duties. Reports about secure, client‑only environments and enforced deletion windows have become a centrepiece of vendor‑and‑firm narratives because they respond directly to lawyer concerns about privilege, confidentiality and model leakage. At the same time, independent testing and transparent governance frameworks are still uneven: accuracy claims vary by task and by how tightly a model has been aligned to a firm’s playbook. (Firm statements and coverage; industry materials.)

For now the prevailing model is augmentation: AI as a tool that amplifies lawyers’ capacity rather than replaces professional judgement. But the pace of change — multiple vendor partnerships, bespoke platforms, fixed‑fee automation and Big Four entrants built on hybrid ownership structures — suggests an inflection point is approaching. If the next phase is shaped by reproducible, regulated, client‑facing automation, the profession will have to reconcile new business models, supervision rules and training requirements with longstanding duties to clients and courts. Firms that transparently marry technical controls with clear governance, and that treat human review as non‑negotiable, are most likely to shape how that future unfolds. (National Law Review; Cooley materials.)

📌 Reference Map:

Reference Map:

Source: Noah Wire Services