The ethics of AI and GenAI in eDiscovery (April 2026)
Updated 21 April 2026 | ABA citations verified | Not legal advice
ABA Formal Opinion 512 (July 2024) confirmed what state bars had been saying for 18 months: the existing Model Rules apply to GenAI, and AI does not create new duties so much as it heightens the existing ones. In eDiscovery specifically, the relevant duties are competence, confidentiality, supervision, and candor. This page applies each to the eDiscovery context specifically.
ABA Formal Opinion 512 -- 29 July 2024
The ABA's definitive GenAI ethics statement
ABA Formal Opinion 512, issued by the Standing Committee on Ethics and Professional Responsibility on 29 July 2024, is the primary national ethical guidance on GenAI in legal practice. It applies Model Rules 1.1 (competence), 1.4 (communication), 1.6 (confidentiality), 3.3 (candor), and 5.3 (supervision of non-lawyer assistance) to GenAI tools. It does not create new duties; it applies existing duties to a new technology.
Rule 1.1 -- Competence
ABA Model Rule 1.1 requires that an attorney 'provide competent representation to a client' including the 'legal knowledge, skill, thoroughness and preparation reasonably necessary for the representation.' Comment 8 adds that a lawyer must keep abreast of 'changes in the law and its practice, including the benefits and risks associated with relevant technology.'
Applied to eDiscovery: the supervising attorney must understand how the AI review tool works at a level sufficient to evaluate its outputs. This does not require technical expertise in LLM architecture, but it does require understanding: what issue prompt was used, what the validation sampling showed, what the recall estimate is, and what the failure modes of the AI are for the document types in this matter. An attorney who signs a production certification under FRCP 26(g) without reviewing the AI validation results has not met the Rule 1.1 competence standard under Opinion 512.
Rule 1.6 -- Confidentiality
Rule 1.6 requires attorneys to 'make reasonable efforts to prevent the inadvertent or unauthorized disclosure of, or unauthorized access to, information relating to the representation of a client.' Sending client documents to a GenAI vendor for processing is a disclosure of client confidential information.
ABA 512 confirms that Rule 1.6 requires attorneys to conduct reasonable due diligence on GenAI vendors before sending client information. The required due diligence includes: verifying zero-retention terms (the vendor does not retain prompts or document content beyond the immediate transaction), confirming tenant isolation (client data does not mix with other clients' data), reviewing the vendor's data processing agreement (DPA), and confirming that the vendor does not use client data to train its foundation model.
For GDPR implications: if the eDiscovery involves EU personal data, the vendor must also execute a GDPR Article 28 processor agreement and must be able to demonstrate compliance with the EU-US Data Privacy Framework or an equivalent transfer mechanism. Processing EU personal data through an AI vendor without a valid Article 28 agreement is a GDPR violation regardless of any other contractual arrangement.
Rule 3.3 -- Candor
Rule 3.3 prohibits making false statements of law or fact to a tribunal and requires disclosure of controlling adverse authority. In the GenAI eDiscovery context, Rule 3.3 applies to: AI-generated case citations (hallucinated citations must be verified before being filed -- not a new standard, but one that AI hallucination makes more relevant); AI-assisted summaries of documents submitted to the court or opposing party; and certifications about the review methodology where AI was used.
DC Bar Opinion 388 (April 2024) specifically addressed AI-generated work product and the duty of candor, recommending that attorneys review all AI outputs before submission to a court or opposing party. The Mata v. Avianca (S.D.N.Y. 2023) sanctions case, where an attorney submitted AI-hallucinated case citations, is the cautionary precedent that most ethics opinions cite.
Rule 5.3 -- Supervision of non-lawyer assistance
Rule 5.3 requires attorneys to make reasonable efforts to ensure that non-lawyer assistants' conduct is compatible with the attorney's professional obligations. ABA 512 confirms that AI tools are non-lawyer assistance for Rule 5.3 purposes. This means the supervising attorney is responsible for the AI's outputs in the same way they are responsible for a paralegal's work. The attorney cannot disclaim responsibility for an incorrect AI relevance determination on the grounds that the AI made the decision; the attorney supervising the AI-assisted review is responsible for validating the result.
State bar guidance survey (April 2026)
California
Practical Guidance on Gen AI (November 2023)
Recommends disclosure for GenAI involving client confidential information. Applies competence, confidentiality, and supervision rules. Notes that confidential information shared with GenAI tools must be protected by contract. Most detailed state guidance as of April 2026.
Florida
Bar Opinion 24-1 (January 2024)
Requires disclosure and informed consent before using GenAI tools to process client confidential information. Goes further than most states on the consent requirement. Attorneys must also ensure the client understands any associated costs.
DC
Bar Ethics Opinion 388 (April 2024)
Confirms application of all Model Rules to GenAI. Specifically addresses AI-generated work product and the duty of candor. Recommends attorney review of all AI outputs before submission to a court or opposing party.
New York
State Bar AI Task Force Report (April 2024)
Comprehensive task force report covering competence, confidentiality, supervision, and billing implications. Does not mandate disclosure but recommends it where GenAI substantially contributes to work product. Addresses AI hallucination and the candor duty specifically.
State bar guidance is evolving rapidly. Verify with the issuing bar before relying on this summary. Last verified April 2026.
Vendor due diligence checklist
- Zero-retention clause: contractually prohibits vendor from retaining prompts, documents, or outputs beyond the API transaction.
- Tenant isolation: client data processed in a logically or physically separate environment.
- No-training commitment: vendor does not use client data to train or fine-tune its models.
- SOC 2 Type II certification (current, within 12 months).
- GDPR Article 28 Data Processing Agreement (required for EU personal data).
- HIPAA Business Associate Agreement (required for protected health information).
- Subprocessor list: who does the vendor use to provide the AI service?
- Incident notification: how quickly and in what form does the vendor notify of data security incidents?
Frequently asked questions
What is ABA Formal Opinion 512?
ABA Formal Opinion 512, issued 29 July 2024, addresses lawyers' ethical duties when using generative AI tools. Key holdings: Rule 1.1 competence requires understanding AI capabilities; Rule 1.6 confidentiality requires vendor zero-retention and tenant isolation; Rule 3.3 candor applies to AI-generated work product; Rule 5.3 supervision extends to AI tools.
Do lawyers need to disclose AI use to clients?
Under ABA 512 and most state bar guidance, disclosure is not automatically required but is often prudent. Florida Bar Opinion 24-1 requires disclosure and informed consent for GenAI involving confidential information. California recommends disclosure. Most states recommend rather than mandate. The practice is evolving rapidly; check your state bar's current guidance.
What is zero-retention in a vendor AI contract?
Zero-retention is a contractual commitment by the AI vendor not to retain prompts, document content, or AI outputs beyond the immediate API transaction. Essential under Rule 1.6 and most state bar guidance. Ask for the specific contractual language, not just the privacy policy summary.
Does using AI in discovery require disclosure to opposing counsel?
Not as a general rule under current law, but disclosure is strongly recommended at the Rule 26(f) conference and is required by emerging MDL-level case management orders (see In re Broiler Chicken Antitrust, N.D. Ill. 2024). Disclosing the methodology upfront reduces challenge risk and is consistent with Sedona Cooperation Proclamation principles.
What are the privilege implications of joint-defence AI use?
When parties to a joint defence agreement share a GenAI eDiscovery platform, documents may cross privilege boundaries if the platform is not configured to maintain separate tenant environments per party. Before sharing any GenAI platform across joint-defence parties, confirm that the platform provides per-party tenant isolation and that the joint-defence agreement specifically addresses AI-assisted document processing.