Do courts require lawyers to disclose AI use in filings? 2025 list of jurisdictions and sample certification language
In 2025, lots of courts want one extra line in your brief: tell them if you used generative AI. Simple idea, big ripple effects. The patchwork of local rules and judge orders is growing fast. It’s not...
In 2025, lots of courts want one extra line in your brief: tell them if you used generative AI. Simple idea, big ripple effects.
The patchwork of local rules and judge orders is growing fast. It’s not academic anymore—it’s about Rule 11, accurate citations, and keeping client secrets out of public tools. If you know what your venue expects, you’ll avoid headaches (and avoid fixing filings at 11:58 p.m.).
So, do courts require lawyers to disclose AI use in filings? Many do. This guide walks through where these rules show up, what usually triggers disclosure, and the common pieces courts want to see—like human verification, a brief statement, and where to put it.
We’ve also got sample certification language you can copy, a quick workflow to make this routine, notes on privilege and records, and a way to track judge-specific orders. If you want it buttoned up, LegalSoul can nudge you with the right wording for your court and help you verify citations before you file.
Quick takeaways
- 2025 trend: more local rules and standing orders ask for either a short AI-use disclosure or a statement that a human checked every citation and quote. Treat it like a Rule 11 issue and check your venue each time.
- What courts usually want: say if generative AI helped with drafting or research, confirm human verification of cites/quotes, and promise you didn’t paste confidential or sealed info into public models. Some venues want you to retain prompts/outputs.
- How to keep it smooth: add an “AI-use check” to your pre-filing list, do a human cite/quote review, log AI steps in a privileged spot, scrub metadata, and pick signature block vs. separate certificate based on local practice.
- How LegalSoul helps: court-aware prompts, one-click certificates with the right language, citation checks, and privacy-friendly setups that support AI disclosure in legal filings 2025.
Executive summary: Do courts require lawyers to disclose AI use in filings in 2025?
Short answer: yes, in many places—and the number keeps climbing. After the 2023 Avianca sanctions in S.D.N.Y., judges across the country started posting standing orders. One early example: Judge Brantley Starr (N.D. Tex.) asks for a simple certification that explains whether AI helped and confirms a human verified citations.
By now, the theme is consistent. Courts aren’t banning tools; they want accountability and accuracy. Think candor, reliability, and confidentiality. If your firm uses AI responsibly, staying compliant doesn’t have to be heavy. One small detail with outsized impact: where you put the certificate. A single sentence in the signature block can push you over a word cap. Keep two versions handy—signature block and standalone—so you can switch without a scramble.
What counts as “AI use” for disclosure purposes
Courts focus on generative tools that draft, summarize, or suggest authorities. If a model wrote a paragraph, summarized a transcript you relied on, or tossed in cases you used, disclosure or a certification may be in play.
Tools that don’t create substantive text—basic templates, spellcheck, or research alerts—usually don’t trigger disclosure. Still, always read your judge’s standing order.
Big reason courts care: confidentiality. Public LLMs can retain prompts or use them to train, so many venues want you to promise you didn’t paste in sealed or sensitive client data. Private or firm-hosted models lower that risk but don’t remove your duties of competence and supervision.
Easy rule of thumb: if AI influenced the content or analysis you filed, be ready to disclose its role or certify that a human verified citations and quotations. Write down what you did so you can explain it later.
Who must disclose and when
The attorney of record owns this, even if associates, vendors, or experts touched the draft. Some judges apply similar expectations to pro se litigants. Briefs and motions are usually covered; a few judges include declarations or exhibits if you used AI-made summaries.
Timing and placement change by court. Some want a one-liner in the signature block. Others prefer a short certificate filed with the brief. That choice—signature block vs. separate certificate—can affect word counts and formatting rules.
Practical move: add “AI-use check” to your pre-filing list. Ask co-counsel and vendors if generative models were involved anywhere in the workflow. If AI only helped triage research (and not drafting), use a narrower certification that focuses on human verification of citations and quotes.
2025 jurisdiction-by-jurisdiction guide to AI disclosure and certification rules
Federal appellate and district courts: Most action is in judge-specific standing orders. You’ll see language like Judge Starr’s (N.D. Tex.) requiring a simple certification and human verification. In S.D.N.Y., after Avianca, several judges include similar expectations in their individual practices. Some districts also post practice reminders about checking quotes and avoiding fabricated citations.
Bankruptcy and specialized courts: A handful of bankruptcy judges have posted orders that echo the federal trend: be accurate, and disclose if generative AI touched the filing. Always confirm at the judge level.
State courts: More state systems are publishing guidance. Themes repeat: confidentiality, human checks of quotes and citations, and short disclosures when AI drafted parts of a filing or supported cite-checking.
International snapshot: Canada’s Federal Court issued a notice on AI in submissions, and the U.K. judiciary gave judges guidance on risks and proper use. Net-net: watch your specific venue and, if unsure, disclose briefly and certify verification.
Common elements courts require in AI-related certifications
Court orders tend to ask for the same few items:
- Say if generative AI helped with drafting or research—clear yes/no.
- Confirm a human verified every citation and quotation against the source.
- Assure the court you didn’t paste confidential or sealed info into public models.
- Accept responsibility for the filing regardless of tools used.
- Be ready, in some venues, to retain prompts/outputs for possible review.
Small drafting choices matter. Some judges don’t want tool names; others don’t mind. If space is tight, tuck a short line in the signature block. If the court prefers formality, file a brief “AI-Use Certificate.”
One more edge case: translation. If AI helped translate anything, add a note that a fluent reviewer checked it against the original. That shows you applied the same care to AI-adjacent tasks as you did to the brief itself.
Sample certification language you can adapt (model clauses)
Option A — No generative AI used
Counsel certifies that no portion of this filing was drafted by a generative AI system and that all citations and quotations were prepared and verified by a human attorney.
Option B — AI-assisted drafting/research, fully verified
Counsel certifies that generative AI tools were used to assist with drafting and/or research, but a human attorney reviewed the entire filing, verified the accuracy and context of all citations and quotations against the cited sources, and accepts responsibility for its contents. No confidential or sealed information was entered into public AI systems.
Option C — Research-only AI assistance
Counsel certifies that generative AI was used solely to aid research triage and that all text, arguments, citations, and quotations in this filing were drafted and verified by a human attorney.
Option D — Citation accuracy statement (when disclosure isn’t required but prudence suggests a record)
Counsel certifies that all case citations, quotations, and references have been checked against the cited sources by a human attorney and are accurate and complete.
Use whichever option fits your venue and matter. Keep both signature-block and standalone versions handy so you can switch formats fast.
Practical workflow to comply without slowing filings
- Intake: At matter opening, note every venue’s AI-related rule and pull the presiding judge’s standing orders. Save them in your docketing tool.
- Drafting: If you use AI, label the version and store prompts/outputs in a privileged location with tight retention.
- Verification: Do a human cite/quote check. Record a supervising attorney sign-off.
- Certificate: Pick signature block vs. separate certificate based on page limits and local norms. Keep templates in your brief shell.
- E-filing: If filed separately, attach the certificate and follow ECF naming rules.
- Post-filing: Retain logs only as long as needed under firm policy and client direction.
Tip that helps later: reflect the verification work in your billing notes—“Human check of all authorities.” Clients and carriers like seeing that diligence, and it gives you clean proof if anyone questions your process.
Data security and confidentiality when using AI in litigation
Many courts want you to promise you didn’t put confidential or sealed material into public LLMs. That means smart tech choices and careful habits.
Prefer private deployments and zero-retention modes. For vendor tools, check SOC 2 Type II, encryption in transit and at rest, data residency, subcontractor terms, and DPAs with audit rights. Turn off training on your data. Make sure prompts and outputs aren’t feeding a public model.
- Mask client names or unique identifiers before sending text to any model.
- Paste only what’s necessary; keep snippets small.
- Use AI for outlines, then write from your sources (“clean-room” approach).
- Keep sealed material out of AI workflows entirely.
Watch for metadata leaks—track changes, comments, or AI artifacts in your documents. Scrub before filing. Also set a clear incident plan for AI misuse or exposure so you can move quickly if something slips.
Ethics, sanctions, and malpractice risk
The ethics core hasn’t changed: competence, supervision, and candor. Bars and the ABA expect you to understand the tools you use. Sanctions are real. The Avianca case is the warning everyone knows—fake citations got filed, and the court responded.
Practical guardrails:
- A human lawyer verifies every citation and quotation against the source, every time.
- Partners review AI-assisted work as if it came from a junior associate.
- Train your team on prompt discipline and cite-check routines.
- Document your verification steps so you can show your work if needed.
One way to think about it: treat the model like a foreign associate. Useful. Fast. Not filing-ready until you’ve reviewed it line by line. Insurers increasingly ask about AI governance; a short written policy (verification, confidentiality, logging, training) helps with renewals and, if something goes wrong, with your defense.
Discovery and evidentiary issues around AI-assisted drafting
Are prompts and outputs privileged? Often yes, if created for litigation and kept as part of your analysis. But a court might still ask for them in narrow scenarios, especially if your filing’s accuracy is challenged.
Expect targeted asks like: identify whether AI helped draft a section and produce related prompts. If you keep logs, store them as attorney work product in a privileged repository, separate from non-privileged materials.
- Be prepared to show human verification of quotes and citations rather than raw prompts.
- When using AI summaries, file the underlying transcript and rely on your own argument.
- If an expert used AI, make sure they can explain their method without leaning on a black box.
Judge-by-judge variability and how to monitor standing orders
Most requirements live in individual practices and can change with little notice. Build a quick monitoring routine and you’ll save yourself the scramble later.
- At case opening, save the presiding judge’s standing orders. Check again before big filings.
- Subscribe to court feeds or docket alerts for admin orders.
- Keep a shared matrix: judge, court, what they expect, where to put the certificate, and any retention notes.
If a local rule is silent but your judge wants a disclosure, follow the stricter path unless you get clarity. Don’t assume trial-court practices carry to the appellate level. A quick call to the clerk can prevent a redo. Over time, your internal tracker will feel like a living “by-jurisdiction” guide—aimed at the only level that really matters: your judge.
How LegalSoul helps you stay compliant
LegalSoul turns the compliance chores into a light, predictable step in your process so you can focus on writing and arguing.
- Jurisdiction-aware prompts that reflect local rules and judge orders at the moment you finalize a draft.
- One-click AI-use certificates: “no AI,” “AI-assisted with human verification,” or “research-only,” ready for the signature block or as a separate attachment.
- Citation support: source links, cross-checks, and quote matching to back up human verification and avoid hallucinated citations.
- Privacy choices that keep confidential and sealed material out of public systems.
- Optional prompt/output logs with retention settings that match your firm’s policy.
Most teams start with their top venues and expand. Fewer last-minute edits, fewer misses, better sleep before the deadline.
Frequently asked questions
Do I have to disclose spellcheck, templates, or my brief bank? Usually no. Disclosure targets generative tools that create or suggest substantive content. Still, check your judge’s language.
Must I name the specific tool? Many courts are fine with a general description like “generative AI tools.” If your venue wants specifics, follow the rule.
Are court-provided AI tools exempt? You still must verify. Disclosure varies by venue. If unclear, add a short statement and a citation-verification line to be safe.
What if co-counsel or an expert used AI without telling me? Ask before you file. If AI assisted drafting anywhere in the workflow, disclose appropriately and confirm a human verified citations and quotations.
Where do I put the certificate? Depends. Some courts want it in the signature block; others prefer a separate filing. If you’re unsure, attach a short certificate to avoid word-count issues.
How long should I retain prompts/outputs? Only as long as you need for defensibility and client needs. Keep access tight and define retention in your firm policy.
Appendices and templates
- Copy-paste certification templates A–D: “no AI,” “AI-assisted with verification,” “research-only,” and “citation accuracy.” Available for both signature block and standalone formats.
- Pre-filing checklist: venue check; tool usage inventory; confidentiality review; human cite/quote verification; certificate placement; ECF formatting; retention log update.
- Sample firm policy: allowed uses, prohibited uses (no sealed info in public tools), supervision and verification, logging and retention, incident response, client notice language.
- Change log template: judge, court, last-checked date, requirement summary, placement, notes, and source link.
- Training module outline: 60-minute associate session on prompt craft, hallucination risk, and verification; paralegal module on certificate prep and filing.
Use these to keep your approach consistent across matters, reduce surprises, and show the court you take accuracy and confidentiality seriously.
Conclusion
More courts now ask you to disclose AI use and prove a human checked citations and quotes, with a clear promise about confidentiality. The specifics vary, but a light routine—venue check, human verification, right certificate in the right place—covers most of it.
If you want this on rails, LegalSoul can offer court-aware prompts, citation checks, and one-click certificates. Grab a quick demo or kick off a pilot and make AI disclosures one less thing to worry about on filing day.