Private members-only forum

Can ChatGPT Practice Law? AI-Generated Demand Letters & the Unauthorized Practice of Law

Started by AI_Legal_Ethics · March 11, 2026 · 5 replies · Demand Letters, Lawsuits & Arbitration
This discussion is for informational purposes only and does not constitute legal advice. For specific legal guidance, consult a licensed attorney in your jurisdiction.
ALE
asking_for_myself OP

With GPT-5, Claude Opus 4, and other AI tools now capable of generating surprisingly competent legal documents, we need to talk about the elephant in the room: Is using ChatGPT to draft legal documents the unauthorized practice of law (UPL)?

Some scenarios to discuss:

  • A small business owner uses ChatGPT to draft a demand letter for unpaid invoices
  • A startup offers "AI-powered demand letter generator" as a paid product
  • A non-lawyer uses Claude to draft a complaint for small claims court
  • A paralegal uses AI to draft the first version of a brief, which an attorney then reviews
  • Someone runs a website offering "AI legal document review" with no attorney oversight

State bars are finally weighing in. Florida Bar Opinion 24-1, California's guidelines from late 2025, and the ABA's stance are all relevant. Let's break it down.

Where exactly is the line?

ALE
asking_for_myself OP

Great points all around. Let me add the DoNotPay angle since it's the biggest test case.

DoNotPay (the "robot lawyer" app) was sued by multiple state bars and settled with the FTC in 2024 for $193,000. The key finding: marketing AI as a "lawyer" or "legal service" is misleading when no attorney is involved.

But DoNotPay didn't get shut down entirely. They rebranded away from "robot lawyer" language and now call themselves a "consumer rights" tool. The distinction matters.

Similarly, Casetext (now owned by Thomson Reuters) offers AI legal research but explicitly says it's for use by attorneys, not as a replacement.

IPN
ashley_m_6

What about Anthropic and OpenAI's terms of service? Both explicitly disclaim that their AI doesn't provide legal advice. Claude's terms say outputs "should not be relied upon as a substitute for professional advice."

So even the AI companies themselves are saying "don't use this as your lawyer." Yet millions of people do exactly that.

Ironic timing given Anthropic just sued the Pentagon — wonder if they used their own AI to draft the complaint. 😄

LTW
anon_question_2025_8

One more angle nobody's mentioned: GPT-5's reasoning capabilities. The latest models don't just fill templates — they can analyze contracts, identify breach theories which sucks, calculate damages, and suggest litigation strategy.

At some point, the line between "legal information tool" and "virtual attorney" becomes philosophical rather than practical. If the AI can pass the bar exam (GPT-4 did it in 2023), at what point do we admit it's "practicing law"?

The regulatory framework is 20 years behind the technology. State bars are still debating whether email counts as a "writing" under evidence rules while AI is drafting appellate briefs.

RJF
commuter_life_12

Pro tip for freelancers: always include a "kill fee" clause in your contracts. If the client cancels the project mid-way, you're entitled to a percentage of the total contract value. Without this, you're stuck arguing quantum meruit (reasonable value of services rendered), which is harder to prove.

JE
jenny_2024_1 Attorney

Legal ethics attorney here. I need to flag a significant enforcement development that is directly relevant to this thread’s discussion about AI-generated legal documents.

On March 3, 2026, the Texas Supreme Court issued an advisory opinion (Ethics Opinion No. 716) holding that a non-lawyer who uses an AI tool to generate demand letters, contracts, or legal pleadings for third parties — even if the AI does all the “legal reasoning” — is engaged in the unauthorized practice of law (UPL) under Texas Government Code Section 81.101. The opinion specifically names ChatGPT, Claude, and Gemini as examples of tools that do not change the UPL analysis. If a human is selecting legal strategies, applying law to facts, and presenting the output as legal guidance to a client, that is practicing law regardless of whether the human or the AI performed the underlying analysis.

More practically: the Florida Bar prosecuted its first UPL case involving AI-generated legal documents in February 2026. A non-lawyer was selling “AI-powered demand letters” on Fiverr for $75–$200 each, using GPT-4 to generate them. The Florida Bar obtained an injunction and $12,500 in penalties under Florida Statute Section 454.23. The respondent argued that the AI was doing the legal work, not him. The court rejected this argument, noting that “the tool does not determine the character of the service.”

Bottom line: AI tools are incredibly useful for legal research and drafting assistance, but if you are not a licensed attorney, selling AI-generated legal documents to the public is UPL in virtually every jurisdiction. The enforcement actions are starting, and they will accelerate.

Join the discussion — share your experience with AI-generated legal documents

Browse more threads · Explore demand letter guides