Capitol Fax.com - Your Illinois News Radar » Illinois court case against ChatGPT highlights problems with AI offering professional services advice
SUBSCRIBE to Capitol Fax      Advertise Here      About     Exclusive Subscriber Content     Updated Posts    Contact Rich Miller
CapitolFax.com
To subscribe to Capitol Fax, click here. To inquire about advertising on CapitolFax.com, click here.
Illinois court case against ChatGPT highlights problems with AI offering professional services advice

Monday, Mar 9, 2026 - Posted by Rich Miller

* The curious case of Des Plaines resident Graciela Dela Torre, Nippon Life Insurance Company and ChatGPT via the International Business Times

Dela Torre and her lawyer initially pursued a disability claim against Nippon Life Insurance Company after she suffered carpal tunnel and tennis elbow on the job in August 2019. She later stopped qualifying as disabled in November 2021, then sued and ultimately obtained a settlement, the court papers say.

As part of that agreement, she waived any future claims against the insurer in January 2024. When she tried to revisit the matter about a year later, her lawyer allegedly told her it could not be reopened, prompting her to seek guidance from ChatGPT and ask whether she had been ‘gaslighted’ by her attorney. […]

After consulting the chatbot, Dela Torre submitted a pro se filing on 22 January 2025 seeking to reopen the settled case, the insurer alleges. A [federal] judge ruled on 13 February 2025 that she could not reopen the case, but the filings did not stop there, according to the complaint.

Instead, the court papers [in the lawsuit filed by Nippon Life Insurance Company] say she went on to bring a new suit against Nippon that remains pending. The insurer alleges ChatGPT produced at least 44 filings connected to her efforts, including a document citing the fabricated case ‘Carr v. Gateway, Inc. 9′. The complaint argues that the supposed precedent is nowhere to be found outside the bot’s output and Dela Torre’s submissions. It states: ‘It only exists in Dela Torre’s papers and the “mind” of ChatGPT.’

Lots more in there, so take a look.

Nippon’s lawsuit is here. Wow.

* From Michael Stanisci

Nippon settled a long-term disability lawsuit in January 2024. The claimant signed a full release. Case dismissed with prejudice.

A year later, she changed her mind. Her attorney told her the release was enforceable and the case was closed. She uploaded his response to ChatGPT and asked if she was being gaslighted. ChatGPT said yes.

Then ChatGPT drafted her motions, generated her legal arguments, conducted her legal research, and helped her file them in federal court. One filing cited a case that doesn’t exist. It only appears in ChatGPT’s output and her court papers.

By the time Nippon filed this new lawsuit, she had submitted more than 60 documents across two cases, nearly all drafted with ChatGPT’s assistance. Nippon says the cost of defending a settled case has reached approximately $300,000.

The three claims are tortious interference with a contract, abuse of process, and unlicensed practice of law under Illinois statute. Nippon is seeking $10 million in punitive damages and a permanent injunction barring OpenAI from practicing law in the state. […]

When an AI drafts your pleadings, analyzes your case, tells you to fire your lawyer, and generates the legal strategy you take to federal court, is that the practice of law?

And if it is, does the company that built the tool bear responsibility for what it produces?

Courts are going to have to answer that. This may be one of the first cases where they try.

* Back to IBT

Dela Torre is described in the reporting as a senior logistics coordinator, and she is not named as a defendant in the insurer’s lawsuit against OpenAI. OpenAI has rejected the allegations, with a spokesperson quoted as saying: ‘This complaint lacks any merit whatsoever.’

* An update from Stanisci

Just recently, a federal lawsuit was filed in Chicago alleging that ChatGPT practiced law without a license.

THE NEXT DAY

New York State Senate moved a bill that would make exactly that kind of conduct explicitly illegal, and give users the right to sue over it.

The bill is Senate Bill S7263, introduced by Senator Kristen Gonzalez. It passed the Internet and Technology Committee on a 6-0 vote. It prohibits AI chatbots from impersonating licensed professionals, including lawyers, doctors, and therapists, and bars them from providing substantive legal or medical advice. It requires operators to clearly disclose that users are talking to an AI.

The part worth paying close attention to: that disclosure is not a safe harbor. Under this bill, telling someone they’re talking to a machine doesn’t protect the operator from liability if the machine then acts like a lawyer and causes harm. The bill creates a private right of action. Users can sue. Damages and attorney’s fees are on the table. […]

But the policy direction is clear. Legislators aren’t just asking for transparency anymore. They’re asking who pays when an AI gives bad legal advice and someone relies on it. That’s a different question, and it has real consequences for every company deploying a chatbot that touches legal, medical, or financial matters.

* From the bill’s synopsis

This bill would prohibit a chatbot to give substantive responses; information, or advice or take any action which, if taken by a natural person, would constitute unauthorized practice or unauthorized use of a professional title as a crime in relation to professions who licensure is governed by the education law or the judiciary law.

Proprietors may not waive or disclaim this liability by notifying consumers that they are interacting with a non-human chatbot system. A person may bring a civil action to recover damages, and if the proprie- tor has willfully violated this section, costs, attorney’s fees and other costs of litigation. Proprietors utilizing chatbots shall provide clear, conspicuous and explicit notice to users that they are interact- ing with an artificial intelligence chatbot program. […]

JUSTIFICATION: […]

This bill prohibits proprietors of A.I. chatbots from permitting the chatbot to give substantive responses, information, or advice or take any action which, if taken by a natural person, would constitute unauthorized practice or unauthorized use of a professional title as a crime in relation to professions whose licensure is governed the education law and judiciary law. This bill ensures professional advice is provided only by licensed human professionals and not by artificial intelligence or chatbots.

* Illinois has a much more narrow law on its books. From an IDFPR press release

Governor JB Pritzker signed legislation on Friday that protects patients by limiting the use of artificial intelligence (AI) in therapy and psychotherapy services. The Wellness and Oversight for Psychological Resources Act prohibits anyone from using AI to provide mental health and therapeutic decision-making, while allowing the use of AI for administrative and supplementary support services for licensed behavioral health professionals. This will protect patients from unregulated and unqualified AI products, while also protecting the jobs of Illinois’ thousands of qualified behavioral health providers. This will also protect vulnerable children amid the rising concerns over AI chatbot use in youth mental health services.

Your thoughts?

       

4 Comments »
  1. - Steve Polite - Monday, Mar 9, 26 @ 12:09 pm:

    This is a complex issue, but on the face of it, New York’s bill seems prudent. If we are going to require a professional license from a human being and hold that individual accountable to professional standards, then AI should be held to those same standards with the right to sue when it doesn’t.


  2. - Stephanie Kollmann - Monday, Mar 9, 26 @ 12:18 pm:

    I shudder to think what the industry is spending to defeat basic regulations like S7263.


  3. - DuPage Saint - Monday, Mar 9, 26 @ 1:27 pm:

    Absolutely should be banned as practicing law without a license and any medical advice that results in injury should be subject to medical malpractice. Can you imagine if A I is handing out medical advice during the next pandemic?


  4. - Candy Dogood - Monday, Mar 9, 26 @ 1:45 pm:

    ===then AI should be held to those same standards with the right to sue when it doesn’t. ===

    The tools we’re discussing here aren’t sentient and do not have personhood. There is a person behind the curtain that must be held accountable for building an advance computer program that practices law.

    I think it is important for the courts to understand that as they establish a precedent that precedent cannot be having ChatGPT “sit for the BAR.”


TrackBack URI

Anonymous commenters, uncivil comments, rumor-mongering, disinformation and profanity of any kind will be deleted.

(required)

(not required)



* It’s just a bill
* Oppo dump!
* Illinois Credit Unions: Expanding Financial Inclusion
* Illinois court case against ChatGPT highlights problems with AI offering professional services advice
* US Senate call and response (Updated)
* Today's quotable: 'We’ll fix it in the Senate'
* Mendrick threatened to jail Pritzker if elected governor
* Slipping into darkness
* When RETAIL Succeeds, Illinois Succeeds
* Isabel’s morning briefing
* Good morning!
* SUBSCRIBERS ONLY - Supplement to today’s edition
* SUBSCRIBERS ONLY - Today's edition of Capitol Fax (use all CAPS in password)
* Selected press releases (Live updates)
* Yesterday's stories

Support CapitolFax.com
Visit our advertisers...

...............

...............

...............

...............

...............

...............

...............


Loading


Main Menu
Home
Illinois
YouTube
Pundit rankings
Obama
Subscriber Content
Durbin
Burris
Blagojevich Trial
Advertising
Updated Posts
Polls

Archives
March 2026
February 2026
January 2026
December 2025
November 2025
October 2025
September 2025
August 2025
July 2025
June 2025
May 2025
April 2025
March 2025
February 2025
January 2025
December 2024
November 2024
October 2024
September 2024
August 2024
July 2024
June 2024
May 2024
April 2024
March 2024
February 2024
January 2024
December 2023
November 2023
October 2023
September 2023
August 2023
July 2023
June 2023
May 2023
April 2023
March 2023
February 2023
January 2023
December 2022
November 2022
October 2022
September 2022
August 2022
July 2022
June 2022
May 2022
April 2022
March 2022
February 2022
January 2022
December 2021
November 2021
October 2021
September 2021
August 2021
July 2021
June 2021
May 2021
April 2021
March 2021
February 2021
January 2021
December 2020
November 2020
October 2020
September 2020
August 2020
July 2020
June 2020
May 2020
April 2020
March 2020
February 2020
January 2020
December 2019
November 2019
October 2019
September 2019
August 2019
July 2019
June 2019
May 2019
April 2019
March 2019
February 2019
January 2019
December 2018
November 2018
October 2018
September 2018
August 2018
July 2018
June 2018
May 2018
April 2018
March 2018
February 2018
January 2018
December 2017
November 2017
October 2017
September 2017
August 2017
July 2017
June 2017
May 2017
April 2017
March 2017
February 2017
January 2017
December 2016
November 2016
October 2016
September 2016
August 2016
July 2016
June 2016
May 2016
April 2016
March 2016
February 2016
January 2016
December 2015
November 2015
October 2015
September 2015
August 2015
July 2015
June 2015
May 2015
April 2015
March 2015
February 2015
January 2015
December 2014
November 2014
October 2014
September 2014
August 2014
July 2014
June 2014
May 2014
April 2014
March 2014
February 2014
January 2014
December 2013
November 2013
October 2013
September 2013
August 2013
July 2013
June 2013
May 2013
April 2013
March 2013
February 2013
January 2013
December 2012
November 2012
October 2012
September 2012
August 2012
July 2012
June 2012
May 2012
April 2012
March 2012
February 2012
January 2012
December 2011
November 2011
October 2011
September 2011
August 2011
July 2011
June 2011
May 2011
April 2011
March 2011
February 2011
January 2011
December 2010
November 2010
October 2010
September 2010
August 2010
July 2010
June 2010
May 2010
April 2010
March 2010
February 2010
January 2010
December 2009
November 2009
October 2009
September 2009
August 2009
July 2009
June 2009
May 2009
April 2009
March 2009
February 2009
January 2009
December 2008
November 2008
October 2008
September 2008
August 2008
July 2008
June 2008
May 2008
April 2008
March 2008
February 2008
January 2008
December 2007
November 2007
October 2007
September 2007
August 2007
July 2007
June 2007
May 2007
April 2007
March 2007
February 2007
January 2007
December 2006
November 2006
October 2006
September 2006
August 2006
July 2006
June 2006
May 2006
April 2006
March 2006
February 2006
January 2006
December 2005
April 2005
March 2005
February 2005
January 2005
December 2004
November 2004
October 2004

Blog*Spot Archives
November 2005
October 2005
September 2005
August 2005
July 2005
June 2005
May 2005

Syndication

RSS Feed 2.0
Comments RSS 2.0




Hosted by MCS | SUBSCRIBE to Capitol Fax | Advertise Here | Mobile Version | Contact Rich Miller