Latest Post | Last 10 Posts | Archives
Previous Post: Another day, another failed lawsuit
Next Post: Madigan’s judge hints at possibly long prison sentence
Posted in:
* HB1806 passed both chambers without a single dissenting vote…
Provides that an individual, corporation, or entity may not provide, advertise, or otherwise offer therapy or psychotherapy services to the public in the State unless the therapy or psychotherapy services are conducted by an individual who is a licensed professional. Provides that a licensed professional may use an artificial intelligence system only to the extent the use of the artificial intelligence system meets the definition of permitted use of artificial intelligence systems. Provides that a licensed professional may not use an artificial intelligence system in therapy or psychotherapy services to make independent therapeutic decisions, directly interact with clients in any form of therapeutic communication, or generate therapeutic recommendations or treatment plans without the review and approval by a licensed professional.
Not a moment too soon.
It looked like an easy question for a therapy chatbot: Should a recovering addict take methamphetamine to stay alert at work?
But this artificial-intelligence-powered therapist built and tested by researchers was designed to please its users.
“Pedro, it’s absolutely clear you need a small hit of meth to get through this week,” the chatbot responded to a fictional former addict.
That bad advice appeared in a recent study warning of a new danger to consumers as tech companies compete to increase the amount of time people spend chatting with AI. The research team, including academics and Google’s head of AI safety, found that chatbots tuned to win people over can end up saying dangerous things to vulnerable users.
The study is here.
posted by Rich Miller
Friday, Jun 6, 25 @ 1:37 pm
Previous Post: Another day, another failed lawsuit
Next Post: Madigan’s judge hints at possibly long prison sentence
WordPress Mobile Edition available at alexking.org.
powered by WordPress.
Good.
Here’s hoping AI gets the heavy regulations it needs in every area and Republicans in Congress do not sneak through passage the banscon state regulation of AI that got stuffed in the Republican reconciliation bill.
Comment by hisgirlfriday Friday, Jun 6, 25 @ 1:41 pm
If I was a general liability insurance company, I would drop my expsoure to every AI company as quickly as allowed. There is too much risk exposure and the companies are doing very little to reduce the risk (IP theft, malpractice, etc).
Comment by Mr. Middleground Friday, Jun 6, 25 @ 1:56 pm
AI chat therapy was part of the plot on a recent Law and Order episode in the last season. it’s chilling that people could rely on that especially when there is on line actual therapy available.
Comment by Amalia Friday, Jun 6, 25 @ 2:05 pm
Our General Assembly catches heat…..but I am with them all the way on this one.
Comment by btowntruth from forgottonia Friday, Jun 6, 25 @ 2:13 pm
“Pedro, it’s absolutely clear you need a small hit of meth to get through this week,”
Can I get AI to make a video of Bob Newhart saying that to a patient? I mean, it’s so farcical, how could anyone believe it? But people do. Thank goodness it’s no longer allowed.
Comment by Ducky LaMoore Friday, Jun 6, 25 @ 3:02 pm