Latest Post | Last 10 Posts | Archives
Previous Post: Open thread
Next Post: ‘This is the top of the mountain here, the very very top’
Posted in:
* AP…
More than a dozen states and the District of Columbia have filed lawsuits against TikTok on Tuesday, alleging the popular short-form video app is harming youth mental health by designing its platform to be addictive to kids.
The lawsuits stem from a national investigation into TikTok, which was launched in March 2022 by a bipartisan coalition of attorneys general from many states, including California, Kentucky and New Jersey. All of the complaints were filed in state courts.
At the heart of each lawsuit is the TikTok algorithm, which powers what users see on the platform by populating the app’s main “For You” feed with content tailored to people’s interests. The lawsuits also emphasize design features that they say make children addicted to the platform, such as the ability to scroll endlessly through content, push notifications that come with built-in “buzzes” and face filters that create unattainable appearances for users.
In its filings, the District of Columbia called the algorithm “dopamine-inducing,” and said it was created to be intentionally addictive so the company could trap many young users into excessive use and keep them on its app for hours on end. TikTok does this despite knowing that these behaviors will lead to “profound psychological and physiological harms,” such as anxiety, depression, body dysmorphia and other long-lasting problems, the complaint said. […]
TikTok does not allow children under 13 to sign up for its main service and restricts some content for everyone under 18. But Washington and several other states said in their filing that children can easily bypass those restrictions, allowing them to access the service adults use despite the company’s claims that its platform is safe for children.
The lawsuit is here.
* From AG Raoul…
Illinois Attorney General Kwame Raoul today announced that his office filed a lawsuit against TikTok for its harmful business practices targeting children and allegedly deceiving the public about the social media platform’s dangers.
Today’s lawsuit stems from a bipartisan nationwide investigation announced by Raoul in March 2022. Illinois’ action seeks injunctive relief to address TikTok’s misconduct as well as monetary penalties.
In addition to Illinois’ lawsuit, 13 other states filed separate enforcement actions today against TikTok for violations of state consumer protection laws. In their lawsuits, Raoul and the attorneys general allege that TikTok’s business model, which seeks to capture as much user time and attention as possible to sell advertising, has targeted youth, including teenagers and even younger children, in ways that take advantage of them.
“American children and teenagers are in the grip of a devastating mental health crisis,” Raoul said. “The addictive features on TikTok’s social media platform interfere with sleep and education, and contribute to depression, anxiety, body dysmorphia and thoughts of self-harm. In Illinois, we will always put our children and young people first. I am committed to holding TikTok and any other social media companies accountable for putting profits ahead of our children’s safety and well-being.”
The U.S. surgeon general has found there are ample indicators that social media can have a profound risk of harm to the mental health and well-being of children and adolescents. Eighth and 10th graders now spend an average of three-and-a-half hours per day on social media. According to the surgeon general, adolescents who spend more than three hours on social media per day face double the risk of experiencing poor mental health outcomes, including symptoms of depression and anxiety.
The use of TikTok is pervasive among young people in the United States. In 2023, 63% of all Americans ages 13 to 17 who responded to a Pew Research survey reported using TikTok, and most teenagers in the U.S. used TikTok daily.
Raoul alleges that TikTok uses design features that are addictive and that exploit young users’ psychological vulnerabilities to keep them repeatedly using the platform for prolonged periods of time. Many of these product features have been linked to damaging psychological outcomes. According to the complaint filed by the Attorney General’s office, TikTok’s platform drives compulsive behavior, interfering with sleep and education, and includes features that can exacerbate issues young people have with depression, anxiety, body dysmorphia and thoughts of self-harm.
Thoughts?
posted by Isabel Miller
Tuesday, Oct 8, 24 @ 9:50 am
Sorry, comments are closed at this time.
Previous Post: Open thread
Next Post: ‘This is the top of the mountain here, the very very top’
WordPress Mobile Edition available at alexking.org.
powered by WordPress.
Frist, the suit is spot on. Tik Tok is not alone in the damage they have done, but I applaud them for filing suit. Where it will go I have no idea, but the only way to make things better is to cost these “social media” platforms money. At the federal level the law makers need to start by removing immunity from all of the platforms. They won’t, but that would be the needed game changer.
Better would be to end all social media. It isn’t good for adults either. It has killed any semblance of truth that existed prior to 2004. Shouting at the clouds here, but I am not wrong.
Comment by JS Mill Tuesday, Oct 8, 24 @ 10:09 am
Nor sure why TikToc is being singled out among the Social media platforms. I am concerned with the single serving bottles of Liquor being displayed in stores so Minors can shoplift them as a form of marketing Liquor to children. Seems more important and immediate.
Comment by Terri Burkholter Tuesday, Oct 8, 24 @ 10:21 am
“TikTok’s platform drives compulsive behavior, interfering with sleep and education, and includes features that can exacerbate issues young people have with depression, anxiety, body dysmorphia and thoughts of self-harm.”
1) I can’t argue with any of that, and
2) I can’t distinguish TikTok from other aspects of late-stage consumer capitalism.
The suit addresses a symptom, but the underlying disease remains unaddressed.
– MrJM
Comment by @misterjayem Tuesday, Oct 8, 24 @ 10:32 am
Cool… Now do Instagram and You Tube, both of which do all the exact same things that Tik Tok does as well. Or is it fine when google and meta do it?
Comment by Benniefly2 Tuesday, Oct 8, 24 @ 10:38 am
===both of which do all the exact same things===
TikTok’s logarithm is well advanced over those two.
Not to say that should get them sued, just saying that it’s different.
Comment by Rich Miller Tuesday, Oct 8, 24 @ 10:41 am
Is TikTok base e or base 10 or something else?
Comment by No relation Tuesday, Oct 8, 24 @ 10:48 am
I think it behooves society to think about how we’re spending our time, but this seems a bit too much. It’s an entertaining product, that’s what they’re calling “addictive”, people enjoy using it.
Maybe we can figure out some reasonable guidelines, but to me this smells more like a witch hunt than trying to actually understand and work through a problem.
Comment by Perrid Tuesday, Oct 8, 24 @ 10:55 am
Political gamesmanship, due to Tik Tok being a foreign entity. All of these social media platforms have integrated algorithms and feed tendencies in all people, not just adolescents.
The amount of data these companies have accumulated on people should be addressed. Illinois has incentivized, through tax credits, the mass accumulation and storage of personal data. The Federal Government purchases personal user data from third-party entities, with almost no oversight.
Europe has much more stringent internet protocols and protections for consumers and their personal data. no one in the US, including the Fed, has been able to crack down on privacy protections within the data sphere.
Comment by Frida's Boss Tuesday, Oct 8, 24 @ 10:57 am
Had to get it filed before the legislatively mandated TikTok ban (The Protecting Americans from Foreign Adversary Controlled Applications Act (PAFACA) kicks in. Chinese owner ByteDance has to divest or sell to a a US owner by January 19, 2025
Comment by Donnie Elgin Tuesday, Oct 8, 24 @ 11:01 am
https://www.nbcnews.com/tech/security/us-government-buys-data-americans-little-oversight-report-finds-rcna89035
The Government pays for your data, which is “commercially available information.” Thus, the government doesn’t have to follow its own rules.
Comment by Frida's Boss Tuesday, Oct 8, 24 @ 11:05 am
- Thus, the government doesn’t have to follow its own rules. -
What’s your point? Is AG Raoul running our national security apparatus?
Comment by Excitable Boy Tuesday, Oct 8, 24 @ 11:27 am
By our nature we will always find something addictive to do. Before social media, people were doing more constructive pursuits, something that social media is not.
As far as the lawsuits, I have no problem with them. If the addiction was constructive and learning was being done, it would be a different thought.
Comment by FormerParatrooper Tuesday, Oct 8, 24 @ 11:34 am
I don’t know enough about the specifics of TikTok’s backend technology to comment on whether or not TikTok is worse than other social media platforms. Social media can be very damaging for minors. But that’s the fault of the parents. Many parents seem to have given up on setting limits for their children and actual parenting. Seems to me that’s the core issue here. Parents CAN prohibit their kids from using social media, so why aren’t they doing it?
I have no idea how government would go about changing that dynamic. Heavy investment in a solid outreach campaign might help.
Comment by Now I’m down in it. Tuesday, Oct 8, 24 @ 11:41 am
Agree with JS Mill, but recognize it’s likely too big and thereby too late to make effective change. Its changed us as a people, and not for the better.
Comment by Lincoln Lad Tuesday, Oct 8, 24 @ 11:45 am
===TikTok’s logarithm is well advanced over those two.===
TikTok’s algorithm is *better* but I’m not sure that it’s “more advanced.” I think it’s being targeted because a) kids use it and olds don’t and b) China’s “golden share” ownership. The evidence is pretty thick on the ground that TikTok deliberately suppresses viewpoints critical of the CCP, and amplifies China’s interests. I don’t think that that’s “dangerous for teenagers,” exactly, but it’s definitely part of why TikTok is the target and not Insta.
I actually think some of the ways the CCP’s involvement in TikTok impact American teenagers are a lot weirder and more indirect — a lot of mental health issues that are discussed pretty openly in the West are NOT discussed in China, primarily through self-censorship because everyone knows the party disapproves. TikTok used pretty blunt-instrument keyword moderation to shove videos talking about “rape” or “sexual assault” down the algorithm, so kids immediately started talking about “r*pe” or “SA,” with the very weird result that teenagers talk in TikTok moderation-evading shorthand in places like reddit that DON’T have those restrictions, and seem to be internalizing that these are words you can’t say out loud? I don’t think TikTok (or China) intentionally set out to do that, but I do think it’s not-great that American teenagers are internalizing Chinese self-censorship around mental health and sexual assault and so on.
Comment by Suburban Mom Tuesday, Oct 8, 24 @ 11:48 am
In a more holistic sense, these types of “dark patterns” (as they’re called) that websites of all types, but particularly social media, employ to increase engagement through quick dopamine hits are bad *in general* and probably need to be regulated *in general* but it’s pretty hard to see that happening in the current US political and judicial climate. We’ll benefit from the scraps of whatever the EU does on it.
Comment by Suburban Mom Tuesday, Oct 8, 24 @ 11:51 am
@Perrid, I urge you to consider the impact on kids especially teens over the last decade or so. There is significant scholarship on the issue.
I do agree that we as a society need to solve this issue, but mainly as it pertains to adults. I just don’t know that we have the capacity to do that any more.
With respect.
Comment by JS Mill Tuesday, Oct 8, 24 @ 11:55 am
@EB it’s not about Kwame running national security. It is just a political game that may get some money from TikTok but won’t solve any issues.
The US Government doesn’t want to hamper social media. The reason they’re going after Tik Tok is because it’s a foreign entity.
If the Government was serious about protecting its citizenry from social media, it would push to adopt the European Date Privacy measures, which eclipse the protections that we have here.
If Kwame and other AG’s want to get serious about protecting people then they should push to hold companies truly accountable.
https://gdpr.eu/what-is-gdpr/
Comment by Frida's Boss Tuesday, Oct 8, 24 @ 11:56 am
===Parents CAN prohibit their kids from using social media, so why aren’t they doing it?===
We do forbid our kids from social media, as do a lot of other parents in our circle.
Unfortunately, school makes parents use multiple social media channels to get updates from sports, classes, etc., and teachers and coaches very frequently require kids to sign up for whatever their preferred social network is. Plus the amount of YouTube my kids get assigned to watch at school is insane, and YouTube does not have a “just this one video” option.
Comment by Suburban Mom Tuesday, Oct 8, 24 @ 12:06 pm
It’s interesting to see where this line is drawn.
Nobody is filing lawsuits against churches for blatantly telling children they are born ‘broken’/sinners. Entire generations have been damaged by that mindset and have grown up into adults with a very obvious case of learned helplessness.
Many of those kids don’t even have a chance to opt-out and are forced to attend hearing this about themselves from adults they are told to trust.
Do we really care about psychological harm to children?
TikTok might be the new kid on the stage, but the concepts they are using are thousands of years old.
Churches, predominately of a single belief, were one of the first organizations to embrace the benefits of TikTok. They even openly admit they target children.
—
https://reachrightstudios.com/blog/tiktok-for-churches/
Why TikTok is Important for Churches
TikTok has millions of users worldwide, many of whom are young people. This makes it a great platform for churches to reach a younger audience.
—
Comment by TheInvisibleMan Tuesday, Oct 8, 24 @ 12:25 pm
=Unfortunately, school makes parents use multiple social media channels to get updates from sports, classes, etc.,=
What schools require this, just curious? We post to FB, but that is only a mirror post to what goes on our webpage. I relented on having a district FB account about 4 years ago because very few people were going to our website or reading emails. But noone is required to go there.
Your local schools should go to something like “Rooms” or “schoolreach” if they are using twitter or instagram to communicate with students. Given your moniker here, I assume your kids attend suburban schools and would have thought (hoped?) they would use one of these closed loop apps to communicate.
Comment by JS Mill Tuesday, Oct 8, 24 @ 12:28 pm
I don’t have kids in school, but I thought this article was an interesting read. In it a student talks about the ways the school “mandates” social media use.
https://www.cnn.com/2024/10/08/health/school-phone-ban-student-perspective-wellness/index.html
Have a good day.
Comment by Dog Lover Tuesday, Oct 8, 24 @ 1:01 pm
TikTok is horrible but this suit is a ridiculous stunt.
Comment by DougChicago Tuesday, Oct 8, 24 @ 1:24 pm
yeah, ok, but go after other companies in the same way. and kids will STILL use it unless parents intervene. other things seem to grab kids, like gangs, which is probably worse. parents should exert more control over kids.
Comment by Amalia Tuesday, Oct 8, 24 @ 3:16 pm
This will go the way of the movement to “ban rock ‘n roll to protect our kids”
Next?
Comment by Walker Tuesday, Oct 8, 24 @ 3:27 pm
@Dog Lover- Thanks for the read, it was very interesting. The teen perspective that they “need” phones even when trying to kick the habit is been interesting to me as well. This was a parochial school, and I found that noteworthy for a number of reasons.
Schools are using these social media platforms for one simple reason, they are “free”. There is a cost but it isn’t as obvious. Like I mentioned, we are using a closed loop “app” from the company that provides our website. We chose this because it limits the “noise” created when you post on Fb etc. POur limited FB posts also have comments off. If people have questions they can contact us directly through a variety of ways to get the actual answer. We are small so we get beck to everyone quickly.
@Walker, I respectfully disagree. We may only be able to address this issue for 8-10 hours a day, but that break from this stuff is huge for kids used to being inundated (of their own choosing usually) 24-7. For me, this is the hill I will die on.
Comment by JS Mill Tuesday, Oct 8, 24 @ 3:56 pm
@JS Mill — our school does use (various) closed-loop apps, but I actually hate them all too, for a slightly different reason: They all leak data like crazy and they all (ALL) bundle and sell kids’ data to data brokers. They also (in their terms and conditions) probably grant themselves rights to everything you or your teachers share on the platform, whether they use it to train an AI model or just to steal curriculum and sell it.
I have no ability to opt out of using these godforsaken apps if I want my child to attend public school. The directory information, absence reporting, school communications, report cards, etc., is ALL owned by a private for-profit corporation, and it is ALL being sold, and fed to AI models. They will claim in their terms and conditions that they “anonymize” the data, but they don’t do NEARLY enough anonymization and I have seen NO school apps that meet federal infosec standards for handling citizen data.
They are also running A/B testing on children, without consent.
I work in data privacy and security, so this is my day-to-day, and I am constantly enraged not just that schools are making my kids use youtube, but that they’re putting my kids data into these awful, awful apps. Virtually all of them are using third-party tracking pixels in ways that probably violate FERPA. Lawsuits have provided credible evidence that powerschool (for example) is violating federal wiretapping statutes by “eavesdropping” on student e-mails.
The US barely has data privacy laws and a bunch of stuff that school apps do routinely would be wildly illegal for a US company to do with paying customers’ data or employees’ data. But the data of minor children who are given no opt-out rights and must use these apps to even register to attend public school? Train the AI on it — Sell it — Create consumer profiles from it — Read kids’ e-mails and sell the contents to marketing companies.
Comment by Suburban Mom Tuesday, Oct 8, 24 @ 10:44 pm
@JS Mill, I hate those apps too. If you read the terms and conditions, they virtually all insist you grant a license to everything posted on their sites (they’re stealing your teachers’ curricula) and force you to agree they can “anonymize” and sell users’ (children’s) data to data brokers and other third parties. I work in data privacy, and the anonymization techniques do not anonymize the data. They also grant themselves licenses to use my children’s data to train their AIs. I have no option to NOT use these systems to even register my children for school. Most of them use third-party tracking pixels. School district IT departments are not sophisticated enough to make decisions about these for-profit school apps. It would require a data privacy specialist with experience with big data brokers, and an IT auditor who knows how to see what data they’re exfiltrating.
The AI piece is a whole other mess that requires some technical information to get in to, but the bulletpoint summary is, AI models can’t “forget” data, so you can’t remove data from them unless you retrain the entire model from scratch. Every child’s data that has gone into these for-profit education apps is now in an AI model, and sometimes you can force AI models to vomit out their training data. It can’t be undone.
Comment by Suburban Mom Tuesday, Oct 8, 24 @ 10:52 pm