Latest Post | Last 10 Posts | Archives
Previous Post: ILGOP jumps on Pritzker’s “governor of Chicago” quip
Next Post: District-level school test results released
Posted in:
* Tribune editorial…
Right from the start, the tech titans at Twitter and Facebook argued that they were not so much a publisher in the sense that the owner of this newspaper is a publisher but more of a public utility: closer to ComEd than the Chicago Tribune, you might say. This has proved to be a con.
By hiding behind a federal law, Section 230 of the Communications Decency Act, the social networks claimed broad immunity from liability for content created by their users; a protection not afforded this newspaper which always has stood behind the content it chooses to publish on these pages, printed or online.
Um, this is about “content created by their users,” not content created or chosen by the Tribune itself. For the Tribune, and myself, “users” would be commenters. I can be sued for writing something defamatory, but I can’t be sued if a commenter does it or if somebody in one of my live news feeds does it. And neither can my internet provider.
People can be sued for posting defamatory videos or Facebook posts, but YouTube and Facebook currently can’t. The Tribune got rid of its comment section because it was a raging dumpster fire and they couldn’t police it effectively and it was giving the entire publication a bad name. But now they want to make others do what they couldn’t and add the constant threat of civil liability to the mix?
Bite me.
I swear I’m almost pining for the old regime.
posted by Rich Miller
Thursday, Dec 2, 21 @ 2:21 pm
Sorry, comments are closed at this time.
Previous Post: ILGOP jumps on Pritzker’s “governor of Chicago” quip
Next Post: District-level school test results released
WordPress Mobile Edition available at alexking.org.
powered by WordPress.
Public utilities are heavily regulated. Just saying.
Comment by 47th Ward Thursday, Dec 2, 21 @ 2:26 pm
You can *successfully* be sued for writing something defamatory.
You can be sued because someone thinks you stole all the cheese from the moon.
But for a more Illinois-specific example of this;
https://patch.com/illinois/plainfield/peck-lawsuit-against-patch-commenter-dismissed-0
My favorite part of which is the following;
“Using the name “John Doe,” the commenter using the name “Tim” was able to successfully defend his privacy in court.”
One might even say that guy was able to stay… invisible.
Comment by TheInvisibleMan Thursday, Dec 2, 21 @ 2:29 pm
When the old editor of the old editorial board blocked folks on the tweeter machine, kinda put into perspective how that old board was… functioning.
Now that old board is almost better than *this* thought?
The bar is so low. This new regime is barely clearing it.
Comment by Oswego Willy Thursday, Dec 2, 21 @ 2:40 pm
If the Communications Decency Act never existed, Twitter and Facebook would still not be liable for things posted by their users. The only reason 230 exists is because the CDA created a form of liability for online communications that didn’t exist before, and then created a safe harbor provision for the platform providers.
Comment by Homebody Thursday, Dec 2, 21 @ 2:54 pm
Homebody speaks the truth. Yelling SECTION 230 AARGLE BARGLE when these are actually 1st Amendment concerns seems to be fun for some, but they’re clearly wrong.
Comment by Lefty Lefty Thursday, Dec 2, 21 @ 3:05 pm
Social media is an absolute scourge. People’s lives have been ruined, intentionally I might add, by posts on facebook, twitter, et al.
I am not referring to the posts by individuals that ruin their own careers, but those meant to harm others.
These tech monstrosities should bear some responsibility for providing a platform for this destruction that otherwise would not occur or occur in the same manner.
That said, I have no sympathy for the Trib.
Comment by JS Mill Thursday, Dec 2, 21 @ 3:06 pm
===These tech monstrosities should bear some responsibility for providing a platform for this destruction===
Just yelling “Do something!” ain’t gonna cut it.
Comment by Rich Miller Thursday, Dec 2, 21 @ 3:09 pm
=Just yelling “Do something!” ain’t gonna cut it.=
Agreed. Might as well yell at kids for walking on their driveway.
Removing their immunity would get the job done, but might as well ask for Illinois to tax retirement income as well. Seems about as likely given political appetites.
Comment by JS Mill Thursday, Dec 2, 21 @ 3:17 pm
I find it interesting that the demand that the social media platforms should be held accountable for what is posted on their platform is then followed by complaints of censorship…from the same folks who demand accountability.
Comment by Pot calling kettle Thursday, Dec 2, 21 @ 3:20 pm
=then followed by complaints of censorship…from the same folks who demand accountability.=
Censorship by a private entity on their platform is completely allowable and does not violate anyone’s rights. Rich has to do it all of the time, some of the posts that don’t get through are mine. It is his blog and he can do as he pleases.
The online mobs that form on facebook and other social media should be censored. I support legislation that would allow it.
Comment by JS Mill Thursday, Dec 2, 21 @ 3:33 pm
The real issue is what to do about the algorithms platforms like YouTube and Facebook-Meta-Whatever use to “suggest” similar content. Yes the suggestions are based on the user’s own choices, and all the platforms are trying to do is keep you online longer so they can charge more for ads, but there’s no question it has fostered a climate of division and narrow mindedness.
Liability may not be appropriate but regulation surely is.
Comment by Bondguy Thursday, Dec 2, 21 @ 3:43 pm
I feel so bad for the hedge fund companies and their adventures in journalism. sn/
Comment by Frank talks Thursday, Dec 2, 21 @ 3:51 pm
The question is whether those who create a public forum and profit from that public forum have any duty at all to ensure folks behave well in that forum.
Generally speaking, my rule of thimmb is whether you own a building, or a business or a virtual space, if you know that space is dangerous or is being used unlawfully, and you do nothing about it, you are morally culpable and legally responsible.
Comment by Schaumburger King Thursday, Dec 2, 21 @ 4:30 pm
If you don’t like the content on electronic media you have two options. Don’t use it, or create an alternative. If you have a better idea or product it will be used. Capitol Fax case in point. Not a fan of government interference or regulation of speech.
Comment by Papa2008 Thursday, Dec 2, 21 @ 4:44 pm
I agree with Bondguy to some extent. The algorithms are the real harm causers (mobs aside, mobs formed before social media). But the user ultimately has to bear responsibility. We don’t say well Jimmy fought for ISIS in Syria but it was the social media that did it to him. At some point in time we do have to recognize that people have agency and the responsibility to keep custody of their own eyes, ears, and brain.
Comment by cermak_rd Thursday, Dec 2, 21 @ 4:47 pm
Agree with Bondguy. There’s a difference between content created by users and content served up and promoted by their algorithms. Big tech can be excused if someone graffiti’s something stoopid on their wall, but if they spread and uplift the stoopodity, that’s on them.
Rich doesn’t promote or uplift comments. Neither does the Trib. But social media sure does and they should do a better job of reviewing what they tout.
Comment by Socially DIstant watcher Thursday, Dec 2, 21 @ 5:14 pm
Homebody: “If the Communications Decency Act never existed, Twitter and Facebook would still not be liable for things posted by their users. The only reason 230 exists is because the CDA created a form of liability for online communications that didn’t exist before, and then created a safe harbor provision for the platform providers.”
———————
Hmmm. Without the CDA and Section 230, lawyers would have made creative arguments under traditional state tort law for why ISPs, blogs with comment sections, Myspace, etc. should be liable for defamatory / copyright-violating / trade secret-revealing / etc. speech by their users. And the lawyers might have succeeded in many instances (at least in some states).
Comment by Purple Bear Thursday, Dec 2, 21 @ 6:27 pm
== Rich doesn’t promote or uplift comments ==
Actually, he occasionally does … but it is very selective and credited to the author.
Comment by RNUG Thursday, Dec 2, 21 @ 7:28 pm
As to the whole censorship issue, I’m a moderator on a couple of Facebook groups, and my son runs a major / very popular local Facebook group. It can be a PITA.
But there are plenty of automated tools to help the process. Heck, Rich even uses similar tools here … and even though I understand the process, have unintentionally run afoul of some of his filters.
The trick is clearcut rules, automated tools that require continued tweaking, enough staff to review any trapped content in a timely content, and punishment to the violators. On Facebook, we started with filters on joining the groups, and a 3 strike rule … break the rules 3 times, you’re banned. Now we’re down to between 1 or 2 strikes, depending on the offense.
The bottom line is boards and groups can be controlled, but it has to be active moderation. You can call it censorship, but when the rules are clear (for example, no adult content), I don’t view it as censorship … just enforcing rules that make the group family / kid friendly.
It can be done. The main reason big companies don’t do it, or don’t do it consistently, boils down to cost.
Rich’s ‘bite me’ is an apt summation of the Trib’s editorial.
Comment by RNUG Thursday, Dec 2, 21 @ 7:46 pm
===The trick is clearcut rules===
Disagree. Keep ‘em guessing and they’ll be extra careful. Give ‘em clearcut rules and they’ll always look for loopholes.
Comment by Rich Miller Thursday, Dec 2, 21 @ 8:38 pm
===Removing their immunity would get the job done===
Remove their immunity and you remove mine and comments are dead. Keep that in mind.
Comment by Rich Miller Thursday, Dec 2, 21 @ 8:48 pm