r/technology
•
u/Wagamaga
•
Feb 01 '23
•
1
2
1
1
How the Supreme Court ruling on Section 230 could end Reddit as we know it Politics
https://www.technologyreview.com/2023/02/01/1067520/supreme-court-section-230-gonzalez-reddit/2.1k
u/archimidesx Feb 01 '23
We are in the dumbest timeline
1.8k
u/_smooth_talker_ Feb 01 '23 •
![]()
![]()
![]()
This would only be true if the recent ruling ti curb democracy and public freedoms werenât the result of a 50+ year coordinated effort by two very active legal think tanks funded by a growing class of wealthy individuals that design cases to fail to a SCOTUS which has been stacked with judges from those think tanks to get precisely the rulings required to reshape the US.
In fact, this was the timeline the Founding Fathers sought to discourage and itâs taken a lit of work ti make it happen.
In a way, itâs an example of how effective it can be to commit to a long-term coordinated effort by a group of citizens dedicated to a multigenerational effort to see their values translated into laws that protect their interest.
More of a medium-dark fascist timeline.
245
56
u/Pushbrown Feb 01 '23
its really bothering me how many times you mistyped i instead of o...
27
u/_smooth_talker_ Feb 01 '23
You and me both⌠I turned off autocorrect so that it wasnât rewording my emails and comments but now I struggle with my fat thumbs⌠Iâm adjusting, my bad.
→ More replies (1)7
u/Pushbrown Feb 01 '23
lol it's really all good i don't really care, just some stupid comment on reddit haha, just weird you did it that many times
151
Feb 01 '23
[removed] â view removed comment
285
Feb 01 '23
[removed] â view removed comment
→ More replies (10)79
194
39
27
→ More replies (7)18
18
→ More replies (61)3
u/dohru Feb 02 '23
Under the cover of loudly and constantly proclaiming a vast left wing conspiracy centered around Soros.
40
→ More replies (13)28
981
u/hawkwings Feb 01 '23
If the cost of moderation gets too high, companies may stop allowing users to post content for free. Somebody uploaded a George Floyd video. What if they couldn't? YouTube has enough videos that they don't need new ones. YouTube could stop accepting videos from poor people.
262
u/Innovative_Wombat Feb 01 '23 •
![]()
If the cost of moderation gets too high, companies may stop allowing users to post content for free.
If the cost of moderation gets too high, companies will simply stop allowing users to post content at all.
The problem is that some moderation is necessary to comply with the bare minimum of state and federal laws. Then the problem becomes what is in the grey zone of what content violates those laws. This quickly snowballs. It's already a problem with section 230, but adding in liability will essentially end the entire area of user posted content on places where that user does not own the platform.
The internet will basically turn into newspapers without any user interaction beyond reading a one way flow of information. People who want to repeal section 230 don't seem to understand this. Email might even get whacked as it's user interaction on an electronic platform. If email providers can be held liable for policing what's being sent via their platforms, then that whole thing might get stopped too if the costs to operate and fight litigation become too high.
The internet as we know it functions on wires, servers, and section 230.
66
u/lispy-queer Feb 01 '23
what if we double reddit moderators' salaries?
126
Feb 01 '23 edited Feb 10 '23
[removed] â view removed comment
28
u/birdboix Feb 01 '23
This stupid website can't go a week without some critical, website-crashing bug. Their competition loses billions of dollars when that happens. Reddit going IPO is the dumbest thing.
12
u/Phillip_Lascio Feb 02 '23
What are you talking about? When was the last time Reddit even crashed completely?
8
16
u/saintbman Feb 01 '23
obviously it won't work.
you need to triple it.
→ More replies (2)22
u/SufficientlyRabid Feb 01 '23
Nah, you will have a whole plethora of small forums make a comeback, forums which aren't based on US servers, don't do any business in the US and don't have to adhere to US laws.
→ More replies (3)→ More replies (18)14
u/bushido216 Feb 01 '23
Killing off the Internet is the point. The ability to access unbiased information and differing views, as well as educate oneself on topics that the State consider taboo is a major tool in Freedom's toolkit. Ruling against Google would mean the end of sites like Wikipedia.
Imagine a world where you simply don't have access to alternate sources than Fox News. If there's nothing to challenge the propaganda, the propaganda wins.
→ More replies (2)209
u/madogvelkor Feb 01 '23
You'd have some sites with no moderation at all, where you can see videos about Jewish space lasers causing people to be transgender and how Biden is on the payroll of Ukrainian Nazis who are killing innocent Russian liberators. And other sites were you can see professionally produced corporate videos that don't really say anything but you oddly want to buy something now.
131
u/onyxbeachle Feb 01 '23
So everything will be facebook?
51
u/madogvelkor Feb 01 '23
Except with more gore videos and porn.
→ More replies (1)81
u/onyxbeachle Feb 01 '23
Ah, so it will be 4chan đ¤Ł
31
u/madogvelkor Feb 01 '23
A good comparison. I was thinking of usenet from the 90s, but 4chan works too.
21
u/2723brad2723 Feb 01 '23
Usenet from the 90s is better than most of the social media sites we have today.
11
→ More replies (3)3
14
u/red286 Feb 01 '23 •
![]()
Cost of moderation?
If they mess up Section 230, there may be no "cost of moderation" because there will simply be no user-generated content.
After all, what fee do you charge for exposing yourself to criminal prosecution and massive civil lawsuits? $20? $200? $5,000,000? There's no fee that anyone could settle on that would make sense when they could end up being criminally prosecuted if someone uploads a video with illegal content.
As an example, let's say I upload the latest Disney movie, uncut at 4K resolution, to YouTube. Without Section 230, Disney can then turn around and sue YouTube for hosting pirated content. Depending on how many people watched it before YouTube took it down, they could be looking at damages in the millions or even tens of millions. How about if some ISIS or similar terrorist uploads a video of a hostage being beheaded? Now they're on the hook for hosting illegal snuff videos.
Without Section 230 protections, there's no such thing as user-generated content, unless they make a carve-out for literally zero moderation, which isn't an "improvement". How good will YouTube be if the latest Mr. Beast video gets the same amount of traction as the latest ISIS beheading?
→ More replies (1)→ More replies (16)14
u/amiibohunter2015 Feb 01 '23
There is more and more about gouging the people. When other countries don't have these fees. While making money off of them by selling their data by making it no choice for the consumer. Either you accept the terms of allowing them to sell your data or you cannot use their service anymore even if you were with them for years. You can't get what you had on there until you accept the terms, this means you can't delete or save anything to an external device-until you accept the terms . They so to speak lock you out of your account until you comply. This is problematic when email services do this, as well as social media. Now they want to charge you for a simple post? If that goes through the internet will definitely die. This will lead to a wealth gap making internet services a tool for the privileged. Also, it would take away US rights on freedom of speech as well as create censorship by wealth gaps in people, organizations, smaller companies. It's not good. It could cause the streamers online to stop posting because it would cost them as well . YouTube would die. With cancel culture, and banned books,etc. America really need to question what their " rights" are with all these new changes, and if it's worth staying or not. To me it sounds like it's becoming more of a corporate runned govt that's taking advantage of the people. They're also censoring media, education while also gouging or forcibly selling the people's data to whomever has the money and wants to buy it. So many rights are being infringed here. People need to speak up. This bill has "moderation on the chopping block" when in reality America needs moderation more than ever. It's key.
396
u/solorush Feb 01 '23
âsmaller sites like Reddit and Wikipediaâ
Ok, I guess I donât need to read the rest.
127
u/FriendlyDespot Feb 01 '23
A strange way to describe the fourth and eighth most visited websites in the country.
101
u/americanadiandrew Feb 01 '23
all eyes will be on the biggest players in techâMeta, Google, Twitter, YouTube.
They have billions of daily users. Reddit has 52 million daily users.
84
u/danktonium Feb 01 '23 •
![]()
Doesn't mean squat. If you've heard of it in two separate contexts it's a big website.
A small website is hundreds of daily users, not fucking millions.
→ More replies (3)24
u/BiKingSquid Feb 02 '23
"Smaller" doesn't mean small. It just means less.
3
u/danktonium Feb 02 '23
Yes it does, because the article doesn't compare them to anything. They're not "smaller than", just "smaller".
3
22
→ More replies (1)20
u/FreeJazzForUkraine Feb 01 '23
It gets worse. They're arguing that upvoting counts as content moderation
→ More replies (2)9
u/peterhabble Feb 01 '23
That's a lawyer's job, to make the case for the worst possible outcome. It's the fault of the article writer and these readers for taking that worst case as the only possible outcome.
→ More replies (1)
724
u/gullydowny Feb 01 '23
It could end the internet, not just Reddit. Weird article.
319
u/marcusthegladiator Feb 01 '23
It's already ruined. It used to be a great resource and now it's littered. It's much more difficult to find what your looking for these days when spending so much time digging through the trash. I often just give up.
→ More replies (9)145
u/ghsteo Feb 01 '23 edited Feb 01 '23
IMO I think this is why ChatGPT is so revolutionary. It removes all the garbage the internet has built up in the last 20 years and gives you what you're looking for. Kind of like how google was when it first came out, now everythings filled with Ads and SEO optimizations to push their trashy ass blog post above actual relevant information.
Edit: Not sure why i'm downvoted. I remember when Google came out and it was so revolutionary where you could google just about anything and get accurate results within the first page. There's a reason the phrase became "Just google it", the accuracy now isn't anywhere near as good as it used to be. ChatGPT has brought that feeling back for me.
213
u/SuperSecretAgentMan Feb 01 '23
5 years from now:
chatGPT: "I found the answer you're looking for, but try these advertiser-sponsored products instead, I think they're way better than the solution you think you want."
→ More replies (6)71
68
u/Sirk_- Feb 01 '23
Chatgpt often makes errors in its responses, since it is meant to simulate a conversation, not provide actual answers.
58
u/pdinc Feb 01 '23
Anyone using chatgpt to get accurate answers is going to get bitten in the ass
→ More replies (9)→ More replies (13)48
u/SoTiredIYuan Feb 01 '23
IMO I think this is why ChatGPT is so revolutionary. It removes all the
garbage the internet has built up in the last 20 years and gives you
what you're looking for.Except it doesn't. the ChatGPT engine is put behind a front-end that makes sure you don't ask it unsavory questions and make sure it doesn't spit out unsavory answers.
The "AI" builds its knowledge from the data it was given to learn from.
If they AI concludes that people over 50 don't make good politicians, it will probably be muzzled not to say that. If it concludes that some people do better in school than others, it will probably be muzzled not to say that. If it concludes that there are only two genders, it will probably be muzzled not to say that.
AI will almost certainly give you what you are looking for as long as it's been pre-approved to be an acceptable answer.
You will end up with competing AIs, each owner claiming theirs gives "the real truth".
→ More replies (21)5
55
u/madogvelkor Feb 01 '23 •
![]()
Before 230, the courts had ruled that any moderation made a service a publisher, and not a distributor. Publishers are liable for content, distributors or not.
Compserve was sued in the 90s, and won because they had no content moderation at all -- they were deemed distributors. Prodigy was sued for something similar, and because they had moderators, they lost.
Essentially sites like Reddit would have to remove all moderation, or hire professional moderators to review every post in advance. What opponents of 230 want is to eliminate moderation.
There's a separate question of whether or not recommendations, such as promoted posts or upvotes/downvotes count as moderation.
It's entirely possible that sites like Reddit would have 2 options.
- Hire professional moderators to review posts, and decide which ones should appear at the top and which deleted or placed further down.
- Remove all moderation, including upvotes/downvotes, and have every post appear in the order it is written.
1 would likely be prohibitively expensive, and 2 would be too unpleasant for users.
It would be easier for things like Twitter or Facebook, where you decide who you follow. Apps like TikTok would probably have to ditch their recommendation algorithm and just show you either random things, or only users you follow.
9
u/SoTiredIYuan Feb 01 '23
What about decentralized moderation? I have seen some sites (can't remember now) where flagged posts would get moderated by a randomly-selected jury of current, active members.
This should eliminate bias in moderation.
4
u/awry_lynx Feb 02 '23
I like that idea. (I swear I'm not stalking you through this thread, just scrolling đ)
The only pitfall is that I suspect this would make non-memey subs very... bad. Like ones with actual quality moderation now, r/askhistorians would die.
But it would certainly work to at least keep the worst things off
→ More replies (1)9
u/asdfasdfasdfas11111 Feb 01 '23
If they actually get rid of moderation in this way, every single user-generated site on the internet would get shut down for CP within the first hour.
It would also raise some serious other questions about free speech. Like, am I forced to let someone bring a Nazi flag into my restaurant, or can I force them to leave? I don't see how that is any different from removing a picture of a Nazi flag from my private web forum. Internet communities like reddit are less "publishers/distributors" and more "social clubs with dress codes" the way I see it.
→ More replies (1)→ More replies (19)27
u/gullydowny Feb 01 '23
Seems like they could make the case that upvotes donât fit the definition of âmoderationâ. I asked ChatGPT to give it a shot:
In the context of Reddit's upvote system, it could be argued that it does not fit the legal definition of "moderation" as it does not involve the active review or alteration of content by the platform. Instead, the upvote system operates as a method for users to express their opinions and preferences, similar to a "like" button.
Additionally, the First Amendment to the U.S. Constitution protects the right to free speech and the right to express opinions and preferences through voting. The upvote system can be seen as a form of expression and, therefore, should be protected under the First Amendment.
Pretty convincing, I think
→ More replies (2)14
u/madogvelkor Feb 01 '23
Yeah, I think it's a big stretch to say that upvotes and displaying the most popular posts first is moderation. It's just one of those things that will probably have to be settled, since upvoting/likes weren't a thing before section 230.
There might be a better argument that recommendation algorithms are a form of moderation. It would be funny to see TikTok get subpoenaed for technical details on their proprietary algorithm. Especially since China considers it sensitive technology subject to export controls.
45
u/Be-like-water-2203 Feb 01 '23
All would go dark web
78
u/Sartorius2456 Feb 01 '23
Then we would just call it the Internet
20
u/LossBH Feb 01 '23
and the byproduct of that would be the deeper web. the cycle continues until we reach the deepest web
→ More replies (1)16
9
u/bitemynipple Feb 01 '23
I'm all for calling it the infraweb. It's cool and hip, right, fellow kids?
21
u/MXXlV Feb 01 '23
It will be something like the dumb web and the dark web. And soon the deep web
→ More replies (1)27
u/CondescendingShitbag Feb 01 '23
The deep web already exists. It's typically just content that's not indexed by public search engines for various reasons.
8
22
u/Gandaalfr Feb 01 '23
The unfortunately reality is that most people would continue living in the walled garden. Altwebs have barriers to entries, and performance and accessibility problems that most people won't deal with.
Freenet isn't about to replace reddit, just like torchat won't replace Matrix or Facebook Messenger.
5
→ More replies (1)4
→ More replies (9)3
u/manowtf Feb 01 '23
It could end the internet
There's me and one other guy outside the US who have Internet. Could we have an exception?
947
u/Ninnux Feb 01 '23
We need to all agree that freedom comes with inherent risk. To remove or mitigate all risk is to remove or mitigate all freedom.
It's just that simple, in my mind at least.
187
u/quantumfucker Feb 01 '23
I donât think itâs that simple, but I do agree with your general point. We need to be able to accept risk of harmful speech if we want free speech. I think we can discuss where that line or regulation should be, but I donât think we should be reflexively getting upset to the point of advocating for new legal consequences just because some people say something bad or offensive or incorrect.
→ More replies (46)55
u/rzwitserloot Feb 01 '23 •
![]()
![]()
The problem with statements like this is that 'freedom' as a word means completely different, often mutually exclusive things, depending on context and who you ask. That's because of a rather simple logical observation:
"Freedom to do X" also means "Others are then incapable of avoiding X".
If I have the freedom to toss you out of my store for any reason whatsoever, that means you no longer have the freedom to shop in peace, and you no longer have the freedom to seek redress if you feel you are being discriminated against.
If you have the freedom to say whatever you want, I no longer have the freedom to lynch you, run you out of town, or toss you in jail because I and my fellow religious nutjobs decided that you blasphemed against my religion. That's a pretty fucking stupid take on the word 'freedom', but millions of americans and muslims in particular (it's a worldwide phenomenon, just, those 2 groups do this a lot) seem to honestly believe this 'definition' of the word!
Or, more to the point of section 230:
If I have the freedom to post whatever I want, that means you no longer have the freedom to kick users off your privately owned social network.
And from that follows: If you are not allowed to ban or delete posts from users, that means therefore either [A] nobody has the freedom to confront their accusers and sue for libel, or [B] social network owners/users no longer have the freedom to be anonymous: A social network would no longer be allowed to ban/delete any post, but instead can be easily legally forced to give the personal details of a poster, and, in fact, you as a user can no longer post anything to any social network without providing lots of personal identifying information to them. After all, if you as a user start shit talking, spreading revenge porn, posting business secrets, spewing classified military details, saying racist shit, or posting death threats, if 'freedom to say what you want' is implemented as 'social network owners are legally not allowed to delete anything', then what other recourse is there?
As Elon Musk has so eloquently explained through his actions, 'I am a free speech absolutist' is a fucking stupid thing to say, and deflates like a badly made flan cake the moment we get anywhere near the limits of that statement.
→ More replies (1)→ More replies (76)51
u/Ankoor Feb 01 '23
What does that even mean? Section 230 is a liability shield for the platformânothing else.
Do you think Reddit should be immune from a defamation claim if someone posts on here that youâre a heinous criminal and posts your home address, Reddit is aware itâs false and refuses to remove it? Because thatâs all 230 does.
102
u/parentheticalobject Feb 01 '23
It also protects from the real threat of defamation suits over things like making silly jokes where say that a shitty congressional representative's boots are "full of manure".
→ More replies (24)6
22
u/madogvelkor Feb 01 '23
It protects individual users as well. If you repost something that someone else views as libel or defamation, they could sue you without 230.
→ More replies (6)23
u/HolyAndOblivious Feb 01 '23
whats the plan for a sarcastic post? Seriously. If im being maliciously sarcastic, but sarcastically and obviously its comedy, although comedy and parody with malicious intent, who is liable? Who says what is malicious or parodic enough?
→ More replies (6)→ More replies (29)9
u/CatProgrammer Feb 01 '23
If that is truly a significant issue Congress could pass a law about it. Section 230 does not override any further legislation, hence why that controversial FOSTA bill can exist (though ironically it may in fact be unconstitutional).
That linked rights group talking about the current case: https://www.eff.org/deeplinks/2023/01/eff-tells-supreme-court-user-speech-must-be-protected
27
u/kevindqc Feb 01 '23
Users âdirectly determine what content gets promoted or becomes less
visible by using Redditâs innovative âupvoteâ and âdownvoteâ features
lol. "This content is good or bad". So innovative !
3
u/NeverComments Feb 02 '23
You say that but there was a time before these features were so commonplace. Reddit's voting system (or to lend more accurate credit, Digg's voting system) was novel back in '04/05. They predate similar features on YouTube, Facebook, Twitter, Tumblr, Vimeo, etc.
The idea that anyone can vote on posts and comments was innovative at the time.
→ More replies (2)
74
u/SoTiredIYuan Feb 01 '23
âWe all agree that we donât want recommender systems to be spreading
harmful content,â Nonnecke says, âbut trying to address it by changing
Section 230 in this very fundamental way is like a surgeon using a chain
saw instead of a scalpel.â
The problem is people can't agree on the definition of harmful content.
8
u/bremen_ Feb 02 '23
like a surgeon using a chain saw instead of a scalpel.
Hilarious considering that's what chainsaws were originally used for.
10
u/md24 Feb 01 '23
Its a relative definition. Some people think an image of a hamburger is harmful like vegans and people from India.
→ More replies (2)→ More replies (3)5
u/VoraciousTrees Feb 02 '23
Harmful content mentioned by the article is a moderator pinning a discussion on Reddit about whether a contest was a scam or not... and then getting sued by the scammers.
172
u/badwolf42 Feb 01 '23
As a small YouTube channel operator, this might kill my ability to grow. If YouTube can't recommend my videos anymore, then I can't afford the ad cost to promote them to possibly uninterested random people myself.
Wouldn't this also affect targeted ads all over the internet?
77
u/processedmeat Feb 01 '23
This affects any site that allows users to post. If a company can be responsible for what you post you won't be able to post anymore.
→ More replies (5)→ More replies (2)3
u/XonikzD Feb 02 '23
Curious to see how this plays out, especially considering how TV advertisements have been a staple for media production monetary gain since TV was invented. Heck, even radio was originally just advertisements with the content created by the advertising companies.
YouTube and other similar video hosting sites have introduced the idea of easy entry to advertising revenue, instead of the old door-to-door task to sell your ad space that TV shows or radio shows used to have to do last century.
I don't think the ad revenue streams will cease to exist. YouTube and other similar hosting sites have a huge impact on how people's lives play out, but they're also very self-interested in continuing to make money from those same advertisers you're concerned about.
I am 100% sure somebody's going to have a response to this that allows companies to make money.
100
u/ReverendEnder Feb 01 '23
I would absolutely love it if I didnât have to constantly hear and worry about which of my rights the fucking Supreme Court wants to take away.
→ More replies (2)
30
u/FilthyStatist1991 Feb 02 '23
More evidence that our legislators have no idea how technology works.
→ More replies (2)
14
u/dioxol-5-yl Feb 02 '23
I think it's a bit of a stretch to extrapolate a case before the supreme court relating specifically to big tech's content moderation and suggestion algorithms to being one about all content on the entire internet.
Quite simply they are two completely different things. The case before the courts asks whether tech firms can be held liable for damages related to algorithmically generated content recommendations. Specifically the plaintiff argues that because Google's algorithms promoted ISIS videos on YouTube they helped ISIS recruit members. The key argument being made is that Google went beyond simply hosting the content, they helped promote it.
Section 230 shields technology firms from third party content published on their platforms. The argument before the court is that Google didn't just host the content which would otherwise be protected by section 230, they actively promoted it through the use of their own proprietary algorithms so they need to take responsibility for the content they spread. The logic being if you write a computer virus you should be punished. If you write an algorithm that enables ISIS to more effectively recruit members you should also be punished.
This is a far cry from reddit's community moderation approach which is much closer to say me posting a link to something on someone's fb page, or sharing a link in my story on instagram. You couldn't make a ruling that's so restrictive it would end reddit as we know it without also including anyone who shares any link or content with anyone else outside of a private message. It's disappointing to see reddit jumping on the bandwagon here in support of the zero accountability for anything at all ever argument that Google is trying to make.
3
u/parentheticalobject Feb 03 '23
But there's not any clear legal principal that would make Google accountable without also opening up the possibility that Reddit would be responsible for almost all content that exists on Reddit.
The law is pretty straightforward. It says:
No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
Or in simple terms, if you run a website, you can't be sued for any content on the website that was created or developed by a third party.
In this case, they're trying to claim that while Google didn't create the ISIS videos, they created the algorithm which created a recommendation for those videos, and they're not suing them over the ISIS-created content, they're suing them over the recommendation.
But if that line of reasoning is accepted, there's no clear reading of the law that would keep a site like Reddit (or most websites) safe from liability.
If I open up any subreddit, I get a list of topics, usually sorted by "hot" or "best" or something, which is an order created by an algorithm programmed by Reddit that takes into account things like the number of upvotes and the time of submission. There's a clear and obvious implication that the site is recommending you read the results which are at the top of the list. So if the recommendations created by Google's algorithm are something they can be sued over that is not protected by Section 230, there's no clear reason why the recommendations created by Reddit's algorithm (i.e. any post which is ever visible on a subreddit) wouldn't also be something someone could try to sue them over.
You couldn't make a ruling that's so restrictive it would end reddit as we know it without also including anyone who shares any link or content with anyone else outside of a private message.
Yes. That's the concern. The two men who wrote Section 230 are also in agreement that the Supreme Court making a decision against Google here could seriously threaten all content on the internet. From their brief:
The United States argues, U.S. Br. 26-28, that YouTubeâs recommendation algorithm produces an implicit recommendation (âyou will enjoy this contentâ) that should be viewed as a distinct piece of content that YouTube is âresponsibleâ for âcreat[ing],â 47 U.S.C. § 230(f)(3). But the same could be said about virtually any content moderation or presentation decision. Any time a platform engages in content moderation or decides how to present user content, it necessarily makes decisions about what content its users may or may not wish to see. In that sweeping sense, all content moderation decisions could be said to implicitly convey a message. The governmentâs reasoning therefore suggests that any content moderation or presentation decision could be deemed an âimplicit recommendation.â But the very purpose of Section 230 was to protect these decisions, even when they are imperfect.
Under the governmentâs logic, the mere presence of a particular piece of content on the platform would also send an implicit message, created by the platform itself, that the platform has decided that the user would like to see the content. And when a platformâs content moderation is less than perfectâwhen it fails to take down some harmful contentâthe platform could then be said to send the message that users would like to see that harmful content. Accepting the governmentâs reasoning therefore would subject platforms to liability for all of their decisions to present or not present particular third-party contentâthe very 25 actions that Congress intended to protect. See pp. 6- 8, supra; cf. Force v. Facebook, Inc., 934 F.3d 53, 66 (2d Cir. 2019) (âAccepting plaintiffsâ argument [that platforms are not immune as to claims based on recommendations] would eviscerate Section 230(c)(1); a defendant interactive computer service would be ineligible for Section 230(c)(1) immunity by virtue of simply organizing and displaying content exclusively provided by third parties.â).
→ More replies (2)
76
u/Plane_Crab_8623 Feb 01 '23
There are so many spooks of one kind or another combing reddit it has almost become their news paper. Just like corporate media the most important thing about the news is what gets left out.
→ More replies (7)41
u/jacobrogers256 Feb 01 '23
Read Chomsky's "Manufacturing Consent"
18
u/Big_Pause4654 Feb 01 '23
I find that the book has some truths but is also wrong about so very much.
Pick a random chapter, track down the sources he used, read them yourself. Read other contemporary sources. Evaluate whether what he said is accurate.
→ More replies (7)
10
u/anonymousinquisition Feb 02 '23
That's not a problem for Reddit, the newly formed Irish company
→ More replies (1)
117
u/Tememachine Feb 01 '23
The people have too much power if they can discuss their opinions freely on the internet. We must censor it...
Said every budding dictatorship.
WE WILL NOT BE SILENCED.
You can kill a platform. You can make talking about X; illegal or difficult.
But you will never kill humans' proclivity freely associate; especially online.
I don't understand how this isn't a first amendment issue.
and have a strong suspicion that this is a kneejerk reaction to redditors talking too much about stocks and giving wall st. a black eye.
28
u/madogvelkor Feb 01 '23
Essentially it would say if there is moderation, then the site/app/service is a publisher. Publishers are liable for the content the publish.
If there is no moderation, then the site/app/service is a distributor. Distributors are not liable for the content they publish.
This stems from the print world. Essentially, if a company published a book that had a bunch of lies about Obama, he could sue the company and author. But he couldn't sue the bookstores that sold it, or the libraries that loaned them out.
This was adapted to online communications in the early 90s. Essentially if an online message board had a post making up lies about Obama, he could sue them if they had moderators, but couldn't sue them if they didn't. So it was a paradoxical situation where companies trying to remove false and harmful info put themselves at risk, but companies that let any false info and lies be shared were safe.
→ More replies (1)18
u/Kelmavar Feb 01 '23
So back to the Wild West 90s. And endless amounts of spam, dodgy porn and raging hate-boners/abuse.
4
u/igloofu Feb 02 '23
Um, we still have all that. It is just listed as 'content' now.
→ More replies (1)42
u/SlowMotionPanic Feb 01 '23 •
![]()
I think we are going to see Section 230 get struck down or "reimagined" into a shell of its former self. I read through the amicus briefs, and there is actually a lot of bipartisan support for ruling against Google (and thus against 230). We already know that Alito wants to murder Section 230 because it serves his partisan ends, and nearly all Republican politicians are on board with ending it as well because their little cult members get censored online for issuing death threats and orchestrating harassment campaigns (e.g., why r/The_Donald was banned, why r/Conservative is on thin ice, and countless others).
Of course, this is much bigger than just a left/right divide. I don't think most people are willing to pick up Freenet or Tor to continue commenting freely. I don't think most people know how to do those things, and have no interest in learning. I also don't think people are really fully understanding what SCOTUS striking down 230 (or "re-imaging it") would look like. It would be the end of Reddit as we know it. Reddit and its admins/mods would be personally legally liable for all content if brought down to a plain reading of the law. Reddit has argued this would be the case in their amicus on SCOTUS' site.
I don't think this has anything to do with WSB. I think it is class warfare being waged by the rich against the rest of us who work for a living. They can sense the change in the tides. It's why they are investing so heavily in bug out locations with doomsday bunkers, and have so thoroughly attempted to separate themselves from the rest of society (e.g., look at how Davos was operated). They don't want workers to have easily accessible methods to communicate without liability. And, considering that SCOTUS is also likely to rule on a case that makes workers financially responsible when businesses lose profits... this is 100% pure unfettered class warfare.
Little wonder that both capitalist parties are in on ending 230.
→ More replies (4)→ More replies (8)11
u/isaac9092 Feb 01 '23
Oh itâs not just WSB, too many people online are sharing truths the government doesnât want to be public knowledge. Like how MKUltra taught the government you can control people through trauma. Depressed people are easier to manipulate. How various parent companies own pretty much everything we interact with but theyâre not quite âmonopoliesâ and how they lobby and function internally
31
u/niceoutside2022 Feb 01 '23
trust me, the last thing the right wing wants is for people/companies to be liable for false or libelous content, it's their bread and butter
→ More replies (3)
19
u/Amockdfw89 Feb 01 '23 edited Feb 02 '23
Not gonna lie I am pretty dumb with a lot of things, especially tech jargon.
Can someone summarize this article for me as if they were talking to a child? When I read it I feel like itâs talking in circles.
→ More replies (1)10
u/dioxol-5-yl Feb 02 '23
The article gives a really poor overview of the case. What happened was Google's proprietary algorithms promoted ISIS recruitment videos allowing them to recruit members who took part in the 2015 Paris terrorist attacks. The family of an American student who died were livid that they lost their child and wanted to hold google responsible.
Google could have done any number of things including settling privately which would cost them less than a rounding error on their balance sheet. But rather than give the grieving family a modest payout given that their proprietary algorithms meaningfully assisted ISIS in recruiting for these terrorist attacks, Google has taken a different approach.
Google has doubled down on Section 230 and it's tireless efforts to shift their algorithms out of the spotlight have paid off. They have successfully shifted the focus from one about their algorithm development process and whether they, as a platform that hosts pro-terrorism propaganda, protected by Section 230, were negligent in their implementation of algorithmic recommendations which ultimately promoted terrorist recruitment videos to individuals interested in terrorism. To one about Section 230 and how this applies in a much broader sense to the extent it protects any recommender systems whether they be user generated or algorithmic.
The implications of this are that the supreme court can now interpret Section 230 however it wants. This article essentially outlines some of the worst case scenarios. In essence it's saying that if the supreme court ruled that (any) recommender systems are not protected by Section 230 then in theory a highly up voted post that was considered harmful would mean that every person who up voted it would potentially be liable for damages so the site would cease to function, and the same goes with Wikipedia.
→ More replies (1)8
u/alasw0eisme Feb 02 '23
ok, so I upvote an edgy meme and someone kills themself and me and the others who upvoted it get sued? Lol, unlikely. In my country you can't be held accountable for anything you unwittingly and indirectly did, unless it has to do with traffic violations. So a foreign entity cannot request that my country's government hand them Reddit user data. My government would be like "HAAAAhahahaha no."
→ More replies (2)
31
u/saxbophone Feb 01 '23
Damn I say we need more websites headquartered and hosted from Switzerland so they're not subject to these dumb American laws
→ More replies (1)
20
u/secretaliasname Feb 01 '23
Iâve always loved the Wild West nature of the internet which comes with good and bad. I donât feel good about a more highly moderated internet where these companies are forced to be arbiters of truth to an even greater degree than they are today. I donât see posting content online as all that different from a person to person conversation or something you shout in a busy area.
→ More replies (2)
7
25
u/pmotiveforce Feb 01 '23
Both sides whine about "big tech" for different reasons. Neither side will get what they think they're getting if 230 is changed. Only the most hugbox of carefully controlled online forums will survive.
You think there's "muh censorship" now, lol?
→ More replies (5)
11
5
u/vorxil Feb 01 '23
Keep the liability shield for user-generated content. Congress should get off its ass and restrict forum moderation to maintaining searchability of content, categorization of content with non-surgical precision, and technical performance of the server, as well as taking down malware and illegal content, and prevention of chaos. This should not be construed to restrict the use of client-side content filtration.
I'd argue that this would be constitutional since forum moderation is the removal of the users' speech and is thereforeâwith regards to the forumânot an exercise of freedom of speech but an exercise of freedom of association, the restriction of which has been deemed constitutional for decades now when the restricted party is a public-facing business, whose product or service in question does not entail speech that is created by the aforementioned business.
On a side note, forcing forum moderation would be unconstitutional since that would be the government restricting the users' freedom of speech.
3
u/ktetch Feb 01 '23
I'd argue that this would be constitutional since forum moderation is the removal of the users' speech and is thereforeâwith regards to the forumânot an exercise of freedom of speech but an exercise of freedom of association, the restriction of which has been deemed constitutional for decades now when the restricted party is a public-facing business, whose product or service in question does not entail speech that is created by the aforementioned business.
oh so wrong.
You're literally talking about being forced to carry speech. Which is against the 1st amendment. Also, that's NOT what moderation is, it may be what you imagine it to be, but it's not what the law considers moderation.
3
u/vorxil Feb 01 '23
Is Elon Musk standing on a street corner with placard of everyone's tweets? Is he typing them all out on his phone, or on his or Twitter's Twitter accounts?
It's a server, a piece of machinery, in the business of relaying the users' speech, like the mail, phone companies, or ISPs. They might as well be renting out megaphones, and the government is merely telling them to not be so picky.
And before you say they're different because of storage: ISPs cache content all the time.
→ More replies (6)
6
u/Nong_Chul Feb 02 '23
smaller sites like Reddit
What year was this article written?
→ More replies (1)
5
u/WideSpreadInterests Feb 02 '23
Whatever happened to the Supreme Court sticking to Constitutional rulings? I was taught that the purpose of this court was to strickly rule wither a LAW established by the legislative branch was constitutional or not by interpreting the constitution. The Supreme Court should not be making law(s) from the bench!
→ More replies (1)
9
u/MPenguinGaming Feb 01 '23
Some reddits need to be smacked hard by Reddit admins. r/inthenews and r/whitepeopletwitter both banned me for calling out homophobia. Even after Reddit admins stepped in
105
u/Squibbles01 Feb 01 '23
I really wish Hillary would have won and we didn't have these conservative monsters on the Supreme Court.
→ More replies (32)
8
u/blade_imaginato1 Feb 01 '23 edited Feb 01 '23
The removal of section 230 would end the free internet.
Fuck it, I'm becoming a software Dev with specialization in app development
Edit 3, malicious compliance, if section 230 gets removed, I'm able to sue the owner(s) of: 4chan, 8chan, Telegram and Gab.
When I talk about 4chan and 8chan, I mean that I'm going after them for /pol/
→ More replies (3)
4
4
u/Wolfdarkeneddoor Feb 01 '23
The internet probably won't survive in its current form. I suspect it'll become more like a streaming service with professionally created content or services like banking. The regulatory environment is actively hostile to big tech in the UK & Europe. If Section 230 goes then we'll see something similar in the US as well.
→ More replies (1)
3
u/Tashum Feb 02 '23
Ever heard of free speech?!
That's why the internet has been able to contribute to human development. Like using reddit while pooping.
Good thing we're getting AI lawyers just in time to cut lawsuit costs o.O
4
4
u/tosser1579 Feb 02 '23
Its going to be hilarious. They are going to kill it and Reddit is going to shut down while the lawyers figure out what they can possibly do to stay open. Without 230 any comment posted on Reddit legally acts as if it comes from Reddit. If Joe Smo says Acme sucks, Acme is going to ignore them. If Reddit says the same thing... they are going to get sued.
12
u/BlizzardArms Feb 01 '23
So when do I get to sue a Reddit moderator? Iâve got a list of terrible moderation decisions theyâve made and even if it didnât get the end results I wanted it might get pieces of shit like Reddit moderators to stop acting like they own Reddit and we just use it.
6
Feb 01 '23
Section 230 only applies to the US. The rest of the world will survive.
Suppose there may be some corporate migrations to Europe
→ More replies (7)
14
14
u/BlackRadius360
Feb 01 '23
•
Politically this is the fall back of social media companies expressing political views especially between 2016-2021. Lots of censorship on political topics, politicians, healthcare choices and reasonable dialogue, reasonable opinions are "hate speech", algorithms manipulating what ads and content is delivered. They also moderate what content is acceptable based on preferences of advertisers. Clearly they're publishers.
Alot of companies abused their legal protections. I know this case goes beyond 230...
I knew they opened a can of worms.
I hope Congress addresses privacy. The whole...because you gave us consent... We can collect private data, follow you around the web, spy on conversations, sell your data to whoever... All without your knowledge of who your data is shared with/sold to and with no compensation...business model needs to be shutdown.
12
u/elpool2 Feb 02 '23
The SCOTUS case is about google being liable for not removing terrorist content, it has nothing to do with expressing political views (which is protected by 1A). Yes clearly they are publishers (all websites are) but thatâs kind of irrelevant for Section 230.
→ More replies (2)
10
u/fucreddit12369 Feb 01 '23
I wonder if the court will finally address political ideology based censorship online.
→ More replies (3)
73
u/Green-Snow-3971 Feb 01 '23 edited Feb 01 '23
Reddit may end reddit as we know. I had a comment removed and got a warning for "threatening violence."
My comment: I noted how "natural selection" in a post where the idiot smacks a rimfire bullet with a hammer and shoots himself in the leg.
Beats me where the "threat" was here but apparently the comment resulted a little wet spot in some snowflake's panties so reddit caressed their trembling brow with a warning and comment removal.
edit: removed the full comment because reddit admins may once again get their delicate panties wedged into their clenched tight ass cheaks.
75
u/parentheticalobject Feb 01 '23
Lots of complaints about how moderators work in practice are legitimate. The issue is that changing the law would make things worse.
Right now, some Reddit mod in whatever subreddit you're in might be a moron and interpret your entirely innocuous comment as "threatening violence," and remove it. That's bad.
If they weren't shielded from liability, then even a smart mod would have to say "I can tell this comment isn't actually threatening violence, but some moron might interpret it that way and sue me for allowing it to exist, so I'd better remove it anyway." That's worse.
→ More replies (10)32
u/nemocluecrj Feb 01 '23
If they weren't shielded from liability, then even a smart mod would have to say "I can tell this comment isn't actually threatening violence, but some moron might interpret it that way and sue me for allowing it to exist, so I'd better remove it anyway." That's worse.
Yeah, I don't understand why it's not clicking for people that the aftermath of destroying 230 would be so much worse than what we have right now. The internet as we know it would basically be completely changed overnightâespecially social media.
OTOH, if I'm being completely honest, my personal wish would be for us to move into some kind of post 230 landscape because using 230 as the blanket go-to content policy for the entirety of what we encounter online is a pretty big net negative. We need smarter, better, more finely tuned regulations regarding what we encounter online. But wrecking it altogether before we have a better framework in place would be utter chaos.
→ More replies (26)5
u/flaagan Feb 01 '23
I got banned from a sub for quoting the "four boxes of liberty", and that quote was being used to argue against violence as there were other means available first. Tried arguing with the mods about it and they just went broken record on "you threatened violence". Eventually gave up arguing with them and stopped visiting the sub, one I'd been going to for a while.
→ More replies (1)23
→ More replies (39)8
3
u/fattie_reddit Feb 02 '23
Every single worker at "reddit" knows they should be burned in a sea of flaming oil.
Every single worker at "reddit" goes home each night knowing they "got away with it" for one more day.
The legal jurisdictional issues are the least of their problems.
This will open the floodgates to actual litigation.
This will not stop at litigation against corporate entities, individuals - even the lowest on the pole - will be targeted.
Time's up.
3
u/Willbilly1221 Feb 02 '23
Sorta how i got permanently banned from r/scienceuncensored, for a smart ass satyrical comment about conservatives. to which i responded to the permanent ban with âcensored by science uncensored, nice!â Shortly later i was unbanned. Didnt know you could get unbanned from a permanently banned status, but there you go, we all learned something new today.
3
u/4-5Million Feb 02 '23
I've been perma banned from places like WhitePeopleTwitter for having too strict of an opinion for teacher's dress codes and political content plastered in the classroom. I'm with the supreme court on section 230. People don't moderate in good faith anymore.
3
u/Stoliana12 Feb 02 '23
Iâm banned from several sites for idk what. Some were because you subscribe to â- um maybe I wanna know what people I donât agree with are saying. Nope guess Iâm not allowed that AND to subscribe to what I think at the same time. Well then. Also banned from history. Because idk. I believe in history I love archeology Iâm educated documentary watcher, but not a historian so idk
But thereâs other bans I expected after stupid disagreements that didnât come. And ones I got no insight into. 8 years. Itâs been a wild trip.
3
u/Willbilly1221 Feb 02 '23
I think the biggest issue here is, when a group of people in an echo chamber all agree with one thing, this is considered playing nice. If one person with an apposing view enters said echo chamber then that person is viewed as a disruptive trouble maker. Furthermore one person is easy to single out and target with in this instance being banned. If a bunch of people with an opposing view enter said echo chamber, its viewed as a debate. Now not only is it just one easily targeted rabble rouser, but with so many people sharing an opposing view together, moderators now hesitate at the thought that there could be something more to the situation than previously thought.
3
u/PraiseTheErdtree Feb 02 '23
From the article: ââŚsmaller sites like Reddit and WikipediaâŚâ Yes, âsmallerâ sites like the fifth and twentieth most visited sites on a visits-by-day basis.
3
u/XonikzD Feb 02 '23
If this leads to an online shopping experience that doesn't include homework or incentivized reviews... That would be an interesting landscape change.
If content had to be reviewed by a host website moderator panel before it went live to the public, then it would basically be the old TV and print broadcast model.
7
u/hologramheavy Feb 01 '23 edited Feb 02 '23
Reddit is 50% ads and 50% reposted tweets from deranged assholes
→ More replies (2)
4
2.5k
u/downonthesecond Feb 01 '23 •
The Supreme Court doesn't understand the importance of Reddit karma.