Because the claim by the OP was that we should remove 230 protections for algorithmically curated content, saying that those that run the algorithms don't deserve those protections.
Comments
Log in with your Bluesky account to leave a comment
I did not say that. I said they should gave reduced liability protection or to have it clarified as to when a company would lose the protection. Even now section 230 is doesn’t not provide complete liability protection.
It’s a bit disappointing you have been so hostile towards me, honestly.
I've been debating poorly thought out ideas for 230 reform for decades, so I get a little snippy at poorly thought out plans that would create all sorts of real world problems.
Good question. In my hypothetical, the service provider could offer algorithmic feeds as long as they are optional and not the default.
The key here is that a social network which implements a non-optional filter is exercising editorial control -- precisely the thing 230 was *not* designed for.
There are so many "opt-ins" that people routinely accept while onboarding, I doubt the change you suggest would have much effect on the posts served to the vast majority of users.
It would basically be like the "click here to opt-in to tracking cookies" nag buttons.
That's... incorrect. Section 230 literally is designed to protect sites "exercising editorial control" over 3rd party content. That the whole point of the law. The Barnes decision made this clear. 230 protects for things a publisher WOULD do, but gives immunity when it's 3rd party content.
Automatic feeds would be riskier under that model (and depending on definitions, a firehose feed could be very risky) but they could make feeds for content they have vetted.
Would probably make more sense to just put weakly filtered feeds behind disclaimers instead
They do, insofar as the service provider must implement some kind of API that allows users to write and upload their own content filters. But they do not require, default, or recommend the use of any particular feeds. I think it would be possible to write the law in a way that protects feed authors.
I feel the need to point out that if you go looking for a Feed, the listing of Feeds is algorithmically sorted. So are the search results when searching for Feeds.
It would be quite the mess if such a hypothetical rule would require that a node which provides open feeds determine if another node belongs to a single user (potential liability) or a corporate node (no direct liability, the corporate node takes on the responsibility to filter for their own users)
This kind of cheap dunk doesn't help anyone. OP's proposal was oversimplistic, but I think there are some useful ideas there, and no one else seems to be seriously engaging with the argument that it's a problem for corporate editorial control to be protected from liability under 230.
Of course. Discussions around section 230 suck because it is filled with people who just want to tell you how much smarter they are then you. I honestly regret ever responding, because little of value has come out of it.
i dunno i think it would behoove The Discourse™ to focus not on "algorithms" per se (tautological, pleonastic) but rather the content of their behaviour and the extent that they can be described (eg "reverse chronological" the name *is* the description) vs, say, a given entity's trade secret
I agree that "algorithm" gets badly overloaded here and I don't think anyone should get too hung up on that word.
My issue is that social networks which boost or limit content in non-optional ways are acting in many ways as publishers, which is violates the original premise of Section 230.
If Bluesky decides that x_HitlerIsAwesome_x should be banned, that's a non-optional limit of that content. If they show me a certain post eight times because eight accounts I follow reskeeted it, that's a non-optional boost. Right?
Because, my understanding is, without 230 they could be liable for banning x_HitlerIsAwesome_x but not XHIAX, unless they banned nobody, which means a feed that hides posts with hogs, frogs, and logs is also potentially a problem if it shows me some but not all "thing I could sue about" posts.
sure, an entity saying "we are going to spend resources arranging our information system so you see what we want you to see" is pretty unambiguously an editorial intervention, but yeah, the "non-optional" aspect is the critical one there
I would like to see the whole of social media be discussed in terms of the business that it actually is, namely the business of selling attention and data about people, not "free speech." It makes sense to devise rule around what you are allowed to do in the selling attention business.
I don't think that was the claim by OP: The claims was limiting the protection. There are a lot of reasonable ways to limit the worst types of algorithmic curation without opening up the entire internet to liability.
Surely if we make a law people will respond to it by doing only the things we want them to do, rather than the things the law allows them to do if they look at it under the right light and squint a bit.
Comments
It’s a bit disappointing you have been so hostile towards me, honestly.
My experience with section 230 is that all discussions around it online suck and no one believe it can be updated. Which has gotten tiresome.
I'm suggesting that if the policy *explicitly excluded* user-generated content feeds then it would maintain liability where it belongs.
The key here is that a social network which implements a non-optional filter is exercising editorial control -- precisely the thing 230 was *not* designed for.
It would basically be like the "click here to opt-in to tracking cookies" nag buttons.
"AOL falls squarely within this traditional definition of a publisher and, therefore, is clearly protected by § 230's immunity."
Would probably make more sense to just put weakly filtered feeds behind disclaimers instead
All of them different.
Oh I don't envy anyone trying to explain the differences :-)
My issue is that social networks which boost or limit content in non-optional ways are acting in many ways as publishers, which is violates the original premise of Section 230.