It seems to me that you're still ignoring the tail risk of authoritarian control of the concept of truth, namely, gulags.
It's a justified fear, since it's happened repeatedly in living memory, and is happening still: you're welcome to go hand out pamphlets about the June Fourth Incident on Tiananmen Square if you don't believe me.
If you want to discount that risk, ignoring it or glossing it as some sort of slippery slope argument, that's your business. I won't, and we find ourselves on opposite sides of the debate for that reason.
To be clear, I don't think it's a slippery slope, because I don't think it's an accidental or avoidable consequence of allowing authoritarian control of the terms of discourse. I think it's the expected outcome, and that people who think that end state can be avoided are being used by people who crave that power over others.
Well, we’re talking about Facebook, so I’m not worried about gulags. At least from Facebook itself. From the people who get their information on Facebook, though, maybe I should be.
I’m positing that in the long run it will be easier and less detrimental to society to establish some kind of editorial guidelines around misinformation and hold platforms accountable to them than it will be to educate enough people quickly enough to vet misinformation for themselves, especially as misinformation becomes increasingly hard to spot. I say this knowing fully that defining what misinformation is will be extremely hard, and you’re putting a lot of power in the hands of whatever person or group does that. Every decision is a trade off where you choose what benefits you think are best and what problems you think you can solve best and trying to balance them. I think we can better solve the problem of effectively limiting power abuse than we can of limiting the broad abuse of inherent human biases.
Edit: I should also note, I agree with a handful of the assumptions I listed above, specifically that democracies are built on the idea that people can broadly reach the right answer together. I still believe that. But I also believe in clearing out the brush and debris in the way so the crowd can actually use that superpower well.
I think I'm generally on your side of the debate, but are we really talking about authoritarian control of public discourse here? I thought we were talking about Facebook, a private corporation, deciding whether they want to allow their platform to be used to spread false and potentially harmful information.
Facebook is powerful, but it's not an authoritarian state. I'm not sure I see much danger in Facebook weighing in on different kinds of falsehood (i.e., your insight that things can be false in different way). It seems more akin to a journal having standards for what it publishes, or HackerNews hiding or removing egregiously bad comments, than it does to the Chinese government jailing dissidents.
I see so much space between a journal and the CCP that Facebook is more like neither than like either.
The main historical analogy to Facebook which seems relevant is the phone company. Either Facebook is so large and influential that misinformation on Facebook can threaten our political process, or it isn't, and clearly it's the former.
When Ma Bell was the only game in town, they weren't allowed to deny use of the network to the Communist Party, because that was clearly a violation of the concept of free speech.
I don't think the difference between a natural monopoly and a state monopoly has much relevance here, what does matter is that if you ban, say, a political party, from Facebook, it absolutely cripples their ability to participate in the democratic process. That's too much power for me to simply shrug and say "their house, their rules".
It's a bad situation, and we should get out of it. And yes, Facebook itself isn't the Ministry of Truth, and doesn't have jackbooted security waiting in the wings.
But I firmly believe that the parties who are pushing for control over social media discourse absolutely want that end state, they are driven by power, and the only way to fight that outcome is to resist it early and often. Denying a victory here will spare expensive battles down the line.
As we are discovering there is a similar tail risk to allowing people lie. Facebook goes a step further and targets the people especially susceptible to specific less of the same type of lie. TBH my problem with Facebook is that it is a bullshit funnel into an echo chamber. If they removed those algos and showed people a varying mix of opinions and positions the problem would largely go away.
Agreed. I stopped using Facebook when they stopped providing a chronological timeline of all my friend's posts. That was actually a useful service, but I guess it didn't drive as much "engagement".
Aren't you ignoring the fact that actual authoritarians are using the lack of moderation in FB to spread misinformation and incite populism and take control? This has occurred in India, Sri Lanka, Myanmar, and I would argue currently occurring in the US.
I also used to be a free speech absolutist, but I'm not anymore since I realized that it doesn't actually result in the kind of educated discourse and debate where truth prevails (as libertarians seem to think it does) - it results in 8chan and QAnon. Human beings aren't rational, and are rife with cognitive biases towards bigotry and cruelty that can easily be hijacked by populists.
It's a justified fear, since it's happened repeatedly in living memory, and is happening still: you're welcome to go hand out pamphlets about the June Fourth Incident on Tiananmen Square if you don't believe me.
If you want to discount that risk, ignoring it or glossing it as some sort of slippery slope argument, that's your business. I won't, and we find ourselves on opposite sides of the debate for that reason.
To be clear, I don't think it's a slippery slope, because I don't think it's an accidental or avoidable consequence of allowing authoritarian control of the terms of discourse. I think it's the expected outcome, and that people who think that end state can be avoided are being used by people who crave that power over others.