Nothing remarkable in a company that makes lawnmowers.
Facebook is a company that uses its power to change what people think because so many view it as a source of truth. Doing it for money, or doing it for power, is essentially the same thing. If you are in position to manipulate the truth for benefits to you or your friends in high places you will, and he is.
If I worked at Facebook and felt the CEO was twisting truth for money, I'd leave, but the bribe to work there is so high.
If Zuckerberg truly was not a dictator wannabe, he'd give up voting control of the company, but of course he won't.
> Facebook is a company that uses its power to change what people think
I don't think that's a helpful way to look at it. Facebook is a platform that other people use to change what people think, and that's dangerous enough. Without any ill intent whatsoever from Facebook itself, that still creates serious issues with unintentional bias (e.g. via algorithms trained with poor data sets) and manipulation. Those are the problems we most need to address, because the number and power of people trying to manipulate Facebook from outside totally dwarfs the ability of anyone inside - including Mark - to do so even if they all wanted to. Framing it as "Facebook and all of its employees are evil" doesn't solve anything, and wouldn't even if it was true.
Except that Facebook's algorithms (like YouTube's, and I'm sure others as well) push people towards groups and sources with "high engagement", which often tend to be conspiracy theories and hate groups.
It's not that their algorithms are trained with poor data sets; it's that they're trained with good data sets, and those data sets show that this right-wing, radicalizing content has huge engagement numbers. Thus, since engagement and time-on-site are surely key KPIs for Facebook, promoting those groups exactly fits with Facebook's stated business goals.
Facebook needs to be political, because they're destroying society and democracy for profit; in the early days they could have been forgiven for not realizing this, but in the last five years I think it's become incredibly obvious that it's happening, and Zuck is doing everything he can to shirk the responsibility of not screwing everything up for everyone.
While other people post content, what content is nudged higher up your feed aor further down the list is definitely in Facebook’s control. They are driving public opinion based on what is promoted or demoted. I wouldn’t be surprised if the Zuck personally decides which posts are promoted, esp. when they come from important people. It’s too much power to just give up to others, and Facebook has never had noble intentions.
Not really. Even the people who train the models and code the algorithms don't have a lot of control over how they respond to ever-changing inputs - and that's kind of the real problem IMO. There was a recent thing on Twitter, where someone discovered that it would consistently generate thumbnails focused on white faces when both white and black faces were present. At a gross level this is clearly because of inadequacies in how that algorithm worked, but literally nobody could say exactly what was wrong or how to fix it (especially without introducing new kinds of bias). Facebook has the same kind of problem, at approximately 10x Twitter's scale.
It's a machine that nobody knows how it works now, and therefore nobody knows how to fix it. Mark can influence it in very vague and general kind of ways, I suppose, but there is no fine control and often these gross adjustments bring their own unintended consequences.
If we don't understand how something works and it seems to be causing negative effects perhaps we shouldn't be using said algorithms in the largest social network.
I'm not actually going to disagree with that. It's a perfectly valid point of view. However, I will also add this thought, which I also expressed internally.
Neutrality doesn't just happen. No matter how neutral a system it is at one point in time, people will continue gaming it and it will cease to be neutral. There must be continuous active response to maintain that neutrality. This is a lesson Google learned with SEO long ago. Passivity is not the same as neutrality, and insisting on one often makes the other impossible.
That's not meant as a refutation of your point, but perhaps it's worth some time to think about how a "no algorithm" version of Facebook would actually play out. Here's your starting point: Twitter but 10x as large.
> Neutrality doesn't just happen. No matter how neutral a system it is at one point in time, people will continue gaming it and it will cease to be neutral. There must be continuous active response to maintain that neutrality.
This is a scary thought - an algorithm that "decides" what neutral is, and pushes all its users towards its predetermined neutrality.
All of this talk is nonsense if we can't define neutral - which we can't. If Facebook is going to be an arbiter of that, they need to be responsible for the content because they choose what content is shown and when - if it was just a timeline feed, they would bear no responsibility.
Facebook wants to make all the money and not scare off customers - a clear conflict of interest.
> if it was just a timeline feed, they would bear no responsibility.
Did that work for Twitter? They've hardly been immune from criticism either, despite being ~1/10 the size of Facebook and being favored by the ruling party in the US. Chronological timelines aren't silver bullets. They're only chronological among that which is shared and there are still huge differences in how much content gets shared/like/retweeted/whatever. Those are the differences that people will exploit, leaving things only "neutral" in a narrow technical sense that has nothing to do with effect. It's naive in the extreme to think they actually solve the problem.
My point isn't that a chronological timeline fixes the problem of social media. My point is that if you are going to decide what is shown, you are making a choice, and should be responsible for that choice. It is no longer just the user's post - because you control its avenue, you are responsible for it and how it spreads, and becomes "your" speech, in this case, the algorithm's, and its company's.
So you're suggesting that Facebook should try to absolve itself of responsibility, but not actually solve the problem? No offense, but that seems exactly opposite to where you started. We're done.
I think you're missing my point. I'm not saying that Facebook should give up. Everyone there should make every effort to improve. My point in bringing up complexity is that it's going to take a lot more than Mark waving his hand and magically making everything better. This isn't some video game where one player hits a button and thousands of minions instantly rearrange themselves and all of their actions are resolved in milliseconds. Anyone who knows anything about complex systems - especially those involving people - knows that's not how they are.
In reality, no matter what leadership says or does, actual change will still require continuous effort from literally thousands of engineers, data scientists, and others. It will take time, as all such things do. Petulantly demanding that things happen faster than they can happen isn't going to make it so.
> I don't think that's a helpful way to look at it. Facebook is a platform that other people use to change what people think, and that's dangerous enough.
I'm not sure there is a distinction. Intent doesn't matter. If your system is used by other people do X, then you are building a system that does X. The Purpose Of A System Is What It Does [1]. You can't just wipe your hands and say "hey it's just a platform, it's really other people doing this stuff!" Knowing what their system does, they wake up every day and with intent say "We are going to continue with this system".
I'm not washing my hands of anything. I'm not denying there is a problem. I'm trying to identify the nature of the problem, because that's important to finding solutions.
> Knowing what their system does, they wake up every day and with intent say "We are going to continue with this system".
No. They don't. Please don't pretend to read others' minds. There are a lot of people at Facebook who are trying to improve these things, but it's a very complex problem and a very complex system. There's lots of disagreement about what the solutions are, or even which direction represents improvement. And mindless bashing just doesn't help. If your only answer is that Facebook should die, then you might as well just be going "bla bla bla" because that doesn't move the needle at all.
You might notice I'm not bashing and haven't made a moral "should or shouldnt" argument. I'm simply saying that complex system is what it does, regardless of the intent of the people building it. You can't disentangle Facebook the platform from what its users use it for. It's all one system that does X. If Facebook the company is trying to change that X to something better, they should be supported.
Are you arguing that Facebook should be a website where people don't try to change each other's minds about things? It is absurd that on a website devoted exclusively to commenting on links there are people seriously arguing that people changing each other's minds via debate and links is dangerous.
No, I'm not saying should or shouldn't. I am narrowly pushing back on OP that what Facebook is cannot be disentangled from what its end users do. If we agree Facebook's users collectively do X, then we must agree that the purpose of Facebook The System is to do X. I'll let other people discuss whether that X is good or bad.
I've seen a fair bit of criticism of Zuckerberg, but all of the criticisms would probably be more true if there was a CEO appointed by a profit maximizing board instead.
Facebook is a company that uses its power to change what people think because so many view it as a source of truth. Doing it for money, or doing it for power, is essentially the same thing. If you are in position to manipulate the truth for benefits to you or your friends in high places you will, and he is.
If I worked at Facebook and felt the CEO was twisting truth for money, I'd leave, but the bribe to work there is so high.
If Zuckerberg truly was not a dictator wannabe, he'd give up voting control of the company, but of course he won't.