Hacker News new | past | comments | ask | show | jobs | submit login

I'm not on Googles side here, but... As nice as your sentiment is, there are plenty of real problems which have to be addressed.

Who defines, and what are the definitions of "Large" and "General Purpose" in this context?

Am I allowed to moderate content as fits with my ToS at X number of users, but at X+1 users I become forced to publish all content which is "not otherwise illegal"?

In regards to what is "not otherwise illegal", which laws from which country apply?

How does one combat things such as excessive spam? If I am forced to allow any content which is "otherwise not illegal", and my service becomes literally unusable because of someone posting 10,000 cat pictures per second (cats aren't illegal, I'm forced to distribute the cats), what is my recourse? Am I still allowed to rate limit? Because that, boiled down, is censoring a users ability to post "otherwise not illegal" content.




The courts decide what is a large platform. And they decide what is "not illegal"

In Germany, there were actually cases where courts have ruled that facebook is not allowed to delete non illegal posts:

https://www.heise.de/newsticker/meldung/Oberlandesgericht-Me...

One big part of the argument of the judges was based on contracts. When a user makes a facebook account, it creates a contract between the user and facebook. The user agrees to share their data for ads and stuff, and facebook agrees to publish their posts. Thus they have to do that, they cannot just decide to not publish some posts, that would be a contract violation.

Just like when you order a pizza with 10 toppings, the pizza service cannot deliver a pizza with 8 toppings and still expect you to pay the full price

And as second part, the court has ruled that a clause "we choose which posts to publish" would be invalid in the contract, because a contract must be fair to all involved parties and not violate basic rights like freedom of speech.


My post was meant to be more rhetorical in nature, to demonstrate that overzealous censorship is not an issue which can be solved by simply stating "force all non-illegal content to be/remain published".

It's easy to hand-wave it to the courts, but there are many issues with that approach as well (e.g. non-technically competent people making decisions about technology, government not immune to corruption nor censorship, courts are slow by nature).

Your answer also conveniently side-steps the nitty-gritty implementation issues. How does a platform deal with spam posts if they are unable to delete content that is legal? If you don't like a website, simply post a few thousand Viagra advertisements to their front page for a few weeks and they either have to advertise Viagra or shutdown.

Speaking of advertising, why bother pay for ad-space on a website like Reddit, when I can just spam my product on every subreddit over and over again and they have no choice but to keep it up?

Will these rules also be applicable to other forms of media which also have a viewership above the arbitrary "large" line? Newspaper, books, TV? If not, why not?

These are just a few off-the-cuff issues that come to mind. I'm sure people smarter than me who take some time to seriously consider this approach will be able to find hundreds of such examples of why "forced to publish all non-illegal content" sounds great but is simply not a feasible solution to the problem.

As an aside, I find it odd that the answer to censorship by private companies is to offload everything to the largest centralized system with the longest history of censorship: governments.


Why would we ask the courts to decide what is large? What if I don't like their answer? Surely this is a legislative question.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: