Hacker News new | past | comments | ask | show | jobs | submit login
The laborers of content moderation on Facebook (wired.com)
101 points by InternetGiant on Oct 23, 2014 | hide | past | favorite | 19 comments



What does it say that there's no commentary here? Collective guilt over the human cost of maintaining these services? I also wonder how a site like Ello would deal with this, if they became big enough to need this kind of content moderation.

I can't imagine having to look at the worst humanity has to offer for eight hours a day. It should merit hazard pay and unlimited psychological counseling at the very least.


Do we feel collective guilt over having paramedics who need to scrape dead toddlers run over by trucks off the road, or hospice nurses who spent their days changing the diapers of dementia patients? It's a shit job, but they're stopping everybody else from seeing the worst of humanity.


In Europe (and I am pretty sure in the US as well) paramedics and hospice nurses have access to psychological help and support if something on the job demands it.


Exactly. These content moderation jobs seem pretty harmless when compared to the real world.


> I can't imagine having to look at the worst humanity has to offer for eight hours a day. It should merit hazard pay and unlimited psychological counseling at the very least.

I doubt it's that bad. I think those people are in autopilot mode and just click "reject" all day long based on a few keywords and pattern.

I found amusing when people in news forum think they are censored for they supposedly controversial thoughts. What actually happened is that some poor worker that couldn't care less rejected their post in a glimpse.


I spent my morning bus-ride wondering about this and my conclusion is that there's not much to say ... I wouldn't want that job and I doubt that anyone else would either. The question of why we feel guilt and/or whether we should is a bit ambiguous:

1) If I'm viewing moderated content, how do I even know that? And should I feel guilty for only being shown that which meets certain guidelines? I'd say no. If I'm disturbed by something I see on the Internet, should I assume it's unmoderated content and/or somebody didn't do their job? I don't.

2) If I'm running a business and feel the need to provide moderators, should I feel guilty that they're exposed to the content that's deemed unfit? Assuming I'm paying them what a wage that's mutually acceptable, I don't think so. If the wage wasn't acceptable given the job, people would simply work elsewhere. If I'm seeing high turn-over, that might indicate these employees decide the job isn't worth the psychological trauma only after they've spent time working. Should I help these workers maintain their sanity (through training, counselling, etc as well as designing the systems to minimize the content's impact)? Absolutely! Should I pay them more than the market rate? That doesn't really fix any mental issues the job is causing, it just locks employees into a role they can't take longer.

3) If I'm posting content that needs to be moderated, should I feel guilty that someone has to moderate the garbage I'm spewing? Nope ... I'm the five percent or so of the population that rarely considers the consequence of my on-line actions! In fact, I find it funny to cause other humans discomfort (excuse me while I go DOX the latest underage rape victim).

So the problem is that those who are guilty will never be stopped, and can't easily be blamed. Those in the middle have decided to apply quality standards (which might mean moderation) and are probably clinging to razor-thin profits (if they're not just burning VC money) and it's simply a business decision - do I make more with moderation or without. And those of us viewing any given web page don't know whether or not moderation has happened.

I think it's a lousy answer but my opinion is that there are a lot of jobs in this category - jobs that people don't really want to do. Does anyone really want to be a garbage-man? If you watch "Dirty Jobs" with Mike Rowe, I think you find quite a few that aren't desirable but are made viable by the market.

Sorry for the long non-answer!


The Arab Spring was in full swing, and activists were using YouTube to show the world the government crackdowns that resulted. Moderators were instructed to leave such “newsworthy” videos up with a warning, even if they violated the content guidelines.

This seems related to the other front page article on google and geopolitics.


I wonder how hard it would be implement automatic face masking for these videos, to increase the viewer's emotional distance from what they're exposed to.


Google did this with Street View for faces, house and vehicle numbers. But just masking the face won't decimate the negative effect.


This was an incredibly sad article. Traumatizing thousands of third (developing?) world citizens to keep our Facebooks and Youtubes free from the depravity and horrors of humankind....and, um, sex. If there ever was a need to effectively automate something it is this...for the sake of these poor workers, if nothing else. Replacing tens of thousands of workers with an algorithm would be insanely valuable as well.


>Replacing tens of thousands of workers with an algorithm would be insanely valuable as well.

You have to believe that a company like Facebook would much rather have an algorithm doing this then spending money on getting it done manually. The problem is that it isn't a feasible solution right now.


Here's a way for you to solve the problem. Offer those workers better pay and conditions than what Facebook is offering. So far nobody's stepped up to the plate.


Because this is so stupid as saying to someone to solve the problem of slavery starting to pay the workers. The problem is that in a market where there's cheaper products being produced because someone doesn't pay for the labor force, someone who give salaries will be in economic disadvantage. The logic is the same if you substitute "slavery" with "bad conditions and poor salaries". When a problem is in the system, you can't solve it just changing individual attitudes.


You can by all means tell us how to solve it. Right now Facebook's the best job offer they got, you can't just wish that fact away. I guess you could do what the GP wanted and create a competing product that drives them out of the job?

> The logic is the same if you substitute "slavery" with "bad conditions and poor salaries".

No, it's not the same. Slavery is an entirely different kind of bargaining position.


I would hope that these big services, like Facebook, use content matching techniques to quickly filter out repeat submissions...


Along similar lines, an excellent article on the gendered nature of content moderation work from earlier this year: http://thenewinquiry.com/essays/hate-sinks/


If anything, the upside is that if we come up with better techniques for detecting unwated content we'll free these people for more productive endeavours.


Off topic, but I would kill to have such view outside of office window like that in the headline picture.


Instead of discussing alternative ways the "moderation" could be done, how about a different idea...stop censoring the content in the first place.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: