Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> a large human moderation team

You're not the only person who has suggested this. Let's think about that for a second. Let's say it takes 6 minutes for a moderation team to review an older account. There's 2 billion accounts, so it'd be good to review all of those. It would take about 200 million hours. Presumably you'd want to re-review positive cases so no moderator has too much power. Additional time. Even if Facebook literally doubled the number of employees, and hired 50000 people overnight, they would still take 2 years to complete the review. But in that time it's possible that previously benign accounts turn abusive.

And then think about the 20 million odd new accounts that are created every day. How long before each of those are reviewed? And what signals will you use to review them? These are mostly empty accounts, so there's not much to go on.

And that's just the problem of aged fake accounts. How about bullying, harassment, nudity, terrorism, CEI and all the other problems?

It's interesting talking to people who say "oh that problem is easy to solve, just do X" without realising that the problem is more complicated than it looks.



> It's interesting talking to people who say "oh that problem is easy to solve, just do X" without realising that the problem is more complicated than it looks.

At no point did I state that the solution was easy. My response was to your claim that you do not know of any possible solution, not an easy solution; to wit, you invited input:

> Maybe it’s possible but I don’t know how. If you have ideas on this, please share.

I also don't follow your examples. Why are you tasking this hypothetical team to review all two billion accounts? The main issue at hand seems to be lack of sufficient staffing to review reported accounts. Why not start there?


Facebook can afford to hire 100,000 moderators. That would let them review every account every year. 100,000 * 2000 hours a year is 200 million hours. They don' actually need to do that, so they can have multiple moderators review some accounts instead.


That is (approximately) accounting for two weeks of annual leave, but not budgeting for illness or other factors, I normally go with "roughly 200 work-days" (socialist Europe, here, so I start from a base-line of five weeks of annual leave), that gives 1600 worked hours per person, taking it to 160 million hours. Still, plenty.


Facebook shouldn't be expected to clean up the messes they create because it's hard?


They didn’t say that: none of their comments seem to be defending Facebook. They are giving their opinion that human moderation is not a simple solution. I super appreciate nindalf’s comments here. It is a shame that an ex-developer who knows the problem space and is clearly explaining some if the issues is getting flamed by association.


Sincere question. Not flaming.

If human moderation won't work. And whatever they're doing now is an unqualified disaster. Then what is the solution?

Oil companies tell us that oil spills and pollution and ruined ecologies and the burning planet are just part of using oil. Sad face.

They're doing their very best to minimize the negatives. They hire the very best lobbyists and memory hole as much as possible and donate to some zoos and greenwash and "recycle".

What more could they possibly do?

Really, what do you expect? Stop using oil?! Please. That's crazy talk.

More seriously, I'm not saying that Facebook is an unmitigated evil, that their biz is the moral equivalent of trafficking (humans, arms, drugs, toxic waste), or that humanity would be better if it had never existed.

I'm only asking why they continue to create a mess that they are incapable of cleaning up, by their own admission.

--

I understand these questions are for The Zuck, The Jack, and their cast of enablers (profiteers) like Thiel, Andreessen, etc.


If you're not morally comparing Facebook to oil companies & toxic polluters, why do you constantly analogize them to Facebook and describe their memory holes & ruined ecologies & the burning planet as if that's comparable to what Facebook is doing? Where does "unqualified disaster" even come from?

Toxic waste companies engaged in the conduct you described and had the impact you described and they were condemned after we found solid evidence that they were doing so. Do you have any argument or evidence whatsoever that Facebook is behaving similarly? If you want to argue the moral equivalency, argue it. Don't spin an evocative narrative about the disasters of the oil industry in the same breath as Facebook's moderation policies then disclaim it "I'm not saying that...".


This doesn't sound like an incredible burden. They are making twenty million accounts a day apparently, if they need to slow down they can.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: