People are awful. It’s the obvious (but no less depressing) reality that Facebook—and especially Mark Zuckerberg—somehow failed to recognize.
Whether it was a naive belief or a negligent assumption, the fact remains: Zuckerberg built Facebook’s content control systems with the core idea that people can police themselves. Now, he’s been scrambling for months, if not longer, doubling down crafting rules to herd his trolling-prone cats. Of course, it’s been a dismal failure.
The hundreds of pages of Facebook Moderator guidelines uncovered by The Guardian are a stunningly analog solution for a digital company as advanced as Facebook. They’re also the clearest indication yet that Facebook’s just making this up as they go along.
What’s also clear is how deeply ineffectual these guidelines are—not just for users, but for Facebook’s army of moderators and the two billion Facebook users who rely on them to keep their feeds scrubbed of the most disturbing content.
And that content’s disturbing.
Reading through guidelines for Graphic Content, Revenge Porn, Sexual Child Abuse, it’s hard not to be struck by Facebook’s plodding attempts to identify what is and isn’t objectionable, as well as the base nature of the examples.
Much of what appears in these stark, black and white slides is drawn, it seems, from Facebook itself….