Who’s Watching the Watcher?

Amine Derkaoui, a 21-year-old Moroccan man, is pissed at Facebook. Last year he spent a few weeks training to screen illicit Facebook content through an outsourcing firm, for which he was paid a measly $1 an hour. He’s still fuming over it.

“It’s humiliating. They are just exploiting the third world,” Derkaoui complained in a thick French accent over Skype just a few weeks after Facebook filed their record $100 billion IPO. As a sort of payback, Derkaoui gave us some internal documents, which shed light on exactly how Facebook censors the dark content it doesn’t want you to see, and the people whose job it is to make sure you don’t.

Facebook has turned the stuff its millions of users post into gold. But perhaps just as important as the vacation albums and shared articles is the content it keeps out of user’s timelines: porn, gore, racism, cyberbullying, and so on. Facebook has fashioned itself the clean, well-lit alternative to the scary open Internet for both users and advertisers, thanks to the work of a small army of human content moderators like Derkaoui.

“We work to foster an environment where everyone can openly discuss issues and express their views, while respecting the rights of others,” reads Facebook’s community standards .

But walking the line between keeping Facebook clean and excessively censoring its content is tricky, and Facebook’s zealousness in scrubbing users’ content has led to a series of uproars. Last April, they deleted an innocent gay kiss and were accused of homophobia; a few months before that, the removal of a nude drawing sparked the art world’s ire. Most recently, angry “lactivists” have been staging protests over Facebook’s deletion of breast-feeding photos.

Read the whole article at Gawker

Advertisements