Off with their’eads, Facebook!

Growing up the Eighties, snuff movies were urban legend in the school yard. Like WWF wrestling, some argued they weren’t real, some lied about having seen them, some might have actually come across a copy of the banned Faces of Death-video tape which featured alligator accidents and mortal falls from tall buildings. For sure the myth was greater than the actual content. Today’s school kids don’t have to debate the existence of snuff movies, videos of real killings are readily available both on file-sharing networks and more legitimate platforms like YouTube. Anyone who likes can watch an actual decapitation at anytime. On-demand real-life depictions of lethal violence.

Facebook has changed its content policies and now join the ranks of those services that accept extremely violent videos uploaded by users. There is only one caveat: it has to be uploaded with the purposes of criticising the action. For sure, it is the Facebook user that does the actual uploading and supposedly should convince Facebook that its objectives are not such schoolyard mythology as in my childhood, but important social critique (presumably of the horrible practice of decapitation). Looking beyond the ambiguity of verifying the purpose, the philosophical question whether the end justifies the means (is the video still okay if other users link to it with less morally appropriate comments?) and the practical matters of monitoring these shades of gray, this puts the spotlight on a very challenging topic of curation, editorial responsibility and intermediary privilege. Most internet companies will claim the technology is neutral and that users are responsible for how they use the service. When asked to forward information to those users, whether it be Google, your broadband carrier or The Pirate Bay, the answer is always that such a thing would be a violation of privacy. This is a – vulgar – interpretation of the safe harbour principle, or intermediary privilege if you will. But when these services go beyond providing the means and begin to interfere with the content – be it banning certain types of content or shaping data traffic to increase revenue – they enter the domain of media companies: selecting, filtering, curating, taking upon themselves to decide what is appropriate for distribution. But the media is closely regulated across the world in terms of editorial responsibility (and many other ways, not least regarding adverts), whether it be in law or through self-regulation (in most cases a combination of the two). The assumption is that you cannot have mass publication without responsible editors. This is a good principle that has contributed greatly to an informed public and a political debate that is fundamental to democracy, at least in most Western democracies. There is no reason to believe that internet makes this any different, quite the opposite. The examples of problematic mass publications on social and online services are numerous.

Internet services have a choice of being responsible editors, in which case they must abide by similar rules as media companies. Or they can be infrastructure providers, in which case they must allow for contact with their users (or at least relay communications) and must not interfere with the traffic except for network integrity reasons (viruses and similar). Details of both avenues can be discussed, but what such companies can’t do is be media sometimes and infrastructure other times as it suits them. It is – to borrow a word from digital technology – a binary choice. In both options, the rules must be transparent and subject to outside scrutiny, the practice of user agreements as a reasonable replacement for democracy is passé, if it were ever plausible. If Facebook prefers to be a media company, that is fine: but in this case they cannot decide the rules themselves, it has to be done by a third party – self-regulation body or government – and it has to be completely transparent and open to discussion. No serious media company would ban pictures of nursing mothers but allow decapitation videos.

If anyone ever really bought the line that technology is neutral, this case should convince the last believers that it is the opposite. The views and convictions of those who develop and control technology are – articulate or not – the basis for the design choices.