Have You Met a Whistleblower?

Today, November 8,  Frances Haugen will give testimony in European Parliament’s IMCO Committee. Ms Haugen is the latest Facebook whistleblower to call foul on the conduct of the platform.

Will this be the moment where Facebook steps up to their responsibilities as intermediary? Or will there be yet another round of symbolic action? The European policy-makers hold the keys to holding platforms to account.

This is not the first time that Facebook is in the spotlight,  it’s become something of a habit.

In 2018, Cambridge Analytica was revealed to extract user data for political influence, using Facebook precisely as it was designed – micro-targeting users.

Facebook then failed to intervene when its services were used to broadcast hate propaganda that contributed to the genocide of the Rohingya population in Myanmar.

Last year, former employee Sophie Zhang posted information showing how Facebook knows their services are used for abuse and propaganda, there are internal deliberations whether to intervene in certain cases. Deliberations that stretch far from “it’s the algorithm, we don’t know what it does”.  There are mere examples of scandal, harm and negligence. It is remiss not to mention   the Capitol Hill storming or the murder of British MP David Amess or the Christchurch massacre when discussing the reasons why there’s lots to whistleblow about.

“It may not be your fault.
But it’s your problem”
Steven Levy

In their 2021 book An Ugly Truth (Harper Collins), journalists Cecilia Kang and Sheera Frenkel cover the controversies around Facebook over the last half a decade. The back cover is pure genius: rather than the traditional blurbs bragging about how great the book is, it has a list of quotes from Facebook-founder/CEO Mark Zuckerberg and chief operating officer Sheryl Sandberg! Like these two:

We never meant to upset you” – Shery Sandberg, July 2014

I ask forgiveness and I will work to do it better” – Mark Zuckerberg, September 2017

Beside such vague statements, Facebook’s actions have included hiring a few thousand more moderators and the “oversight board” – which is employed by Facebook and has the power to criticize interventions where content is removed or users restricted, but not cases of non-intervention.

Of course, this is not real self-regulation: proper self-regulation has transparency, independence and teeth. This is well understood in many industries: news, advertising, games etc. There is no reason it could not work in social media.

Facebook’s reluctance to take real action has little to do with some mysterious algorithm that works beyond the control of any human mind, but rather a question of ideology on Facebook’s part. Consider what Facebook Vice President Andrew “Boz” Bosworth said in an internal memo appropriately titled “The Ugly” in 2016:

So we connect more people. That can be bad if they make it negative. Maybe it costs someone a life by exposing someone to bullies. Maybe someone dies in a terrorist attack coordinated on our tools.

As the company develops the new, three-dimensional internet – the “metaverse” – these questions will only become more prominent.

Wired Magazine’s Steven Levy said it best: “It may not be your fault. But it’s your problem.”