Google Sees the Light after Advertisers Pull Out

Consumer power may be an illusion online, but advertiser power is real. In theory, users can “vote with their feet” and leave a service they don’t like. This is often hailed as “competition is only one click away”. Except that does not take into account how network effects influence life online. Metcalfe’s law explains how the value of a network increases exponentially with the number of nodes in the network (or users), which is why online services are so focused on scale. The flipside is that the penalty for being outside increases with the value of the network. Try to quit Facebook and see how that hurts your real world social life: invites, points of reference, jokes, even dates vetting you – all lost. (Here’s one story of how hard it can be.) The consequence: Consumer power decreases exponentially with the size of the network.

Advertiser power, on the other hand, is real. Google found out the hard way last week when a host of advertisers pulled their adverts from YouTube after having found that their ads appear next to hate speech from sources such as “American white-nationalists, anti-gay preachers and radical Islamic groups”. As Google (which owns YouTube) relies almost completely on advertising for its revenue, it acted fast. Tougher community guidelines and hiring more people to police content were some of the promises made as Google apologized to its advertisers. Not only will hate speech videos lose ad revenue but may be taken down altogether.

Not a day too early, the idea of radical freedom of speech online came to an end many years ago, as shown by another famous internet law: Godwin’s law that explains that every discussion online that goes on long enough will lead to one person comparing another to Hitler. This was in 1990, years before Google was founded or YouTube invented. In fact, every freedom of speech act anywhere has restrictions to such things as hate speech and death threats. The question is how anyone came to believe the internet would be any different.

So, good news Google finally decided to take some responsibility for the content it distributes. It is not “only a platform” that makes the distribution available to third parties, but also monetizes the content they upload and promotes it to viewers based on past preferences through its algorithms. Some call it filter bubbles, but you can also be called curation. Or editing. Of course, the same issues have occurred in traditional media since day one so what Google is finding out is much similar to what the press has spent a few centuries dealing with: how can we balance the benefits of free speech against hate and threats. The answer has always been editorial accountability. Let’s hope Google has seen the light and can become part of the solution, creating an example for other online actors. Next, let’s hope it applies the same insight to neighboring issues, like privacy rights for individual users and protection for creators whose content is distributed against their will. That would make the internet platforms very helpful, not only to their shareholders and advertisers, but to the world. Silicon Valley loves to talk about changing the world. Right here is a good opportunity.

This is Netopia’s newsletter March 27th 2017