Skip to main content
opinion
Open this photo in gallery:

A Facebook logo is shown in a photo illustration taken March 25, 2020.Dado Ruvic/Reuters

Taylor Owen is the Beaverbrook Chair in Media, Ethics and Communication and the director of the Centre for Media, Technology and Democracy at McGill University. A version of this article was first published by the Centre for International Governance Innovation (CIGI).

The growing boycott of advertising on Facebook, which now includes Unilever, Coca-Cola, Starbucks, Ford, HP and Adidas, is an unambiguously positive development. Last year Facebook made 98.5 per cent of its US$70-billion in revenue from ads. So leave no doubt that this is getting the company’s attention.

The boycott is happening because of a confluence of events. U.S. President Donald Trump inciting violence and racism has demonstrated the flaws in Facebook’s approach to exempt political ads from its content moderation policies. The Black Lives Matter movement has highlighted the harms caused by racist speech that circulates widely on the platform. And the coronavirus has shown us the collective costs of false information circulating widely in our public sphere.

In response to these events, a wide range of social-media companies, including Facebook, have made changes in how they treat hate speech, misinformation and radicalization on their platforms, doing things they have long denied were necessary or even possible.

But we should be cautious about outsourcing the governance of speech in our society to either a global platform company that sits largely outside of our jurisdiction, or to the whims of brands who momentarily see their interests aligned with the public good. That a boycott is even necessary is not a success of self-governance, but a failure of democratic governance. The most meaningful response to this problem would be for governments to develop new competition, data and content policy suited to the digital world.

First, without better competition policy, this moment won’t last. Time and time again, after each scandal that has hit Facebook, the market has sent a clear signal: Users and advertisers always come back. Mark Zuckerberg knows this, recently telling employees that he expects boycotting advertisers to “be back on the platform soon enough.”

The reason is because we largely have a digital ad duopoly. If you want to place a sophisticated microtargeted ad on the internet, you effectively have two options – Facebook and Google. Only they have the massive data collection capacity and reach needed to make this type of product effective. The two companies earn more than 50 per cent of the world’s online ad revenue, with the only other global competitor being Amazon. Hardly a dynamic market.

Monopolies are market failures, and it is up to governments to correct them through a range of competition policies, including antitrust, merger and acquisition restrictions, and enforcement of broad consumer harm protections. If we want the market to have any chance at correcting the problems being flagged by the boycott, then we need to make sure that advertisers and users have more choice. This will only happen if there is real competition. Publicly traded private monopolies do not self-regulate. This is why we have governments.

Second, without far better data privacy and rights, we are missing the root cause of this problem: On platforms such as Facebook and YouTube, it is largely automated systems that determine what we see and hear and whether we are seen and heard. Algorithms decide what speech is amplified and what is not. They decide what videos we are presented with and recommend what private groups we should join. They decide what ads we see and how our behaviour can be nudged by subtle messaging.

The primary motive of the business model of social platforms is to keep us on the sites, and so these algorithms are calibrated for engagement. And it turns out that what engages us is not always aligned with what informs us or brings us together. Instead, false, divisive and hateful content too often drives us to click, comment and share – the very things that make platforms money. As a human-rights audit of Facebook released this week found, these algorithms – essentially the engine of the platform ecosystem – fuel extreme and polarizing content.

How they work, and how they collect the data that drive them, is entirely opaque, hidden from view and unaccountable to public scrutiny. Governments must adapt their data privacy laws to the digital world, empower their privacy regulators, and force transparency and accountability over the technologies that determine the character of our digital public sphere.

Third, democratic governments need to engage in the difficult and fraught debate about content moderation. We are delegating the governance of speech to a small number of companies responsible for moderating the world’s communication. No matter how Facebook, Google, Reddit and TikTok respond to this boycott, we should be wary of their decisions shaping what we can and can’t say. What speech we allow in our society is a core function of democratic governance – one that is riddled with trade-offs and tensions, historical and cultural context and the nuances of languages. And because of this we need citizens and governments to be part of the decision-making process.

At its core, this debate presents a challenge between the values of free speech and our responsibility to protect people from the harms of this speech. A global company sitting outside of our jurisdiction, trying to apply norms to more than a hundred billion acts of speech a day has neither the democratic accountability nor legitimacy to reconcile this issue. Even though this debate is hard for democratic governments and they are understandably hesitant to take it on, it is their responsibility to do so.

Finally, this is a global problem that demands new forms of global governance. But instead of an international conversation, this moment of market correction demonstrates the centrality of the United States in determining the way speech is shaped. Where was the boycott and the Facebook policy adjustments when Rodrigo Duterte was using the platform to harass journalists and radicalize citizens in the Philippines? Where were they when genocide in Myanmar was being incited on Facebook? Where were they when Jair Bolsanaro was spreading dehumanizing speech across Brazil? We also need a conversation about what human rights look like on digital platforms in countries with illiberal and autocratic regimes.

Facebook has done tremendous good. It enables speech, empowers new voices, allows small businesses to reach audiences and connects people in meaningful ways. But in no other domain do we allow the positive attributes of a company to negate accountability for actions that are damaging to society.

When we have a market failure that is causing such harms, we expect governments to regulate. We must take Mr. Zuckerberg at his word that he is open to regulation and focus our attention on what that should look like: not regulation that serves the interests of platform companies or global brands, but regulation that serves the interests of society.

For the boycott movement to be successful, it should be directed at governments and the citizens who give them their mandate. It must move beyond self-governance and demand democratic governance.

Keep your Opinions sharp and informed. Get the Opinion newsletter. Sign up today.

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe