Find Us


Euroviews. Regulating Big Tech will take pluralism and institutions | View

Whistleblower allegations this week brought user safety on Facebook back into the spotlight
Whistleblower allegations this week brought user safety on Facebook back into the spotlight Copyright Jenny Kane/AP
Copyright Jenny Kane/AP
By Miguel Poiares Maduro and Francisco de Abreu Duarte
Share this articleComments
Share this articleClose Button
The opinions expressed in this article are those of the author and do not represent in any way the editorial position of Euronews.

Give social media users a choice of algorithm - and real, independent oversight bodies - to keep debate online free and healthy, argue EDMO's Miguel Poaires Maduro and law researcher Francisco de Abreu Duarte.


The need to regulate Big Tech companies like Facebook, Google, or Amazon is now daily news. This week's scandal involving Facebook is just the latest in a saga.

From the news battle between Google, Facebook and Australia to the Trump/Parler dispute, from new proposals to regulate digital companies in the US to increasing pressure by competition authorities in Europe, Big Tech firms are in the spotlight for the power they wield.

Appearances before the US Congress or European and national parliaments are ever more frequent. On both sides of the Atlantic, a general regulatory momentum is mounting. The EU has taken the first steps with a Commission proposal to tackle tech companies’ market power (via the Digital Markets Act) and internal moderation structures (via the Digital Services Act).

All this is different to anything we have seen in the past. There have been and still are private entities that, de facto, regulate markets – think of sports federations – or companies so critical to a given economy that they also hold substantial political power. But we have never before had market-dominating companies whose aim is to create a community of ideas.

Previous monopolists were not involved in the regulation of speech, nor in the dissemination of knowledge that shapes public opinion. But for Big Tech companies, fostering a large community – similar to a public sphere – is key to the business model.

Moderating and maintaining engagement is the end goal. An advertising-dependent businesses, Big Tech firms work as digital gatekeepers and carefully curate the content shown to users. They are the editors of that public sphere: both the vehicles of speech, and the controllers of speech.

Precisely because Big Tech is different, we cannot expect the old remedies to work. Classic antitrust or competition law will play a part but it is not enough. The reason for this is simple: to regulate a public sphere, one needs to address more than simply the market.

All constitutional states know that free speech is the baseline, and that ideas then need an institutional process to become knowledge and form the basis of decision-making. If regulators wish to create a healthy digital environment for users, they need to ensure that two things are present: a) pluralism and b) institutions.

A marketplace for algorithms

We need to ensure that users can share ideas, contrast visions, argue and debate. It is now clear that Big Tech, preoccupied by creating networks, selling ads, and rapid growth, forgot about this. The algorithms work to maximize attention without editorial concerns.

Online platforms generate bubbles of agreement that segregate the public sphere and divide communities, so as to better target like-minded people with advertising. These firms have centered their business model on such ‘clustering’.

If companies can unilaterally control the algorithm that curates content – and such an algorithm is solely aimed at ‘clustering’ and expanding the network for advertising purposes – this will reduce pluralism instead of promoting it. And this has consequences for democracy.

Therefore we suggest the creation of ‘algorithmic pluralism’ as a possible solution. We must create an actual algorithmic market in which different players can create and sell algorithmic choices to users.

Imagine a scenario in which people can change the content they see in their social feeds by choosing one of several available algorithms. A world where people acquire the algorithms to be installed on their networks — not just turn it off, as is already possible on some platforms like Twitter. A world where companies compete to offer us more responsible, pluralistic algorithms.

When I turn off the ads-oriented algorithm, a steady feed of people agreeing with my views suddenly becomes a new world of disagreement. When I toggle the sports-oriented one, my feed becomes a sporting blog made up of different supporters. My literature-oriented feed becomes a debate of contrasting views and positions on writing and art.

My advertisement algorithm shoots all sorts of product-related content my way, for those shopping spree days. My privacy algorithm prioritizes my data. In all of them, I see the myriad different views, commercial and political, religious and agnostic, artistic and literary, that the world has to offer. I jump from one community to the next.

Big Tech would have to offer this algorithmic market itself, or else allow an intermediary market to arise. Different firms could develop creative ways of tailoring content that could then be sold to the platforms, or to individual users.

This would bring increased transparency, as companies would be incentivized to show how their algorithms surpass those of competitors, and could be held more directly accountable for flaws. If an algorithm is faulty or non-transparent, a competitor will take its place.

People would then be able to choose privacy-oriented providers that showed them diverse content, while still using the platforms they love. It would circumvent the costly exercise of moving from one platform to a new (often unpopulated) network. It would empower consumers to choose.


Independent oversight

This alone, however, will not be enough. We must also ensure that Big Tech firms’ regulatory power over speech in the virtual public sphere is subject to institutions that, as they do in our democracies, work to transform contrasting views into actual, shareable knowledge.

This is often the role of constitutions in liberal states: they work as frameworks setting the rules of the game. They make our disagreements possible while also rationalizing them. We suggest the adoption of similar, quasi-constitutional principles within Big Tech companies, to foster healthier exchanges of ideas.

This means the imposition of due process obligations on the companies – as the EU’s proposed Digital Services Act (DSA) aims to do (albeit imperfectly) through notices and takedowns – for users to check platforms’ power themselves.

It means increasing transparency obligations, and fostering independent quasi-judicial bodies (Facebook’s Oversight Board is actually a good start) as instances of appeal, but with broader supervisory powers, up to and including over the algorithms themselves.

It means creating within such companies the quasi-constitutional bureaucracies that are always necessary to prevent power from being exercised in an unaccountable way.


We acknowledge the ambition of these proposals. But we need that creative ambition to match the power held by Big Tech, without empowering a police state.

If we wish to keep our democratic values intact, we must ensure that the democratic tools that constrain state power are applied to Big Tech. This means not only fostering plural marketplaces of ideas, but reinforcing them, with institutional tools designed to act as a check on power.

Miguel Poiares Maduro is a former Portuguese development minister and executive chair of the European Digital Media Observatory (EDMO), a digital fact-checking and anti-disinformation project. Francisco de Abreu Duarte is a law researcher at the European University Institute.

Share this articleComments

You might also like

Facebook down: Outage shows need for more players in tech, EU competition chief Vestager says

Big tech firms Google, Facebook and Microsoft are the biggest spenders on EU lobbying, study finds

The bonds that bind: Our adversarial sovereign bond habit