The Facebook whistleblower's testimony in Brussels this month showed us the EU's planned Digital Services Act does not go far enough, argues German MEP Patrick Breyer
The recent testimony of Facebook whistleblower Francis Haugen in the European Parliament has exposed the EU‘s unwillingness to take back control of the digital sphere.
Ms Haugen’s alarming revelations show the extent to which Big Tech is putting profit before the public interest, and how it is damaging for us all.
Yet despite this, upcoming legislation, in the form of the EU’s Digital Markets Act and Digital Services Act, lacks the ambition needed to address these problems.
MEPs have the power to set global standards and shape the digital future in line with fundamental rights and values, but a majority seems reluctant to wield it.
The fight for privacy
In her speech, Haugen warned of the extent to which Meta/Facebook‘s virtual reality plans — which entail a dangerous multiplication of sensors in buildings — will dramatically increase the threat to our privacy posed by surveillance capitalism.
Sensors in our homes and workplaces could become as vital as webcams and microphones currently are. But they could also give companies like Meta/Facebook constant, unrestricted access to our personal lives.
In spite of Haugen’s warnings, the proposed European digital legislation fails to secure a right to anonymity for citizens and fails to protect us from having each and every one of our actions recorded and used to manipulate us.
Dismantling censorship mechanisms
Haugen warned that Facebook‘s unreliable upload filters over-block massive amounts of valuable legal content, such as on counter-terrorism because they don’t understand the meaning, intention and/or context of information.
It is vital that we ban these error-prone censorship mechanisms. And yet, as it currently stands, the upcoming Digital Services Act would not rule them out!
These filters disrupt essential media content, political speech, scientific discourse and educational material, often disproportionately affecting minorities and political activists.
Fixing toxic algorithms
Haugen‘s testimony proves that Facebook keeps prioritizing profit over the public good. But the EU continues to bury its head in the sand, deluding itself that it can force Big Tech to pursue a public interest agenda and fix problems by itself, relying on a so-called “risk-based approach”.
This is naïve at best. Big Tech’s only true loyalty is to its shareholders, its only aim is more engagement, more use, more ads, more data, more profit.
That profit is driven by toxic recommender algorithms. These algorithms— which promote destructive, extreme content in a quest for more clicks — are vast, intricate, complex black boxes.
No individual has responsibility over them, very few people know how they work at all, and they are close to impossible to audit.
This means we'd have to trust companies to make changes to their algorithms that would reduce their profit: something they will never do voluntarily.
Many of the severe societal issues we are facing today are caused, or made significantly worse, by these algorithms. From anti-vaxxers to conspiracy theorists, the Myanmar genocide to the US Capitol attacks, they have fed the most vulnerable citizens the most extreme content, and everyone has paid the price, some with their lives.
The only real way to deal with the tech companies' toxic recommender algorithms is to allow users to switch them off, and either view posts by date, or use external algorithms of their choice instead, such as non-commercial community ones.
Giving citizens more choice online
Even with transparent, external algorithms, users still have little choice but to hand over their data to Meta/Facebook in order to keep in touch with those close to them.
As Ms Haugen said, Meta/Facebook has a near-monopoly on personal social networks, and network lock-in means that if you want to switch, you’d have to convince everyone to switch with you.
The only effective way to deal with the tech companies' abuse of market dominance is interoperability: allowing users to switch to alternative platforms and still keep in touch with their friends and colleagues across platforms.
But in its recently-adopted position on the Digital Markets Act, the European Parliament’s lead committee fails to call for mandatory interoperability.
Choosing a future
Imagine a future where we can feel safe about our data and privacy, at home and at work. A future where legal content isn’t constantly censored, where independent content creators can easily grow, where extreme and damaging content is left on the fringes where it belongs, and where citizens can choose a social network in the same way you can choose an email provider, knowing that regardless of who they sign up with, they can reach anyone.
Such a future is possible. But only if we have the political will to put users and our democratic institutions in control of the digital sphere, rather than continuing to trust multinational corporations to shape the digital era we and our children will live in.
Ahead of crucial votes and decisions, there is still time for the public to call on policymakers to fundamentally change the rules of the game, and let citizens and public institutions take back control.
Patrick Breyer is a digital rights activist, member of the Pirate Party Germany and an MEP since 2019. He currently serves as the EU Committee on Civil Liberties' rapporteur for the planned Digital Services Act.