EU Policy. Can social media swing the EU election?

Online platforms are widely used in the election campaign.
Online platforms are widely used in the election campaign. Copyright AP Photo, File
Copyright AP Photo, File
By Cynthia Kroet
Share this articleComments
Share this articleClose Button

Compared to the 2019 election more EU platform rules are in place.


This June, some 365 million people will be eligible to vote in the EU Elections, including many young voters for whom social media is a vital resource for campaigning. Four countries – Belgium, Germany, Malta and Austria – will allow 16-year-olds to vote, in Greece the minimum age to vote is 17.

Five years ago, when the last EU vote was held, the regulatory landscape for online platforms was quite different: there were fewer rules on combating misinformation, and the AI Act did not yet exist. We take a look at how platform regulation in the EU has changed in 2024, compared to the last election in 2019, and its impact on social media companies. 

1. TikTok

Platforms played a role back in the election campaign of 2019, but TikTok but is a real game changer. Whereas Facebook and Twitter (now X) were already big five years ago - the Chinese-owned video sharing platform TikTok has rapidly gained popularity and now boasts more than 150 million monthly users across Europe. Meta still exceeds that with some 408 million monthly users in Europe.

A report by the European Parliamentary Research Service said that the electoral turnout in 2019 reached 50.6 % largely due to youth participation, which was sparked by a parliament campaign and the use of platforms like Snapchat. TikTok, with its high number of young users, could play a similarly pivotal role this time around, despite the platform's embroilment in regulatory turmoil in recent months.

The European Commission and the Parliament have called on staff to ban the app on work phones over cybersecurity fears. It is also "strongly recommended'' that lawmakers and their assistants remove TikTok from their personal devices. Some of the political groups, such as the social-democrat S&D group and the Left (GUE/NGL) do use the app for their campaigns. But more security and data protection fears in recent days over the popular company put this in a different light. The Commission started two compliance probes related to online child protection under the EU’s platform regulation in February and April.

2. AI Act

Europe was the first continent to regulate: its AI Act it attempts to impose strict rules on high-risk machine learning systems, and it adds transparency requirements for Generative AI (GenAI) tools such as ChatGPT. The AI Act is likely to enter into force in June, however, too late to impact the EU election.

Nevertheless, awareness around the risks of the tools has grown. Voters in the EU are increasingly concerned about the impact of AI on democratic processes, such as the ability to increase the spread of false information and deep fakes, making it difficult to distinguish the real from the illusory, surveys show. 

In a bid to address these risks, platforms such as Microsoft, Google, and Meta have already pledged to prevent AI election interference in an AI Elections Accord presented at the Munich Security Council last February. “As society embraces the benefits of AI, we have a responsibility to help ensure these tools don’t become weaponised in elections,” Microsoft Vice Chair and President Brad Smith said when signing. 

Under Commission guidelines on GenAI, large online platforms will for example have to use watermarks to ensure users that the information related to electoral processes is official. Meta said it already includes visible and invisible watermarks for images created with its GenAI platform. 

3. DSA

Under the Digital Services Act (DSA), proposed by the European Commission in 2020, online platforms with more than 45 million monthly average users – including Facebook and TikTok – are obliged to take measures against disinformation and election manipulation. The rules started applying to those large Big Tech companies as of August last year.

The companies also need to have in place content moderation tools, that include the option to contest decisions when users’ content is removed or restricted, as well as increase transparency for users regarding terms and conditions and how algorithms recommend content.

EU Commission Vice-President Margarethe Vestager said that many election debates will take place online and the DSA provides tools to work together with online platforms. “We can address the emerging online risks to electoral processes, like deep fakes. So we can enable people, in a safe way, to engage, discuss and make up their minds without illegal interference,” she said

To allow companies to test the rules, the Commission recently (24 April) organised a stress test on DSA election guidelines.

Non-profit organisations Mozilla and CheckFirst said in research published this month (16 April), that online platforms do not provide enough insight into paid influence and commercial ads ahead of the EU election in June. Civil society groups also question the lack of content moderators for some of the EU languages like Maltese, Dutch and Estonian; and therefore, the ability of platforms to really halt misinformation published in those countries. In addition, there is no information in the DSA on how non-official EU languages are moderated, and how content in, say, Russian or Arabic is moderated in the EU. 

4. Political advertising

Another regulatory change can be found in more stringent rules affecting political advertising. Last February, the European Parliament approved rules for such ads, aimed at restricting foreign interference, notably online. Under the framework, sponsors from third countries will not be able to pay for political advertising in the EU in the three-month period before an election or referendum.

Political ads based on profiling and the use of minors’ data will also be prohibited. In addition, any political ad will have to be clearly labelled as such and include information such as who paid for it and how much it cost. 

The plans were put forward by the European Commission in 2021 to increase transparency of political advertising, as part of measures aimed at protecting election integrity. However, civil society groups including Access Now, are worried that the rules came too late to protect the upcoming EU vote; in addition, information such as who’s paying for the ad is delivered only by self-declaration from political groups.

“While the new law’s transparency measures will hopefully prevent voter manipulation and protect people’s personal data from abuse in years to come, it won’t have much impact on the 2024 European elections – making it a missed opportunity, albeit one with potential for the future,” a press statement said.


For their part, social media companies TikTok and Meta announced their measures to combat misinformation this month. For example, Meta said that it will ensure that advertisers who run ads related to elections with it will have to disclose if they use AI or fake images.

Share this articleComments

You might also like