TikTok removed 4 million videos considered illegal or harmful in the EU in September

The TikTok app logo, in Tokyo, on Sept. 28, 2020.
The TikTok app logo, in Tokyo, on Sept. 28, 2020. Copyright Kiichiro Sato/Copyright 2020 The AP
By Euronews with AFP
Share this articleComments
Share this articleClose Button

TikTok published a report regarding its moderation policy, a requirement imposed by the new European legislation on digital services (DSA).


TikTok announced that it removed four million videos considered illegal or harmful in the European Union in September, according to a report published on Wednesday.

The platform, which is owned by the Chinese company ByteDance, claims to have a workforce of more than 5,000 people dedicated to content moderation in the EU alone, out of 40,000 people responsible for safeguarding its users worldwide.

"The vast majority of actions taken by TikTok against illegal or harmful content are done proactively" because they violate the social network's rules, the group said in a report on its moderation activities in the EU.

It noted that these removals are much more numerous than those related to user reports.

In the interest of increased transparency, the publication of such a report every six months is a requirement imposed by the new European Digital Services Act (DSA) that came into effect at the end of August and affects 19 very large tech platforms, including TikTok.

The European Commission has also launched investigations in the past two weeks targeting X (formerly Twitter), Meta (Facebook and Instagram), and TikTok.

It sought clarification on the measures they are implementing against the spread of "false information" and "illegal content" following attacks by Hamas on Israel.

The social network explains that it has introduced a tool that allows its European user community to report illegal content, in line with DSA obligations.

TikTok received 35,000 reports related to 24,000 videos in the first month. Actions were taken against 16 per cent of them, deemed illegal or in violation of internal rules.

The median time between reporting and action was 13 hours, the group said, highlighting the challenging legal analysis it had to conduct to be fair and consistent while also considering freedom of expression.

People involved in content moderation also rely on automated tools with one-third of them working in English.

The service also employs around 860 German speakers and more than 650 French speakers.

The team includes speakers of the 24 official languages of the EU as well as individuals capable of monitoring posts in Turkish and Arabic, two frequently used languages.

Share this articleComments

You might also like