Online firms face EU fine if extremist posts stay up over an hour

Online firms face EU fine if extremist posts stay up over an hour
FILE PHOTO: People are silhouetted as they pose with mobile devices in front of a screen projected with a Facebook logo, in this picture illustration October 29, 2014. REUTERS/Dado Ruvic/Illustration/File Photo Copyright Dado Ruvic(Reuters)
By Reuters
Share this articleComments
Share this articleClose Button

STRASBOURG (Reuters) - The European Union's chief executive on Wednesday proposed fining Google <GOOGL.O>, Facebook <FB.O>, Twitter <TWTR.N> and other online platforms if they fail to remove extremist content within one hour.

Brussels gave internet firms three months in March to show they were acting faster to take down radical posts, but EU regulators say too little is being done without legislation forcing them to do so.

If authorities flag it, the European Commission wants content inciting or advocating extremist offences, promoting extremist groups, or showing how to commit such acts to be removed from the web within a hour.

"One hour is the decisive time window the greatest damage takes place," Jean-Claude Juncker said in his annual State of the Union address to the European Parliament.

In a proposal that will need backing from EU countries and the European Parliament, internet platforms will also be required to take proactive measures, such as developing new tools to weed out abuse and human oversight of content.

Service providers will have to provide annual transparency reports to show their efforts in tackling abuse.

Providers systematically failing to remove extremist content could face hefty fines of up to 4 percent of annual global turnover. Content providers will though have the right to challenge removal orders.

"We need strong and targeted tools to win this online battle," Justice Commissioner Vera Jourova said of the new rules.

In turn, the draft rules will demand the EU's 28 national governments put in place the capacity to identify extremist content online, sanctions and an appeals procedure.

The industry has also been working since December 2015 in a voluntary partnership to stop the misuse of the internet by international extremist groups, later creating a "database of hashes" to better detect extremist content.

The Commission will retain a voluntary code of conduct on hate speech with Facebook, Microsoft <MSFT.O>, Twitter and YouTube in 2016. Other companies have since announced plans to join it.

(Reporting by Philip Blenkinsop, Daphne Psaledakis and Alissa de Carbonnel; Editing by Matthew Mpoke Bigg)

Share this articleComments

You might also like