US tech companies such as Meta, Google, and Twitter have taken swift action to limit disinformation since Russia’s invasion of Ukraine last Thursday with the conflict highlighting the massive role social media plays in modern war.
"Russian information warfare has been unfolding for many, many years now, and not all policymakers have been super engaged in that," said Dex Hunter-Torricke, vice-president of global communications at the Facebook-created Oversight Board.
"Now is the time for everyone, companies, and policymakers to be stepping up," he told Euronews Next at the Mobile World Congress in Barcelona.
Meta, which owns Facebook, said on Monday it removed a network operated by people in Russia and Ukraine that "ran a handful of websites masquerading as independent news outlets, publishing claims about the West betraying Ukraine and Ukraine being a failed state".
Facebook and Google-owned YouTube have since announced they were blocking content from Russian state-owned media outlets RT and Sputnik in Europe, following calls from the European Union to stem the spread of pro-Kremlin propaganda.
Other measures by Facebook and YouTube include demonetising Russian state media accounts and videos.
These decisions by social media companies are "steps in the right direction," said Hunter-Torricke, who was previously a speechwriter for Meta’s chief executive officer Mark Zuckerberg as well as SpaceX’s former communications chief.
Hunter-Torricke added that the Oversight Board is following the events on the ground "very closely," which he described as a “huge moment in the world".
He noted that the body is not designed as a first-line in defence for content moderation, but that it wants to see what it can do to help protect users.
The disinformation fight
The Oversight Board was created in 2020 by Zuckerberg but it is an independent body made up of former world leaders, activists and top lawyers, to make decisions on the most significant content moderation challenges on Facebook and Instagram.
Disinformation and misinformation are major challenges on these platforms and have always been important issues for the board, said Hunter-Torricke.
"I think none of these things (disinformation) are going to ever be solved. They are things that we're going to need constant ongoing attention from the company on," he said.
The conflict in Ukraine has been ongoing since the 2014 annexation of Crimea. Even weeks before Russian soldiers set foot into Ukraine disinformation on social media stirred alarm.
"These are all situations where you have slow-brewing conflicts. You have situations that are going on for potentially years and the way a lot of tech companies work is to move very, very quickly," said Hunter-Torricke.
"They sort of mobilise resources, they staff up. You have a team devoted to handling a crisis and then it moves on to the next crisis. But of course, these platforms, they're dealing with the entire world".
He said to deal with this velocity, more teams will be needed to be constantly staffed during these slow-brewing conflicts.
"Without having that level of ongoing attention and resourcing from these companies, actually you end up only with retroactive measures, you end up in hotter stages in the fighting, and obviously we want to mitigate those holes before we actually see people really being harmed," said Hunter-Torricke.
Support 'at all levels'
Although social media companies have a big impact on people’s lives the disinformation fight cannot be solved by Big Tech alone, he said.
"What's happening in Ukraine is you have a nation-state that has declared war on another country and has mobilised its entire resources against that country. And if you're asking, you know, a single company or even a set of companies to take on a nation-state of that power, you know, that's a really tough ask," he said.
“This is an effort that is going to be probably something that takes place over a very long time period and is going to require a lot of support and leadership from policymakers at all levels".
However, Hunter-Torricke said there are even bigger questions to be asked that go beyond business and policy.
"We also need to consider why is it that so many people in our society are prepared to believe rumours, are so prepared to share bad content from bad actors?
"And of course, the answer lies in things that are very unglamorous and which policymakers don't want to pay attention to, such as media literacy, preparing populations and making them more resilient to the terrible effects of disinformation".