More than 20% of the staff Meta hired to check the violent content of Facebook and Instagram are on sick leave due to psychological trauma.
Facebook's only requirement for working as a content moderator in the Spanish city of Barcelona was to know the local language.
It' seemed like an attractive position, with a salary that could reach €2,400 per month for viewing between 300 and 500 videos per day.
However, what seemed like a good opportunity turned out to be a bad decision for many of the workers who got the Barcelona jobs.
More than 20% of the staff of CCC Barcelona Digital Services - owned by Telsus, the company that Meta hired to check the content of Facebook and Instagram, are on sick leave due to psychological trauma.
The images posted on the social networks they were supposed to check showed the worst of humanity: videos of murders, dismemberments, rapes and live suicides.
"In one of the videos, a father shows his baby, who would have been one year old. He sticks a knife in its chest, rips out its heart and eats it," Francesc Feliu, lawyer for more than a dozen workers who decided to sue the company, told Euronews.
"Apart from the absolutely inhumane content, there is a lot of noise, screaming and blood," Feliu added.
The employees have criticised the working conditions imposed by the company on the content moderators, which leave them extremely exposed to serious mental health problems such as post-traumatic stress disorder, obsessive-compulsive disorder and depression.
"We are talking about people who were healthy and suddenly these mental disorders appear. Some of these workers have attempted suicide," says the lawyer.
"They are allegedly producing dozens and dozens of relatively young people who are mentally ill. It is extremely serious," he adds.
The first European complaint
Chris Gray started working for CPL, the contractor Facebook worked with in Ireland, in 2017. His contract as a content moderator lasted a year.
The Irishman, now 50, was the first to take the social network to court.
When he started working, video only made up 20% of the content he had to review, working mostly with text, photos and some live video.
The images that stick in his mind are of migrants being tortured with a burning metal rod or dogs being boiled alive.
"I didn't realise how much it affected me. It was only later that I realised I was a mess, I was very stressed. I couldn't sleep and I became very aggressive. If anyone ever talked about my work, I would cry afterwards," Gray told Euronews.
When he realised he couldn't cope any longer, he tried to talk to the company psychologist. "I filled in questionnaires saying I felt overwhelmed at work. But it took ages to get an appointment with someone".
Like Gray, 35 other content moderators have complained to the Irish High Court, where Meta has its European headquarters.
"We have clients from Ireland, Poland, Germany and Spain. Some of them realised the job was hurting them after a few weeks, others claim they didn't realise the impact until their family and friends told them their personalities had changed," says Diane Treanor, a lawyer with Coleman Legal who is representing Gray.
Not being able to access psychological support is something that the Spanish workers also complained about.
"When watching a video, many of these people collapsed, they couldn't go on. The psychologist would listen to them and then tell them that what they were doing was extremely important for society, that they had to imagine that what they were seeing was not real but a film, and that they should go back to work," says the Spanish lawyer.
In addition, the workers were not tested beforehand to see if the people they were hiring had any prior mental health issues, something that might make them unfit for the job.
However, when contacted by Euronews, Facebook claims that it is working with Telsus to urgently address the issue.
"We take the support of content reviewers seriously and require all companies we work with to provide 24/7 on-site support from trained practitioners."
‘It scars you for life’
Workers say that even five years on, many are still undergoing psychological treatment. Some are even afraid to go out on the street.
"This kind of content scars you for life. We are not machines or computers without feelings," says Feliu.
Both lawyers agree that Meta's policy of forcing employees to watch the entire video in order to explain all the reasons for censorship aggravates the trauma.
They say that if the employee can already see after 10 seconds the reason the video should be censored, then there is no need to watch the whole video. They also complain that the moderator often has to watch the same video again and again because more than one user has reported it.
On this issue, Facebook claims that there are "technical solutions to limit exposure to graphic material as much as possible".
"People who review content on Facebook and Instagram can adjust the content review tool so that graphic content appears completely blurred, in black and white, or without sound."
Three years ago, Facebook CEO Mark Zuckerberg described criticism of the company's working conditions as "a little overdramatic". The comments, made at a staff meeting, were leaked to the press.
"It's not that most people just see horrible things all day long. But there are really bad things that people have to deal with, and making sure that they have the right counselling and the space and the ability to take breaks and get the mental health support that they need is a really important thing," Zuckerberg told employees.
But that is not the experience of former workers.
"The company's policy is to deny it," says the Spanish lawyer.
"If they wanted to solve these problems, there would be a different dynamic, but at the moment the only solution is the criminal route."