The French government is currently under fire after it announced a proposal to use video surveillance assisted by artificial intelligence (AI) ahead of the 2024 Paris Olympics.
The Senate voted at the end of January overwhelmingly in favour of a bill that would allow its use during the event.
Sports Minister Amélie Oudéa-Castéra thanked the senators for swiftly adopting the text on Twitter saying "it will promote the best possible organisation of the Paris 2024 Games".
The legislation includes plans to use AI to detect - for the first time ever in France - suspicious body language or crowd movements through CCTV cameras and drones. This information is then sent directly to the police.
AI could also be used around stadiums, on streets, and on public transport. The bill states that the cameras can be used as an experiment until June 2025 during sporting, festive, or cultural events.
"The scope of this law is worrying because it goes far beyond the Olympics - until June 2025," said Katia Roux, advocacy officer for technology at Amnesty International.
"For years, we have seen French authorities have been trying to expand the surveillance power of the police and we fear these Olympics could be used as an excuse to do so," she said in an interview with Euronews Next.
The plan comes as the nation seeks to avoid repeating the mayhem seen during the Champions League final at the Stade de France in 2022. Many supporters were attacked or mugged around the stadium.
But for human rights groups, this bill is intrusive and a danger to human liberty.
Targeting specific groups or right to privacy
"We fear this could be used to target specific groups and that would infringe on the right to privacy and peaceful assembly," said Katia Roux, advocacy officer for technology at Amnesty International.
"Errors or biases are to be feared, and one can actually wonder what is abnormal or normal behaviour? To date, the effectiveness of such technology in fighting crime and terrorism has not been proved at all," she said.
The French government has emphasised that the bill will not include facial recognition technology.
The authorities also argue it would greatly help police to keep crowds safe as the capital is expected to welcome 13 million people for the Olympics.
"The problem is the use of biometric data. Even though there’s no facial recognition, the analysis of behavior and movement of individuals is still biometric because it’s sensitive data that should be protected," explained Maryse Artiguelong, a data and privacy specialist at the Ligue des droits de l'homme (LDH).
"When you move, you have a particular way of moving that could identify you," she told Euronews Next.
Amnesty International and other digital rights groups believe that if this bill passes, it could pave the way for more intrusive surveillance technologies in the country.
"We can fear that facial recognition could be the next step," said Roux.
"We saw that with Russia when they started the facial recognition for the World Cup. Now a few years later, the technology was used to arrest peaceful protesters," she explained.
Will this mean France will be entering a new era of Big Brother? The bill still hasn’t been passed but will be examined by the National Assembly in February.