During the Paris 2024 Olympics , real-time cameras will use artificial intelligence (AI) to spot suspicious activity, such as abandoned luggage and unexpected crowds. But as the BBC’s Hugh Schofield reports, civil rights organisations claim the technology poses a threat to civil liberties.
“We are not China; we do not want to be Big Brother,” declares François Mattens, whose AI company, based in Paris, is competing for a portion of the Olympics video surveillance contract.
According to a recent law, police will be able to detect anomalies like crowd rushes, fights, or unattended bags using CCTV algorithms. The law explicitly rules out using facial recognition technology, as adopted by China, for example, in order to trace “suspicious” individuals.
But opponents say it is a thin end of the wedge. Even though the experimental period allowed by the law ends in March 2025, they fear the French government’s real aim is to make the new security provisions in Paris 2024 Olympics permanent.
“We’ve seen this before at previous Olympic Games like in Japan, Brazil and Greece. “Noémie Levain, from the digital rights campaign group La Quadrature du Net (Squaring the Web), argues that the special security arrangements meant for the special circumstances of the games eventually became normalized,”
AI security system in Police stations
A version of the new AI security system is already in place in some police stations around France. One of the pioneers is the southern Paris suburb of Massy.
“Around the town we have 250 security cameras – far too many for our team of four to monitor,” says Massy’s mayor Nicolas Samsoen.
“So the AI device monitors all the cameras. And when it sees something it’s been told to look out for – like a sudden grouping of people – it raises an alert.
“It’s then up to the humans – the police officers – to examine the situation and see what should be the appropriate action. Maybe it’s something serious, maybe it’s not.
“The important thing is that it’s humans who make the ultimate decision about how to react – not the computer. The algorithm is empowering humans.
As a test, we abandoned a piece of luggage on the street not far from the police station. Thirty seconds later the alarm was raised and CCTV footage of the suitcase popped up on the control room screen.
Previously, of course, the algorithm had to teach what an abandoned piece of luggage looks like – which is where the AI comes in. The developers have fed the programme a massive bank of different images of lone bags on the street – a bank which continues to grow as more images accumulate.