Indian city sets out with facial recognition to identify harassed women

LUCKNOW (INDIA) – A plan to keep a track of women’s expressions with facial recognition technology to prevent street harassment in a north Indian city, will result in intrusive policing and privacy violations, digital rights experts cautioned on Friday.

In Lucknow, about 500 kilometres (310 miles) from the nation’s capital New Delhi, police found out some 200 harassment hotspots that women frequent and where most complaints are reported, said police commissioner DK Thakur.

He said, referring to the artificial intelligence-based technology, “We will set up five AI-based cameras which will be capable of sending an alert to the nearest police station.”

“These cameras will become active as soon as the expressions of a woman in distress change,” he told reporters this week, without revealing which expressions would trigger an alert.

Facial recognition technology is being increasingly being used in airports, railway stations and cafes across India. Plans have been on for nationwide systems to modernise the police force and its processes revolving around information gathering and criminal identification.

Anushka Jain, an associate counsel at digital rights non-profit Internet Freedom Foundation, said, “The whole idea that cameras are going to monitor women’s expressions to see if they are in distress is absurd.”

“What is the expression of someone in distress – is it fear, is it anger? I could be talking to my mother on the phone and get angry and make a face – will that trigger an alert and will they send a policeman?”

Jain told that a more practical solution would be to amp up police patrol numbers, adding that the technology is not tested yet, and could lead to over-policing and possibility of women, who trigger alerts, being harrassed.

Roop Rekha Verma, a women’s rights activist in Lucknow. said that police often send back women who go to register complaints or do not take necessary action.

“And they want us to believe they will take action watching our facial expressions,” she said.

While there is a growing backlash against facial recognition technology in the United States and in Europe, Indian officials have said it is needed to bolster a severely under-policed country, and to stop criminals and find missing children.

But digital rights activists say its use is problematic without a data protection law, and that it violates the right to privacy, declared to be a fundamental right by the Supreme Court in a landmark ruling in 2017.

Vidushi Marda, a researcher at human rights group Article 19, said, “The police are using the technology to solve a problem without considering that this will simply become a new form of surveillance, a new form of exercising power over women.”

“AI is not a silver bullet, and no amount of ‘fancy’ tech can fix societal problems,” she said.

Exit mobile version