India goes ahead with its facial recognition program despite privacy concerns
Last Updated on by Segun Ayo
The Indian government has played down fears of mass surveillance in response to concerns that its proposed facial recognition system lacks adequate oversight.
Replying to a legal notice filed by the Internet Freedom Foundation (IFF), a Delhi-based non-profit that works on digital liberties, the country’s National Crime Record Bureau (NCRB) defended the move, stating it doesn’t interfere with privacy of citizens as it “only automates the existing police procedure of comparing suspects’ photos with those listed in LEA’s [Law Enforcement Agency] databases.”
It also dismissed concerns of misidentification and discriminatory profiling, and said the project will only be used to identify missing people and unidentified dead bodies.
The need for facial recognition
The move comes after NCRB opened bids from private companies in June to help develop a facial recognition system — dubbed National Automated Facial Recognition System (NAFRS) — that would allow law enforcement to match people of interest against an existing database of facial images.
“This would greatly facilitate the investigation of crime and detection of criminals and provide information for easier and faster analysis,” the tender document said.
NAFRS will also “add photographs obtained from newspapers, raids, sent by people, sketches etc. to the criminal‘s repository tagged for sex, age, scars, tattoos, etc. for future searches.” It will have options to upload “bulk subject images” and “CCTV feeds” to “generate alerts if a blacklist match is found.”
Lack of a legal framework
The system, once in place, is expected to be accessible to all police agencies across the country. In addition, it’s likely to be one among the largest facial recognition apparatus with a capacity of processing over 15 million facial images.
The bids for the project were due on November 7, but the deadline has now been extended till January 3, 2020.
However, the proposals have walked into a privacy minefield, what with the country lacking strong legislations around data collection, protection, and sharing, let alone regulating the use of facial recognition technology.
A draft data protection bill presented to the government last year is anticipated to be introduced into parliament during the winter session, which kicks off on November 18.
Questions about data protection and consent
The country has had problems implementing Aadhaar, one of the world’s biggest biometric national identity databases linking everything from bank accounts to income tax filings, which been plagued by data leaks and the growth of a black market for personal information.
IFF, for its part, has reiterated that there’s no legislative framework that grants NAFRS any legality. While the NCRB said the system would not be integrated with Aadhaar, IFF has voiced concerns about “inadvertent access” resulting from the integration of various databases, thereby violating individual consent.
Calling for a withdrawal of the tender, the IFF has urged for a “moratorium on all privacy invading projects until a data protection law and authority is in existence.”
Not just India
India is far from the only player looking to deploy facial recognition on an enormous scale. China already has leveraged the technology to establish what’s a sophisticated surveillance network, while law enforcement‘s use of facial databases in the US and UK have drawn scrutiny.
France plans to follow India’s footsteps with an Aadhaar-like biometric citizen ID program called Alicem that employs facial recognition to counter identity theft and “increase confidence in electronic transactions within the European Union for online services.”
Complicating the matter further is the lack of oversight and data protection regulations to prevent exploitation of such sensitive data for dubious purposes.
“While technology is very well a force for good, prior to its integration in society, adequate safeguards and protection of target audiences need to be in place,” the IFF said.