MY KOLKATA EDUGRAPH
ADVERTISEMENT
regular-article-logo Thursday, 02 May 2024

Mission: Shakti: the eyes do not have it

The Indian government has often relied on the argument that increased surveillance of public spaces would increase women’s safety

Shubhangi Agarwalla Published 10.03.21, 03:43 AM
Representational image.

Representational image. Shutterstock

The Lucknow police is reportedly setting up AI-enabled cameras to record pictures of women ‘who look distressed’. The additional director-general explained that “The cameras will be able to detect any change in the facial expressions of a woman being subjected to stalking, threats or harassment on the streets, and an alert will be sent to the police control room.” He further clarified that around 200 hotspots for crimes against women have been identified by the Lucknow police on the basis of areas frequented by women as well as from spots that generated the most complaints. This initiative, which was launched under the Uttar Pradesh government’s ‘Mission Shakti’ programme, has met with a backlash from civil society organizations and lawyers.

There is a lot to unpack here.

ADVERTISEMENT

First is the assumption that using AI to recognize emotion is helpful. Emotion regulation algorithms are increasingly employed in job interviews in the United States of America and in South Korea. But studies show that such claims are often not backed by scientific evidence and there have been several instances of false positives. AI gives the appearance of neutrality and credibility. However, the AI technology is structurally limited by the prejudices of those creating the algorithm and feeding it the initial data to run checks against. A leading research institute, AI Now, is worried that the use of emotion regulation algorithms may lead to racist and sexist outcomes. This technology is likely to lead to over-policing.

Second is the assumption that surveillance of women is going to keep them safe. Surveillance can be defined as attention that is ‘purposeful, routine, systematic and focused’.

Surveillance is not necessarily undesirable. Administrators need some types of information about the governed to do their job efficiently. However, surveillance sans transparency and accountability mechanisms is a means of controlling populations and has a disproportionate impact on marginalized communities. B.R. Ambedkar and, before him, Jyotirao Phule understood this back in the 1940s and urged the lower castes to migrate from villages to cities in the belief that the relative anonymity of the Indian city would allow them some reprieve from untouchability that was integral to lives lived in closely-knit social and commercial spaces. In other words, a life away from surveillance would allow them to lead a life of dignity. Surveillance, on the other hand, has a chilling effect on the right to freedom of expression, freedom of movement and freedom of association.

The Indian government has several surveillance systems, including the National Intelligence Grid and the Central Monitoring System. It has often relied on the argument that increased surveillance of public spaces would increase women’s safety. The installation of CCTV cameras, development of safety apps (Nirbhaya app), making it mandatory for all phones to have panic buttons — each of these policies has an adverse impact on privacy, a constitutional right protected under Article 21 and recognized by the Supreme Court in Justice K.S. Puttaswamy vs Union of India. This landmark case provides for certain standards that have to be met to justify intrusions into people’s right to privacy. These standards include the existence of a law, a legitimate State aim, proportionality and procedural safeguards to check against abuse. Crucially, in the present case, there is no anchoring legislation to regulate the use of emotion regulation algorithms by the police. We do not have a data protection regime to oversee the collection, processing and storage of data collected by these systems. Further, it has not been clarified whether the data collected would be analysed by the police officers themselves or sent to third parties. Without an express purpose limitation on data use, the ‘function creep’ of any technology enables its use in any number of insidious ways. For instance, surveillance of women might ostensibly seem like it is for their protection but it might also be used to control women who do not fit the stereotype of the ‘chaste, pure, domesticated and submissive’ Indian woman. Counter-intuitively, it might even lead to crime against women going unnoticed in the cases of those who do not meet the stereotype or have the ‘correct facial expression’.

Sexual violence is pervasive in India. Resources should be diverted to fight the problem. However, this policy, which reinforces existing stereotypes while giving rise to a surveillance State, fails to acknowledge the agency of marginalized communities and intrudes upon their privacy rights through the use of technology by carefully examining contextual factors such as the types of crimes the particular model is meant to record, who has ownership over the device and the data, and who gets to mobilize it.

Follow us on:
ADVERTISEMENT