Welfare at the Mercy of the Machine: The Dystopian Drift of Digital Welfare in India

 

Kurt Vonnegut’s 1952 novel Player Piano imagined a bleak future where a society divided by automation left engineers in control of machines, while workers found themselves disempowered and obsolete. More than seven decades later, this dystopian vision finds echoes in India’s approach to welfare delivery, where digital technologies are increasingly prioritized over human judgment. The government’s recent decision to mandate Facial Recognition Software (FRS) in Anganwadi centres is a stark illustration of this shift, highlighting how technology, rather than empowering citizens, may end up undermining both dignity and access.

 

The Role of Anganwadis in India’s Welfare Ecosystem

 

Established in 1975 under the Integrated Child Development Scheme (ICDS), Anganwadis are a cornerstone of India’s nutrition and early childhood care system. With a vast network of 14.02 lakh centres, they deliver critical services such as preschool education, health awareness, and most crucially, Take Home Rations (THR) for children under three and for pregnant and lactating women. These rations are a legal entitlement under the National Food Security Act, 2013, and are distributed by Anganwadi Workers (AWWs) and helpers, who are usually women from the local community.

 

The Facial Recognition Rollout: Techno-Solutionism Over Care

 

·       In 2021, the Poshan Tracker app was launched to digitise and monitor nutrition services across Anganwadis. While initially intended to streamline operations, it has now taken a contentious turn. Since July 1, it is mandatory for beneficiaries—pregnant women and lactating mothers—to authenticate their identity through Facial Recognition Software integrated into the app.

·       The official rationale is to eliminate fake beneficiaries and prevent diversion of rations by workers. But this move flips the principle of natural justice, treating genuine recipients and frontline workers as suspects. Instead of being a tool for transparency, FRS has become an instrument of mistrust.

 

Implementation Failures and Practical Hurdles

 

·       In practice, FRS implementation has been riddled with obstacles. AWWs face frequent authentication failures due to mismatched facial data, incorrect or outdated phone numbers, and technical glitches such as poor network connectivity and outdated smartphones. The facial recognition process demands heavy data processing, causing delays and requiring photos to be taken repeatedly.

·       Even when AWWs personally know the beneficiaries, the app provides no override mechanism, resulting in the denial of rations to deserving individuals. These technological hurdles have led to frustration among both workers and recipients, severely hampering the delivery of essential nutrition.

 

Misplaced Priorities and Ignored Realities

 

·       Instead of focusing on systemic inefficiencies in the THR programme—such as poor quality of food, irregular supply, a stagnant budget of 8 per child since 2018, and rampant corruption in supply contracts—the government has chosen to impose a surveillance-oriented solution. Despite Supreme Court directives calling for decentralised and community-led procurement, large corporations still dominate the supply chain.

·       Moreover, there is scant evidence to justify the claim that fake beneficiaries are draining resources. The introduction of FRS was done without consulting Anganwadi workers, further alienating the very people who are central to the system’s functioning.

 

Reclaiming Human-Centric Welfare

 

·       If the goal is truly to improve service delivery, the government must begin by making public any credible evidence of fraud and inviting community participation in monitoring the distribution of rations. Strengthening local oversight, rather than imposing top-down surveillance, would offer a more humane and effective solution.

·       The focus should return to addressing the core issues—ensuring quality nutrition, streamlining logistics, increasing budgets, and empowering frontline workers with trust and autonomy.

 

Facial Recognition and the Question of Dignity

 

·       FRS is primarily a tool of criminal surveillance. Its application in welfare services raises serious ethical concerns, especially when used on vulnerable populations like women and children. By treating beneficiaries as potential fraudsters, the system strips them of their dignity and agency. The global irony is striking: San Francisco, a symbol of digital innovation, has banned FRS, while India is deploying it in its most basic welfare schemes.

·       Children’s early care and nutrition should not be contingent on whether a software correctly identifies their mothers. Technology, no matter how advanced, must not come at the cost of compassion and inclusivity.

 

Conclusion: A Choice Between the Technocratic and the Humane

 

·       Kurt Vonnegut’s warning from Player Piano remains chillingly relevant. If India’s welfare architecture continues down the path of automated control, it risks becoming an engineer’s paradise that sacrifices human dignity at the altar of efficiency. The choice before us is profound: authentication or authenticity, surveillance or care, dehumanisation or dignity. As a society, we must ask whether we want welfare delivery that builds fraternity and trust, or one that fragments communities in the name of precision.

·       In the pursuit of digital governance, we must not forget the essential truth: the purpose of welfare is not just delivery, but dignified empowerment.

 



POSTED ON 18-09-2025 BY ADMIN
Next previous