- Home
- Prelims
- Mains
- Current Affairs
- Study Materials
- Test Series
EDITORIALS & ARTICLES
Draft IT (Digital Code) Rules, 2026: Key Provisions & Concerns
- The Ministry of Information and Broadcasting proposed the Draft IT (Digital Code) Rules, 2026, to regulate online obscenity and classify digital content.
About Draft IT Rules, 2026: Legal and Constitutional Basis
- Legal Basis: The draft rules are proposed under Section 87(1) of the Information Technology Act, 2000.
- Constitutional Balance: The framework follows Supreme Court directives to balance the freedom of speech under Article 19(1)(a) with the reasonable restrictions under Article 19(2).
- Broadcast Alignment: The draft draws heavily on the Cable Television Networks Rules, 1994, and extends similar content standards to digital platforms.
Key Provisions of the Draft Rules
- Age Classification: The draft proposes a five-tier classification system for online content, comprising U (Universal), U/A 7+, U/A 13+, U/A 16+, and A (Adult).
- Mandatory Labels: Platforms must clearly display age ratings and content warnings regarding violence or nudity before each programme begins.
- Professional Content: Exemptions apply to content meant exclusively for professional audiences, medical, scientific, or academic users.
- Content Restrictions: Digital platforms are barred from hosting material that attacks religions, promotes communal disharmony, or glorifies violence, crime, or substance abuse.
- Parental Safeguards: Platforms must provide parental controls for 13+ content and verified access systems for adult-only material.
- Intermediary Liability: Non-compliance with obscenity laws attracts civil consequences for Online Curated Content Providers (OCCPs).
- Obscenity Definition: Content is considered obscene if it is lascivious, prurient, corrupting to viewers’ minds, or offensive to good taste or decency.
Arguments in Favour of the Digital Content Age-Based Classification System
- Child Protection: Age-based classification and parental locks help shield minors from explicit content, similar to safeguards used in TV broadcasting and global OTT platforms like Netflix.
- Informed Choice: Mandatory content labels and warnings empower viewers to make informed decisions, improving transparency in digital consumption.
- Constitutional Compliance: By aligning with Article 19(2) restrictions, the rules operationalise Supreme Court guidance on regulating obscenity without imposing blanket censorship. E.g., the Aveek Sarkar case.
- Platform Accountability: Intermediary liability incentivises platforms to proactively moderate harmful content, reducing the circulation of hate speech or the glorification of violence.
Arguments Against the Digital Content Age-Based Classification System
- Digital Mismatch: Applying Cable TV-era rules to OTT platforms ignores the on-demand, user-driven nature of digital content, unlike push-based television.
- Vague Standards: Subjective terms like “good taste” and “decency” risk arbitrary enforcement, as seen earlier with Section 66A before its striking down.
- Chilling Effect: Strict liability may push platforms to over-censor content, discouraging independent creators and socially relevant storytelling.
- OTT Industry Impact: Removing the distinction between OTT and television may undermine innovation and global competitiveness of India’s digital content ecosystem.
Concerns Regarding the Draft Rules
- Digital Fit: Applying broadcast-era standards to on-demand platforms may conflict with the flexibility of digital content consumption.
- Vagueness Risk: Subjective terms like “decency” create uncertainty and raise concerns about selective or arbitrary enforcement.
- Speech Impact: Strict liability provisions could deter content creators, resulting in a chilling effect on free expression and creative freedom.
- OTT Distinction: Eliminating the distinction between push-based television and pull-based OTT content remains a key industry objection.
Way Forward
- Context-Sensitive Regulation: Develop OTT-specific norms recognising on-demand viewing autonomy rather than broadcast equivalence.
- Clear Definitions: Precisely define subjective terms like obscenity and decency to reduce regulatory uncertainty.
- Co-Regulatory Model: Strengthen self-regulation backed by light-touch government oversight, similar to global best practices.
- Judicial Safeguards: Ensure appeal mechanisms and proportionality tests to prevent excessive curbs on free speech.
- Digital Literacy: Complement regulation with media literacy programmes to empower users and parents.
As Justice Chandrachud noted, “liberty survives in the ability to question”. India’s Digital Code must therefore balance child safety with creative freedom. A future-ready, co-regulatory framework can protect users without muting innovation or democratic discourse
Latest News
General Studies