India's New 3-Hour Deepfake Removal Rule: Experts Urge Strict Compliance
India's 3-Hour Deepfake Removal Rule: Experts Urge Compliance

India's New 3-Hour Deepfake Removal Rule: Experts Urge Strict Compliance

The Centre has notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2026, in a major regulatory overhaul aimed at tackling deepfakes, harmful online content, and improving platform accountability. These amendments, set to come into effect from February 20, 2026, amend the 2021 rules and introduce significant changes to content moderation timelines and obligations.

Key Amendments and Compliance Obligations

The amendments shorten content takedown timelines drastically and introduce detailed compliance obligations for platforms hosting synthetically generated information (SGI). Under the amended Rule 3(1)(c), intermediaries such as social media platforms like Facebook, Instagram, YouTube, X, and other websites must now inform users every three months, instead of once a year, about the consequences of violating terms of service. Users must be clearly warned that access rights may be withdrawn for non-compliance, and they may face penalties under applicable laws for creating unlawful content. Certain offences require mandatory reporting under laws like the Protection of Children from Sexual Offences (POCSO) Act, 2012, and the Bharatiya Nagarik Suraksha Sanhita (BNSS), 2023.

Drastic Reduction in Takedown Timelines

One of the most striking changes is the sharp reduction in timelines for content removal and grievance redressal. The amendments mandate that court-ordered or law enforcement-directed takedowns must now be complied with within three hours, down from the earlier 36-hour window. Similarly, platforms must remove non-consensual nudity within two hours, reduced from 24 hours. Grievance redressal timelines have also been halved to seven days. Legal experts note that this compressed timeframe will require platforms to establish round-the-clock rapid response teams and enhanced automated moderation systems to expedite law enforcement coordination.

New Framework for Synthetically Generated Information (SGI)

In a significant move addressing the rise of deepfakes and AI-generated content, the amendments introduce a detailed definition of 'synthetically generated information' (SGI). SGI includes audio, visual, or audio-visual content that is artificially or algorithmically created or modified to appear real and indistinguishable from actual persons or events. The Rules clarify exclusions for routine editing or good-faith activities that do not materially alter the substance, such as formatting, color adjustment, or compression.

Additional Compliance Burden on SGI Platforms

Intermediaries offering SGI generation or sharing services must now inform users that punishment may be attracted for directing or causing unlawful SGI creation or sharing. They must warn that violations could result in content removal, account suspension, identity disclosure, and mandatory reporting under POCSO or BNSS. Platforms must implement reasonable technical measures, including automated tools, to prevent the generation or sharing of unlawful SGI, with prohibited categories including child sexual abuse material, non-consensual nudity, and false depictions of persons or events.

Mandatory Proactive Detection and Labelling

For SGI not under prohibited categories, platforms must ensure prominent labelling. Labels must be clearly visible in visual displays, prefixed in audio content, and embedded with metadata or technical provenance markers, including a unique identifier of the computer resource used. The rules explicitly prohibit suppression, modification, or removal of such labels and metadata. Significant social media intermediaries (SSMIs) face additional obligations, such as mandatory user declarations for SGI content and verification of accuracy using technical measures.

Expert Opinions on the Regulatory Push

Advocate Yashaswini Basu from Bangalore stated, "The new IT rules enable mandatory transparency through permanent metadata and prominent labeling, ensuring users can distinguish AI-generated content from reality. By slashing takedown timelines to just three hours, the rules enforce rapid accountability." Senior advocate Srinath Sridevan of the Madras High Court commented, "Implementation is another matter altogether. Unless the government comes up with an automated monitoring mechanism, this regulation will remain a well-intentioned but empty rule."

Advocate Vikash Kumar Bairagi from New Delhi expressed concerns, "The 2026 amendments respond with regulatory overreach rather than calibrated restraint, risking incentivising intermediaries to err on the side of censorship." Advocate Ankit Konwar highlighted challenges, "Key challenges may arise in uniformly identifying synthetic content, balancing compliance with user privacy and free speech concerns, and ensuring technological feasibility across platforms."

Advocate Suhael Buttan emphasized, "Platforms must ensure strong accountability, transparency, and technical safeguards around AI-generated content to maintain compliance." Advocate Huzefa Tavawalla noted, "The requirement to implement provenance mechanisms within a 10-day period appears aggressive given the technical complexity involved." Advocate Arya Tripathy pointed out, "The SGI Rules blur the contours of safe harbor protection, exposing intermediaries to actual liability for unlawful content."

Advocate Rashmi Deshpande added, "The requirement to embed permanent metadata improves traceability and can deter impersonation or fake political content, making platforms more accountable." Advocate Ankit Sahni concluded, "For AI users, this could mean increased friction and mandatory disclosures, while for AI platforms, compliance architecture becomes central to retaining safe harbor protection."

With the rules set to take effect soon, intermediaries face less than ten days to recalibrate compliance mechanisms, and industry stakeholders are expected to seek clarifications on implementation logistics, particularly regarding the feasibility of the three-hour takedown mandate and permanent metadata requirements.