Govt Tightens Digital Rules on AI
The Centre has significantly tightened India’s digital governance framework by formally bringing synthetically generated information (SGI) — including AI-generated audio, video, and visual content — under the ambit of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, through amendments notified on February 10, 2026.
Key Definitions and Compliance Obligations
The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2026, which will come into force from February 20, 2026, introduce detailed definitions, disclosure requirements, and compliance obligations for digital intermediaries and social media platforms. This comes amid growing concerns over deepfakes, misinformation, and synthetic media misuse.
For the first time, the rules define “synthetically generated information” as audio, visual, or audio-visual content that is artificially or algorithmically created, modified, or altered using computer resources in a manner that appears real or authentic. This content may depict individuals or events in a way indistinguishable from real persons or real-world events.
However, the government has carved out explicit exemptions. Routine or good-faith editing, formatting, color correction, noise reduction, transcription, compression, translation, or accessibility-related enhancements will not be treated as SGI, provided such changes do not materially distort the original meaning or context.
Mandatory Labelling and Metadata Requirements
A key compliance requirement under the amended rules is the mandatory labelling of synthetically generated content. Intermediaries that enable or facilitate the creation or dissemination of SGI must ensure that such content is clearly, prominently, and unambiguously labelled so that users can immediately identify it as synthetic.
Additionally, platforms are required to embed persistent metadata or other technical provenance mechanisms, including unique identifiers, to enable traceability of SGI to the intermediary’s computer resource, to the extent technically feasible. Importantly, intermediaries are prohibited from enabling the removal or tampering of such labels or metadata.
Enhanced Obligations for Significant Social Media Platforms
Significant social media intermediaries face enhanced obligations. Before allowing upload or publication, such platforms must obtain user declarations stating whether the content is synthetically generated. They must also deploy reasonable and proportionate technical measures, including automated tools, to verify the accuracy of such declarations.
Where content is identified as SGI, platforms must ensure it is displayed along with an appropriate disclosure or notice prominently indicating its synthetic nature. Failure to exercise due diligence in this regard could expose platforms to liability under the amended framework.
Stricter Timelines and Takedown Procedures
The amendments also compress several compliance timelines, signaling a tougher stance on harmful online content. The time limit for intermediaries to act on lawful orders or complaints in certain cases has been reduced—from 36 hours to 3 hours in specific circumstances—while other response timelines have been cut from 15 days to 7 days, and from 24 hours to 12 hours, depending on the nature of the violation.
Synthetic Content and Unlawful Acts
The rules clarify that any reference to “information” used to commit an unlawful act—including under user due diligence obligations—explicitly includes synthetically generated information. This brings AI-generated content squarely within enforcement mechanisms related to offences under laws such as the Bharatiya Nyaya Sanhita, the Bharatiya Nagarik Suraksha Sanhita, and the Protection of Children from Sexual Offences Act.
Platforms are required to prevent the use of their services for creating or disseminating SGI that involves child sexual abuse material, indecent or obscene content, false electronic records, impersonation, or content related to explosives, weapons, or ammunition.
Clarification on Safe Harbour Protections
At the same time, the government has sought to reassure intermediaries regarding safe harbour. The notification clarifies that removal or disabling of access to SGI, including through automated tools and technical measures, will not amount to a violation of safe harbour conditions under Section 79 of the IT Act, provided such actions are taken in compliance with the rules.
Policy Signal on AI Governance
The amendments mark one of India’s most detailed regulatory interventions in the rapidly evolving AI and synthetic media ecosystem. By combining disclosure mandates, traceability requirements, and sharper enforcement timelines, the government appears to be aiming for a balance between innovation and harm prevention while placing clear accountability on platforms hosting or enabling synthetic content.