MeitY Extends IT Rules Feedback Deadline to May 7, Proposes Stricter Labelling for Synthetic Content
  • Elena
  • April 21, 2026

MeitY Extends IT Rules Feedback Deadline to May 7, Proposes Stricter Labelling for Synthetic Content

The Ministry of Electronics and Information Technology has extended the deadline for submitting stakeholder feedback on its proposed amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, allowing responses until May 7, 2026. This extension comes as part of a broader effort to refine regulatory measures aimed at improving transparency, accountability, and oversight in India’s rapidly evolving digital ecosystem.

The updated draft, referred to as the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Second Amendment Rules, 2026, introduces several notable changes. One of the most significant revisions focuses on synthetically generated information, commonly known as SGI. Under the revised provisions, any content that is artificially created or altered using computer-based tools to resemble real individuals or events must carry a label that remains continuously visible throughout its display. This move is designed to ensure that users can clearly distinguish between authentic and manipulated content, especially in the context of increasing concerns around deepfakes and misleading digital media.

SGI has been defined broadly to include audio or visual content generated or modified in such a way that it appears genuine and could potentially mislead viewers. By mandating persistent labelling, the government aims to enhance user awareness and reduce the risk of misinformation spreading through realistic but fabricated media formats.

In addition to the SGI labelling requirement, the proposed amendments introduce a structured compliance framework for intermediaries. A new sub-rule under Rule 3 establishes that intermediaries must adhere to clarifications, advisories, directions, and guidelines issued by the ministry. These directives must be formally documented, clearly outline their legal basis, and specify their scope and applicability. The requirement ensures that any instructions provided to digital platforms are transparent, legally grounded, and consistent with existing laws.

This framework is expected to improve coordination between the government and intermediaries by reducing ambiguity in implementation. It also places a greater responsibility on platforms to actively comply with regulatory instructions, thereby strengthening enforcement mechanisms across the digital space.

Another key aspect of the amendments is the expansion of rules related to oversight and content regulation. Rules 14 to 16, which fall under the oversight mechanism chapter, are proposed to be extended to include not only intermediaries but also user-generated content, particularly in the domain of news and current affairs. This means that content created and shared by individuals, even if they are not formal publishers, could be subject to the same scrutiny and regulatory processes.

These rules outline how authorities can intervene in cases involving problematic or disputed digital content. For instance, Rule 16 provides for emergency powers that allow the government to block content without prior notice or hearing in urgent situations. However, such actions must be reviewed by a designated committee within 48 hours, ensuring a level of accountability and oversight in emergency interventions.

The amendments also aim to make the content regulation process more participatory. By potentially involving both users and intermediaries in review mechanisms, individuals whose content is flagged or blocked may get an opportunity to present their case. This approach seeks to balance regulatory enforcement with fairness and transparency, addressing concerns around unilateral decision-making in content moderation.

Overall, the proposed changes reflect a growing emphasis on responsible digital governance in India. With the increasing influence of artificial intelligence and content generation tools, the need for clear labelling and stronger compliance frameworks has become more pressing. The extension of the feedback deadline indicates the government’s willingness to engage with stakeholders and refine the rules based on broader input before final implementation.

These amendments are likely to have far-reaching implications for social media platforms, digital publishers, and users alike, as they redefine the boundaries of accountability, transparency, and content regulation in the digital age.