Act Against Obscene and Unlawful Content or Face Legal Action: Centre Warns Social Media and Online Platforms

The Centre has issued a stern warning to social media companies and other online platforms, directing them to take prompt and effective action against obscene, vulgar, pornographic, paedophilic, and other unlawful content hosted on their platforms, or face legal consequences. The warning comes amid the government’s concerns over what it described as lax enforcement and inconsistent compliance by intermediaries with existing digital laws.

In an advisory issued by the Ministry of Electronics and Information Technology (Meity) on December 29, 2025, the government reminded all intermediaries, particularly social media platforms, of their statutory obligations under the Information Technology (IT) Act and the IT Rules, 2021. The ministry cautioned that failure to adhere to these obligations could result in serious consequences, including prosecution under multiple criminal laws.

According to the advisory, Meity has observed that several platforms are not acting strictly or consistently against content that is obscene, vulgar, inappropriate, or otherwise unlawful. The ministry flagged this as a serious concern, especially given the wide reach and influence of social media platforms and their impact on children and vulnerable users.

Statutory Obligations Under the IT Act

The advisory specifically reiterated the provisions of Section 79 of the IT Act, which provides intermediaries with conditional protection from liability for third-party content hosted on their platforms. This exemption, often referred to as “safe harbour,” is available only if intermediaries observe due diligence and comply with the requirements laid down in the law and related rules.

“Intermediaries, including social media intermediaries, are reminded that they are statutorily obligated under Section 79 of the IT Act to observe due diligence as a condition for availing exemption from liability in respect of third-party information uploaded, published, hosted, shared or transmitted on or through their platforms,” the ministry stated.

Meity made it clear that the safe harbour protection is not automatic and can be withdrawn if platforms fail to meet their due diligence obligations, particularly in cases involving illegal or harmful content.

Warning of Legal Consequences

The ministry warned that non-compliance with the provisions of the IT Act and the IT Rules, 2021, would invite legal action. Such consequences could include prosecution under the IT Act itself, the Bharatiya Nyaya Sanhita (BNS), and other relevant criminal laws.

“Non-compliance with the provisions of the IT Act and/or the IT Rules, 2021, will lead to consequences, including prosecution against the intermediaries, platforms and their users,” the advisory noted.

This warning underscores the government’s intent to hold not only individual users but also platform operators accountable if unlawful content is allowed to proliferate due to negligence or weak enforcement mechanisms.

Focus on Obscene and Harmful Content

According to the PTI report cited by the ministry, Meity has taken note of a growing volume of content on social media that can be categorised as obscene, vulgar, pornographic, paedophilic, or harmful to children. The advisory emphasised that such content is explicitly prohibited under Indian law and that platforms are required to proactively prevent its circulation.

The ministry said there is a clear need for greater consistency in how intermediaries fulfil their due diligence obligations, particularly when it comes to identifying, reporting, and removing content that is obscene or otherwise unlawful.

Meity reminded platforms that under the IT Act and the IT Rules, 2021, they are required to make reasonable efforts to ensure that users of their services do not host, display, upload, modify, publish, transmit, store, update, or share prohibited content.

This includes content that is obscene or pornographic, involves child sexual abuse or paedophilia, is harmful to children, or violates any other law currently in force.

Call to Review Compliance Frameworks

The advisory also urged social media companies and other intermediaries to immediately review and strengthen their internal compliance frameworks. This includes updating content moderation policies, improving reporting and grievance redressal mechanisms, and ensuring swift takedown of illegal material once it is identified or flagged.

Meity indicated that platforms must not treat these obligations as a mere formality, but as a core responsibility that comes with operating in India’s digital ecosystem.

The warning signals a tougher regulatory posture by the government as concerns grow over the misuse of social media platforms for spreading harmful and illegal content. It also reflects the Centre’s broader push to enforce accountability among tech companies operating in India, particularly in relation to user safety and child protection.

Heightened Scrutiny Ahead

With this advisory, the government has put online platforms on notice that failure to act decisively against prohibited content could result in the loss of legal protections and exposure to criminal liability. The move is expected to lead to heightened scrutiny of content moderation practices and stricter enforcement of existing digital laws.

As social media continues to play a central role in public discourse and everyday communication, the Centre has made it clear that platforms must balance freedom of expression with their legal and social responsibility to prevent harm and uphold the law.

Leave a Reply

Your email address will not be published. Required fields are marked *