Singapore Launches Tougher Internet Safety Law to Protect Users from Online Harms
Singapore, October 15, 2025 — Singapore is set to strengthen its fight against harmful digital content with a new online safety commission that will have the authority to order social media platforms to remove or block access to damaging material.
The move comes as part of a new online safety bill tabled in parliament on Wednesday, aiming to safeguard citizens from online harassment, child exploitation, and other forms of digital abuse.
Commission to Have Broad Powers Over Online Platforms
Under the new law, the Online Safety Commission will be established by the first half of 2026 and will have sweeping powers to tackle a range of online harms.
The commission will be empowered to:
- Order social media platforms to take down or block harmful posts.
- Direct internet service providers to restrict access to harmful websites, groups, or entire platforms.
- Ban perpetrators of abuse or harassment from using certain online services.
- Grant victims a right to reply, allowing them to respond publicly to abusive content.
The law will initially target specific categories of harm — including cyberbullying, doxxing, stalking, sexual abuse imagery, and child pornography — before expanding to include other offenses such as incitement of enmity and non-consensual sharing of private information.
Findings Prompted Government Action
The new legislation was prompted by findings from the Infocomm Media Development Authority (IMDA) earlier this year. A February 2025 study revealed that more than half of legitimate user complaints about harmful online content were not promptly addressed by major platforms, including global social media giants.
“More often than not, platforms fail to take action to remove genuinely harmful content reported to them by victims,” said Josephine Teo, Singapore’s Minister for Digital Development and Information, during the bill’s introduction.
She emphasized that the commission would serve as an independent authority to ensure faster and fairer enforcement of online safety rules, reducing the burden on victims of digital abuse.
Building on the Online Criminal Harms Act
The new online safety bill follows in the footsteps of Singapore’s Online Criminal Harms Act (OCHA), which came into effect in February 2024. That legislation gave the government powers to act against criminally harmful online content, including scams and extremist materials.
Earlier this year, authorities used OCHA to issue the first-ever directive against Meta, ordering the company to improve its measures for detecting and removing impersonation scams on Facebook.
In September 2025, Singapore’s Ministry of Home Affairs warned Meta it could face fines of up to S$1 million ($771,664) — and S$100,000 per day thereafter — if it failed to comply with the new safeguards, such as introducing facial recognition systems to detect scam accounts.
The ministry has not yet confirmed whether Meta met the compliance requirements before the deadline.
Gradual Implementation and Public Consultation
The government has stated that the rollout of the new online safety regime will occur in stages, allowing for public consultation and industry input.
During parliamentary debate, lawmakers are expected to discuss the scope of enforcement, platform accountability, and data protection implications of the new powers.
The Ministry of Digital Development and Information first proposed creating a dedicated Online Safety Commission during budget debates in March 2025, following increased public concern over rising online abuse cases in Singapore.
Expert Views: Balancing Safety and Free Expression
Analysts say the new commission could make Singapore one of the most proactive nations in Asia in regulating online spaces, but it will also raise questions about freedom of expression and platform liability.
Digital policy expert Dr. Melissa Koh from the Lee Kuan Yew School of Public Policy noted:
“This legislation reflects a growing global trend of governments demanding accountability from tech platforms. Singapore’s approach aims for a balance between user protection and maintaining an open digital environment.”
Critics, however, warn that broad powers to block content could risk overreach if not accompanied by transparent oversight mechanisms.
Global Context: Rising Trend in Online Safety Regulations
Singapore joins other countries, including Australia, the UK, and the EU, in introducing stronger online safety laws that compel tech companies to take responsibility for harmful content.
The UK’s Online Safety Act (2024) and the EU’s Digital Services Act (DSA) are similar in scope, requiring major platforms to remove illegal or harmful content swiftly or face large fines.
Singapore’s new commission is expected to cooperate with international regulators and share best practices to address cross-border digital harms.
Key Takeaways
- Law Introduced: October 15, 2025
- Commission Launch: First half of 2026
- Main Powers: Block harmful content, direct ISPs, ban abusers, grant victims’ replies
- Targeted Harms: Cyberbullying, doxxing, stalking, child abuse, sexual exploitation
- Fine Example: Meta could face S$1 million fine for non-compliance under prior law


Leave a Reply