Singapore Online Safety Act 2026: Complete Guide for Businesses and Users
Singapore continues to set the pace in Asia-Pacific for digital regulation, and the Online Safety Act 2026 represents the next major evolution in how the city-state governs online content, platform accountability, and user protection. Building on the original Online Safety (Miscellaneous Amendments) Act 2022 and the Code of Practice for Online Safety, the 2026 framework expands the regulatory perimeter, introduces new duties for service providers, and tightens enforcement powers held by the Infocomm Media Development Authority (IMDA).
This complete guide breaks down what the Singapore Online Safety Act 2026 means for businesses operating online services, marketers running campaigns into Singapore, and everyday users who want to understand their rights. Whether you run a social media platform, an e-commerce site, a SaaS product, or simply share links online, you need to know how this law applies.
What Is the Singapore Online Safety Act 2026?
The Singapore Online Safety Act 2026 is an updated regulatory framework administered by the IMDA that requires online communication services and electronic services with significant Singapore reach to proactively prevent, detect, and remove harmful content. It strengthens the original 2023 framework by adding obligations around scams, deepfakes, child safety, and algorithmic transparency.
At its core, the Act creates legally binding duties for designated service providers. Failure to comply can result in directions to disable access, financial penalties of up to S$1 million per breach, and in serious cases, blocking orders that prevent Singapore users from accessing the service entirely.
Key objectives of the 2026 update
- Reduce exposure of Singapore users — especially minors — to egregious online content.
- Combat the rapid rise of AI-generated scams, deepfakes, and synthetic media.
- Hold platforms accountable for systemic risk, not just individual posts.
- Provide victims of online harm with faster redress mechanisms.
- Align Singapore with global standards such as the EU Digital Services Act (DSA) and the UK Online Safety Act.
Who Does the Act Apply To?
The Act applies to two broad categories of services that are accessible in Singapore, regardless of where the provider is headquartered. Extraterritorial reach is a defining feature — overseas providers cannot avoid obligations simply by being based outside Singapore.
1. Online Communication Services (OCS)
These are services that allow users to communicate or share content with the public, including:
- Social media platforms (Facebook, Instagram, TikTok, X, LinkedIn)
- Messaging services with public broadcast features (Telegram channels, WhatsApp Communities)
- Video-sharing platforms and livestreaming services
- Online forums and discussion boards
- App stores distributing user-generated content
2. Electronic Services and Designated Internet Access Service Providers
These include search engines, cloud services hosting user content, and ISPs that may be directed to block access to non-compliant services.
Threshold for Regulated Online Communication Services (ROCS)
Services with significant reach in Singapore — typically defined by monthly active user thresholds — are designated as Regulated Online Communication Services (ROCS) and face heightened obligations including annual safety reports, risk assessments, and a designated Singapore representative.
Categories of Harmful Content Covered
The 2026 Act expands the categories of content that providers must address. Each category carries specific takedown timelines and proactive duties.
| Content Category | Examples | Required Action Timeframe |
|---|---|---|
| Child sexual exploitation material (CSEM) | Imagery, grooming, livestreamed abuse | Immediate (within hours) |
| Terrorism and violent extremism | Recruitment, propaganda, attack glorification | Immediate |
| Suicide and self-harm content | Methods, encouragement targeting minors | 24 hours |
| Cyberbullying and harassment | Doxxing, targeted abuse, image-based abuse | 24–48 hours |
| Scams and fraud (new in 2026) | Investment scams, phishing, impersonation | 24 hours |
| Deepfakes and synthetic media (new) | Non-consensual intimate deepfakes, election deepfakes | 24 hours |
| Content endangering public health | Dangerous misinformation during emergencies | As directed by IMDA |
Core Obligations for Service Providers
Designated providers under the 2026 Act must meet a layered set of duties. These are organized into systemic obligations (how the service is designed and governed) and content obligations (how specific harmful items are handled).
Systemic duties
- Risk assessments: Conduct and document annual risk assessments covering all relevant harm categories.
- User safety by design: Implement default privacy settings for minors, age assurance measures, and friction tools against scams.
- Transparency reporting: Publish annual reports detailing content removed, enforcement metrics, and effectiveness of safety measures.
- Algorithm accountability: Provide IMDA with information on recommender systems and offer users non-personalized feed options.
- User reporting tools: Provide accessible, multilingual reporting and appeals mechanisms.
Content-specific duties
- Proactively detect and remove egregious content using a mix of human review and automated tools.
- Comply with Directions to Disable Access issued by IMDA within stipulated timeframes.
- Preserve evidence and cooperate with Singapore Police Force in criminal investigations.
- Notify users when their content is restricted and offer an appeals process.
New Provisions Introduced in 2026
The 2026 update is not just a refresh — it introduces several substantive new provisions that reflect the changing threat landscape.
1. Scam-specific duty of care
Singapore lost over S$1.1 billion to scams in 2024 alone. The 2026 Act now imposes a specific duty on platforms to detect and disrupt scam content, including paid advertisements impersonating banks, government agencies, and public figures. Platforms that profit from scam ads can be held liable.
2. Deepfake and synthetic media rules
Non-consensual intimate deepfakes, election-related deepfakes, and AI-generated impersonation of Singapore officials are now explicitly prohibited. Providers must label AI-generated content where feasible and remove harmful synthetic media within 24 hours of notification.
3. Enhanced child safety code
A dedicated Children's Online Safety Code requires providers serving minors to implement age assurance, restrict targeted advertising to under-18s, and apply default high-privacy settings.
4. App store accountability
App stores must verify developer identities, screen for scam apps, and remove apps flagged by IMDA within prescribed timelines.
5. End-user empowerment
Users now have a statutory right to file complaints directly with IMDA when platforms fail to act, and a new ombuds-style review pathway has been introduced for contested takedowns.
Penalties and Enforcement
The IMDA's enforcement toolkit under the 2026 Act is substantially expanded. Non-compliance can lead to escalating consequences:
| Enforcement Action | Trigger | Maximum Penalty |
|---|---|---|
| Direction to Disable Access | Specific harmful content identified | Mandatory compliance; fines for delay |
| Financial penalty | Breach of Code or Direction | S$1 million per breach |
| Daily penalty | Continuing non-compliance | S$100,000 per day |
| Access blocking order | Persistent non-compliance | ISPs block service in Singapore |
| Criminal liability (officers) | Willful obstruction | Fines and imprisonment |
What This Means for Businesses
Even businesses that are not large platforms feel the ripple effects of the Act. Marketers, SMEs, and content creators must adapt their workflows.
For digital marketers and advertisers
- Vet ad creatives carefully — impersonation, misleading claims, and unverified financial promotions risk both platform takedowns and IMDA action.
- Use trusted link infrastructure. Shortened links should come from reputable providers that maintain anti-abuse systems. Tools like Lunyb offer privacy-respecting URL shortening with built-in malicious link detection, which helps marketers avoid having their domains flagged.
- Maintain documentation of advertiser identity and creative approval workflows.
For SaaS and platform operators
- Determine whether your service falls within OCS or ROCS thresholds.
- Appoint a Singapore representative if required.
- Build out trust and safety policies, reporting tooling, and escalation pathways with IMDA.
For content creators and influencers
- Disclose AI-generated or synthetic content clearly.
- Avoid promoting unverified investment opportunities or financial schemes — these are a primary scam enforcement focus.
- Respond promptly to platform notices; appeals are available but timelines are short.
Compliance Checklist for Platforms
Use this practical checklist to benchmark your readiness for the 2026 framework:
- Map your service to OCS, ROCS, or other regulated categories.
- Conduct a documented harm risk assessment across all seven content categories.
- Update Terms of Service and Community Guidelines to reflect Singapore-specific obligations.
- Implement an in-product reporting flow accessible from every piece of user content.
- Establish a 24/7 escalation channel for IMDA Directions.
- Deploy age assurance for services likely to be accessed by minors.
- Configure default high-privacy settings for under-18 accounts.
- Audit recommender systems for amplification of scams and harmful content.
- Publish your first annual transparency report.
- Train trust and safety teams on Singapore-specific definitions and timelines.
How the Act Compares Globally
Singapore's framework borrows ideas from the EU and UK but is distinctive in its speed of enforcement and scam focus.
| Feature | Singapore OSA 2026 | EU DSA | UK Online Safety Act |
|---|---|---|---|
| Extraterritorial reach | Yes | Yes | Yes |
| Max fine | S$1M per breach + daily | 6% global turnover | 10% global turnover |
| Scam-specific duty | Yes (explicit) | Limited | Yes (fraud) |
| Deepfake provisions | Yes (explicit, 2026) | Partial | Yes (intimate) |
| Access blocking | Yes (ISP level) | Limited | Yes |
| Children's code | Dedicated | Article 28 | Children's codes |
Practical Tips for Singapore Users
The Act is not just about platform duties — it strengthens user rights too. Here is how to make the most of them:
- Report harmful content: Use in-app reporting first. If the platform fails to act within the required timeframe, escalate to IMDA via the Online Safety portal.
- Verify links before clicking: Many scams arrive via shortened URLs. Use reputable shorteners and link-preview tools, and read our 2026 buyer's guide to URL shorteners for safer choices.
- Protect minors: Activate parental controls and review default privacy settings on apps used by children.
- Be skeptical of deepfakes: Investment ads featuring local celebrities or government officials are almost always scams.
For businesses choosing tools that align with Singapore's regulatory direction, picking privacy-respecting infrastructure matters. Our honest review of Lunyb and our comparison of Rebrandly in 2026 offer detailed looks at link platforms and how they handle abuse, privacy, and analytics — all increasingly relevant under regimes like the OSA.
Timeline and Implementation
The 2026 Act follows a phased rollout to give providers time to adapt:
- Q1 2026: Act passes; designation of new ROCS begins.
- Q2 2026: Updated Code of Practice for Online Safety published.
- Q3 2026: Mandatory compliance with scam, deepfake, and child safety provisions.
- Q4 2026: First transparency reports due from designated providers.
- 2027: Full enforcement, including financial penalties for systemic non-compliance.
Frequently Asked Questions
Does the Singapore Online Safety Act 2026 apply to overseas businesses?
Yes. The Act has extraterritorial reach. Any online service accessible to Singapore users, regardless of where the provider is based, can be subject to obligations. Larger services with significant Singapore reach face heightened duties as Regulated Online Communication Services.
What are the penalties for non-compliance?
Financial penalties of up to S$1 million per breach plus S$100,000 per day for continuing non-compliance. IMDA can also issue access blocking orders directing Singapore ISPs to block the service entirely, and individual officers can face criminal liability for willful obstruction.
How is the 2026 Act different from the 2023 framework?
The 2026 update introduces explicit duties on scams and deepfakes, a dedicated Children's Online Safety Code, app store accountability, algorithmic transparency obligations, and stronger user redress pathways including a complaint route directly to IMDA.
Do small businesses or personal blogs need to comply?
Most small businesses and personal blogs do not meet the thresholds for designation as ROCS. However, if your platform allows user-generated content or messaging, you should still implement basic moderation, reporting tools, and ensure you act on takedown notices to avoid being targeted by IMDA Directions.
How does the Act affect online advertising in Singapore?
Platforms hosting paid ads have a heightened duty to prevent scam advertising, including impersonation of banks, government agencies, and public figures. Advertisers should expect stricter identity verification, ad creative review, and faster takedowns of non-compliant campaigns.
Conclusion
The Singapore Online Safety Act 2026 represents one of the most comprehensive online safety regimes in Asia-Pacific. It tightens duties around scams and deepfakes, empowers users with new redress channels, and signals that Singapore expects platforms — wherever they are based — to take responsibility for the harms their services enable.
For businesses, the message is clear: invest in trust and safety now, choose privacy-respecting infrastructure, and treat compliance as a competitive advantage rather than a checkbox. For users, the Act offers real new protections, but vigilance — particularly around scams and synthetic media — remains essential.
Protect your links with Lunyb
Create secure, trackable short links and QR codes in seconds.
Get Started FreeRelated Articles
Singapore PDPA vs GDPR: Key Differences Every Business Must Know
Singapore's PDPA and the EU's GDPR both protect personal data, but they differ in scope, consent rules, penalties, and individual rights. This guide breaks down the key differences for businesses operating in both jurisdictions.
ePrivacy Regulations Ireland: Latest Updates and Compliance Guide 2026
A complete 2026 guide to ePrivacy regulations in Ireland, including the latest DPC enforcement trends, cookie consent rules, direct marketing requirements, and a practical compliance checklist for Irish organisations.
GDPR in Ireland: Your Privacy Rights Explained (2026 Guide)
A complete guide to your GDPR privacy rights in Ireland, including how to make Subject Access Requests, file complaints with the Data Protection Commission, and what to do when companies refuse. Learn the eight core data subject rights and how to enforce them.
Singapore PDPA: Your Personal Data Protection Rights Explained
Singapore's PDPA gives you enforceable rights over how organisations handle your personal data. This guide explains each right, how to exercise it, and how to file a complaint with the PDPC if your data is mishandled.