KUALA LUMPUR: Malaysia’s Online Safety Act 2025 (ONSA), in force since Jan 1, marks a bold step in digital regulation, forcing platforms to tighten controls on user content with child safety at its core.
Passed in December 2024 and gazetted in May 2025, the law works alongside existing rules such as the Communications and Multimedia Act 1998 but raises the stakes.
ONSA makes platforms directly accountable for harmful content circulating on their networks.
The act is intended to address gaps in how platforms currently manage harmful content, where enforcement decisions are often guided by internal policies.
When flagged material is not removed promptly, even a relatively small number of unresolved cases can expose large numbers of users to online harm.
RISING ONLINE HARM
The legislation follows a sustained increase in reported online harm in Malaysia.
Police recorded RM2.7 billion in reported losses from online scams between January and November 2025, according to Bernama.
In October, police announced the dismantling of a criminal network linked to child sexual abuse material (CSAM), resulting in 31 arrests and the seizure of more than 880,000 digital files.
Since 2022, authorities have also taken down 38,470 items of cyberbullying and online harassment content through regulatory or enforcement action.
Despite ongoing moderation, harmful content continues to surface.
Data from the Malaysian Communications and Multimedia Commission shows that major platforms removed about 92 per cent of the 697,061 posts flagged as harmful between January 2024 and November 2025, leaving 58,104 posts still accessible.
Even a shortfall of one percentage point would translate into several thousand harmful posts remaining online.
The growing use of automated tools and AI-generated content, including deepfakes, has further complicated detection and enforcement efforts.
SCOPE OF THE ACT
ONSA applies to application service providers and content application service providers, including both local and foreign platforms operating in or targeting the Malaysian market.
The act covers a defined set of harmful content categories. This includes CSAM, online scams and financial fraud, obscene or pornographic material, harassment and abusive communications, content linked to violence or terrorism, material encouraging self-harm among children, content promoting hostility or disrupting public order and content related to dangerous drugs.
Rather than regulating individual posts, ONSA focuses on platform-level risk management, including content distribution and recommendation systems.
The framework emphasises governance and operational controls, rather than imposing criminal liability on users for lawful expression.
Under ONSA, platforms are required to take steps to identify and manage risks arising from harmful content on their services.
This includes reducing users’ exposure to “priority harms”, or the most severe forms of online harm, ensuring certain categories of content are made inaccessible and providing reporting and user support mechanisms.
Platforms must also prepare and submit an online safety plan outlining how they address these risks.
Enforcement measures are available where platforms fail to meet their obligations, although the act sets out procedural requirements governing how directions are issued and reviewed.
The act also requires platforms to adopt measures specifically intended to limit children’s exposure to harmful content and interactions, including age-related safeguards and restrictions on access to certain material.
Overall, these provisions will affect how platforms design default settings, content discovery tools and interaction features for younger users.
LIMITS AND SAFEGUARDS
While ONSA introduces new obligations for platforms in managing online risks, it does not extend to private one-to-one messaging or authorise the general monitoring of users.
The act also does not create new offences relating to lawful speech or political expression.
Safeguards within the framework include notice requirements before enforcement action, opportunities for representations, public records of regulatory directions and access to appeal mechanisms and judicial review.
ONSA introduces a formal regulatory structure for online safety, but does not replace existing enforcement, education or prevention efforts. Issues such as digital literacy, parental involvement and online behaviour norms remain outside the scope of the legislation.
As its implementation approaches, the act’s practical impact will depend on how platforms adapt their systems and how regulatory oversight is applied over time.
© New Straits Times Press (M) Bhd