Govt studying need for safeguards to curb harms of online games, AI chatbots


SINGAPORE – Local authorities are studying the need for safeguards to curb online harms perpetuated by artificial intelligence (AI) chatbots and online games, which makes children vulnerable to content such as violence and cyberbullying.

This comes as new risks continue to emerge that makes online safety a challenge, said Minister of State for Digital Development and Information Rahayu Mahzam during the debate on her ministry’s budget on March 2.

Online harms have stemmed from the use of AI chatbots to generate sexual content in bulk, said MP for Sengkang GRC He Ting Ru, who cited the recent controversy

surrounding

Elon Musk’s AI chatbot

Grok

.

The chatbot, which is accessible via social media platform X, came under fire in January after it acceded to user requests to churn out non-consensual, sexually explicit and violent content, often depicting women and children.

The Infocomm Media Development Authority had said

it was in talks with X

and that Grok has stopped producing such content.

Ms Mahzam acknowledged the dangers of AI chatbots, especially in social media services, as children can access them more easily. She said: “We will continue to study whether safeguards for AI chatbots are needed to better protect users from the harms caused by their misuse.”

The government also recognises that online harms exist outside of app stores and social media services, she said.

“Some parents have expressed concerns about harms that online video games bring, including exposure to inappropriate content, cyberbullying and screen addiction,” said Ms Mahzam.

“We recognise these concerns, and are studying whether safeguards on online video games are needed.”

Steam, a platform that serves as the PC gamer’s gateway to install games, is not regulated under Singapore’s soon-to-be-enforced Code of Practice for Online Safety for App Distribution Services. Steam has also been criticised for its lax age verification, which simply asks users to declare a birth date before downloading a mature-rated game.

The Code of Practice for Online Safety for App Distribution Services requires major app stores – such as those operated by Google, Apple, Samsung, Microsoft, and Huawei – to ban underage users from downloading age-inappropriate apps from April 1.

By June, a new

Online Safety Commission

will also start operations to support victims of online harms such as harassment, intimate image abuse, doxxing, and deepfakes. The agency can issue directions to disable users access to harmful content, or restrict the perpertrator’s online account.

Across the world, some countries have implemented bans on underage users from downloading social media apps. In Australia,

under-16s have been banned

from downloading TikTok, X, Facebook, Instagram, YouTube, and Snapchat.

“Singapore also wants to strengthen protection for our children online, and we want to do it right and take a holistic approach,” said Ms Mahzam.

“As the Ministry of Digital Development and Information continues to study the impact of social media bans, we plan to extend age assurance requirements to designated social media services.”

Consultations with designated social media services – including Facebook, HardwareZone, Instagram, TikTok, X, and YouTube – are ongoing, and more details will be announced later in 2026, said Ms Mahzam.

Ms He also raised concerns about design architecture used by social media platforms – such as infinite scroll, autoplay videos and algorithmic feeds – that maximise engagement and erodes self-regulation in children.

“A child doom-scrolling past bedtime is not making a choice – they are responding to a system designed to make stopping almost impossible,” said Ms He, who cited a report published by the European Commission in February, which found that TikTok’s addictive design is a legal violation.

“Silence from Singapore adds a reputational risk…The question is therefore whether we should allow platforms to deploy attention capture dark patterns against children, without legal consequences.”

She also called for the ministry to consider putting together a select committee that will examine global efforts to protect children from the harms of social media.

Online safety reports submitted by designated social media services and app stores in 2025 are currently being assessed by IMDA, said Ms Mahzam, adding that these will be published alongside the agency’s overall report once they are ready.



Source link