SINGAPORE — In a bid to protect users from harmful online content, the government is proposing two codes of practice for social media services, including the ability to order these companies to disable access to specific content.
The Ministry of Communications and Information (MCI) said at a press conference on Monday (June 20) that the first proposal is for high-reach or high-risk social media services to have “wide processes of the system” to improve the safety of all users. .
This would include the establishment of community standards and content moderation mechanisms to mitigate users’ exposure to sexual, violent and self-harming content.
The second proposal is that the Infocomm Media Development Authority (IMDA) would be able to direct social media services to disable access to “specified content” for users in Singapore, or ban specific online accounts on social media services to interact with or communicate content in Singapore. users.
This may cover content related to sexual harm, self-harm, public health, public safety, and racial or religious discord or intolerance.
MCI said these proposals will only cover social media services – platforms that enable the publication of online content with the primary purposes of online interaction and bonding – and will exclude messaging apps.
Industry consultations for the proposals began this month and there will be public consultations on the proposal next month.
Communications and Information Minister Josephine Teo first unveiled plans for these new codes of practice against online harm in March during her ministry’s budget debates.
MCI said on Monday that the prevalence of online harm both globally and in Singapore is a major concern despite many online services working to address the issue.
“Such content that may spread harm includes content that endorses acts of terrorism, extreme violence or hateful acts against certain communities, promotes suicide or self-harm, or destabilizes one’s physical or mental well-being through harassment , intimidation or consensual non-sharing of sexual images.
“These online harms are exacerbated when amplified on social media services,” MCI said.
For example, platform algorithms based on user interest can propel content such as dangerous challenges on video that can quickly go viral, which can lead to injuries, deaths and acts of terrorism, and their consequences can also spread through live captured videos. and content sharing, he added.
MCI also pointed out that racially or religiously offensive content can incite religious intolerance and undermine our racial harmony.
For example, he said that last year a Singaporean man posed as a Chinese woman and posted several racist and insensitive public messages on a social media service, disparaging Singapore’s minority communities, and that has since been reported to authorities.
In 2020, a person behind the “NUS Atheist Society” profile posted a religiously offensive article that described the Bible and the Quran as alternatives to use in the event of a toilet paper shortage.
Abusive online behavior such as harassment and sexual violence is also widespread, MCI added.
Last year, a poll asking people to rank female asatizah (religious teachers) based on their sexual attraction was posted on social media.
“The message caused immense distress to those involved and was found to have encouraged sexual violence,” MCI said.
A January survey by Sunlight Alliance for Action, a cross-industry alliance that tackles online dangers, found that 61% of Singaporeans experienced harm online on popular social media services.
The proposed codes call for greater accountability on the part of social media platforms, instructing them to produce an annual report to be published on the IMDA website, and allowing the authority to order these platforms to remove access to harmful content.
However, MCI said details of the consequences of not following these guidelines were not discussed at this early stage of the consultations.
“MCI takes a collaborative approach to governing the online space against harm,” he added.
“We recognize that the industry has taken active steps in recent years to combat harmful online content on social media, and their contributions will be essential in shaping a safer and more responsible online space for users in Singapore. “
Meta, which is the parent company of social media platform Facebook, said in response to today’s questions that tackling harmful content is a shared goal between governments and industry.
“We welcome dialogue between young people, parents and caregivers, educators and experts to ensure the safety and well-being of adolescents, while respecting their expectations of privacy and promoting their autonomy”, he added.
YouTube and Twitter declined to comment.
As part of its consultative approach, the authorities will also examine the laws of other jurisdictions to regulate online services.
These include Germany’s Network Enforcement Act, which came into force in January 2018, and the UK’s Online Safety Bill, which was introduced in March.
The MCI said these laws are general in nature, while the proposed codes are intended to be more targeted.
The codes are expected to fall under the Broadcasting Act, which now also covers the Internet Code of Practice.
The Internet Code of Practice sets standards for Internet content and requires Internet service providers and Internet content providers to deny access to any content banned by authorities.