Reed Smith Client Alerts

Singapore’s Ministry of Communications and Information (MCI) has launched a public consultation on two proposed codes of practice, a code of practice for online safety and a content code for social media services, which will impose requirements on social media services to ensure the online safety of users in Singapore. The consultation will close on 10 August 2022.

When issued, the codes will be legally binding, similar to other codes issued by the MCI and administered by the Infocomm Media Development Authority (IMDA) pursuant to the Info-communications Media Development Authority Act 2016.

Auteurs: Bryan Tan Goh Eng Han (Resource Law LLC)

Visual contents concept. Social networking service. Streaming video. communication network.

Background

This development is timely, with a significant amount of discourse happening on social media platforms today in addition to conventional channels. Taiwan and India are also taking steps to formulate similar regulation to govern online and social media harms. However, unlike previous IMDA codes of practice for cybersecurity, competition and fibre network interconnectivity, these proposed codes impact end users more directly. While the public consultation has not explicitly defined ‘social media services’, or at least, not yet, one would expect that the codes will apply to the major social media platforms which are accessible by users in Singapore today.

The proposed codes continue the trend of new legislation setting industry-wide standards, such as the Personal Data Protection Act 2012 (PDPA) and the Protection from Online Falsehoods and Manipulation Act 2019 (POFMA). While PDPA covers leaks of user data and POFMA covers false statements of fact, there are other scenarios that have the potential to cause significant online harm. These include insensitive and provocative racist comments and the sharing of dangerous viral video challenges. Aside from closing the gaps in such areas not addressed by existing legislation, the proposed codes will also ensure that local support resources are available that will be better suited to Singapore users.

Code of practice for online safety

The first proposed measure is a code of practice for online safety. The code of practice will introduce community standards for the following categories of content:

  1. Sexual
  2. Violent
  3. Self-harm
  4. Cyberbullying
  5. Endangering public health
  6. Facilitating vice and organised crime

When users search for high-risk content like self-harm, the code of practice will require social media services to provide relevant safety information to these users.

Three focus areas the MCI has identified for the community standards are:

  1. Child safety
  2. User reporting
  3. Platform accountability

Child safety is particularly important as a notable proportion of users are younger with social media services becoming more ubiquitous. In addition to adhering to community standards for online content, the proposed code of practice requires social media services to provide tools to limit exposure to specified unwanted content. The MCI suggests tools including the hiding of unwanted comments and limiting contact and interactions with other users.

Child safety is achieved by introducing further safeguards specific to young users, including a stricter set of community standards and more tools to limit exposure. The MCI suggests enhanced tools, including limiting the visibility of accounts, limiting contact and/or interactions and managing the content young users see and/or experience. The safeguards can either be activated automatically or provided on an opt-in basis. It remains to be seen whether social media services will be required to scan encrypted content for infringing content automatically, as proposed by draft legislation in some European jurisdictions.

User reporting is becoming an increasingly important moderation tool with the fast pace and high volume of social media content created today. While social media services’ moderation teams and algorithms are a first line of defence against harmful online content, users might still come across harmful content that slips through the cracks. Creating a system for users to report harmful content and for social media services to take action will reduce the amount of harmful content that evades initial detection and is shown to users.

With great power comes great responsibility. While social media services deploy more moderation tools, their practices should also be disclosed to users. Although not every user takes the time to read annual reports released by social media services (separate from the code’s proposed requirements), collating reports from various services using a standard format on an IMDA page, for instance, increases accountability. This will then enable users to contribute constructive feedback and allow the MCI to gain a more accurate understanding of various moderation practices adopted by social media services over time.