Technology

New rules to make social media firms accountable for online harms


SINGAPORE – Laws to tackle online harms will kick in in 2023 after Parliament passed an Online Safety (Miscellaneous Amendments) Bill on Wednesday, with unanimous support from MPs.

The Bill seeks to amend the Broadcasting Act to make social media platforms liable if they fail to protect local users from online harms, placing the Republic among front runners regulating a space that has so far been self-supervised.

For one thing, the Bill will empower the Infocomm Media Development Authority (IMDA) to issue orders to social media platforms, including Facebook, Instagram, YouTube and TikTok, to take down egregious content.

This includes posts advocating suicide, self-harm, child sexual exploitation and terrorism, as well as materials that may incite racial or religious tensions or pose a risk to public health.

Failure to comply may attract a fine of up to $1 million, or a direction to have their social media services blocked in Singapore.

Internet service providers such as Singtel, StarHub and M1 may also face fines of up to $500,000 for failing to block the services in question.

An accompanying draft Code of Practice for Online Safety, to be imposed on regulated social media platforms, spells out the safeguards needed to prevent users, especially children under 18, from accessing harmful content.

These include tools that allow children or their parents to manage their safety on these services, and mechanisms for users to report harmful content and unwanted interactions.

The code is expected to be rolled out as early as 2023, after a final round of consultation with social media firms.

See also  Robo-advisory startup Bambu raises US$10m to grow global team, offer more products

During the debate on the Bill on Wednesday, which continued from Tuesday, several MPs called for more to be done to protect children and help victims of online harm.

Workers’ Party MP Gerald Giam (Aljunied GRC), Mr Melvin Yong (Radin Mas) and Mr Desmond Choo (Tampines GRC) called for age verification for all new sign-ups. “Age verification is needed for parental controls to work,” said Mr Giam.

In her concluding speech, Communications and Information Minister Josephine Teo told Parliament that there is currently no international consensus on standards for effective and reliable age verification by social media services which Singapore can also reliably reference.

“Instead, we will continue to closely monitor and extensively consult on the latest developments in age verification technology, taking into account data protection safeguards, and consider viable regulatory options,” she said.

Ms Tin Pei Ling (MacPherson) and Ms Nadia Ahmad Samdin (Ang Mo Kio GRC) asked why private messaging services such as WhatsApp and Facebook Messenger are not covered by the Bill, even though image-based sexual abuse largely takes place through private channels.

Responding, Mrs Teo said: “The short answer is that there are legitimate privacy concerns.”

But users are not without recourse, she said. “If individuals encounter harmful messages or unwanted interactions in private messages when using these social media services, they could block the sender or report the sender to the service.”

Mrs Teo clarified that groups with very large memberships, which could be used to propagate egregious content, will be caught under the new online safety law.

See also  Overwatch League's 2022 season will use an 'early build' of Overwatch 2



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.