Draft Codes of Practice for Enforcing Online Safety Outlined by Ofcom in the UK

Draft Codes of Practice for Enforcing Online Safety Outlined by Ofcom in the UK

The UK’s communications ⁢regulator, Ofcom, has set out draft codes of practice and ‌guidance for social media, gaming, pornography, search‍ and​ sharing sites in the wake of the Online‌ Safety Act coming into force last month.

The act aims to keep websites ​and different types of internet-based services free of illegal and harmful material while defending​ freedom of expression. It applies to search ⁤engines;⁢ internet services that host user-generated ⁣content, such as social media platforms; online forums; some online ‌games; and ‍sites that publish ​or ‍display pornographic content.

In an online post outlining the draft codes of practice, Ofcom said that companies ⁤will ⁢be required‌ to assess the risk of users being harmed by illegal content on‍ their platform, ​and take appropriate steps to minimize these potential harms.

A core list of measures that services can adopt to mitigate the risk of all types of illegal harm include:

Having a named person accountable to their most senior governance body for compliance with content regulations.
Making sure​ content and search‌ moderation​ teams are well resourced and trained; monitor performance targets and their progress toward them.
Ensuring users can easily report ‌potentially harmful content, make complaints, block other users and ⁣disable comments.
Running safety tests for recommender algorithms.

When⁣ it ‍comes to specific harms, such as protecting children online, larger and​ higher-risk​ services — defined by Ofcom as ‌user-to-user⁣ services such‌ as social media platforms — should make children’s ​accounts invisible ⁣from ‌suggested friends and ⁣connections lists and hide their location information. Accounts outside a child’s ⁤connection list must not be able to send them direct ‌messages.

Ofcom is also proposing that larger and higher-risk services⁣ use “hash matching” technology to identify child sexual abuse material (CSAM) and match it to a database ‍of illegal ⁤images. Platforms should also ‌use automated tools to⁢ detect URLs that have been identified as hosting CSAM.

All large general search services should provide crisis prevention information in ⁢response to search requests regarding suicide and queries ​seeking information‌ regarding ​suicide methods.

In order to tackle fraud and terrorism offenses online, large higher-risk services will be required to deploy keyword⁢ detection to find and remove posts linked to the sale of stolen credentials, such as credit card details and block ‌accounts run by proscribed terrorist organisations.

Ofcom is now consulting with industry⁣ and other ⁤experts before⁣ publishing the final version of its codes of practice⁤ in autumn 2024. Services will then have three ‍months ​to conduct their risk‌ assessment, while Ofcom’s final Codes of Practice will be subject to Parliamentary‍ approval.

“Ofcom is not a censor. We won’t have powers to‍ take content down,”‌ said Ofcom’s ⁢CEO, Dame Melanie Dawes in ‍a statement published alongside the…

2023-11-15 02:41:02
Article from www.computerworld.com ⁣ rnrn

Exit mobile version