Sunday, June 8, 2025
No menu items!

Tech firms must start protecting UK users from illegal content

Must Read

Tech Platforms Must Implement Better Moderation, Easier Reporting, and Built-in Safety Tests

Enforcement of Online Safety Regime Ramps Up

Tech companies must start putting in place measures to protect users from child sexual abuse images and other illegal content in Britain from Monday as enforcement of its online safety regime ramps up. Media regulator Ofcom has set out the measures that companies such as Meta’s Facebook, ByteDance’s TikTok, and Alphabet’s YouTube must implement to tackle criminal activity and make their platforms safer by design.

Better Moderation, Easier Reporting, and Built-in Safety Tests

Platforms must now act quickly to come into compliance with their legal duties, and our codes are designed to help them do that, said Ofcom’s enforcement director, Suzanne Cater. The Online Safety Act, which became law in 2023, sets tougher standards for platforms, with an emphasis on child protection and the removal of illegal content.

Assessing Risks and Implementing Measures

In December, Ofcom published its first codes of practice for the new law and set companies a deadline of March 16 to assess the risks illegal content posed to users on their platforms. The regulator will be able to issue fines of up to £18 million (US$23.31 million) or 10% of a company’s annual global turnover, if they fail to comply with the law.

File-Sharing and File-Storage Services Under Scrutiny

Ofcom said file-sharing and file-storage services were particularly vulnerable to being used for sharing child sexual abuse material. It launched a separate enforcement programme on Monday to assess the safety measures these services had in place to prevent the spread of such content. The media regulator requested a number of firms that offer file-storage services to share their risk assessments by March 31. Should they not comply with these rules, they could face the same penalties.

Conclusion

In conclusion, tech platforms must take immediate action to protect users from child sexual abuse images and other illegal content. The Online Safety Act sets out clear standards for platforms, and it is essential that companies comply with these rules to prevent the spread of harmful content. Ofcom will be monitoring the situation closely, and companies that fail to comply with the law will face severe penalties.

Frequently Asked Questions

  • What is the Online Safety Act?
    The Online Safety Act is a new law that sets out tougher standards for online platforms, with an emphasis on child protection and the removal of illegal content.
  • What are the penalties for non-compliance?
    Companies that fail to comply with the law can face fines of up to £18 million (US$23.31 million) or 10% of their annual global turnover.
  • What is the deadline for companies to implement measures?
    Companies have until March 16 to assess the risks illegal content posed to users on their platforms, and until March 31 to share their risk assessments with Ofcom.
Latest News

Issues over Petros nothing to do with Petronas layoffs, says Fadillah

Write an article about Issues over Petros nothing to do with Petronas layoffs, says Fadillah .Organize the content with...

More Articles Like This