
UK Regulator Sets July Deadline For Tech Firms To Enforce Child Online Safety Rules
The UK’s communications regulator, Ofcom, announced new rules on Thursday under the Online Safety Act, mandating tech companies to protect children from harmful online content, effective from July 25.
These rules will apply to websites and apps commonly used by children, such as social media and gaming platforms. Firms must block access to material involving suicide, self-harm, eating disorders, pornography, hate speech, online abuse, and harmful challenges. Measures include age checks, algorithm changes, blocking features, and content controls. Providers must complete risk assessments by July 24. Non-compliance may result in fines of up to 18 million pounds or 10% of global revenue and, in severe cases, removal from the UK market. Ofcom will issue additional codes of practice for other areas of the Online Safety Act later this year.
The new measures are part of broader efforts to implement the UK’s online safety legislation in stages.