
The Online Safety Act is now in force, giving platforms new duties to put up protections for children and take down illegal content. It is the biggest step forward in online protection since the birth of the internet.
As of the 24th July 2025, platforms now have a legal duty to protect children from online harms. This includes:
- Age verification to ensure that those below 18 can not access harmful content
- Changing algorithms to filter out harmful content
- The quick removal of harmful content such as self-harm content, supporting children who have been exposed to it
- Committments to removing illegal content, such as those selling drugs or promoting terrorist groups
- Identifying a named, accountable person for children’s safety online
The Act will enforce the detection and removal of horrific sexual abuse material using hash-matching, stop malicious adults from messaging children preventing grooming, keep the profiles and locations of children hidden, and prevent children from viewing harmful content such as pornography, suicide content, eating disorder content or content which encourages self-harm.
Peter Kyle, Secretary of State for Science, Technology and Innovation, has given various interviews on the matter: BBC Breakfast, Sky News, Mumsnet.
You can read more about its the Online Safety Act here.