Technology companies will have just 48 hours to remove non-consensual intimate images (including deepfake nudes) or risk being fined millions and potentially blocked in the UK, Prime Minister Keir Starmer has said.
Describing online misogyny as a "national emergency," Starmer announced amendments to the Crime and Policing Bill that would force platforms to act swiftly once abusive content is flagged. Firms that fail to comply could face fines of up to 10% of global revenue or have their services restricted under enforcement by Ofcom.
The measures would also apply to AI-generated content, including images created through tools such as Grok, developed by xAI, which earlier this year faced backlash for generating sexualised images of women. Ministers had warned that stronger action could follow if safeguards were not introduced.
Victims will be able to report abusive material either directly to platforms or to Ofcom, triggering alerts across multiple sites to prevent repeated uploads. The regulator is also expected to explore digital watermarking and hash-matching systems to automatically detect and block reposted content.
The changes would classify the creation or sharing of non-consensual intimate imagery as a "priority offence" under the Online Safety Act, placing it alongside offences such as child sexual abuse material and terrorism-related content.
Starmer said the responsibility for tackling abuse must shift away from victims and on to perpetrators and the companies hosting harmful material. "Too often, victims are forced to chase content across platforms while it continues to spread," he wrote. "That is not justice, it is failure."