Ministers have conceded that it is not “technically feasible” to scan encrypted messages for explicit child abuse material without undermining privacy in a de-escalation of a row with technology companies over the Online Safety Bill.
Messaging apps including WhatsApp, owned by Facebook parent company Meta, and Signal had threatened to leave the UK if asked to weaken encryption standards for the bill.
A controversial clause in the bill, which is going through parliament, gave Ofcom powers to order messaging services to scan for and take down child sexual abuse material.
However, experts and privacy advocates said that this would not be technically possible without undermining end-to-end encryption, which means only the sender and receiver can see the contents of the message.
The minister for arts and heritage, Stephen Parkinson, said in a final reading at the House of Lords that Ofcom would only be able to intervene if scanning content was “technically feasible”.
He said: “A notice can only be issued where technically feasible and where technology has been accredited as meeting minimum standards of accuracy in detecting only child sexual abuse and exploitation content.”
The development, which was first reported by the Financial Times, is being hailed as a win by tech companies and privacy campaigners.
Meredith Whittaker, president of Signal, wrote on X, formerly Twitter, that it was a “victory”.
Whittaker added she was “grateful to the UK government for making their stand clear. This is a really important moment, even if it’s not the final win.”
The government said it has not changed its stance.
A government spokesperson said: “Our position on this matter has not changed and it is wrong to suggest otherwise. Our stance on tackling child sexual abuse online remains firm, and we have always been clear that the Bill takes a measured, evidence-based approach to doing so.
“As has always been the case, as a last resort, on a case-by-case basis and only when stringent privacy safeguards have been met, it will enable Ofcom to direct companies to either use, or make best efforts to develop or source, technology to identify and remove illegal child sexual abuse content – which we know can be developed.”
The spokesperson added that it met with tech companies including TikTok, Meta, Microsoft on Monday to discuss ways to tackle the “threats posed by sexual offenders exploiting our children”.
Richard Collard, the head of child safety online policy at the NSPCC, said: “This statement reinforces how the online safety bill sets out a balanced settlement that should encourage companies to mitigate the risks of child sexual abuse when designing and rolling out features like end-to-end encryption. It does not change the requirements in the legislation.”