Whatever happened to the Online Safety Bill vs WhatsApp controversy?

The few people likely to read this blog will probably remember the big debates about whether the Online Safety Bill (or Act as it is now) would break end-to-end encryption (e2ee).

This debate hit the floor of the House of Commons on 5 December 2022 in the debate on Amendment 153, specifically designed to remove the requirement of regulated services to have ‘the ability to monitor encrypted communications’. The Amendment was rejected.

A few months later, the debate it the press through the not inconsiderable influence of Meta's press office making a story of Will Cathcart's visit to the UK and threat to remove WhatsApp from the UK if the Bill required them to break encryption. Obviously, no politician could be seen to give in to such a threat, but Cathcart met with them and appeared to receive assurances.

We can now see exactly what those assurances might have been. Schedule 4, 12(4) says of the OfCom's Code of Practice on what measures services may be required to take to enforce their Terms of Service:1

A proactive technology measure may relate to the use of a kind of technology on or in relation to any Part 3 service or any part of such a service, but if the technology operates (or may operate) by analysing user-generated content or metadata relating to such content, the measure may not recommend the use of the technology to analyse user-generated content communicated privately, or metadata relating to user-generated content communicated privately.

So what counts as content communicated privately? Section 232 leaves that to OfCom to determine but specifies some factors which they must take into account. The second which (232 (2)(b)) is:

any restrictions on who may access the content by means of the service (for example, a requirement for approval or permission from a user, or the provider, of the service);

Now when we turn to OfCom's discussion of what constitutes communicating privately in Annex 9, we find something very interesting has happened to this:

Unless the provider has evidence to the contrary, we would generally expect the following to constitute access restrictions:
• A requirement for an individual to enter credentials (for example, a password or biometrics) before being able to access the content.
• A requirement for an individual to receive an invite or obtain permission from another user before being able to access the content.
• A requirement for users to have access to a decryption key in order to access the content, where that key is only available to specific individuals.

So OfCom's interpretation of 'approval or permission' includes a reference to encryption such that any e2ee service would thereby be implementing an access restriction. That is a good interpretation of the very general idea of an access restriction, but it is notable that it is made explicit here in a may which certainly is not in the Act itself. That is presumably the reassurance Will Cathcart was given.2

We will probably never know how instrumental the threat to remove WhatsApp from the UK was in OfCom's decision to make the e2ee exemption explicit, but we do know that at some point between 5 Dec 2022 and 9 Nov 2023 (when Annex 9 was published), a decision was made which gives WhatsApp no reason to carry out their threat.

Of course, the 'accredited technologies' for detecting terrorist and CSEA content in e2ee communications is yet to be specified. In the debate on 5 Dec 2022, David Davis MP described it as 'magic' but we can be pretty sure that what the government had in mind as client side scanning - checking messages for prohibited content before they are encrypted and sent - which already appears to be implemented in WeChat.3 In many people's minds, including those who use WeChat, this does amount to mass surveillance in ways which seem to conflict with Article 8, but until any such technology is 'accredited', we won't know.


  1. As I understand it, these 'proactive technologies' are to be implemented very broadly in contrast to the 'accredited technologies' which only look for terrorist and CSEA technology and are only required if they can be effective in doing that without infringing on Article 8 of the ECHR. But I am no lawyer! 

  2. It is worth noting that while end-to-end encryption is here evidence favouring an act of communication being private in the sense of the OSA, ease of sharing/forwarding is evidence of its being public. Presumably WhatsApp believe that the changes they made to how forwarded content was handled in response to pressure from the Indian government adequately deals with that. 

  3. If you already have WeChat installed, try uploading an image of Winnie the Pooh to Moments. Every time I try, no upload happens and the app restarts. You can also compare the size of the app once installed (on my phone, 2.14gb) with the file size on the app store (413mb). That is more than just a compression effect and the installation process seems to involve downloading data. 


You'll only receive email when they publish something new.

More from Tom Stoneham
All posts