Privacy, Account Recovery, and Authentication

Suppose you are designing a private messaging app and your 'target market' wants the ability to recover their account, should they lose or upgrade their phone. The obvious ways of doing this (I would be interested if there were some unobvious ways!)1 all require the user to provide three things:

  1. An account identifier
  2. An authorisation token
  3. Some kind of decryption key

(Depending what is kept on the servers and what level of recovery you want to give your users, 3 may not be needed.)

Now, the traditional way of doing this was to use a phone number or email address as the account identifier and send the authorisation key to that number/email. However, this is just one-factor authentication (something you have), so it is now also common to have an - optional - password or PIN to upgrade this to two-factor authentication (something you know).

However, the real downside of this approach is that by using an account identifier which is an account with another service, the attack surface is increased and, depending on that other service, a significant vulnerability may be introduced.

Suppose instead you use an account identifier which isn't tied to another service, like a Telegram username or Session ID. I'll come back to Telegram in a minute, because they still require a phone number to create the account.

Session's account identifier is a random sequence and its authorisation token (and decryption key) is a 'Recovery Phrase' stored in the app. This means the user has to keep a record of the Recovery Phrase in order to perform an account recovery should the phone with the registered app become unavailable. But the problem with this approach is that we now only have one-factor authentication (something you know). Because Session has no way of communicating with you outside the app (a good thing), it also has no way of sending you a second authorisation token.

One solution would be to have the second token generated locally by an authenticator app (something you have) linked to the account. But since we are considering account recovery in the situation of a changed or lost device, access to the authenticator app may also be lost. So there would be a dependence on 'back-up codes', making the second factor just another 'something you know', weakening the two-factor authentication.

This, of course, explains why Signal have taken so long to introduce anonymous usernames: they don't want to drop two-factor authentication for account recovery.

Telegram have implemented a sensible solution to this problem: the 'something you have' factor is another device logged into the same Telegram account. Then they can send the authorisation token to that device. (They also allow optional passwords for linking a new device, giving the second 'something you know' factor.) Of course, they know that more users will want account recovery than will be logged in on two devices, so there remains the option for SMS-based authorisation. But at least those who want to can create an account with a burner number and still have a recovery route. Signal could presumably make this option available reasonably easily, giving account recovery to people who signed up with burner numbers.

The root cause of the problem

It is clear that the real challenge for account recovery in anonymous, private messaging is two-factor authentication. It would be possible for an app to give the Telegram route as the only option, but that would never work in the mass market, because it requires a bit of care and attention to make sure you always have at least two devices logged into the account.

What about this user demand for account recovery - maybe that is the problem? As with so many privacy problems around us, it is worth pausing and thinking how 15 years of surveillance capitalism might have shaped our preferences and whether we can reshape them.

I started using WhatsApp in 2011 when a family gathering for my mother's 80th birthday led to everyone signing up so we could keep in touch cheaply across four countries and three continents. Prior to that, iPhone users had iMessage but often didn't know if they were using SMS or not (how many iPhone users can tell you the difference between green and blue messages even now?) and everyone else had SMS or nothing. Which meant that for 99% of users, their messages and contacts were not backed up to the cloud and would only be backed up locally if they had gone through an elaborate process. Lose you phone and you lost all that. 'Account recovery' meant asking people for their numbers and accepting all the messages were gone.

As it became easy for more and more users to back-up their accounts and contacts to the cloud, this service was pushed harder and harder. And Big Tech has two very good reasons to do this:

  1. It gave them a huge source of data to drive their business model
  2. It made buying a new device a much smoother experience and thus helped boost sales.

It was just another example of maximising profits by selling a small convenience, and it was so successful that no messaging app without account recovery is seen as viable outside a small niche market of #PrivacyFundamentalists.

This also went alongside another social change: don't file, search. When storage capacities in the cloud were large enough, people stopped saving copies of important things and instead saved everything, sorting out the important wheat from the unimportant chaff through search when it was needed - but keeping the chaff 'just in case'.

The ephemeral has gained a permanence and we are trained to think of that as normal.

The way forward?

We need a cultural change. We need people to be mindful of the information they receive (and store), to think about what is important and worth saving, while deleting what is not. We have been trained to save everything and protect ourselves against loss by having cloud-based services which can be restored. That behaviour is convenient (lazy?) but ultimately it benefits Big Tech and the surveillance mindset, not the user.

There is nothing wrong with storing valuable information - photos, notes, key contacts - in the cloud to protect yourself against device loss or failure. That is very sensible. But it needs to be a process with two stages of selection: (i) what is worth saving; (ii) where to save it.

This sort of mindfulness about the information we receive and share is mandated - in some contexts - by GDPR. But in fact it should be part of the education of everyone in how to be good digital citizens, how to protect your own privacy and that of people you interact with. Then private messengers designed without account recovery might be able to attract sufficient numbers of users to create network effects at scale. But only then.

As with most problems of digital privacy and digital rights, there is no purely technical solution. The source is social and often a product of exploitation. The solution is always better education in digital citizenship.


  1. Since writing this post, I have come across SimpleX, which implements an alternative: entirely local and offline back-up and recovery. This makes sense because SimpleX has no persistent user IDs - they are generated on a per chat basis and they leave it to users to identify each other whatever ways they choose within the chat or outside it. Like IRL! ā†©


You'll only receive email when they publish something new.

More from Tom Stoneham
All posts