Summary

WhatsApp violates its users’ privacy, and its use encourages others to install potentially malicious software.

Why Platform Choice Matters

Communication platforms typically benefit from the network effect – the more people on a given platform, the more pressure there is to use it. Most people don’t use WhatsApp because they shopped around and chose a messaging platform with features that they liked. Instead, they were invited to events that use it, and they know that those they invite to a call are likely to already use the platform. The value of WhatsApp derives from the friends, family, and co-workers who use it.

The use of insecure platforms like WhatsApp for official, semi-official, or social functions strongly encourages more people to install them – and the presence of those users encourages further adoption. The network effect encourages many people to use these platforms without being informed of the risks – risks which might be unreasonable to expect non-experts to fully understand. Some have the immense privilege of not needing to fear their country’s government, but that is not the case for many. If the only people who use secure communications are activists and others threatened by tyrannical regimes, the mere use of those technologies marks them for additional scrutiny.

Security of WhatsApp

In May 2019, Canadian researchers reported that unknown actors had been systematically exploiting a remote vulnerability in WhatsApp to install spyware on the phones of activists, human rights lawyers, and journalists. The malware was the same as that used to spy on Omar Abdulaziz, a Saudi dissident and associate of Jamal Khashoggi, and reported on one day before Khashoggi was murdered.

In January 2020, the Guardian reported that Mohammed bin Salman compromised Jeff Bezos’s phone by sending him a video file via WhatsApp (yeah, apparently they text). The story included a message asking readers to reach out if they have additional information – and provided a number to message on WhatsApp. It took nine months before Facebook indicated that they had fixed the vulnerability. This isn’t new – technical experts have been discovering security flaws in WhatsApp for essentially its entire existence.

In May 2011, a Dutch hacker demonstrated a technique for stealing anyone’s WhatsApp account due to a flaw in the two-factor authentication text verification procedure. In December 2014, two Indian teenagers demonstrated a denial of service attack that would crash WhatsApp with only a 2 KB message. In August 2015, Israeli researchers discovered that they could run arbitrary code by sending a contact file via WhatsApp. In July 2017, German researchers showed that a WhatsApp server could inject new users into supposedly “end-to-end” encrypted chats, compromising the security of future messages.

But isn’t WhatsApp encrypted?

WhatsApp was never going to be secure

Motivations

As a user, there are four primary features you probably want in a messaging platform:

  • Confidentiality: only intended recipients should be able to see message contents
  • Integrity: your messages cannot be changed by a third-party; messages you receive are genuine
  • Availability: messages are delivered quickly and reliably
  • Safety: the application is not a means for someone to otherwise compromise your computer or phone

By contrast, Facebook wants to make money – there is little incentive for them to focus on things that don’t drive revenue, like security and privacy. In fact, Facebook has disregarded the interests of its users so many times that I’ll link a meta-article listing 21 scandals in 2018 alone.

WhatsApp is closed-source

Software is written in ‘source code’ which is readable and writable by humans, but it is distributed in the form of ‘machine code’, which is only understandable by processors. If you only have the machine code, it is difficult to determine what the software actually does – you are reliant on the developer to tell you.

‘Open-source’ refers to software that has source code – not just machine code – available, so that you (or a security researcher) can compile it to machine code yourself, and verify that the software does what it claims, and only what it claims. WhatsApp is not open-source.

What to Use Instead

Telegram is an open-source chat app developed by a non-profit and funded by Pavel Durov, the billionaire founder of Russia’s largest social network who is in self-imposed exile after refusing to censor posts about Ukraine. Telegram focuses on ease of use and syncs your messages to the cloud by default. It supports a web app and groups up to 200,000 members as well as free, unlimited file storage. Telegram has been widely used by activists in Hong Kong in hopes of avoiding surveillance and censorship by the Chinese Communist Party. Consequently, it has withstood3 numerous cyberattacks originating from mainland China, usually corresponding to major protests.

Signal is an open-source chat app developed by a non-profit and funded by a variety of charities as well as the United States government. Its code has been extensively audited by third-parties, and its protocol is essentially the undisputed gold standard for cryptographic security. In fact, the EFF notes that the best thing about WhatsApp is that, despite its lack of transparency and accountability, it adopted Signal’s communication protocol.

Jami?