Security specialists Surfshark say that its study of popular messaging apps reveals 90% of them use end-to-end encryption. But while some messaging apps are highly committed to user privacy, their efforts might not be sustainable in the long-run.

Surfshark says the situation could change soon as the EU is currently proposing a Child Sexual Abuse Regulation – known as Chat Control – which seeks to introduce rules that would require providers to scan private communications or implement lawful access to encrypted services.

“Having end-to-end encryption for communication and other digital services is just essential hygiene,” says Vytautas Kaziukonis, CEO of Surfshark. “Without it, all other efforts by apps to protect user privacy and security become largely meaningless. Proposals to introduce message scanning would inevitably create vulnerabilities that malicious actors could exploit. There is no such thing as partial encryption: either it is intact, or it is broken. Therefore, weakening encryption risks undermining trust in Europe’s digital infrastructure and setting a dangerous global precedent.”

Of the 10 most popular messaging apps, nine offer end-to-end encryption: Telegram, QQ, WhatsApp, WeChat, Messenger by Meta Platforms, Rakuten Viber Messenger, LINE, Signal, and Apple Messages. Discord is the only app among those studied that does not provide end-to-end encryption for text-based messages.

However, the EU is moving forward with the Chat Control proposal, which was set for a vote on October 14, but is now being postponed, sparking fears that end-to-end encryption services could disappear from Europe.

As Surfshark research concludes, at the time of study (October 6th), eight countries, with 269 Members of the European Parliament (MEPs) representing 37% of the EU population, remained undecided on their position. Among these was Germany, which has the largest number of MEPs (96). Meanwhile, 12 countries expressed their support for the proposal including France, the second largest by MEP count, with 81 MEPs.

Surfshark, in collaboration with the VPN Trust Initiative, firmly believes that undermining encryption is not an effective means of reducing online harm. If the proposal is passed, the messaging app security would shift completely as E2E encryption would no longer be possible. Some providers have already expressed their concerns and threatened to leave the European market.

Strong encryption is crucial as it constitutes a fundamental human right, safeguards individuals and organizations from cyberattacks, protects vulnerable populations, supports the global economy, and is a cornerstone of both democracy and security.

 

Which messaging apps are the most secure at the moment?

Considering the level of encryption and other analysed factors, Signal ranks at the top for its commitment to minimising user privacy risks, with a score of 0.99. As one of the most downloaded messaging apps in 2025, it stands out by collecting minimal data – specifically just phone numbers – which are used solely for app functionality as noted in the App Store. Furthermore, Signal completely avoids user tracking. By employing quantum-secure cryptography to protect communications and avoiding AI features that could potentially compromise privacy if misused, Signal ensures that users’ conversations remain as private and secure as possible.

LINE ranks at the bottom with the lowest score, followed by Discord, Rakuten Viber Messenger, and Messenger by Meta Platforms all of which fall below the average score of 0.52 for the analysed apps.

According to information in the App Store, LINE, Discord, and Rakuten Viber Messenger are the only apps that may collect data for user tracking. Meanwhile, Messenger by Meta Platforms is notable for declaring that it may collect an extensive range of data types – 30 out of 35 listed in the App Store – for purposes beyond app functionality. These potential uses include, but are not limited to, advertising, product personalisation, or analysing user behaviour.

Ninety percent of the analysed messaging apps offer AI features, which could potentially increase privacy risks. For example, AI might be used to summarise private conversations or translate personal messages. While these features may offer benefits, they also raise concerns about granting access to information that should be private and visible only to the sender and receiver.

Additionally, users can integrate AI assistants into ongoing conversations with others or even engage with AI as a friend. However, it’s crucial to understand that users aren’t just sharing information with a virtual friend, but they’re actually providing data to the company that owns the app or the AI service.