This blog is reserved for more serious things, and ordinarily I wouldn’t spend time on questions like the above. But much as I’d like to spend my time writing about exciting topics, sometimes the world requires a bit of what Brad Delong calls “Intellectual Garbage Pickup,” namely: correcting wrong, or mostly-wrong ideas that spread unchecked across the Internet.

This post is inspired by the recent and concerning news that Telegram’s CEO Pavel Durov has been arrested by French authorities for its failure to sufficiently moderate content. While I don’t know the details, the use of criminal charges to coerce social media companies is a pretty worrying escalation, and I hope there’s more to the story.

But this arrest is not what I want to talk about today.

What I do want to talk about is one specific detail of the reporting. Specifically: the fact that nearly every news report about the arrest refers to Telegram as an “encrypted messaging app.” Here are just afewexamples:

This phrasing drives me nuts because in a very limited technical sense it’s not wrong. Yet in every sense that matters, it fundamentally misrepresents what Telegram is and how it works in practice. And this misrepresentation is bad for both journalists and particularly for Telegram’s users, many of whom could be badly hurt as a result.

Now to the details.

Does Telegram have encryption or doesn’t it?

Many systems use encryption in some way or another. However, when we talk about encryption in the context of modern private messaging services, the word typically has a very specific meaning: it refers to the use of default end-to-end encryption to protect users’ message content. When used in an industry-standard way, this feature ensures that every message will be encrypted using encryption keys that are only known to the communicating parties, and not to the service provider.

From your perspective as a user, an “encrypted messenger” ensures that each time you start a conversation, your messages will only be readable by the folks you intend to speak with. If the operator of a messaging service tries to view the content of your messages, all they’ll see is useless encrypted junk. That same guarantee holds for anyone who might hack into the provider’s servers, and also, for better or for worse, to law enforcement agencies that serve providers with a subpoena.

Telegram clearly fails to meet this stronger definition for a simple reason: it does not end-to-end encrypt conversations by default. If you want to use end-to-end encryption in Telegram, you must manually activate an optional end-to-end encryption feature called “Secret Chats” for every single private conversation you want to have. The feature is explicitly not turned on for the vast majority of conversations, and is only available for one-on-one conversations, and never for group chats with more than two people in them.

As a kind of a weird bonus, activating end-to-end encryption in Telegram is oddly difficult for non-expert users to actually do.

For one thing, the button that activates Telegram’s encryption feature is not visible from the main conversation pane, or from the home screen. To find it in the iOS app, I had to click at least four times — once to access the user’s profile, once to make a hidden menu pop up showing me the options, and a final time to “confirm” that I wanted to use encryption. And even after this I was not able to actually have an encrypted conversation, since Secret Chats only works if your conversation partner happens to be online when you do this.

Starting a “secret chat” with my friend Michael on the latest Telegram iOS app. From an ordinary chat screen this option isn’t directly visible. Getting it activated requires four clicks: (1) to get to Michael’s profile (left image), (2) on the “…” button to display a hidden set of options (center image), (3) on “Start Secret Chat”, and (4) on the “Are you sure…” confirmation dialog. After that I’m still unable to send Michael any messages, because Telegram’s Secret Chats can only be turned on if the other user is also online.

Overall this is quite different from the experience of starting a new encrypted chat in an industry-standard modern messaging application, which simply requires you to open a new chat window.

While it might seem like I’m being picky, the difference in adoption between default end-to-end encryption and this experience is likely very significant. The practical impact is that the vast majority of one-on-one Telegram conversations — and literally every single group chat — are probably visible on Telegram’s servers, which can see and record the content of all messages sent between users. That may or may not be a problem for every Telegram user, but it’s certainly not something we’d advertise as particularly well encrypted.

(If you’re interested in the details, as well as a little bit of further criticism of Telegram’s actual encryption protocols, I’ll get into what we know about that further below.)

But wait, does default encryption really matter?

Maybe yes, maybe no! There are two different ways to think about this.

One is that Telegram’s lack of default encryption is just finefor many people. The reality is that many users don’t choose Telegram for encrypted private messaging at all. For plenty of people, Telegram is used more like a social media network than a private messenger.

Getting more specific, Telegram has two popular features that makes it ideal for this use-case. One of those is the ability to create and subscribe to “channels“, each of which works like a broadcast network where one person (or a small number of people) can push content out to millions of readers. When you’re broadcasting messages to thousands of strangers in public, maintaining the secrecy of your chat content isn’t as important.

Telegram also supports large public group chats that can include thousands of users. These groups can be made open for the general public to join, or they can set up as invite-only. While I’ve never personally wanted to share a group chat with thousands of people, I’m told that many people enjoy this feature. In the large and public instantiation, it also doesn’t really matter that Telegram group chats are unencrypted — after all, who cares about confidentiality if you’re talking in the public square?

But Telegram is not limited to just those features, and many users who join for them will also do other things.

Imagine you’re in a “public square” having a large group conversation. In that setting there may be no expectation of strong privacy, and so end-to-end encryption doesn’t really matter to you. But let’s say that you and five friends step out of the square to have a side conversation. Does that conversation deserve strong privacy? It doesn’t really matter what you want, because Telegram won’t provide it, at least not with encryption that protects you from sharing your content with Telegram servers.

Similarly, imagine you use Telegram for its social media-like features, meaning that you mainly consume content rather than producing it. But one day your friend, who also uses Telegram for similar reasons, notices you’re on the platform and decides she wants to send you a private message. Are you concerned about privacy now? And are you each going to manually turn on the “Secret Chat” feature — even though it requires four explicit clicks through hidden menus, and even though it will prevent you from communicating immediately if one of you is offline?

My strong suspicion is that many people who join Telegram for its social media features also end up using it to communicate privately. And I think Telegram knows this, and tends to advertise itself as a “secure messenger” and talk about the platform’s encryption features precisely because they know it makes people feel more comfortable. But in practice, I also suspect that very few of those users are actually using Telegram’s encryption. Many of those users may not even realize they have to turn encryption on manually, and think they’re already using it.

Which brings me to my next point.

Telegram knows its encryption is difficult to turn on, and they continue to promote their product as a secure messenger

Telegram’s encryption has been subject toheavy criticismsinceat least 2016 (and possibly earlier) for many of the reasons I outlined in this post. In fact, many of these criticisms were made by experts including myself, in years-old conversations with Pavel Durov on Twitter.1

Although the interaction with Durov could sometimes be harsh, I still mostly assumed good faith from Telegram back in those days. I believed that Telegram was busy growing their network and that, in time, they would improve the quality and usability of the platform’s end-to-end encryption: for example, by activating it as a default, providing support for group chats, and making it possible to start encrypted chats with offline users. I assumed that while Telegram might be a follower rather than a leader, it would eventually reach feature parity with the encryption protocols offered by Signal and WhatsApp. Of course, a second possibility was that Telegram would abandon encryption entirely — and just focus on being a social media platform.

What’s actually happened is a lot more confusing to me.

Instead of improving the usability of Telegram’s end-to-end encryption, the owners of Telegram have more or less kept their encryption UX unchanged since 2016. While there have been a few upgrades to the underlying encryption algorithms used by the platform, the user-facing experience of Secret Chats in 2024 is almost identical to the one you’d have seen eight years ago. This, despite the fact that the number of Telegram users has grown by 7-9x during the same time period.

At the same time, Telegram CEO Pavel Durov has continued to aggressively market Telegram as a “secure messenger.” Most recently he issued a scathing criticism of Signal and WhatsApp on his personal Telegram channel, implying that those systems were backdoored by the US government, and only Telegram’s independent encryption protocols were really trustworthy.

While this might be a reasonable nerd-argument if it was taking place between two platforms that both supported default end-to-end encryption, Telegram really has no legs to stand on in this particular discussion. Indeed, it no longer feels amusing to see the Telegram organization urge people away from default-encrypted messengers, while refusing to implement essential features that would widely encrypt their own users’ messages. In fact, it’s starting to feel a bit malicious.

What about the boring encryption details?

This is a cryptography blog and so I’d be remiss if I didn’t spend at least a little bit of time on the boring encryption protocols. I’d also be missing a good opportunity to let my mouth gape open in amazement, which is pretty much what happens every time I look at the internals of Telegram’s encryption.

I’m going to handle this in one paragraph to reduce the pain, and you can feel free to skip past it if you’re not interested.

According to what I think is the latest encryption spec, Telegram’s Secret Chats feature is based on a custom protocol called MTProto 2.0. This system uses 2048-bit* finite-field Diffie-Hellman key agreement, with group parameters (I think) chosen by the server.* (Since the Diffie-Hellman protocol is only executed interactively, this is why Secret Chats cannot be set up when one user is offline.*) MITM protection is handled by the end-users, who must compare key fingerprints. There are some weird random nonces provided by the server, which I don’t fully understands the purpose of* — and that in the past used to actively make the key exchange totally insecure against a malicious server (but this has long since been fixed.*) The resulting keys are then used to power the most amazing, non-standard authenticated encryption mode ever invented, something called “Infinite Garble Extension” (IGE) based on AES and with SHA2 handling authentication.*

NB: Every place I put a “*” in the paragraph above is a point where expert cryptographers would, in the context of something like a professional security audit, raise their hands and ask a lot of questions. I’m not going to go further than this. Suffice it to say that Telegram’s encryption is unusual.

If you ask me to guess whether the protocol and implementation of Telegram Secret Chats is secure, I would say quitepossibly. To be honest though, it doesn’t matter how secure something is if people aren’t actually using it.

Is there anything else I should know?

Yes, unfortunately. Even though end-to-end encryption is one of the best tools we’ve developed to prevent data compromise, it is hardly the end of the story. One of the biggest privacy problems in messaging is the availability of loads of meta-data — essentially data about who uses the service, who they talk to, and when they do that talking.

This data is not typically protected by end-to-end encryption. Even in applications that are broadcast-only, such as Telegram’s channels, there is plenty of useful metadata available about who is listening to a broadcast. That information alone is valuable to people, as evidenced by the enormous amounts of money that traditional broadcasters spend to collect it. Right now all of that information likely exists on Telegram’s servers, where it is available to anyone who wants to collect it.

I am not specifically calling out Telegram for this, since the same problem exists with virtually every other social media network and private messenger. But it should be mentioned, just to avoid leaving you with the conclusion that encryption is all we need.

Main photo “privacy screen” by Susan Jane Golding, used under CC license.

Notes:

  1. I will never find all of these conversations again, thanks to Twitter search being so broken. If anyone can turn them up I’d appreciate it.

52 thoughts on “Is Telegram really an encrypted messaging app?

  1. Hey! Awesome article. I have a question, regarding this part “ The resulting keys are then used to power the most amazing, non-standard authenticated encryption mode ever invented, something called “Infinite Garble Extension” (IGE) based on AES and with SHA2 handling authentication.*” was that sarcasm or you being genuine? I ask because you used the * symbol.

    You don’t think SHA256 and AES are secure? If I misunderstood you then sorry (: that’s why I am asking for clarification

    Thank you!

    1. I suspect he means it’s possible to make a disaster out of good components like AES+SHA2 and with a non-standard approach you are taking your chances like the 2024 edition of the Secret Service.

  2. I remember at the time Nadim Kobeissi and Taylor Hornby talking a lot about Telegram’s brokeness, and found the following:

    https://x.com/DefuseSec/status/413861043585036289

    https://www.cryptofails.com/post/70546720222/telegrams-cryptanalysis-contest

    https://news.ycombinator.com/item?id=6915741

    Unfortunately Nadim blocked me a long time ago so I can’t actually see tweets, but hopefully you still can:

    https://x.com/search?q=%40kaepora%20telegram&src=typed_query

  3. It would be excellent if you gave some alternatives of apps which you you deem the best for encrypted messaging.

      1. Signal developed and uses the Double Ratchet, which is nice. An issue I see though is the metadata that can be collected from their servers given that Signal is centralised.

        Double Ratchet over a distributed network would seem to be a good step forward, and I wouldn’t be surprised if there aren’t already over a handful that do this properly on the big blockchains. A side-effect also would be the reduction of spam messages because each message would not be zero-cost.

      2. @JohnSmith I’m sure Moxie and crew are nice people but it’s not rubber-hose safe, so I’d rather chose security by design rather than security by gentlemen’s handshake.

        Don’t forget, I was responding originally to “state-of-the-art”. If you don’t care about metadata and are just happy with e2e, then stick with Signal (I personally use Signal with close friends rather than other apps).

        Sadly I could only find only a single blockchain-based messaging service – Session App – which uses the Loki network (based on CryptoNote) over Tor. Nice, but it unfortunately looks abandoned.

        Given the tooling we have now towards the end of 2024, I’m willing to bet someone out there could probably hack out a gloriously safe messenger app over a few weekends as a Dapp via Tor!

        (An added bonus in 2024 is that some networks now have user’s addresses = user’s public key)

      3. @pete It says “Military-grade security” on their website so I would chuck them in the trash, but besides that it looks like they’re open source, doing messaging over a decentralised blockchain, and they say E2EE – so points there!

        My only problem now is that, as I said in my original comment, they’ve got to do it on a big network because otherwise the network metadata with under 100 nodes will be simple to trace who’s talking to who.

        Another problem I have is that their E2EE solution is pretty thin in their documentation. So far all I can find is that their E2EE is “Military grade” and “256 bit” – a detailed whitepaper would be nice otherwise you’ve got to trust their marketing or view the source.

  4. Are you able to download your Twitter archive? You could then do a local search of those posts. At least find a particular one, and from open its URL. From there the threads should be linked

  5. I feel the description of Secret Chat initiation complexity is a bit disingenuous, on the android app at least one only needs to press the ‘new message’ pencil in the bottom right and the option to start a new secret chat is front and center, along with the new group and channel options….

  6. “Even though end-to-end encryption is one of the best tools we’ve developed to prevent data compromise, it is hardly the end of the story.”

    In addition to the data and metadata in flight, I would add consideration of the security of the client and server itself. Can/has it been audited? Is there any way to attest that the client and server are built from the audited code etc. The chances of deliberate backdoor seem to be significant if the company is indeed malicious.

    1. Sadly, audited code is not sufficient. One also needs to employ security throughout the build process to ensure artifacts are actually built from the audited code.

      1. For the clients, both Telegram and Signal provided reproducible builds. (Signal only provides that for Android as doing reproducible builds on iOS is difficult due to how App Store works)

  7. Great article, Matthew. Given the challenges with Telegram’s encryption, would you recommend switching to Signal for those who prioritize privacy? Curious to hear your thoughts on the balance between usability and security.

  8. And can we talk about how Telegram is fundamentally NOT secure just by virtue of all accounts having an associated phone number?

  9. This was a really well written and professional heads up. thanks for this piece and the time and effort you’d put into it.

    1. Matthew, ALL,

      The news indicates Pavel Durov has been released on 5million bail and six charges.

      https://www.politico.eu/article/france-charges-telegram-ceo-pavel-durov-released-bail/

      And that there are other outstanding arrest warrants for other’s in Telegram’s senior management.

      When looking at the list of charges, you will see they are all of a type that requires access to the content of users.

      Which strongly suggests one or both of,

      1, No E2EE used by users.

      2, Telegram has made user content available to authorities in one way or another.

      But there is a serious point to be considered.

      In the days of pay for message transport such as the Letter/Post, Telegram / Telex and “Plain Old Telephone Service”(POTS) there was the legal status of being “Common Carrier”. This protected the service provider from any actions the users of the service carried out on a “Don’t look or Say” basis.

      This differentiated them from printed material distributors such as Editors of Newspapers and Owners of Book and other print media publishers.

      This century so far has been about removing peoples rights of free association, free speech, and all to obvious pushes by Governments and their “Guard Labour” agencies to not just force people into electronic communication but very insecure electronic communications.

      However a very evident side effect of this is the clear difference between the aims of the “Police Forces” and the “Security Services”.

      Police Forces have to have “clear sight” of “message content” to be usable in court to meet the “burden of proof” of “Beyond reasonable doubt” for criminal prosecution. Security Services basically do not need access to message contents but meta-data and meta-meta-data which gives rise to “traffic analysis”.

      As far as I’m aware there is no user application or service that provides effective protection against Traffic Analysis.

      With the recent out-cry in the UK over people being jailed for 20-50 months over comments made on line, you have to wonder how long before the likes of Traffic Analysis and contact analysis becomes not just used but normalised to courts.

  10. re: twitter search, try this

    (from:matthew_d_green) (to:durov) until:2022-12-31 since:2006-01-01

  11. Well worth the read. Thank you. One aspect of Telegram you don’t mention is that (AFAIK unlike Signal or WhatsApp) Telegram makes it easy to deploy bot services (some of which have legitimate uses; some of which most certainly do not) – it would appear that Mr Durov is more interested in that side of the application than making E2EE easy and pervasive.

  12. Another factor which entirely bypasses any end-to-end encryption is the O/S’s capability to see everything that you can see. The O/S is presenting everything to you, both input and output. If there is a routine, based on the think-of-the children excuse, helping you by monitoring for illegal content – well it will have to know everything on the device, including pictures, to “help” you.

    Go ahead and encrypt the chat over and over and it won’t matter.

    Now, if that is only accessible to the developer it is still better than open to the public. The only reasonable thing is to never use anything online that you would not want to be seen by anyone else.

  13. A long time ago I too was guilty of bringing up CPU/OS/compiler etc security when the discussion was high up on the application layer, and vice-versa. It’s happening here in the comments of this very post and at the same time over at Metzdowd (where the discussion was about safe SSD erasure and on compiler optimisations”…

    I think for your next “Intellectual Garbage Pickup” post you should talk about ‘When discussing security at a specific layer, let’s all stick to the same damn layer unless the issue is not a general issue rather than yet another derailing to Evil Maids, Evil Fabrication Plant employees, and “Trusting Trust”‘.

    It’s almost as if every security forum has an army of people trying to outdo themselves on how much things they can implement out of The Simple Sabotage Field Manual.

  14. Matthew, ALL,

    It’s been said that the crypto was a “Roll Your Own” by Mr Durov’s younger brother. As I can not find any real info on him I can not judge his potential competence.

    But one thing you really should explain better and a lot more clearly is the “Limit of two parties in E2EE” and why virtually every group / conferencing system can not be truly E2EE.

    Because it effects most if not all “Chat / Conferencing” applications.

    Further it needs to be mentioned that whilst “meta data” and “meta meta data” are technically “traffic analysis” and very fast as it does not generally involve crypto… They are not things Law Enforcement are generally interested in because they have near zero evidentiary value in a court and are generally to difficult to describe to a tired / distracted jury.

    Any way keep “ploughing your furrow” as it’s very beneficial to raising new crops of informed people.

  15. My question is, why isn’t the encryption turned on by default and why is it so hard to turn it on?

  16. @ alfiedotwtf,

    Whilst Moxie might be nice and smart in some ways…

    When secure messaging apps came out I made myself quite unpopular over comments I’d made over on the Bruce Schneier blog and some other places.

    The reason is I pointed out none of these “secure apps” are actually secure for the users, no matter what the developers say or do.

    The reason is the “weaklink in the chain” of the system which is the “elephant in the room” of all modern consumer and commercial system connected to a method of communications.

    In all cases because due to deficiencies in “the system design”,

    “The security end point is before the communications end point and an ‘End Run Attack’ is easily possible”

    And in all cases so far it’s actually happening.

    The attacker does not need to break the app, they simply need to find a way via another app, vulnerability in another app, the OS, the Drivers, or even firmware / hardware.

    This enables them to bypass the app and “look over the users shoulder” at the user interface, which is we much later now think about thanks to Apple and others. That are doing the “on-device” anti-CSAM or similar with Microsoft and it’s latest additions to Win 11 betas being the most thorough. Because they are a very effective “back door” in the system that no “only on device” app etc can stop.

    The only solution is to take the security end point beyond the communications end point.

    To do this means the encryption and decryption has to be done outside of the Smart Device.

    I’ve described how to do this over on the Schneier blog in the past with a simple example using a pencil and paper cipher that has been proven to have “perfect secrecy” by Shanon’s criteria that all bits in any ciphertext are “equiprobable”.

    I’ve also said why it’s not going to work in the general case, and to many peoples surprise it’s not due to the significant “KeyMat” issues of One Time Pads and similar.

    It’s a user issue of “convenience” or “instant gratification”.

    Users want the “look and feel” in their minds of “being secure from the man”, but in reality they will never ever do the OpSec that is required to ensure this.

    We know this from Encrochat and similar backdoored mobile phone systems used by people with the most to loose that is the violent and worse criminals.

    The real solution is to realise that any consumer or commercial communications device you can obtain or build is not nor can it be “secure” thus you have to treat it like it is a “General Broadcast” device making any data on it or through it available to all.

    We knew this back before the foundations of computers and information theory were laid down in the 1930’s very nearly a century ago. And we also knew how to achieve the level of security needed,

    “You overlay a security channel over the broadcast channel”

    That is you ensure the security end points are beyond the reach of the communications end points.

    Mechanically the One Time Pad was about the simplest cipher you can think of that is by any reasonable measure even today secure. However the OTP has many issues not least it is tedious and slow to use.

    So not “convenient” nor “instant gratification”.

    These days we have reasonably powerful microcontrollers that can take much of “the drudge” out of the user mechanics of the OTP.

    However making actually secure systems is much harder because as you strengthen one “weak link” you now have a new “low hanging fruit” weak link to strengthen and so on.

    But security is not just about the message content.

    Eventually the weak links become “meta-data” and “meta-meta-data” subject to “traffic analysis” and “human behaviour surveillance”.

    Think on the simple case of “pizza deliveries” where the fact you order in way more than usual tells an observer there is more than just you in your home / work place even though there may be secret entrance tunnels.

    It’s the same with communications, it’s why you don’t need to have the message content to know there is a change in “action rediness” in the opponent you are observing, thus it’s reasonable to assume if the traffic pattern changes that something else has changed to cause it.

    But it gets a bit more subtle, even if you do not change your message frequency meta-data there is still meta-meta-data that can be used.

    Some times this is known as a pattern change.

    If you message four people every friday prior to meeting up for a drink, the lack of a message to one of them or the fact the message is of a different size, gives away that, the person is going to “break habit” for some reason known to you. Which is an indicator of that catchall “conspiracy”…

    Message security may be the low hanging fruit of “Operational Security”(OpSec) but fixing it just makes another fruit low hanging… The trick is to make “all fruit beyond reach” and that can be easy or hard depending on your opponents resources and if you even know they are an opponent.

    Few people even those who know their life is in danger actually go to the OpSec lengths required to give them a “lack of weakness” thus avoid giving information to a hostile opponent.

    1. Clive, you have reminded me of all the claims of using quantum entanglement to… well, I’m sure you remember better than I do. If I was the head of the NSA, I’d be wondering if it is more cost effective to hire Penn & Teller or Schneier & Norris. (FWIW: You’ve never been unpopular with me. The only time your comments gave me pause was when I thought you had a stroke or were being impersonated.)

      1. @ Bernie,

        Thank you for the kind comments.

        Hopefully more of those that used Schneier on Security as regulars will move as Bruce puts his blog into what looks like “pre-retirement mode” run-up to “End of Life close down”.

      2. Where can we find Clive on Cryptography? All I can find is Clive on Cooking, where all the recipes start with “preheat the oven” even those for frozen smoothies.

  17. I suggest you to use a alternative frontend for Twitter, I suggest nitter.poast.org for example

    1. Whilst Nitter is a usable front end, Mr X has issues with it, and has tried in the past to stop it being used.

      This type of behaviour can become “Cat and Mouse” as has been seen with other major platforms or suppliers of user applications. And sadly for those that run and use front ends, it can happen at any time for any duration.

      I used to follow various people on Twitter as an “audience” not a “participant” so Nitter quickly became my chosen method of “viewing the stages at the festival”.

      But with the transition from bird to X banded snake I’ve “left the festival grounds” entirely and I suspect many many others have as well.

  18. Wow, what a wrong direction all this took. I mean there’s zero doubts in your analysis. And there’s a lot of shitty marketing which got those terms to Tg. (And my personal pun is to ask how long should we wait for the server code release promise fulfillment.) But current events aren’t about security, crypto, and so on.

    It’s most of all about freedom, free speech, and censorship. They just want to dictate Tg and oppress its users like they do via any major platform, company, association and so on.

    Moving people to a better tech then Tg is a great thing to do. But we should stand for Tg liberties with the same strength it stood for ours.

    1. https://core.telegram.org/mtproto

      The mproto encryption is used on every chat in Telegram, not just the Secret chats.

      Secret chats are indeed end to end technically, but that doesn’t mean the rest of the chats aren’t encrypted because according to their documentation, they are encrypted.

      I think everyone is caught up on that but, my question would be has there been 1 recent (or otherwise) incident of the encryption on telegram being broken in either their secret or cloud chat? Anyone?

      Please share if you know of one. I’m not referring to accounts being stolen because that happens. Any weak links in the updated mproto encryption that have been exploited to decrypt the messages either through Telegram servers or on the devices?

      I’m very curious because I could find none.

  19. For those thinking about,

    “Where do we go and how?”

    Various people have been thinking about it for quite some time.

    One big problem is that the Internet is actually a point to point network with “Choke Points” not an unrestricted “Broadcast system” like a radio system.

    A broadcast system effectively sends to everyone in range. Thus whilst the TX location is known precisely, the location of RX stations is only known to be within a large area, and the numbers receiving unknown. This is why “Numbers Stations” work quite well.

    A point to point network means both ends have to be known. That is as with a phone you dial from a number to another number. And this is one of the plainly observable pieces of meta-data visable at any choke-point in the network (especially those that “packet switch” rather than “circuit switch”).

    Thus a big issue is how do you communicate if you do not know what the number is for the intended recipient?

    Untill quite recently this was solved by use of static numbers like house addresses that were put in a read only database like a telephone directory.

    But with mobile communications this nolonger works as your location changes so does your basic address. One solution is to put a layer over this which is a read and write database of vietual addresses. Where each communications end point writes it’s physical address in as a field of the users virtual address record.

    The result is one heck of a lot of network traffic that is not user traffic and a “Rendezvous Protocol” for end points to find each other.

    Such a system “self instruments” thus can build contact maps not just on the database server but also to anyone who can see the message content of any network traffic passing destined for the server. Which is obviously a security weakness that needs to be addressed.

    likewise it can be an obvious indicator of where the user end point is, and that it is about to communicate to another user end point. So a great deal of information via this becomes available to a passive observer on the network.

    Resolving this is in the difficult to impossible to make secure with our current type of usage of point to point data networks.

    So we also need to think about fundamental changes to the “physical layers” of data networking and how we are going to change things to improve the security of our privacy.

  20. What about comparing Signal, Jami, Session, Briar, Cwtch, Tox, Ricochet Reflex and all, from the points of view of encryption, serverless, Tor relays, multiplatform, extra capacities like pics/video… etc?

  21. @ herve55555,

    “What about comparing…”

    As I point out above none of them are “secure as a system” even though their link crypto may be good.

    Because you need to get the “security end point” off the smart device that has the “communications end point”. And you should do this via a reasonably secure “energy gap” crossing system that is “instrumented”.

    Whilst doing an “energy gap” crossing is hard using “ease of use technology”. It’s comparitively only moderately hard when comparing with doing it “properly instrumented” with technology.

    Back in the 1990’s I pointed out the real issue with the then online banking systems was they at best only autheticated the communications not the transactions so “Man In The Middle” attacks were possible.

    Importantly though that all transactions should be authenticated “through the user”. That is the user formed the actual link to the authentication device to stop/limit the use of various side channel tricks in the comms channels.

    In part this is because I was the “guilty person” that worked out how to get the use of SMS/Texts to be a then viable second channel for sending “one time” codes to users a decade befor (long before Smart Devices).

    The lesson is not “the communications channel has to be secure”, but “the entire system has to be secure”.

    A lesson that those security app developers appear not to have learned in three decades.

  22. @ Matthew, ALL,

    Telegram chat fails and the guilty are convicted.

    In the UK there are restrictions on reporting prior to cases going to court, and “contempt of court” is not something people want to come up against as you have to prove yourself innocent against a judge who has accused you of violating their court…

    Which may be a reason why news of the arrests and trial of three young men and the failing of their use of Telegram has had to wait untill after the “guilty verdicts”.

    Rather than me do a long post I’ll just link to Brian Krebs article,

    https://krebsonsecurity.com/2024/09/owners-of-1-time-passcode-theft-service-plead-guilty/

  23. Some encrypted communication software claims to have end-to-end encryption, but no one knows how they implement these technologies. Can private keys that can be predicted be considered safe?

  24. The Telegram desktop client doesn’t seem to support Secret Chat at all.

    Telegram’s desktop apps gave the impression of being total peers of the mobile apps. For example, one can create a new Telegram account using only the desktop client. When I tried to actually do that I got blocked by opaque spam stuff, but at least it looked like it was supposed to work according to the protocols, technology, and product management. It was a welcome relief from Signal and WhatsApp trying to pin their account to me like a bag of meat, as jealous advertisers do to bust out of the web privacy sandbox by pushing mobile apps on me.

    But the first impression is false. The Secret Chat option is totally missing from the desktop clients. I can see how this could make the math easier since the Secret Chats are then 1:1. But this is not up to the technical level expected in 2024.

Comments are closed.