Telegram with Matthew Green

Telegram with Matthew Green

We finally have an excuse to tear down Telegram! Their CEO got arrested by the French, apparently not because the cryptography in Telegram is bad, but special guest Matt Green joined us to talk about how the cryptography is bad anyway, and you probably shouldn’t use Telegram as a secure messenger of any kind!

Links:


This transcript has been edited for length and clarity.

Deirdre: Hello. Welcome to Security Cryptography Whatever. I’m Deirdre.

David: I’m David.

Thomas: I’m drinking and there’s…Matt. Hi, Matt.

Deirdre: We have a special guest today, returning champion Matt Green. How are you, Matt?

Matt: I’m good. Thanks for having me.

Deirdre: Oh, yeah. We’re bringing Matt on because…Telegram is in the news, and we finally have a reason to talk about fucking Telegram. And Matt both wrote an up to date blog post about it and has written about Telegram in the past, and we just thought he was the best person to come and just have a drink and talk about Telegram, because it’s in the news.

Matt: Thanks for having me. And I am having a drink, so that we’ve done.

Deirdre: All right. Telegram’s in the news because the CEO of Telegram, what’s his name? Pavel Durov, who is a French citizen, was arrested in France for things including basically not cooperating with law enforcement when trying to prosecute crimes happening on Telegram, such as child sexual abuse material being spread about on Telegram. And also, there were a couple of items in the press release of the indictment about, basically operating cryptology in France without an import license? We’re used to export restrictions, but apparently France has this weird law. But we, you know, there’s tons of public stuff happening on Telegram, never mind the crap, end to end, encrypted, opt in, only private DM’s that are not used by a lot of people, and yet they are throwing that in, for reasons, into their charges. And so that’s why we’re talking about Telegram.

Thomas: I think a good place for Matt to start would be to explain to our listeners the workings of the French legal system and how it differs from our own. There was a movie, like, I don’t like a good movie, like, a year and a half ago or whatever, ‘Anatomy of a Fall’, which is like, a French criminal legal drama or whatever. And then, right as that movie came out, Debevoir and Plimpton, I’m probably pronouncing that wrong— I’m not a lawyer, but they’re a huge law firm, they published, like, this big white paper on, like, the distinctions between the French criminal legal system and ours. And it’s fucking crazy. French criminal law is bananas.

The prosecutor works for the judge. It’s how it works in France.

Deirdre: Weird.

Thomas: I. So, yeah, it’s fun.

Deirdre: So just to kind of cover, we’re not going to get into the other crap about, like, okay, Telegram is run by this guy, Pavel Durov. He was originally Russian. He fled Russia at some point. He created Telegram. I don’t remember if it started as a messenger or turned into a social media platform / has DM’s or the other way around or whatever, but Telegram has ostensibly 900 million users. It’s pretty big, but it’s much more of like, a kind of Twitter plus DM’s than like, end to end encrypted messenger, like WhatsApp or Signal. But it has DM’s and it has opt in private DM’s and it has that.

But also their public persona is very hands off. And this is part of the thing that got them in trouble. They, instead of building in, like, we couldn’t like backdoor this if we wanted to, or we couldn’t, we couldn’t snoop on your messages if we wanted to, they’re just, we don’t want to and we won’t. And if you don’t do bad things in public channels, this is like on their website, we’ll leave you alone. They’re basically like, don’t do crimes in public, full stop. And I think that’s what’s got them into trouble. But also their whole persona is like, we don’t care.

We’re not gonna like throw you under the bus. And also we have end to end encrypted DM’s, but also the end encrypted DM’s are shit! So, I don’t know.

David: I just wanna note that this podcast also endorses not doing crimes in public.

Deirdre: Oh, sure. But like, unlike other, other platforms, like say, WhatsApp owned by Meta or Signal or other places that have end to end encrypted communications or DM’s or whatever it is, Meta is the largest reporter to NCMEC, the National Center for Child Exploitation or whatever. They are huge and they cooperate. And this includes public information on WhatsApp. So if your profile picture is advertising bad shit, they’ll report you even though they can’t get at the contents of your end to end encrypted WhatsApp or Facebook Messenger. They cooperate like crazy, because they want to comply with the law and they’re a United States-based company, but Telegram explicitly does not cooperate. And it’s interesting because they only have about 30 people. And even if they wanted to cooperate, it’s unclear if they could with that many people. They have to scale up, spend money.

Matt: And things like that, and they have cooperated, they seem to be very select—

Deirdre: They have cooperated, um, with other countries.

Thomas: I was, before this started up, I was talking to David. So like every other provider, they have published terms of service. And in their terms of service, they have things like, you can’t spam and you can’t advocate violence. And then the last term for the acceptable use stuff, in the terms of services, you may not publish illegal pornography on publicly visible Telegram channels. It’s right there. Right. You can see it right now.

Deirdre: It’s very specific

Thomas: Yeah. It’s real, real specific.

Deirdre: Yeah. Okay. But the crypto. The crypto has always been bad. The crypto is like—

Thomas: Before you go there. Okay. That’s kind of, that’s the thing I’m super interested in here, too, is just kind of like picking apart and trying to figure out how the hell they wound up where they wound up with cryptography. Right. Not as like a snarky comment, but just as kind of a sanity check. Matt, do you remember Lavabit?

Matt: I do. I remember Lavabit, yes.

Thomas: So to me, and you can tell me I’m crazy about this, and that would be valuable to me, but to me, these situations are uncannily similar. Right. So Lavabit was the encrypted email provider that Snowden was using in part at the time of, like, the release to Greenwald and all that stuff. Right? Like, they were a little bit tied up in the Snowden drama. And a lot of people on message boards know about Lavabit because they wound up in a situation where the DOJ was requiring them to disclose TLS keys because they didn’t do end to end encryption, which is a distinction we’ll get into in a minute. Right. If you had the TLS keys for Lavabit, you had all the messages there. And like, Ladar Levinson, who is like the, the operator of Lavabit, played chicken with the DOJ, but ultimately gave in and gave them, like, there was a thing where he was like almost in contempt because he tried to give the DOJ the TLS keys printed out on paper, which, which is an awesome move.

Matt: I remember he made the font very small. That was his trick.

Thomas: That’s great. It’s awesome. But if we’re trying to think about the war on cryptography or an escalating war against peop le’s privacy or whatever, these are the same situation. The issue here is don’t run providers that put you in a position where everybody knows that you can trivially comply with a CSAM investigation.

Matt: So what I think I would say about this is, and maybe I was going to save this part to the ‘til to the end because it’s not something I can say with any confidence. It’s just a feeling. You know, the difference between the Ladar— it was Ladar Levinson, right? That was the Lavabit gentleman?

Deirdre: I think so.

Matt: The difference between him and the Telegram folks is I feel like Levinson was basically naive, right? He didn’t know how to build a system, so he built the best system he could. And the best system he could was not very good. And then in the end, it turned out that all someone needed was a TLS secret key and then they were toasted. Whereas I feel like Telegram knows exactly what it’s doing. And I feel like the great thing about knowing what you’re doing is when somebody shows up and says, you know, can I get these messages? You have three choices, right? It can be, sorry, I don’t have the keys at all, there’s nothing I can do for you, which is a bad place to be if you’re really in trouble. The second thing you could do is be like, well, I’m going to try not to give you the keys and then ultimately fail. And then the third place you can be is, hey, this is a negotiation. And I feel like that’s what Telegram folks are kind of, that’s the position they’re in. And, but again, I have no reason to believe that, but it’s interesting that that probably is more the case than the other way around.

Thomas: So Telegram crypto. I hear it’s pretty good.

David: But it notably probably doesn’t apply to whatever Pavel’s being arrested for, is that that is could entirely be over the public channels that are by default in Telegram.

Matt: So, I think that would be the general case. But then there are these two charges, and I think you mentioned them, which are importing without a license. I actually had to sort of like, read that again and try to remember my French and then go through Google translate, without prior authorization. Maybe that’s a better word. I’m not quite sure what the French was. So, you know, it would be a much simpler thing to argue about if those two charges weren’t there. But those charges really do sound like this is about cryptography. So leave it to the French to kind of make everything much more confusing than it needs to be.

But yeah, Telegram’s crypto, well, we can go there, but Telegram’s crypto, we all have been around long enough to know a lot about Telegram’s crypto and its history.

Deirdre: But we’ve never talked about it on this channel in three years. We’ve just been sort of like, ah, that’s a piece of shit. So now we finally have a real reason to bring it up. One thing that, so all right, Telegram purports— so they’re not open source, number one. So we are kind of just trusting that they are saying what they say publicly, which is they implemented MTProto 2.0, version two. They originally had MTProto 1.0 that had some hilarious things in it that practically look like a back door, even if they thought they were putting randomness into it or whatever. But now they’re doing 2.0. And one thing that jumps out to me for MTProto 2.0 is they’re still doing finite field Diffie Hellman for these.

We mentioned this, but these are not on by default. You have to go in— on your nice blog post that we’ll link in the show notes, you have to go through several menus to get into the opt in, end to end encrypted DM option. And they’re doing parameter negotiation on the fly. You can’t send someone a DM if they’re offline. An encrypted DM offline. They have to be both online to agree. You can’t send it out of band async like you can on basically every other platform.

Thomas: Can you send Telegram DM’s that are not encrypted to people when they’re offline?

Deirdre: I think so.

Thomas: Really?

Matt: Yes, absolutely. It works just like old fashioned— Yeah, absolutely. I was sending my friend, I sent him a message saying, hey, do you mind if I start an encrypted DM with you and post pictures of it? And I was sending these messages just fine. And then when I started the encrypted DM, it was, sorry, he’s offline, can’t do anything.

Thomas: I was going to be like, if they just don’t have async message delivery, that I feel sympathy for them, but they don’t have async negotiation of deep. That’s great. Awesome. Keep going, I’m sorry.

Deirdre: Okay, so, when you’re doing this like negotiation thing, some of us may remember, I think, Signal still kind of has it, but it’s in like a different part of the product, different surface, where like you could compare fingerprints of like the things that I’m sending you over here and the thing I’m sending over here, and you can compare them. And this is supposed to be out of band mitigation for some sort of man in the middle attacker or whatever, but for basically every other system that’s doing this they’re doing this for elliptic curve Diffie Hellman identity keys. But Telegram is doing it for on the fly, finite field Diffie Hellman negotiated key material. It’s like crypto from 25 years ago.

Matt: I don’t even think, I don’t think there are signatures on it. I actually have to check. I don’t even.

Thomas: Do they use a fixed finite field or do they negotiate the group on the fly?

Matt: No, the server gets to make one up. You ask this, they give me a group and then it generates parameters. And then you, like, painstakingly, you have to check all the parameters and make sure. But this is a place where we could talk about it because that’s, you know, an opportunity. There are these SNFS parameters you can come up with, and I don’t know how plausible it is to make, you know, a backdoored set of parameters. Probably not very, but, like, it’s, it’s the invitation for that kind of stuff that makes me worried.

David: Their own docs say that you should check and make sure that the prime is a safe prime, meaning that p minus one over two, is also prime. But the funny thing about that is that it immediately goes on to describe how to check that you have a valid generator, and it picks the most complicated method, because if you have a safe prime, you just raise it g to the p minus one over two when you check if that’s p minus one. This is the one thing in cryptography I remember how to do.

But they were like, no, we need quadratic residues. It’s very, like, weird.

Thomas: This is like the message crypto nerd out that I was hoping for, right? But just to bring people in who are not message crypto nerds, right? If you were using curves, like, if you read, like— and we’ll get a little bit more into what Filippo wrote about the DH exchange there in the past, too— but, like, if you read Filippo’s thing, it’s like, you know, the problem with finite field Diffie Hellman, is that it’s weaker for every bit of key than with elliptic curve, which sends smaller messages. It’s better studied, blah, blah, blah, blah, blah. Right? But like, the other thing about using elliptic curve is if you’re, when you build a curve system, usually you always fix a curve, you’re doing it over like P-256, or most likely curve25519. But, like, there isn’t a negotiation about what they’re normally in sane systems, isn’t a negotiation about what curve you’re using. In finite field Diffie Hellman, for reasons I don’t fully have my head around, like there could be the curve25519 of a finite field Diffie Hellman. You could just say this is the best group to use, right? But there isn’t.

So all these systems negotiate them.

Matt: So in TLS they still support finite field divyallman. There is a set of recommend, I think DKG over at ACLU actually came up with them. There’s like three you should use. And obviously Telegram does not use any of that stuff. But they could, could have picked something with TLS and they’re like 3072 bit. So they’re not, they’re bigger than the Telegram parameters. But yes, they did the absolute worst, most error prone thing you could do in Diffie Hellman.

Deirdre: For finite field Diffie Hellman. Yeah.

Thomas: Now we say all this, but they have never been broken that way. Although you’re saying that there’s like a plausible, like, okay, there are parameters that are weak to particular, like numeric field sieve attacks and things like that?

Matt: Yeah. And think about this, right? Like, so the way it works is I log in as me, you log in as you, and then one of us starts communication. The first thing we do with our logged in account is we say, hey server, give me parameters. And so imagine like, you know, some academics get together and they say that we want to look at these parameters. Are they any good? There’s no reason to believe their parameters are going to be bad, right? They’re logged in as random accounts. Nobody’s going to be targeting them. But maybe you’re a Ukrainian general and somebody knows your account is something important. Maybe you’re not going to get the same parameters.

So it’s a very, very hard thing to test for. Even if we had good algorithms for testing for that.

Thomas: This is awesome, because they are, I haven’t thought about this, but they are recapitulating the problem with web-based, HTML-based JavaScript, right? Which is like they could say, here are the parameters and we’re gonna use the good, like, the TLS-based parameters or whatever, right? Like the good finite field group definition or all that, right? But every time you do a chat, the server tells you what your parameters are gonna be and you are trusting that they are giving you the same parameters as every other DM that you set up. Okay, that’s awesome.

Matt: There is no reason forever to have a server handing you Diffie Hellman parameters like this. There’s just none.

Thomas: Right. It should just be in the protocol. Right. They should just say, we’re going to use these parameters.

Deirdre: Or at least in say, the TLS setting like there is a set of three, and we will figure out which one of those three sets of predetermined offline elsewhere parameters we are going to agree to use on for this session.

David: Yeah, I don’t want to use the parameters from the man, man!

Deirdre: Yeah, well this is the corollary of that.

Thomas: Yeah, there are NIST parameters. Their problem is there aren’t brainpool DH parameters for them to use.

Deirdre: Are there not? That’s a good question.

Thomas: I killed the whole conversation dead with that weird reference, moving right along.

Deirdre: And so, not in the current version of Telegram of MTProto 2.0, but when, okay, say you got these parameters from the server, they’re also in MTProto 1.0. You would do your Diffie Hellman, your finite field Diffie Hellman over the prime field that the parameter was given to you by the server. But then they also let you do your Diffie Hellman, you get the share from the other side and raise it to the secret value that you have on your side. But then they also take that shared secret value and xor a nonce into it, in the old version, this is not live, but this used to be.

Thomas: [dripping with mock condescension] Deirdre, Deirdre, Deirdre. What you don’t understand, what you don’t understand, and what they do understand is that some of the systems that are running the Telegram client, have bad random numbers. You can’t simply trust to run a DH. You have to mix in some server random information so you can be guaranteed that they’re using real random numbers.

Deirdre: [sarcastically] Uh huh, sure.

Thomas: Right. Matt?

Matt: Yes, Pavel Durov has much better random numbers than me, and I would rather have him choosing my randomness. Yes, the serious answer is that bug is a few years old, so I hate to pick on it, but it was a crazy bad bug because, you know, as you were about to say, what it does is it lets Telegram server basically take, you know, make it so that we are on both sides picking, you know, if we do a man in the middle attack. If they do a man in the middle attack, what’s going to happen normally is I’m going to end up with one key shared with the man in the middle, which would be Telegram, and then my counterparty is going to end up with another key shared with Telegram, which will be different. And this way they can fix it up.

So they’re saying the same key and the key fingerprints will match. So it was a really scary looking bug because the implications were everything you would expect from a backdoor. Now, allegedly, ostensibly, it’s been fixed for a few years, but it’s the kind of thing where you just don’t really feel comfortable after seeing something like that in a prototype.

Thomas: But they’re also, they’re not like signing the key exchange right now. If they’re not doing like, or like doing a triple Diffie Hellman, authenticating against an identity key or whatever, it’s still just like a plain old. We’ll get into my opinions about what may have happened, building this, this whole crypto system later on. But, like, it’s still just like, I read this bit of Schneier and immediately put it into practice kind of situation.

David: Yeah. Even that backdoor is like, if you didn’t think about how to build it, you think you’re building it more secure. It’s like, it’s a very Hanlon’s razor backdoor.

Matt: Okay. But I do want to add one thing, which is that this particular group of people, and I think literally this particular group of people had multiple conversations in public about this, several of them, and I’m going by memory here, mostly involving Pavel Durov himself being involved in this conversation, or at least jumping into different Twitter threads and some of his other folks, too. So this was not something that, like, they did not know about. They were heavily criticized by people on Twitter, you know, cryptographers on Twitter, by academics and over, I think now let’s go on, let’s say maybe nine years. You know, they have never upgraded this protocol to be better. And like, that starts to feel a little ugly to me at this point.

David: They did, I mean, they did take this specific, they rev from 1.0 to 2.0, and that took this specific backdoor out and probably made some other changes.

Matt: But they did in terms of like the Hellman parameters. And all the things make me uncomfortable. They’re all still there.

Thomas: We’re bouncing back and forth, which I think is a good thing, but, like, we’re bouncing back and forth topically here. But I do kind of want to call out like a thing that I’m not sure. It’s a weird thing, to say about a system. So I’m not sure it’s intuitive to everybody. But when we’re talking about this MTProto system, right, where like we set up, you know, we set up a DH shared key and then we run a transport protocol over that, right? Like that is they do that for DM’s. But that is also, I think it’s their like baseline protocol, right? I think it’s their normal, like when you do client server. I think I’m right about this.

They don’t run TLS even. Right? They run TCP. They run TCP on port 443 or an HTTP transport on port 80 if they run their weird MTProto thing instead of TLS.

Matt: Well, you know what? Like let’s not, let’s be a little fair because like Signal and other chin messengers use Noise, right? Noise, not TLS. And so we can’t be totally mean about this, right?

Thomas: And actually that’s, that’s not where I’m going. Right? Like it’s not that, like that’s a smoking gun kind of deal. It’s just like, I think it’s kind of important to understand that there’s like a property of their whole system. Like everywhere they do communication is, they have these issues, right? But like a thing that you will hear Telegram advocates say, like, again, like the most important thing to understand about Telegram is that none of this is happening by default, right? Like you have to ask for secure messaging in a one to one conversation. They don’t do it with groups, right? But they will also say they run this MTProto system with identity keys and a secure transport and blah, blah, blah, um, for all communications between the client and the server.

Ergo, their group messages are encrypted. Their group messages are encrypted because they run MTProto between the client and the server. Like they’ll get real mad if you say that the server has plaintext because they don’t have plain text. They have whatever the product of this empty proto. Like they talk about client server encryption as opposed to end to end encryption.

Matt: But they have plain text. I mean, the whole point is their server can decrypt the messages as it has to put them into a database. And so they have plain text. I understand what they’re trying to say, but it’s certainly not true in that sense.

Deirdre: Yeah, this is the same thing that when Zoom blew up, when COVID hit and everyone started working from home, on their FAQ or whatever, they’re like, is this end to end encrypted? And they said, yes, we have encryption from your client to other clients in the Zoom. And then people had to be like, no, no, no, no. The thing that you’re saying on your website and what people are asking about is not true. And they hired a whole bunch of people to actually make deployable end to end encrypted Zoom because they have a TLS connection or a webRTC DTLS connection from the client to a server. And the server has to be in the loop to do all sorts of things like transcription and cleaning up noise and all this sort of stuff. That was a big feature and they had to like go do all this work. Zoom got burned by the thing that Telegram is advertising. And they, like— to what Matt was saying, they’ve been notified for ages that this is not, this is not true, this is not accurate, this is misleading, and they don’t care, or something.

Matt: I also just want to say, before I forget, something Deirdre said just a few minutes ago, which is it’s not open source. I kind of agree with you. I have looked multiple times over the last few years at Telegram’s clients and found things that weren’t officially supported were way out of date. People could not claim that they were actually the thing that they built into the client. However, I have not looked in the last few months and it’s sort of the thing I want to do. But I do know that Pavel Durov specifically made a big post about they now have reproducible builds for iOS, which Signal does not have. And yet I’ve also heard a lot of people saying that’s not really true. So unfortunately I’m in a position where I just, you know, it’s such a confusing mess that I don’t know the truth, and I really wish I did.

But in the past it has not been true that you could not build it from code reliably.

Deirdre: Okay, stepping back like a hot second, we just jumped right into Diffie— finite field Diffie Hellman and what was in 1.0 and what’s no longer there. Matt, can you give us a quick walkthrough of what we are supposed to get in MTProto 2.0 in these private chats from Telegram clients, whether or not we can confirm or deny that they’re actually getting shipped?

Matt: Okay, so if everything works the way it’s supposed to work. So first of all, you, you make a connection to somebody else. It has to be a one to one, not a group connect, group chat. You make a one to one chat with somebody, you can talk to them. At this point, everything’s on encrypt or it is not encrypted at the server, it’s not end and encrypted. Now you press a button, I think you go to the users, this is four clicks, to the profile page on my iPhone. And then from there, there’s kind of nothing there that says encrypt. There is a little more menu.

Matt: You click that, you click that again, there’s a ‘start encrypted chat’. And then if your partner is online then you get to start an encrypted chat. But if they’re offline, it basically just says sorry, waiting for them to come online.

Deirdre: God, I hate that so much.

David: That’s crazy. I can’t like, it’s giving flashbacks to like aol and like middle school.

Deirdre: Yeah, and like Pidgin or something. Or maybe like really shitty OTR Pidgin.

Matt: Yeah. This is not like wild technology like the way, this has all been handled in modern messengers for years. Okay, what you’re supposed to get though, if you get through this process and you start a secret chat, you are supposed to get a Diffie Hellman established key with your remote counterparty that the server doesn’t know. And you know, you have to check a key fingerprint, maybe read it over a phone to be absolutely sure the server is not tampering with that. But if you do that, then you should have hopefully end to end encrypted messages that just appear as junk to Telegram servers. It’s just you and the person you’re talking to can decrypt them.

Deirdre: And anyone who happens to be listening or whatever, who happens to be sitting on that channel or whatever, well even.

Matt: They shouldn’t be able to see what you’re talking about. But obviously if they get on your phone then you’re in trouble.

Deirdre: Yeah, yeah. I mean, and that is also like the case for basically all of these clients. Like you have to have a little bit of trust. And to be fair about the open source stuff, WhatsApp is not open source, iMessage is not open source. There’s a lot of stuff that’s not fully open source, but you’re able to have a lot more trust in what they’re saying than the Telegram people Signal is open source.

David: Let’s back up to the protocol again a little bit. So I feel like we all open the protocol docs on Telegram site and are like good God, what is going on? And I feel like most people don’t have enough context to know why we would have that reaction. Could you maybe explain what, what Telegram is doing as compared to what you might expect a sort of more normal end to end secure messaging application to do?

Matt: Okay, so a normal end to end secure messaging application would do something kind of similar to what Telegram is doing, right? It would publish some kind of key exchange, public key, maybe it’s an identity public key or something, but it would publish some public key. It would upload that to the server, whether it’s Signal or WhatsApp or whoever else is operating a system. It would also do a thing which is really nice when you want to make a connection to somebody offline, is it would also send the first move of that key exchange protocol maybe 100 times, and it would ship that up to the server as well. And that’s really nice, because when somebody else wants to send me a message and my phone is off, those first moves are ready. They can just go ahead and complete the protocol and establish a key with me. This is a little bit in the weeds, but it’s really, really important. This is a very straightforward thing. And so that’s what another messenger would do.

Then the messenger would use proper Diffie Hellman, preferably elliptic curve Diffie Hellman with normal parameters, would also check that all the messages sent by the counterparty are signed by what appears to be their public key, which is called an identity key, would do a whole bunch of checks, and then would only then actually start a communication using a normal encryption, symmetric encryption protocol, authenticated encryption, something that people trust. And every place where I’m saying these things, it sounds like it’s, you know, kind of boring, but this is every single place where I’m saying and then would do this. That is something that Telegram does in a weird, non standard way. So, like, it’s not all a given. Um, the only thing I would say is, like, with all of these protocols, there is a fear that the server could be tampering with the messages and, like, swapping out, uh, let’s say Deirdre’s public key, the key that she’s trying to send me for somebody else’s public key, like the FBI’s public key. And this can lead to man in the middle attacks on all of these different services. So it’s not a totally unique thing. This key fingerprint checking, they call it safety numbers in Signal, is supposed to help you check for that.

However, the way the Telegram does everything with all these parameters and extra stuff means that the reason we’re so nervous about that is the safety numbers could maybe not work if they found some way to bypass them with these parameters. That’s one of the reasons we’re so nervous about it.

Deirdre: And to mitigate the whole checking fingerprints out of band, WhatsApp and I think maybe Facebook Messenger have basically deployed key transparency. And you have to do, it’s a lot of infrastructure that you have to put trust in WhatsApp themselves. But if you already trust WhatsApp or Meta themselves, they look at keys and they notify you, if you— if me, Deirdre, am starting a new session with Matt, and Matt is showing me an identity key that seems very different than what Meta is used to seeing from Matt’s devices, it will alert me, rather than, I have to be proactively checking my fingerprint and Matt’s fingerprint, and if there’s any time that I forget to check them and miss that, it changed: oopsie daisy, there’s an FBI in the middle or whatever. This is a lot of infra to deploy this, but, and you know, for example, Signal does not have this yet, but it’s no surprise that Telegram does not have anything like this, it’s even worse.

Matt: I think that’s kind of the story here. Right? So, like, there are some old protocols for messaging, like off the record messaging, which is called OTR. I don’t know when that came out. In the 2000, sometime maybe before 2010, I don’t know. And it looks a lot like Telegram’s protocol today. And what we’ve seen, like, out in the real world, is OTR kind of got Signal came along and took OTR and then made it better and better. And then like, WhatsApp kind of got/used Signal. And so messaging over the last 15,16 years has been improving a bit at a time.

And you mentioned this key transparency thing. That’s just like the most recent set of improvements, but things like being able to send messages offline and using proper protocols, these are much older. Telegram has done none of that. They’re basically at the level of like the 2000 something OTR. Yeah, and they stopped.

Thomas: The OTR people are going to jump at you right now. Right. Because, like, so OTR is like, I think it’s like, I think of it as being roughly 2005 vintage, which is like right around the time where we were kind of solidifying notions of like authenticated encryption and stuff like that. Right. But like, I think they would say it’s, it’s like sigma. I think they’re like an implementation of sigma key exchange, if I’m remembering that right.

So I don’t think about key exchanges that much anymore, but there’s a theoretical basis for it. And it’s not like what you would do now, because now modern messaging encryption and modern messaging key exchanges are all like the gold standard is Signal protocol. Everything’s kind of in some way intellectually derived or informed by Signal, right. But that didn’t exist back then. So they would say, like, we were based on the literature of the time is what the OTR people would say, right. And, like, Telegram can’t say that, right. Telegram is doing something much weirder than that.

Matt: And Telegram is a lot more money. Right? Like, Telegram isn’t some open source project. They have apparently hundreds of millions of users. And I don’t know how they make money, but they also started a cryptocurrency and made billions of dollars on that, I think, at some point. So they, they could do things. They just haven’t.

Thomas: But my other thing here, and, like, maybe the vibe shift I’m sensing here is not one that you all sense, and I will let you talk about it after I insert the vibe into the conversation. But, like, ten years ago, you could not have this conversation without it being kind of an implicit defense or advocacy of Signal, where, like, Signal was the only thing in the world that was doing this, right. That was like, doing a serious implementation of message crypto with a thought out, like, you know, theoretical design model for what they were doing. Right. And I think that, like, I think Telegram, in public discussions about security and message security capitalizes on the Signal versus Telegram thing of this whole conversation. Like, well, you’re all just shills for Moxie Marlinspike, right? Where it’s like, I said a thing about this earlier on Twitter, if you bring up Moxie Marlinspike’s name in one of these conversations, which I just did, it’s kind of a red flag for the whole conversation at this point. We have multiple good theoretical models for how to do messaging encryption. MLS is a strongly theoretical whatever for this whole thing.

We don’t even have to make reference to Signal at this point. I think to, you know, to point out how wacky Telegram is.

Matt: Even Apple iMessage now has a lot of features. It does ratcheting, it does post quantum, it does all kinds of crazy stuff. And like, yeah, everyone who’s doing serious crypto and encryption is doing it much differently.

David: Webex has ratcheting. Like, if your messaging system is behind Webex, we’ll call this, that’s my new law for secure messengers—

Thomas: Our listeners are not seeing David’s gesticulations, as he says, “behind Webex.”

David: Your security should be at least as good as Cisco Webex, which has good people working on it, credit to them.

Deirdre: And they are actively involved with MLS development and moving that sort of stuff forward. Yeah. And I just put the original OTR paper in the show notes. It doesn’t go into specific details about some of the details that we are uh, cringing at in MTProto 2.0.

One of the fun things is like to go really deep is like their KDF is SHA-256 we’ve learned a lot about KDFs.

Thomas: I’m hoping against hope that Matt is conversant a little bit with the details of how this stuff works. I’m not like, I’m faking it right now because I just read the paper before this, but I’m hoping to get that walkthrough of it’s so like I have to say going into this, that it’s not a broken protocol. Like we don’t have, we have, there are some attacks and they’re not cosmetic attacks, but they’re not like, you know, they’re not attacks in the sense that hacker news would think of an attack on the system. Right. But it’s, it’s a fun transport protocol. Right. Like once you’ve actually got gotten through the, the Diffie Hellman part of this and established keys, everything else they do is also, it’s pretty wacky. Are you from, how familiar are you with this Matt?

Matt: Oh, in terms of the KDF stuff, not so much. I know that they use some bad kdfs and they do like MACs using SHA-2 as well. And I know that there is theoretically some attack on it, but I actually dont know the details.

Thomas: Yeah. So first of all, theyre using IGE mode, right. Which any crypto nerd conversation about any crypto nerd conversation about Telegram and MTProto has to come down to IGE mode, which is. Its whatever. Its fine. Its whatever. Right?

Deirdre: Is it???

Thomas: It’s not fine

Matt: It was not fine. It was, it was kind of amusing when they did it the first time in version one of this. And then they were like heavily criticized. People wrote papers saying it wasn’t CCA secure and it wasn’t really a big deal. Like it wasn’t broken-broken, but it was just theoretically kind of broken. And then they went ahead and upgraded it to MTProto 2.0, and they kept all this stuff. So it’s not like they haven’t made changes and it’s not like they haven’t been told that there’s a problem. They just haven’t done anything.

Thomas: I thought IGE was super, super wacky. Like until this moment, I thought IGE was super, super wacky because it’s like it’s yet another cipher mode in a world where you have like CBC and CTR and CFB and you know, FB whatever. And I don’t know what the fucking are, right. But, like, you know, you really should only need to have in your head two of them, right? And they found another one, which I thought was crazy back then, but I never really looked into it. I, like, I stopped there. But IGE is just a CBC with an extra chain, right? Like, it’s a. You’re what? The thing you’re trying to do with IGE is when you, um, you’ve got your CBC, you know, cipher text that you’re building up, right? And you’re chaining the previous block. And it’s like the thing you’re trying to get is when you have a bit flip in one of the blocks instead of the CBC behavior, where it totally fucks the first block and then gives you a targeted edit of the next block, what they want is it totally fucks every subsequent block, which is kind of like a reasonable property to want.

Matt: Except for the reason they’re doing it. So. So first of all, there’s this whole literature on, like, self synchronizing ciphers that have this property that they don’t get screwed up forever when there’s an error. So they’re building the opposite. They’re building a self, like, non synchronizing cipher, which never comes back, right. If there’s a single bit error, then, like, everything is screwed forever. And you’d ask, like, why do you want this? And someone who has never done crypto before would say, well, this is great, because, like, if somebody tampers with my message, it’ll be really, really obvious. But the way we detect tampering in messages is we use MACs, or we use an authenticated encryption mode to actually properly detect tampering.

We don’t have to deal with weird modes that scramble up the message. And they seem to have not known about MACs as being a thing. So the meaning, the cryptographers are Telegram. So what they did is they invented this weird new mode that, like, a single error scrambles the message forever, and then they have a check at the end to see if the message is still scrambled. And so they kind of reinvented, authenticated encryption from scratch. And it’s very boring. Everything I’m saying is very boring.

Except it’s also like finding out, you know, your car has wood inside the engine instead of metal, right? Like, it’s probably fine. It runs okay, but, like, why is there wood inside my car? It doesn’t make any sense. It’s non standard.

Deirdre: It’s also, like, I hear an adversary gets to corrupt a block and that corruption, like, propagates through all the subsequent blocks. And, like, that gives me the willies. Like, I feel like that’s just like the opposite of what I want in my cipher mode, my authenticated encryption mode.

Thomas: Yeah.

Matt: The only thing I wanted to say is they could have just gotten rid of IGE and, like, swapped it for something modern. They had plenty of years of knowledge. They made the decision to, like, upgrade.

Thomas: The algorithms they’re doing something weird with. So friend of the show Martin Albrecht, and friend of the show Kenny Patterson wrote a paper where, I don’t know if this is just funny to me because I’m ignorant of theoretical computer science and cryptography or if it genuinely is funny, but I’m reading a paper where they’re trying to do a formal model for the Telegram protocol. And, like, the formal model for the Telegram protocol has a section in there where it’s like, we need new novel assumptions for shockle one or shackle two or whatever, because, like, whatever we would normally do to prove a protocol, we need to actually assert new things about SHA-2 in order to actually talk about this protocol, which I find hysterical. But I might just be wrong. This might just be me being ignorant. Am I being ignorant or is that really funny?

Matt: It’s pretty funny because, like, usually you should be able to get away with collision resistance, or you should be able to get away with some preimage. However, I will say, generally speaking, I’m not picky about this. If you use the random oracle model to analyze SHA-2, I’m mostly okay with it. It’s not usually that bad. So I’m not sure if this is a question of cryptographers being very picky for Eurocrypt or if this is actually Telegram being really weird. I can’t tell you right now.

Thomas: So they do a weird thing with the way that they actually encrypt messages where, like, so they’re doing IGE mode and, like, part of, like, in the back of my. So, like, first of all, I want to point out that friend of the show, Ben Laurie, is responsible for all of this—

Deirdre: What? What is all of this?

Thomas: Everything that happened with Telegram.

Deirdre: Okay.

Thomas: Telegram would not be using, Telegram— Ben, I’m talking to you right now. Telegram would not be using IGE mode, had you not implemented it, for reasons passing understanding, in OpenSSL, like, ten years ago. For whatever reason, Ben actually implemented IGE, and Telegram uses the OpenSSL, BoringSSL implementation of IGE. Right. So there’s a weird thing going on here where, like, they don’t have a standard MAC. They have sort of like an MDC type model. The paper says it’s like an encrypt and MAC model where it’s like, if you’re reading a paper with a proof of a transport protocol and it says encrypt and MAC, and then the thing in between is not ‘then’, then something weird is happening, right? Something weird is happening here. But, like, part of the idea, I think, behind IGE is like, if you can prove that you’re chaining, like, ciphertext block corruption all the way to the end of the message, then you only have this, like a CBC MAC thing happening here, right? Like, you only have to look at the last block of the message to authenticate it. Like, I think there’s like some intuition here for why they would have done this.

Matt: Yeah, that is definitely what they’re doing. They’re like a single bit corrupts the encryption and, like, carries forward this bad thing, and then we can just check. And it makes a lot of sense if you’re in 2006. In fact, I just googling quickly and I looking for Ben Laurie and IGE, I find, you know, emailing, mailing lists from, like, 2006 or so, people complaining about IGE being broken and OpenSSL, but it doesn’t make any sense as of 2015. What are we talking about when Telegram came out? This is years later.

Thomas: Yeah. I have a sort of thing where, like, I explained this whole system to myself by, like, if you literally left it up to Hacker News to design a transport protocol, right? And, like, you have that weird mix of people and what they, what they bring to the conversation, what they’ve actually read about. It’s like, if you’re, like, doing it from first principles, like, you could wind up in a weird place here where, like, basically you’re rederiving all of the last 20 years of cryptography without the benefit of anybody’s papers. It’s like if you tie your arm behind your back where, like, I’m going to build a new crypto system based only on the cryptography knowledge that we had in 2003, not 2004, very specifically 2003. And I’m not allowed to google anything, I might wind up with this system.

Matt: I just want to stress that I can accept that for draft one and maybe for, like, version two, but come on. We, we had many years of conversations and people writing papers and making fun of them. And I really. The thing that I think you and I both saw, maybe we saw this and nobody else listening remembers this is Pavel Durov came back at us and said, you know, the truth is that my guys in Russia who invented this are just smarter than me. Like, the reason it’s so bad is that my brilliant Russian mathematicians, I found they’re smarter than you. You Americans are just using this broken NSA cryptography, and we are using stuff that is not broken. Like, it was pretty nuts. And when I say nuts, I want to drill down into the nutty that uses.

This is using AES and SHA. This is using algorithms that were standardized by the us government. So it’s not like it’s avoiding this kind of us government cryptography. It’s just using it in ridiculous ways. But, like, Pavel was very, very militant about the fact that, no, if we used it correctly, that would, you know, be the wrong thing, and these guys were smarter. He did not just say, like, oops, we made a mistake. He pushed back and really stuck to his guns. And I admire him for sticking to his guns, but his guns were very bad.

Deirdre: They’re very bad guns. I was going to give sort of an argument of, ah, they have such a lean team. They have approximately 30 people running Telegram for almost 900 million users worldwide or whatever. And if you want to be so lean, there’s only so much advanced development work you can do to completely upgrade your end to end ncrypted, blah, blah, blah. But, yeah, it doesn’t seem like they’re just trying to keep it really lean. It seems like they’re just sort of willfully just not, don’t want to contend with the fact that what they’ve built is actually kind of a pile of crap and actually not doing their users a service at all. And it’s just kind of just a fig leaf of encryption at this point.

Matt: I don’t think it’s even a fig leaf. I think it’s, I don’t know. I am trying to say this in a nice way. I think it is a deliberate effort to send the message that anytime you use a Telegram chat, it is encrypted, when it is very deliberately designed so that it will not be encrypted. That is my feeling. Now, that could be an accident, but I definitely feel like the marketing has been done wrong. Let’s just leave it at that.

Deirdre: I mean, the marketing is very— and the marketing is basically what they write on their website, the fact that they don’t cooperate with any sort of endeavors to be like, hey, someone’s trying to organize a murder on your channel. Will you kind of help us prevent a murder? And they’re like, no, we don’t cooperate. For anything at all. And Pavel Durov just kind of talking like this in public for, I don’t know, 15 years or whatever. One thing that really bothers me is that even it seems like all these choices are not willfully designed to be friction, for you to actually get the best possible secure option out of Telegram, but you could be forgiven to confuse them for that because it’s literally you and your person have to be online to even start an end to end encrypted chat. You have to negotiate, like, maybe you negotiate getting online over a not end to end encrypted chat.

You have to go through, like, three or four or five menus to actually find this option in the first place. And then once you’ve actually done everything and, like, you’ve checked your fingerprint, your, your randomly generated fingerprints, at the point that you’re finally communicating, even then it’s shitty. And I don’t know, it’s just, they just don’t want to do anything. They don’t care. They think they’re right. They don’t want to make it better.

Matt: So it’s like those places that say you can cancel your subscription anytime, but I could go there in person and, like, do it in writing and stuff. And, like, that’s how I feel about starting an encrypted chat. Like, just by making little choices in the way that user experience works, you can make it vastly less likely that anyone’s going to do the thing that you’re trying to make hard.

Thomas: So I feel like there’s like, a, you know, there are, like, smoking gun design decisions they’ve made with their user experience where, like, I think everybody kind of uniformly agrees that, like, this is a system designed to make messages insecure. The not being on by default, the fact that it doesn’t work async, the fact that doesn’t work for recruit messages, blah, blah, blah, right? Like, yeah, sure, I’m with you 100%. I’m gonna say something that’ll make you angry, though, right? Which is if you look at Matrix or if you look at Threema, two other encrypted messaging systems that have had public papers written about them, right? So, like, you know, Martin Albrecht and his team did the thing where they went in, and I don’t know if it’s Martin Albrecht. I’m sorry if you were not the lead on this. Whatever. I don’t know how academia works, but whatever. That team went in and built a formal model for Telegram and then evaluated the system against it, and they came up with some things you can reorder messages. It depends on which direction you’re going.

Client to server versus server to client. Because the server has control over timestamps or something in one of those directions, you can change the order in which messages will appear. Because I think timestamps are not authenticated cipher text in the system. And then there’s like they do a thing where for, I’m not really sure I understand what’s going on here, but there’s a retry mechanism in empty proto where like if you don’t get it back from the node that you’re sending a message to. Right. Like you’ll re encrypt it and send it again. And there’s an oracle there. You can distinguish if you know what was sent originally, you can verify that the same message was sent or a different message was sent.

That’s real problems in the sense where if you find an encrypted transport where you can reorder messages, it’s fucked. Okay, fine, I get it. It’s fucked. It’s not a good, it’s not a good transport. No one should use it. Right. But in terms of actually usability, like usable.

What am I trying to say here? Like actual vulnerabilities that people would care about. Right? Like after Nebuchadnezzar, Matrix was a steaming crater. Right? And Telegram, again, modulo the fact that doesn’t encrypt anything at all. Telegram, their cryptography is. I was about to. The words holding up was. Were about to come out of my mouth. And that’s not, those are not the, those are not the right words.

Matt: I want to, I want to push back on that. So first of all, let’s refocus on, nothing is really encrypted. Very little is really encrypted. Let’s start there. I spent all of my blog posts on the fact that, you know, by the way, it’s really hard to find this menu and very little of my blog post on, like, all this stuff about Diffie Hellman.

Deirdre: Sure.

Matt: Because I do hear what you’re saying and I agree with some of it. But let me just say this Telegram is now not a, like new system. Matrix, how old is Matrix? Like a couple, a few years?

Deirdre: It’s several.

Matt: It’s younger than.

Deirdre: It’s younger, yes.

Matt: It’s younger than Telegram. Telegram was really broken in its early versions. And like, that backdoor bug that we mentioned that Filippo was talking about, that was the most devastating kind of bug you could find. Right?

Thomas: Uh, uh, uh, uh. Uh, uh, uh, uh. This is what I do now just stop people from talking to, I’m the best. So hold on, hold on. Because there’s a tripwire in this conversation that sets me off. Right? Like I agree. Like morally, directionally agree with what you’re saying. Right? Like, I know where you’re going. I’m just doing a thing now also where I’m finishing a thought where my wife would throw a shoe at me right now if I did this to her.

Thomas: But like, whatever. So however, Matrix has the vulnerability, and I’m bringing this up because this is like a, it’s like a pet issue with me. Matrix had the vulnerability where the server controlled group membership. They did encrypted group membership. They do encrypt encrypted group messages, which is a big thing that we want from these systems. The number one biggest problem with Telegram is that groups aren’t encrypted. Matrix does. Encrypted groups, that’s the calling card for Matrix.

It’s Slack. But the group messages are encrypted. But because of really severe vulnerabilities in that system, it might as well not have because the server controls group messaging and clients are not adversarial about who’s in the group. Right. Like clients just kind of assume the server. Like this is the whole system where like group members are group members, right? So I do want to call out every messaging system that we look at, not three more, by the way, but every messaging system that we look at tends to have this one vulnerability that I think people don’t. People are not keyed in enough on, right? Which is you do end to end encryption. You get everything, right.

You do ratcheting, you have the right primitives, but somehow the layer we are figuring out group membership happens one layer up from all this stuff about how we do the key exchange, how we do the messaging crypto, there’s this, like, it’s a higher level of semantics about how we’re going to authorize which people get to join which groups. And I have a thing here where like, I think people don’t like. It should be like, to me, if you do encrypted group messaging, it should be the first question you ask about that system is what determines who’s allowed to join a group, because group membership is key distribution, right? So like, it’s not really fair to compare Matrix and Telegram on this in that Telegram has the worst possible case answer here, which is simply that nothing is encrypted at all. But there’s also a truth in advertising to it.

Matt: Well, hold on a second. The defense of Telegram is we just didn’t bother to implement group group encryption at all. Right? Like, it’s not like it’s even, like, not active. It’s just they don’t support end encryption for groups. So in the process of not supporting end to end encryption for groups, they magically get to avoid all the bugs that showed up in Matrix. Would those bugs have existed in Telegram if they had group encryption and encryption? I can’t tell you. It’s the kind of thing where, you know, if I was asked in a court case, I would say, well, that’s hypothetical, but I bet it would be a nightmare and a mess. But going back to the things that they did have, right, where you trust the server for this one essential function of giving you some parameters in key exchange for pairwise communication, they, in that early thing, that early version that Filippo comments on, they messed it up so badly that it essentially offered no security against a malicious server.

So you don’t do a lot worse than that, considering all you’re doing is pairwise communication. And the point that I was trying to get at was that, yes, Telegram has held up well now eight to ten years after being introduced. But I think what we’re looking at is the result of a lot of band aids. They started with a bad protocol. People would point out things that were bad, really broken, really broken, and then they would put a band aid on the really broken thing. And now we see a thing that is just like, kind of theoretically academically broken and not like broken in the sense that, like, oh, my God, I can’t use this because all my messages are going to be toast. And we still don’t really know if it’s any good because I think the academic description and the protocol descriptions versus the implementation of it, there’s probably still a big discrepancy, and we don’t know what that looks like.

David: I would also argue that server provided Diffie Hellman parameters are kind of like DoA. Like, it’s harder than it was in the previous version to do something, but that’s still very close to a backdoor.

Deirdre: Mm hmm. I mean, especially because it would be one thing. If it’s like, I don’t know, some well known pre suggested list, maybe there’s some sort of optimization, some performance thing that the server is trying to do. They’re trying to be like, here I shall pick from the publicly known set. Here are some parameters. I don’t know, maybe it’s doing client detection. Something, something, but it’s not. It’s generating some random fucking number on the backend and just handing it to you and then hoping that the client will actually check that it’s the right number and that it’s within the correct range.

And doing all this other stuff. And you have to like, it’s shenanigans. It just feels very shenanigans.

Matt: It feels really shenanigans. I actually have to go back and look at the attacks on Telegram paper and see what they say about possible attacks in there. I don’t remember seeing it. I’m sure it’s in there.

Deirdre: Yeah. To pivot away from this getting into it feels kind of gross to have some CEO of a company that purports to provide end to end encrypted chats. And we’ve covered that. It’s pretty shitty. It does implement some sort of end to end encryption. If everything goes perfect and theyre not giving you a crappy finite field Diffie Hellman prime, and youre online and you pick the thing and you do everything right, its giving you something. It feels pretty shitty that that CEO of a company can just land and get arrested, ostensibly for having antenna encrypted messenger. We can have Mark Zuckerberg land in India and get arrested for having, for owning, being the CEO of the company that controls WhatsApp for ostensibly the same reason.

Or we can have, I don’t know, pick, pick your CEO of company that provides an end to end encrypted messenger. It feels like a pretty shitty precedent. However, it’s Pavel Durham’s fault that he fucking got into the situation in the first place, because it’s not really because he’s operating crypto without a license in France. It’s because he’s not fucking complying with the French trying to like actually investigate crimes that are being organized on the public. The vast, vast majority of activity on Telegram is all public. And he’s been waving his self in the air saying, yes, we do not do that. We do not. We just do not help.

We do not comply. We do not do anything. And now all the other people who operate good end to end encrypted messengers find themselves like, under threat because this is a precedent that has now been set. But it’s just a big mess and it sucks.

Matt: Can we talk about that? That’s the thing that makes me really unsure how to process this. Normally, if this was someone, if this was a company that was an end to an encrypted messenger, even one that I mostly disagreed with their philosophy on, my normal reaction would be, look, this is really bad. Even if these are the not mode, the not the most sympathetic case, we want to make sure that encryption is protected because after they get rid of these people, they’ll come after Signal, they’ll come after WhatsApp, they’ll come after all encryption. I don’t know. I’m not quite prepared to go there. I’m very close to being prepared to go there, but I’m not quite prepared to go there because I feel like this Telegram thing is such a weird unicorn of a company and such a unicorn of a service that, like, I don’t know if there’s 100 times more going on here than meets the eye. Maybe this is some kind of weird geopolitical thing that is not just about encryption. Maybe it’s not just about message content.

Maybe there’s something else going on. And I feel a little weird, like rushing to the barricades to be like, save encrypted messengers if it turns out that this has nothing really to do with that. But maybe we’ll learn in the next that it does.

David: There’s just so much plain text that I know some people who are active in the tech community, and we’re super stressed out about this and we’re freaking out about attack on cryptography or attack on free speech. And it’s really hard to tell that this isn’t plain text moderation issue versus it’s not further down the spectrum of plaintext moderation issue than it is an attack on cryptography.

Deirdre: Yeah. And if they had filed this indictment, whatever, I don’t know French law, but they had a press release about their full indictment because apparently in French law they don’t post publicly the full indictment the way that you do in us law. So we can’t just go look at the indictment. But they say, you know, basically whatever the French equivalent is of like, unwilling, like unwilling obstruction of a prosecution or investigation or whatever. If it was just that, I don’t think anyone would give a fuck. They’d be like, yeah, that’s what they do. They basically say it out in public on their website that that’s how they. How they behave and that’s how they have behaved.

But it’s the other two things of saying operate like this thing that hardly, I don’t think everyone was surprised that this is a thing, because if Pavel de Robin and Telegram are found liable for operating cryptography without a correct license in France, it would imply that a whole bunch of other software that has cryptography that doesn’t have a license and operates in France would also be liable. And that’s the. That’s the scary thing. And it’s really unfortunate. And I wish we had more information because that is the scary thing. It might be just kind of padding this actual thing, which is we’re trying to investigate stuff and you are obstructing us as opposed to we’re trying to investigate stuff, and you have encryption, and the fact that you have encryption without a license is the thing we have a problem with. I think it feels like they’re throwing it in, but we don’t know. And if it is just throwing it in, it’s just like, swirling all this stuff up and it just makes everything very fuzzy and very kind of gross, and it’s not great.

Matt: It would be really nice if they drop those charges or clarify those charges. Unfortunately, I just don’t think that they care. I’ve never gotten the impression that the French government is pro encryption or cares about the kind of privacy community very much. And so we’ll see how this boils.

David: You’re forgetting the core tenant, which is that it’s not cryptography, unless it comes from the cryptologie region of France. Otherwise, it’s just sparkling combinatorics.

Thomas: I’m going to say right now that on Slack. David said, like, 40 minutes ago that the one thing he had to contribute to this conversation was a sparkling cryptography region joke, which he finally got it in.

David: I got it in.

Deirdre: And it needs to be licensed by the French government, apparently, or something like that.

Thomas: The only thing I have on the legal side of things, and like, when trying to understand how Twitter and message board people are reasoning through this, is there is a species of YouTube video that I highly recommend you seek out, which is sovereign citizen people getting pulled over by the police. Now that’s a whole other thing, right? Like, there’s a million videos of Sovsit people getting owned by the police all across the continental United States. But I’m saying there’s a very specific species of them where sovsit us people get pulled over in Canada. They’ve driven over the country line into Alberta, right? And they get pulled over by the police, and they try to explain the Fifth Amendment to the canadian cop, who’s like, we don’t have a fifth amendment, which I think decodes, like 80% of all of the message board conversations about Telegram. It’s people trying to look at what France is doing through the lens of us procedural protections. And, like, I don’t want to get too political here. I totally do. But, like, one thing I think is not, like, appreciated enough is just how many weird procedural protections we have in the US that are not, in fact, common in the rest of the world.

Thomas: For instance, when a police officer or a law enforcement agent or whatever collects fucked up evidence by, like, you know, pummeling you and getting you to admit it or, like, doing an illegal search of your car or whatever. Right? We have a rule in the US called the exclusionary rule, which is that that evidence gets struck out from the court case. And again, back to what I said earlier in the conversation, and we’ll have a show note somewhere where I have a link to this white paper because it’s fucking awesome, right. About the French legal system works, but, like, the prosecutor works for the judge in France, right? There’s no exclusionary rule in France. Right. They could beat whatever. They’re probably beating Pavel right now. Right.

All of that evidence is admissible because that is not a protection act. I don’t even know if they have double jeopardy in France, right.

David: This is Thomas saying in so many words that France and the United States are two different places.

Deirdre: Indeed.

Matt: But let’s just add that Pavel Durov took on French citizenship voluntarily, like, two years ago. This wasn’t something, he didn’t stumble into this. He made this part of his life.

Thomas: Oh, I don’t know.

David: Did something happen in Russia a couple years ago?

Deirdre: Yeah, but why France? He’s also a citizen of the UAE.

Matt: And like, so many EU countries you could go to that maybe are maybe not better, but.

Deirdre: And like, in France, apparently, you have to, like, you know, have French like proficiency. Like, you have to be sufficiently proficient in France— French to actually become a French citizen. You do not have to do that to become a citizen of Ireland, for example. And, like, so somehow he did still.

David: You could still buy citizenship in Malta up until, like, a year or two ago, I think.

Deirdre: Yeah. So, yeah, it’s very interesting. Yes. He is just to do Thomas a solid. He is a citizen of France. So I don’t know if. Yeah, he flew into France and he got arrested in France, and he is a French citizen. I don’t know.

Deirdre: It’s weird. Matt, thank you. Thank you for jumping on. Thomas, you have anything to say? We’re wrapping up.

Thomas: I think I’ve said enough.

David: Matt, do you have anything else that you want to say, or are you happy that you came on and had Thomas interrupt you for an hour?

Matt: No, I actually really enjoyed it. No, no. This whole thing’s a mess. What I do want to say is that I there has been this big debate in the EU about doing chat control, which is scanning chat messages. And I just wanted to have this last one thing, which is that France has not been what I consider to be like on the side of the angels in that particular debate. They are very pro scanning. And so if this really is about cryptography, France are not the people that I would be the happiest to have doing this. So that’s really all I wanted to add. That’s pretty much it.

Deirdre: Yep. Not very happy or, you know, nothing has landed for this chat control stuff yet for these sort of internationally run, end to end encrypted things. But there is a reason that people really, really care about federation for building the next generation of encrypted messaging or encrypted chat messaging because they have to care about federation if they want to operate in Europe, which will have its own laws and things like that.

Matt, thank you so much. Don’t use Telegram! Pretend it’s all public because it basically is. Don’t use Telegram, cool.

Security Cryptography Whatever is a side project from Deirdre Connolly, Thomas Ptacek and David Adrian. Our editor is Nettie Smith. You can find the podcast online @scwpod and the hosts online @durumcrustulum, @tqbf, and @davidcadrian. You can buy merchandise online at merch.securitycryptographywhatever.com. if you like the podcast, give us a five star review wherever you rate your favorite podcasts. Thank you for listening.