NSO group, Pegasus, Zero-Days, i(OS|Message) security

NSO group, Pegasus, Zero-Days, i(OS|Message) security

Deirdre, Thomas and David talk about NSO group, Pegasus, whether iOS a burning trash fire, the zero-day market, and whether rewriting all of iOS in Swift is a viable strategy for reducing all these vulns.

The memory safety of C and the blazing speed of smalltalk.

This rough transcript has not been edited and may have errors.

Deirdre: Hello, welcome to Security Cryptography Whatever. I am Deirdre.

Thomas: Hi, I’m Tom.

David: I’m David.

Deirdre: Today we’re talking about NSO group, Pegasus, is iOS a burning trash fire or not? Do we need to do bug bounties better or not? Why isn’t everything written in Rust already? Spoiler: It takes time to replace already written software. Why you shouldn’t just give up because software is riddled with bugs.

So recently there was a big news headline across multiple publications about an Israeli-based. was it cyber espionage tooling? What do we want to call these?

David: They’re a hacking company.

Deirdre: Yeah. They buy zero days.

David: Do they buy them or do they find them?

Deirdre: I don’t know- why not both?

David: Well?

Thomas: Does anyone does anyone find zero-day vulnerabl— I always kind of assume that everybody that, that sells products that have, that run implants, but are vectored through zero-day vulnerabilities are also in the market for those vulnerabilities.

Deirdre: Mmm-hmm. I wouldn’t be surprised if NSO group did a little bit of everything, but especially bought some and then did not necessarily buy the exploit. They probably weaponized the exploits. They created the exploits themselves and delivered them. Anyway: spyware, malware, usually targeted in the context of their product Pegasus at mobile devices. In the news lately the , The Guardian, Washington Post, a project run by Amnesty International with checking from CitizenLab , found a whole bunch of targets from a source. I don’t think we’ve ever narrowed down exactly where this list of targets came from, a bunch of phone numbers that all these journalistic bodies associated with targets such as the prime minister of France, Emmanuel Macron, or a bunch of family members of murdered journalist, Jamal Khashoggi, and they confirmed a forensic analysis that they were— their devices, from the time of their targeting were littered with evidence of this Pegasus spyware that allows the target that— actually, as far as we can tell, got onto these devices by leveraging a zero-day in iMessage, in iMessage media parsing, so that you never had to click on a link. You never had to officially interact , with the— the attacker interaction with you. You never had to click a link or download the payload or anything like that actively. It just tried to send it something to your phone number on iMessage. And it automatically just did it in the background and installed this payload.

And were totally owned up and whoever had paid NSO Group for their Pegasus software got practically everything as far as we can tell. And part of the problem with with trying to track down these— Pegasus was basically, they were not persistent or at least this iteration, the most recent reiteration of Pegasus was not persistent.

Previous iterations, you were persistent so that if you rebooted the phone, they would just still be there. I think the latest iteration is reported to be not persistent. So if you rebooted the phone, it would be gone and they would have to own own you up again. But because it was happening quietly under the radar, they could just send you a message again, and you would never even know. Part of the problem there being iOS is so locked down, you need to use a jailbreak to do the forensic analysis, to get evidence of getting owned up.

Yeah. So, Thomas: is is iOS just like a tire fire, and why is the quote unquote, most secure operating system in the world, so vulnerable to someone sending you a message on iMessage and nothing happening and you’re not noticing, and then you’re completely owned up.


you can be the most secure operating system in common use and still be a complete tire fire, right? That’s an indictment of computer science, not so much an indictment of Apple per-se. So, I mean, it’s possible that Apple is doing better than any other mainstream vendor right now in locking down its operating system and still could it be enough to we talk about NSO Group, but the reality here is that the economic force for this is, state-level adversaries, right.

Thomas: People with— my mental model for state level adversaries— and I’m not an expert here, right? This is not field I work in, I just have dumb opinions about these things, but my mental model for this is just, state-level adversaries, are adversaries with effectively unlimited budgets. So you can be fantastic on security relative to the rest of the industry and relative to where we were in the two thousands and still be not well matched against an adversary that, can address the gray market to find vulnerabilities and spend arbitrary amounts of money to out- do you. There’s also, we’re all coming out of a legacy of not just memory-unsafe languages, which are obviously a problem and somehow still controversial, but also, just legacy system design. Just how you construct these systems out of components and all that. It’s also possible that, you can be on the ball with design and have a really clear idea of where things need to go, but no realistic amount of money that you could spend could get you to a place where right now you would be perfectly well-matched against a state-level adversary.

That’s like my high level take is , I think that you read Matt Green’s post ,the Case Against Security Nihilism, and I think nobody here is an advocate for security nihilism—

Deirdre: Can you define security nihilism?

Thomas: I don’t know. It’s a good question. After reading the post, I’m left with that question, right?

I think you can be a computer science nihilist and say that we throw all these things in the dumpster, right? Like , I’ll sign up for that, right?

Deirdre: Yo, the Silicon lies to us. It’s like you, we ask it to do stuff and it’s like, yeah, we’ll take that under advisement. We’re just going to compute some stuff in however way we want it to. So I kinda I almost agree on that regard.

Thomas: It’s a recurring theme with me when I try to talk to other software developers about these problems where there’s, one of our premises that we don’t examine a lot is the idea that whatever people are doing with phones or with computers or whatever, they have to do those things.

This came up a lot during campaign security, right? Or with journalists . This comes up a lot with journalists. They have to go talk to people: it’s their job, right? But when you’re talking about sensitive targets that don’t necessarily have to go and talk to everybody and don’t necessarily have to have one phone, that’s like their command center for everything they do in their life. There’s always this question of, do you have to be using these devices at all for this stuff? We’d all rather there be a world where all this stuff is safe, but it’s just like with secure messaging where there’s a lot of secure messaging technology out there that, you know, is better than nothing, but not as good as Signal is right always the reframe foin for that is, know, this is better than nothing, it’s cryptography, people aren’t going to attack the cryptography first. If you weren’t this, you’d use nothing at all. And it’s like, there is an option besides using nothing at all, which is not sending the messages at all. Not doing that. Being aware of the risk that you take when you have an online profile for this stuff. And I think that kind of goes through all of this stuff is we might not be at a point yet in our industry where we’re ready to support all the use cases for all of these, like, targets of the UAE. So.

Deirdre: LIke, sources that you met, you don’t even know who they are, but you got contacted through some sort of SecureDrop, maybe not Secure Drop, but— you have to trust them as far as you know them, and you might not know them at all. You just know that they are someone inside something that you’re trying to report on and they might be sending you images or voice memos or whatever.

And that’s what you’re trying to— that’s your use case of these devices and the software. And if you can’t do that securely, then what are you even doing with this technology? Basically.

Thomas: Yeah, there’s also, and David here, but I’m sure he has something more intelligent to say. But I feel like the kernel of Matt Green’s post is this paragraph where it’s like, in the world I inhabit Ivan works up to— Ivan is the lead security at Apple, good guy —in, in, in Matt Green’s dreamworld, Ivan wakes up and says today I’m going to put NSO out of business. Let’s put a pin in the idea that the right thing to do here is to put NSO out of business. NSO’s evil, don’t get me wrong. But there’s an unexamined premise about the goal of putting a specific bad actor out of business.

But that aside, like he wakes up, he decides, you know what, today I’m going to go to my bosses and, get a blank check to put NSO out of business. Is there a blank check that Apple can write, solves that problem within the next four years? Is it a thing that’s possible for them to do? I don’t know.

Like maybe it is, but I, it is not at all clear to me that that is a, that’s a one paragraph question rather than an entire blog post worth of things that we should examine about how possible that is to do.

David: If you replace, put NSO out of business, because that’s complicated because of market dynamics, like, and take the business model, everything out of it. Why don’t we just say. Give Ivan a blank check to stop having have, let’s say less than 10% or less than 1%, the number of exploitable, remotely exploitable vulnerabilities in iMessage or the network stack or that whole anything that can an attacker can send a message to basically on the iPhone.

And ignore the fact that NSO has their own market dynamics and that UAE has a lot of money. The police doing traffic stops in the UAE drive Lamborghinis at least according to top gear. So scale of money we’re talking about here.

And so, if we say, you know what, maybe they’ll always be able to afford million hundred or even a hundred million dollar exploits, which seems unlikely they would get that high, no matter how rare they are, we take the business out of the equation, I think that there’s still a lot of things that could be done to improve the state of iOS. Is anything— would you disagree with anything Matt Green says, if we replaced "put NSO out of business" with just make the attack surface of message parsing better on iOS?

Thomas: Yeah. I feel like I have the two responses to that, the first and kind of silly one is I’m kind of picking on the words that he chose there about putting NSO in particular out of business, but the real question I’m asking is if you’re, if your threat model is this threat model, if your threat model is the market for, I generally come at this thinking that any plausible cost for these kinds of implant technologies and vectors and all that, right, like any plausible costs you come up with, it’s probably petty cash for the Seychelles islands, right? Like almost everything they do involves staffing, multiple people. And having them travel around the world, just the logistics costs of running any kind of serious intelligence operation.

Like it’s, I assume it’s dominated by really boring things that we don’t think about a lot, but that’s where all the money goes is like it’s all health insurance is what costs, right. And an exploit compared to that,

David: Can you imagine the rate on that group, right? If tell the insurance company that, everyone is spies the pre-existing conditions, man.

Deirdre: But they all stay in the TAO in the very center of Fort Meade. So like maybe that gives them a discount or something.

Thomas: And an exploit is essentially, it’s a CapEx cost, you pay it once and then you have this capability and it’s push button for the thing you’re trying to get out of it. Right? We talked a lot about how police use exploits now and things like that. And where that’s going and like you can see the attraction, right?

Because the alternative to doing that, the alternative of doing intelligence is classic policing, is, forget about the civil liberties. That sounds terrible, but forget about the civil liberties. Forget about the civil liberties aspect to it for a second. But just the notion of tailing, somebody with a team of people 24/7 figuring out where they’re going.

Again, it’s just like the health insurance costs of having all those people, I have a hard time understanding how you could reasonably drive— like it’s so valuable, right? That the ability that people are getting out of exploits is so valuable and I bring this back just to bring it back to computer science, right?

Like, my real question here is there a check that Ivan could write that would maybe cost is the right way to think about it, or maybe something else is the right way to think about it, but can you drive that, that, that threshold up to the point where any government that we’ve ever heard about wouldn’t pay that money in the blink of an eye. And I’m just wondering if things if you could put $10 billion down on securing iOS tomorrow, do you have the people to do it? Do the processes exist to hire those people? Think about all the changes you have to make to iOS. Like those changes are happening anyways, but they’re happening over like a five-year timeline—

Deirdre: All right : about some of those changes. So things that I think of specifically, you talked about like ,rearchitecting, some parts of iOS, ideally. First things first, like rewriting some of these data parsers, audio visual video images either in a memory safe language, because they’ve tried to put them in a sandbox And, they called it BlastDoor. and we just saw a bunch of patches come out where they were like, "ah, you can get around BlastDoor. Sorry! You can get privilege escalation into the kernel. Sorry!" That blast door is not very strong. So what else

David: presumably they’d still would have had more without BlastDoor right. I

Deirdre: Presumably, but I don’t know.

David: of a sandbox. Even if you do get around it, that’s still time you have to spend getting around it. And maybe it blocks like maybe 50% of things are no longer feasible because of BlastDoor maybe it’s 10% or maybe it’s 5%. But I think it’s, I think it’s, I think it’s—

Isn’t that the nihilism that Matt Green is referring to as being like, well, the sandbox didn’t work. Yes. We, that doesn’t mean you shouldn’t have sandboxes.

Deirdre: Oh, no. I would say sandbox and rewrite it in a memory-safe language, because there are still logic bugs that you can have in your thing written in a memory-safe language. You would like to have your sandbox around that.

David: Yeah. The sandbox is a lot cheaper.

Thomas: Right. I mean, BlastDoor—

Deirdre: Right?

Thomas: is written in Swift, right? BlastDoor is in a memory-safe language.

It is, and it’s also I think it’s, I think it’s still seat— seatbelt sandboxes? are the thing on iOS and like friends I have or acquaintances, I have that actually do iOS work are gonna make fun of me for saying that, but it also, in addition to being written in swift, that also has a really tightly locked-down kind of system call and IPC filter. So you can’t talk to the network from inside of, the BlastDoor sandbox, you can’t touch the file system. You can’t talk to device drivers from it, which if you read if to bring yourself up to date on iOS exploits, that’s where a lot of the stuff is.

David: Exploits and device drivers? What is this Windows in 2005?

Thomas: But if you read the amnesty report or the forensic methodology report that they published there, it looks like in a lot of cases, what happened here are, and again, I’m saying this too often, but I’m probably wrong about all of this. But it looks like in a lot of cases, what happens are you don’t send a complicated file to to, iMessage directly per se so much as you send something to iMessage that causes some other application on iOS that was not considered part of the attack surface to interact with the attacker directly.

So you’re actually, you’re working through Apple Music for instance, which nobody thinks of Apple Music as attack surface,

Deirdre: Uh-huh

Thomas: There’s no zero click Apple Music, except there is, if you can trigger Apple Music through an iMessage thing, there was a fad for these kinds of vulnerabilities back in like 2010 where there were all URL handlers, a really common vulnerability than an iOS app doc is you have a URL handler. It does something based on URL clicks, and essentially you can synthesize roughly the same kind of thing out of an iMessage thing. So, it doesn’t really matter how you sandbox iMessage, if iMessage can also trigger Photos or Apple Music or things where you’re interacting with that application instead. Which brings us back all of iOS, all of iOS that’s the issue, right? It’s every application that’s built on iOS.

David: I think the first or at least first from Charlie Miller, who was one of the first, I don’t know who the first person was on iOS and I’m not going to make a claim because someone will get mad at me. But I remember Charlie Miller found an exploit on the iPhone in the very early days of iPhone. And it was in the URL parser.

Deirdre: Mmm. Not surprised.

David: 2007.

Deirdre: Okay. So Thomas, how would you rearchitect iOS? D— like, do you have any if you were, Thomas Ptacek neé Ivan and you were handed a big budget and had everything solved for you what would your plan of attack be to deal with this sort of thing?

Thomas: I would turn that job down.

Deirdre: Okay.

If they give you $10 billion and several of that is just for you to take the job.

Thomas: I think it’d be $10 billion and I’ll take the job. I won’t do a good job of it. I’m just warning them in advance. If they’re thinking about saying that money to me, that’s not value they’re going to get out of me. So.

David: It’s like when OPM was offering like 120 K to be their CSO, I

Deirdre: That’s it?

David: Not touching that 120 K and having to testify to Congress on arguably a consistent basis.

Thomas: But that is not a problem that apple has, right? If you talk to the security people on slacks or whatever, they all know that, right. Everyone of has a rough idea of the investment Apple puts into security, but the general, the broader audience, most of the people that are reading the security nihilism post it’s not clear to them at all, how much apple invests in security. And I’m just going to sound I’m, shilling for Apple here. I have no commercial relationship with them or whatever. But they’re one of the three biggest investors in information security, software security in the world. If you’re, talking about doing substantially better than they are, like at a different level of what they are, you’re talking about no commercial entity in the world does that but you’ll read responses to this where Apple needs to take this stuff seriously at last, at long last they need to. And it’s like, yeah, I don’t have a way to finish that sentence.

Deirdre: Yeah. It’s like, well,

Thomas: sputtering.

Deirdre: like you, I would say like the only thing that I can think that’s might be on a better footing is fuchsia is coming fuchsia is all rust except for a micro kernel that was written in C, C plus plus just, because they had to write it before rusty was like barely at one dot oh or whatever, but fuchsia is like a toy OS it’s over in a corner that’s vaguely gonna re replace or add hoc Linux / Unix it’s not iOS or Android, like multiple billions of devices over the world. So Yeah, trying, I agree that trying to be like, oh, apple doesn’t take security seriously. It’s but that wouldn’t, that would preclude everyone else in the world practically.

Okay. Pretend money’s no object and you actually want to do something like, do you have an idea of something that needs to be rearchitected about iOS or is it literally like, all these components are vulnerable on their own? It doesn’t really matter how you reorganize them. just need to drill into all. of them because that is just how an operating system like this works.

David: Okay. I so it’s been a while since I’ve looked at the details of how iMessage works, but if you let me handwave a little bit, it’s— the messages, are, almost entirely XML or XML is a large component of it, at this point for all of the things that they do it’s basically a file system, right?

An iMessage message can have attachments. It can have links that are parsed out into other stuff that create network calls. It can have games embedded in it with app clips. it can launch an app, right? There’s a lot of stuff that’s going on there. And if I had to venture a guess, right?

If you remember the early days of iMessage and like the iPhone texting, it was just like, oh, your message is going over the internet. And it’s blue now. And you could make the iMessage whale. Does anyone remember the iMessage whale?

Where you hit enter a bunch of times, if you just hit enter a bunch of times and you did a period or bunch of underscores and another period, the little tail of the bubble looks like the edge of a whale?

And so the message looked like a whale because they had shading and lighting on the sides, it actually looked like before they did the flat redesign.

Anyway, Google iMessage whales.

But right. It began as like something that was a replacement for SMS and is now something that can send like arbitrary content with attachments and like code to some extent.

And so I would believe that there exists a simpler implementation of iMessage than they’re using now. And whenever you simplify any sort of network-ish protocol, it becomes easier to implement securely. And then on top of that ripping out as much of it as possible. And reimplementing it in a memory safe language or closer to memory-safe language like Zig

would they

Deirdre: it in Swift?

David: I suspect they would do it in Swift .When I’ve talked to apple people, they’ve basically been like swift percolation throughout apple is just in a weird state like where you can and cannot use swift in the build process, do more with build stuff.

Someone from apple can probably correct me, but .

Deirdre: I would definitely like to rewrite a bunch of this stuff because even of the most recent patches that have come out for iOS Mac iOS iPad, O S I forget what it’s actually called more than half of them have been memory safety vulnerabilities have been fixed. Was it, I think it was like fish in a barrel or whatever that account is, keeps track of all of them And that’s very handy.

David: The analogy I would use would be like word documents from like the nineties through early two thousands to now not to say that not that I recommend opening an untrusted Word document from anybody, but like word documents, serialization began as take pointer to struct is word, document call like write the system call on that pointer and a great, we’re done and then it’ll read it in directly to the pointer. And then with the doc X format, they switched to an XML based format, which is hopefully easier to parse than a C struct based format and stuff did get better over time.

Deirdre: And they did disable certain macros that in the, in, in Microsoft word. And so yeah, opening a random word document on modern windows is just like a completely different stack of both capability and implementation than it was 10 years ago, 15 ago

David: Yeah. And that took what, five years? 10 years. I don’t know how long, but it’s clearly possible for a large tech company was effectively a file format, which is what iMessage is to not to redo it again, better, stronger, harder, faster. it

Deirdre: with a lot of changes. to windows though. So like from the late nineties, two thousands to where windows exists now is like a different beast of separating what your capabilities on the system. I’ve heard arguments that basically iOS has separation between apps to some degree, but it could be a lot better.

I think reimplementing iMessage and simplifying the protocol and simplifying the stack in the ways you just described is necessary and good, but also don’t forget the rest of iOS, I guess. Something to that degree.

Thomas: when, Microsoft did that, they, it was a root and branch effort across the entire tree for them. In a previous life, I was kind of peripherally involved with some of that stuff. And a reasonable mental model is they’d probably put as much energy into, pentesting

David: I was in middle school.

Deirdre: school.

Thomas: Yeah. Yeah. Like they probably put as much effort into, fuzzing and pen testing, mine sweeper as any company has into their own. It’s like everything got hit and they also, they really pissed off a lot of their users with the subsequent releases that metabolized all that separation that they were doing.

and so like they, they did those two giant things that I don’t know, this is three people who don’t really do iOS security work talking about iOS security, which is great. But I don’t know the extent to which apple is doing kind of root and branch memory safety stock, by the way, Microsoft didn’t do root and branch memory safety.

They did, root and branch, "don’t call *copy anymore", which is not quite the same thing. And for all of that is windows and a place right now that substantially better than iOS is? I don’t know that that’s true. So I, it’s kind of unproven to me whether you can just repeat what Microsoft did and get to a much better place.

For a while they stopped worms, right? Maybe this is our modern equivalent of worms but they didn’t end vulnerabilities on the platform.

David: I think the analogy is just, it was a major upgrade to the state of security. Not that I don’t think anyone was trying to claim that windows is more secure or has less exploits than iOS although I don’t, maybe it does, but like I’m, if someone was talking to me and saying I have sensitive stuff or if I was in charge of securing a CEO’s devices, I’d be like, you’re getting rid of your windows PC, we’re getting rid of your Mac and you’re doing all of your work on an iPad., like

Deirdre: An iPad or a Chromebook or something. That’s a good, that’s a good point. Basically like our default is if you need a device and you think you’re, or we think, or you might be a high value target, use an iPad for all of your document handling, if you’re a journalist or if you’re a, political candidate or something like that, or you’re a principal possibly a Chromebook.

Where would we compare… macOS to where windows is? Because we used to think oh, maca west is like, had a shine of oh, all of these viruses and malware and crapware that went after windows because windows was like 90% of the market or whatever. And I don’t know where it’s at now.

like, oh, you don’t have all these problems on macOS. You can make an argument about popularity versus actual vulnerability. I don’t have a good gut feeling about, is windows now, like fully patched, like top, like whatever windows we’re on windows seven. there security architecture con like possibly better Or comparable to macOS?

Just because macOS has been languishing. They haven’t been investing in it as much as they used to.

David: I think you are less likely to torrent a pirated copy of software that has actually malware on macOS than you are on windows.

That’s have anything deeper to say than that.

Thomas: Kind of in the continuing theme of three, mostly crypto vulnerability researchers talking about platform security, I haven’t talked to anybody who believes that macOS is more secure or even comparably secure to kind of a secure configuration of windows 10 or whatever it is right now. But I also don’t know anybody who thinks that windows 10 is more secure than, or comparably secure to an iPad, for different reasons. It’s not because the iPad is perfectly locked down as you can obviously see. Right. But if I was talking to a campaign or a journalist and giving them advice on how to deal with incoming messages and stuff like that, I still a hundred percent tell them use Gmail for everything and look at your attachments on an iPad because if you do that on windows, you’re just going to get owned.

For reasons that probably don’t have that much to do with the software security, the intrinsic software security of the Microsoft office platform and more just to do with how flexible desktop environments are compared to tablet and phone environments.

Deirdre: There was a, there was the ah, apple epic for the game company that got, they like, push their buttons. And then they got kicked off the app store. Cause they were trying to like actually do payments through their own app store, inside their app. And they brought James Mickens as an expert witness to be like, "compare macOS or testify to the security and why all of these store restrictions are either good for security or not".

And basically he was like

David: Did he deliver it as a standup routine?

Deirdre: he did not. he delivered it quite. He was a well-delivered expert witness in a real legal trial with lots of money on the line. But he basically made the argument of like mac OS and iOS are not significantly different, except for the fact that a Mac OS, you can just download a random executable from the internet and you can run it. Could also do it through the app store, but the OS lets you just click on something random That you downloaded from your web browser and execute it.

And iOS doesn’t let you do that. And there’s not a significant security reason why?

so anyway, yeah. It’s flexible desktop stuff.

Thomas: that seems wrong.

Does that, seem off to you that like, if there’s not a significant security reason to have more than likely to not have that one single app store channel for installing applications, like it seems like for end user security, you’re a lot better off. If you have to get your application through the app store, then just downloading them off the internet.

Deirdre: Yes, but I think the argument that the non-Apple side was trying to make is that if you really cared about this on your desktop OS, you would enforce it the same way you enforce it on iOS, like only let people install things with the app store on macOS and they don’t. And so it was like, why?

Thomas: Right. And I mean,

I think,

Deirdre: is money so that they could take a cut of sales

Thomas: but we know what the real answer is there. Right? Like you can just look at hacker news and every time, any story about apple platform security comes up. There’s 10 comments about how this is the beginning of the end. And apple is going to take away from macro S the ability to side load applications, or run my own applications so they can control the whole platform.

And you’ll have to use the app store. It’s a conspiracy theory that everyone has believed for 15 years now. Right. And the reason they don’t is because they can’t because their users won’t let them

David: they’re not

Thomas: with the phones

David: they’re, different platforms for different purposes. And it’s just what the people on the internet don’t realize is that what happens instead is, normal, non-security ,non-developer, non-engineer people, just do all of their computing on their phone. It’s not the internet and the mobile internet anymore.

It’s like the mobile internet or internet and the desktop internet,

Deirdre: and maybe your iPad,

David: it’s not computing and mobile computing, it’s computing and desktop computing. I can, if I put on like my venture capitalist market analysis hat here, right? The market is going in that direction and the Mac will continue to be, and windows will continue to be A thing for running more or less how they do now. It’s just less and less people will use it.

Deirdre: Yeah.

David: And I believe that even games are going to go to different platforms, look at the steam deck now we’re way off topic.

Deirdre: Ooh, I’m looking at the steam deck. I’m interested in that. THis Is the ‘whatever’ part of ‘security cryptography or whatever’.

David: Is it’s better for most users to have either one or a small number of app stores in which to download software from.

Deirdre: We can, we, as a bunch of people who are like nerds about, about cryptography can argue that someone signed this binary and it can only be installed on this operating system. If we’re able to check the chain of signatures back to a trusted root or something, and we think that that is, good.

But it does go through the sort of somewhat arbitrary to Apple’s corporate values about things that get approved and audited through the app store review process, as opposed to just sort of randomly like you register with apple. And then as soon as you get a developer key pair, you can just start whipping out signed binaries, And they’re just distributed through the app store channel.

Do you think there’s any actual value in that?

Thomas: A hundred percent I do. I would compare it to ah, this is terrible, right? random topics that we’re, we all think about. And David thinks that my random topic is a meat and my real random topic is local government right now. But like, It’s it’s analogous to everything that happens in local government and passing ordinances.

So for instance, we’re where I live, we’re probably gonna regulate Airbnb. We’re gonna license. People can only rent out their places if they have a license that we’re worried about people like having parties and stuff in their apartments and starts have been a recurring problem.

Right? And the people who are opposed to that point pointed it and say, well, there’s about this license process. That’s going to prevent somebody from having a party at their place. No one can seriously check before people get in. But the point of that, the point of the licensure isn’t that the licensure itself prevents the bad thing from happening, but it gives you a control point, right?

Like the real teeth there are that your license get revoked, your license gets revoked. And then you you can’t rent that place out again like that this single channel app store with approval allows them to limit the the size of the addressable market for a piece of bad software.

know, at some point you get fairly large or lots of people start using your thingy. But you can only keep selling it if Apple approves. And at some point apple catches up to the fact probably because somebody told them about it or maybe because they found it themselves, they do a substantial amount of, reverse engineering and things like that.

Who knows yeah, either way they figure it out and then they revoked the license for it. Right. And that’s it. Right? You can still get malicious software onto the app store,

Deirdre: and that has happened

Thomas: not, not at a huge audience size. Right. At some point, like it languishes like only a couple of people ever touch it.

It’s hard to drive by installing application on iOS for people. Maybe it’s not, again, what do I know? Right. But but the real teeth there are that they can, they can revoke your

ability to keep pushing new. And that’s that seems like self-evidently valuable to me.

Deirdre: What about an extra app store? Does it have to be controlled by apple? Because they let you do this on the Android they used to let you install apps, different apps stores. I don’t know if they still do. Same argument?

David: I think if apple just cut the fees, everyone would shut up about this

Deirdre: Yes, they probably would. They probably would.

David: Like what the reasonable fee compared to what their current fee is, is like somewhere in the range of like 10 to $50 billion in revenue. So I understand why they’re not,

Deirdre: it’s like the default as a 30% cut there’s that. And they don’t let you make other sales or even point you to other sales inside the apps or anything like that. Like you have to make the transaction as part of the app store and they hate that too. yeah. Okay.

So we’ve talked a little bit about NSO group Pegasus zero days, security nihilism is iOS, really a bunch of trash. And how would you actually fix it a little bit? What about bug bounties? Is apple pricing there? I think they’re their highest bug bounty that they will pay is like a million dollars or something like that. are they too low for the sort of vulnerabilities that Pegasus is leveraging? Because for an app like Pegasus you do a one-time cost to buy the zero day. then you can deploy it over and over and over again until it gets patched. Thomas, what do you think?

Thomas: We keep asking me. I’m like, I’m not just over my skis in this discussion in my skis are like miles behind me.

David: Think it’s ridiculous to compare the market for bug bounties to the market for exploits. Like people have like moral values as well as India related goals or whatever, as to what they’re trying to do. I don’t think there’s ever been a person who’s like, well, I could sell this vulnerability through the apple bug bounty program for a million dollars, or I could sell it to NSO group for $3 million, even though that’s probably still more than NSF. Like I don’t believe that they’re paying that much. And so, yeah, light, I don’t think that’s people’s consideration as well as I don’t think there’s the same people who are finding and selling exploits for the most part are not the people that are going to bug bounties.

Deirdre: I think at one point it was, that was like, kind of the raw consideration. I read Nicole Perloth’s book. This is how the world ends. It’s just basically like, a history of the zero-day market at a pretty high level.

Before bug bounties were really a thing. And if anyone has more direct experience I’m talking about I read other places, not direct experience. People would try to report things because they want to get them fixed for no cash. And then they would just get rebuffed and be like, oh, our lawyers are going to Sue you for finding a vulnerability and exposing it and making us look bad. and they’re like well, fuck. And then people would start paying them to give them cash under the table And not tell them where and just give them these no competes basically on these vulns and they’re like, okay, I’m going to go over here. then the bug bounty started to come up and they were like, chump change and still a pain in the ass compared to selling it to a broker or, eventually a state actor or an NSO group or whatever.

So I don’t think it’s necessarily like a million dollars with apple and $3 million with NSO group. In general, but I think the actual actors in this kind of rarefied market, it might be not as an easy choice as you and I might think.

Thomas: Yeah,

you, no, what I’m talking about here. Right. but two things, I think I know the first of them is that the direct price comparison between what people think of the gray and black market pays and bounty prices doesn’t work because the terms are different, right? Whatever, the number that you’re getting on, an exploit that you sell privately is you’re not getting that in a lump sum payment.

You’re getting it in traunches. You continue to get tronches until the vulnerability is burnt. don’t know, you don’t know when

Deirdre: I’d never heard of that

Thomas: There was a talk at blackhat two years ago

David: 2019, yeah. By Maor Schwartz, I think is his name? Yeah, M a O R. He’s a broker.

Thomas: and you read it or you hear it. And it’s it’s obviously this must be how everything works because it’s such a sensible way to run things. But you get paid for a vulnerability as long as the vulnerability continues to work. And you, as the person who reported the vulnerability, you don’t know when it’s going to stop working you’re gambling when you do it that way.

Right. So that’s one thing

David: It is a contract that looks very similar to a B2B software sales. It’s kind of funny. It’s like the same industry. I was watching the talk at black hat that year. And I’m like, they’re just talking about the contract negotiation that we were going through at Censys like with anybody, like all of this stuff is there’s a support team,

Deirdre: this reminds me in a way it reminds me of how now all these ransomwares are like ransomware as a service. This is so weird. All these worlds colliding. Anyway, continue.

Thomas: My other thing I think I know here is that. We’re not necessarily well-served by apple driving up NSO’s costs. Not just because putting NSO out of business is bad, but also because by driving up exploit costs, you can potentially be helping NSO, right? Because their business model is essentially, part of their business model, is recurring revenue from the implant that they build Pegasus or whatever that framework is called.

But part of it is also them capturing some of the value of exploit development. And as exploits get more expensive, they take a cut of that and they’re going to get some revenue from that. Ultimately, the people that you’re really charging here, aren’t NSO, they’re the intelligence services of, European governments.

And those people can pay indefinitely more. So at some point you’re just making NSO very, very rich by driving those bounty prices up. That’s not to say that we shouldn’t drive bounty prices up. Right. Like I think that’s not the headline that I would want to have taken away here, but I wonder how the dynamics work.

I wonder how, like, how you’re actually solving a problem by, Making the market hotter for NSO

Deirdre: , so if Apple’s driving those costs up and the idea being, you want The people who have these volumes or her found them to go to the apple and give them to apple and apple , will pay you for, the rates that you would get. If you went across the street and you gave it to NSO group or a comparable broker so it’s driving up the cost. Are you saying that. it, that would drive up the cost of NSO group’s offerings when they sell to the ministry of information of France or Saudi Arabia or whoever their customers are.

Thomas: My thought about this is that the backstop for the pricing for vulnerabilities is ultimately the people who are ultimately using the vulnerabilities, right? That’s not NSO, it’s NSO’s customers. So whoever, if you drive the prices up, the prices are just gonna get passed through to those customers.

If those customers are price sensitive, then that’s valuable. If they’re not price sensitive, which I think there’s good reason to believe they’re not, then you know, what you’re really doing is just increasing the amount of revenue that is flowing through the system. And, if NSO’s business model, isn’t part that they take a 30% cut off of all the revenue that’s going, flowing between the exploit developer and the ministry of information and France then you know, you’re helping on it So by doing that.

Deirdre: Got it. Okay. So where do we think the break point is between the manufacturers, apple, Google, whomever windows Microsoft who are trying to catch all of these extreme vulns beforehand that they don’t get on the wild and NSO basically having a less attractive product because they could not buy these vulns as cheaply anymore, or they only have one, or they, they don’t have a bunch of pretty great zero days.

They only have one pretty great zero day and they have a couple of other, single digit days. Yes, they might be ripping off Saudi Arabia or administrative information in France. But their offering is less good. It’s a less good attack because apple, Microsoft and Google are buying up the vulns and patching them.

Is there a balance?

David: I mean, I think it’s just like any other market, like there’s supply and demand. The demand is probably like fairly constant, right? Right now, and that like intelligence agencies or governments are looking for they’re trying to use what they can use and if stuff works well, they’ll keep doing it.

If it doesn’t they’ll stop, but like the intelligence is still going on and the money is still there on that side. And then there’s some number and costs associated with how many, like exploits can we find and what is the rate that we can find them at. So I would believe that if you will raise the bar and you do some of the engineering changes that we discussed earlier to that could decrease the rate at which,

We find exploits in say, iOS and I iMessage that then decreases the supply that NSO has, or it drastically increases the price per exploit, maybe that NSO and other organizations needs.

So maybe I would expect that, as the security of the products increases some like some of the market might shrink or consolidate down. Right. I don’t know whether that’s going to actually reduce. It may be, as Tom said that like NSO group ends up making more money in that case, but maybe they have to be a little more diligent in who they target when or how many contracts they sign or the rate at which they can do something.

Maybe there, maybe this is like the time to deliver in their contract. If you have, if you’ve signed one of the recurring contracts with them, or you don’t, aren’t paying per exploit, you’re getting like a service in which you say, Hey, I have a person and you need to go get it. Maybe the number of people you can go after per year goes down or maybe the cost of that service drastically goes up. Maybe NSO group survives and some of the smaller ones go out. I don’t know. I assume that like right now, because we’re hearing about NSL so much, they’re doing better than at least other public ish facing exploit groups. And I totally believe that Like they have bought out other ones or employees or, and so on. Right.

Deirdre: It definitely

David: see more of that.

Deirdre: it definitely seems Like, they’re not hurting and they are actually expanding their clientele. And even though they say, oh we’re always trying to this is for catching terrorists and someone like that, it’s like okay, you’re selling it to Saudi Arabia and they’re murdering journalists with it. . But there seems to be indications that they are going like further to like local intra-state law enforcement as well. And that’s a bit scary too.

I basically agree with everything you just said. Like I want the software to not be bad. I want it to be good. And if it’s riddled with vulnerabilities that is bad. And I would like to pay whoever finds the vulnerabilities to give it to us. So whoever’s making the software. So the, it, it will not be bad. it will be good software as much as far as humanly possible within economic costs. So I kind of just want to pay for The vulns and then if that. does make the software harder to exploit, and even if you have to raise prices, if your NSO group to try to deliver a product I am okay with that(?).

Thomas: I’m okay with it, right. Like anything that you can do that makes vulnerabilities more expensive. I for computer science reasons I think it’s a good thing. I just think that, , it’s less a question of whether it’s the right move to make for apple and more whether we’re going to get what we expect when that happens.

So , I guess I would put the question to both of you is do you, guys think that it’s possible for apple to price NSO, specifically NSO out of this market? Do you think it’s possible to do it? Is there like a re like is there a check that they can write that would do it ?

Deirdre: For people submitting things to their bug bounty. I don’t know about price them out, although they are a trillion dollar company. They have many billions of cash in the bank, so it’s not impossible whether they. I mean, it’s certainly possible. They could literally have a billion dollar bug bounty and it would not hurt their business. I don’t think NSO would really be able to compete with that, but I don’t think they will. think


Thomas: you’re a yes. On that. Yeah.

Deirdre: So I I think there, I think it is very possible in this world with this apple In the year of our Lord 2021, that they could do this. They probably will not do this to really blow them out of the water.

But I think it’s very possible to match them and maybe just go just in front of them and like keep even with them. and I think. that would be much more viable from a business perspective. But I don’t think they’re there. I don’t think a million dollars is. the cap, I think they need to go a couple more million dollars to really compete so that if you are, if you’re trying to figure out where you want to give your exploit or your vuln, you have to be like, I really have to flip a coin between apple and NSO group or whomever.

then maybe like values and stuff like that will actually come into play. don’t know. Maybe I’m thinking maybe I’m too optimistic about of these people who are selling their exploits. I don’t think they’re there yet, but I don’t know. I am not in this market.

David: I don’t think apple could buy out that market. Like for starters, not everybody selling vulnerabilities is like a us tech worker with like plenty of options for contracting and or what they’re doing with their life. Right. If I am in. A less economically stable country, and I don’t have a way to get into the bug bounty for Apple or like for whatever reason, or I won’t be taken seriously or I can get paid a lot. I don’t have a way to accept us dollars or I don’t have a bank account, but someone is willing to pay me a lot of money to get this, or even not that much money, but like, they’re here, great.

If I’m in the military of another company, right? Like it’s not like someone working for the PLA just decide that they want to go and sell the apple instead. Right. So there’s just no way they could buy everything no matter what their price was.

And again, beyond that, like I think the thing that really answers this question, like what I would love to see would be like, if NSO made a pitch deck to raise from VCs that described their market and their financials, like that would be fascinating.

Like obviously you shouldn’t fund them, but that’s the thing that would reveal this. And we don’t the closest that we have to that are like some of the talks from people who have been brokers explaining like What, makes a good or a bad vulnerability and like the difference between what, they’re looking for and what most bug bounties pay for. And like what, they expect, like the size of the supply side of the market is, is like the people who are good at finding these vulnerabilities. So yeah, I think that you could decrease the supply. I think you could decrease the rate and probably drive up NSO’s prices. Would that make it go away? No, but I think you can certainly make the situation better than it is now. That in a way maybe that still drives up NSO’s prices. But maybe it reduces their the number of targets that it can be used on. And even if it doesn’t even, if all it does is drive up NSO’s prices and NSO makes more money, like we’re still probably in a better situation.

Thomas: What percentage of this problem do you think apple could make go away? If they just rewrote everything that was on the platform and swift and rust?

Deirdre: half of all of their vulns apparently at least half are memory safety related. So that doesn’t mean half of all exploits that. NSO group would leverage in their toolkit or their payload, but does mean half as many links in the chain for you to put together and do something with.

Yeah. So that’s a good rule of thumb

David: I’d go with a 65% and I’m actually not pulling that, making that number up. Alex Gaynor has done some like empirical research on if you have a large code base, what percentage of like the bugs that you know about? And the CVEs were caused by memory un safety issues. And it’s basically it’s 65% and it holds across Android. It holds across iOS. It holds across Chrome, it holds across windows. It holds across Firefox it’s CSS sub system. So if you can get rid of 65% of your bugs, like that’s that’s going to have an impact on the supply.

Thomas: I’m more optimistic than both of you in this one weird case. Right? So I would imagine that for the kinds of exploits that NSO needs for the things that they’re trying to do, that you could be north of 80%. If you get rid of all of the memory safety vulnerabilities, or more, more candidly, like with the question I asked, if you rewrote everything in swift and rust, whatever memory safety issues are leftover, I still think you’d be down to a very small fraction of the bugs that NSO needs to provide the value that they’re providing now

Deirdre: I think eliminating all the memory safety bugs this does not include logic bugs, but it may correct me if I’m wrong, but they seem like they are less powerful than most of the memory safety bugs, because they, just give you a better platform to leverage into the, using the next one in your chain.

Why do you think it’s 85%,

Thomas: I think it’s north of 80%.

Deirdre: right? Why do you think it’s so high? Do you think it’s basically if you, it’s not just one-to-one if you remove 65% of the vulnerabilities and those ones you removed are memory safety ones, do you kind of agree with my like, smell test of these memories, safety vulnerabilities, basically let you do more for them than general logic vulnerabilities.

Thomas: I think disproportionately the vulnerabilities , that gives you a code execution, which is what they need to do implants. Those are memory safety vulnerabilities, and I think even more so the vulnerabilities that you need to do local privilege escalation to go from inside of the sandbox to outside the sandbox and the kernel like either somebody is going to correct me and point out a chain that doesn’t involve a memory safety vulnerability. But I think if you take those vulnerabilities off the chain off the table, those bug chains are all pretty much broken.

Deirdre: yeah, I think I agree.

David: Else do you get code execution besides like memory un safety? Right.

Thomas: , you’ll probably still have platform, memory, safety vulnerabilities. Like you’ll be keeping like a page table reference counter somewhere someplace where the interface between the rust and the platform starts hitting, right? Like you’ll still have vulnerabilities that are that low level and there’s nothing that Rust runtime can really do about it.

I think you’ll still have

David: write gosh, what’s that the memory addresses that are actually hardware, right? You can’t write to that without unsafe. And

Thomas: right. ARM doesn’t yet have a borrow checker. Someday. But but right now I think yeah.

Deirdre: Even when you’re using the unsafe things and rest, there’s still like a ton of checks that give you a lot more safety than the writing the same thing and see, but

David: So this is just, you have to write and rust and

Deirdre: Hey, Hey

David: to experiment.

I mean, I like Russ too, but like

Deirdre: what do you mean by

David: that, the example I always give is I was once working on some build tooling and in five minutes and Python, I got farther than three days of rust, just like trying to map out a graph of dependencies and do some stuff because the code runs when it doesn’t compile.

And I don’t need to know what I’m doing ahead of time.

Deirdre: Okay. All right. All right.

Cool. So that seems to circle back to what we were talking about in the beginning, which is that It seems, like quote, unquote rewriting it all in Swift slash Rust or another memory safe language would give you extreme bang for your buck in reducing the number of vulnerabilities and most impactful vulnerabilities, at least in getting privileged escalation and code execution period.

So should they just one, I think they should rewrite it all in rust or Zig or swift. If you’re apple, , do we have any signals that, they’re like, it’s apple, it’s like, a locked box and they threaten your children if You leak anything.

I haven’t seen signals externally about them trying to rewrite some of these most vulnerable parts in swift. have this sandbox that’s written in swift around. I message. But so if, yeah, if you’re the, if you’re the head of platform, security for apple, why aren’t you just doing this right now?

Maybe they are, but I don’t think they are?

Thomas: You know, I think you said it’s a lot of bang for your buck and I don’t know what the ratio is there. So what the amount of bang is, but it’s a lot of bucks, right. That you’re talking about. Cause you’re again, it’s root and branch. Like everything there is pretty much getting torn out. So there’s that problem. There was another thing I was going to say and the, tool, I forgot what it’s gonna say.

Deirdre: that’s fine.


David: my hot take has always been like,

Deirdre: the

David: way to improve, like the security of an organization is to become the VP of engineering, which I think I’m also stealing from Alex Gaynor. I think he might’ve said that first. But again, like but you need to know something about engineering, right. And be able to still do that job. But again, we have it’s, this is a tractable engineering problem. It’s just very hard because it’s very large. And even if you have an unlimited budget, like it’s still hard,

Thomas: I think it’s also like the solution space for this problem. Like right now, I think it’s saying to look at the landscape and say, just rewrite everything in swift and rust, but like how long have those languages been viable? For doing large scale systems work? It’s been a little while, but it hasn’t been 10 years wherein they could literally rewrite everything in rust.

And also when you tell apple that they’re rewriting everything in rust, we imagine that you’re

telling Alex Gaynor and his acolytes to rewrite things in rust, but actually you’re addressing the entire application programming programmer. And as somebody who has gone recently from one language to rust the learning curve there is not gentle.

Deirdre: I forget that, they’re written in objective C, but the applications themselves are written in objective C. Whereas at least on Android, they’re all written in Java or like the Android flavor of Java.

David: The memory safety of C and the blazing speed of smalltalk.

Thomas: Okay. Yeah. I think that the post that that that Matt green wrote mentioned, like they should be using things like, pointer authentication and things like that. And they do, but objective C actually gives you vectors that work around pAC. it’s, there are probably ways in which objective C is substantially worse than C from a security perspective.

Deirdre: can you tell me more? I don’t know any details about objective C really? I’ve never written it.

Thomas: I think you just get a whole bunch of pointers, like a bunch of function, pointers that are easy to target and overwrite that you don’t get in a standard C run time.

So it wasn’t an ISA pointer or something like that. That was reliably. You can find it and send messages to it.

David: Yeah, it has like a built-in message passing an object framework. That is all just pointer based. That is on everything.

Thomas: there was a pack bypass that involves objective C runtime stuff.

David: This is why the fact that rewriting things in rust requires you to like, think about things in a completely new way and restructure your problem as basically like a DAG of data transformations, which is more difficult in certain cases than others. Compared to say something like Zig, which does not offer quite the guarantees Rust does, but it’s still like way, way safer. And you can compile with like complete safety by just inserting enough checks, right? Like more innovation in the programming language space, even beyond Rust is going to result in probably being more suitable for some of these things. Not that like you can’t do all of this from scratch and rust. I think that’s the whole value proposition of the oxide computer company. But starting from scratch and replacing bits are, you know, two different tasks.

Deirdre: That’s true. And I, I wouldn’t be surprised if apple with their own programming language would go with swift before they go with rust, at least for a lot of pieces. Yeah.

Okay. Facts about bears. Questions about meats.

Thomas: My bear fact for the day is I recommend everybody read Stephen Herrero’s bear attacks, their causes and avoidance Manchester Boesky said it was one of his favorite books and I’m reading it. And my alternate title for it is "when you are engulfed in bears" and it is fantastic. And I will come armed with more bear facts further into this into this thing that we’re trying to do here,

Deirdre: Awesome. am pulling it up now and I can’t, I cannot, this is a great book,

Thomas: it is what it is, what it says on the label. It’s just bear attacks, stories about bear attacks and how they happened.

David: The book behind the movie 127 hours, I think it’s called between a rock and a hard place. I’m about the guy that had to cut his arm off because it got stuck under a Boulder or something. When he was rock climbing. He has other stories in there, including one where a bear followed him for about 24 hours. And I believe he did what you were not supposed to do and just yelled at the bear. But if you want even more bear stories.

Thomas: the bear is trying to eat you, and it is a grizzly bear and the bear flowchart, then you’re supposed to yell. if the

Deirdre: flow chart.

Thomas: there is, but if the bear is not trying to eat you, which is a much more common case, and that’s a grizzly bear, then you are not supposed to scare it or make noises because it will attack you until you stop making noises. That is what I now know about bears.

Deirdre: Wow. This is, this is handy, useful information. Not just random facts. Useful.

David: I was in Northern Michigan recently. And I had heard that the place that I was in there had been some like bear sightings nearby. And I was going for a run one morning. And when I go for runs, I usually don’t wear my glasses because they get loose and they bounce around and I don’t have contacts because I find them gross.

Deirdre: fell


David: and so I was running back and then I’m going down this dirt road and I just see This black blob.

And I’m like, I don’t know if this is a bear or a dog. It turns out it was a dog

But if you’re worried about bears bring your glasses with is the other piece of advice that.

Thomas: is not a good bear story, but and on that note,