by ruik on 12/27/23, 3:47 PM with 399 comments
by sweis on 12/27/23, 9:53 PM
by mike_hearn on 12/27/23, 6:41 PM
I was willing to believe that maybe it was just a massive NSA-scale research team up until the part with a custom hash function sbox. Apple appears to have known that the feature in question was dangerous and deliberately both hidden it, whatever it is, and then gone further and protected it with a sort of (fairly weak) digital signing feature.
As the blog post points out, there's no obvious way you could find the right magic knock to operate this feature short of doing a full silicon teardown and reverse engineering (impractical at these nodes). That leaves hacking the developers to steal their internal documentation.
The way it uses a long chain of high effort zero days only to launch an invisible Safari that then starts from scratch, loading a web page that uses a completely different chain of exploits to re-hack the device, also is indicative of a massive organization with truly abysmal levels of internal siloing.
Given that the researchers in question are Russians at Kaspersky, this pretty much has to be the work of the NSA or maybe GCHQ.
Edit: misc other interesting bits from the talk: the malware can enable ad tracking, and also can detect cloud iPhone service hosting that's often used by security researchers. The iOS/macOS malware platform seems to have been in development for over a decade and actually does ML on the device to do object recognition and OCR on photos on-device, to avoid uploading image bytes: they only upload ML generated labels. They truly went to a lot of effort, but all that was no match for a bunch of smart Russian students.
I'm not sure I agree with the speaker that security through obscurity doesn't work, however. This platform has been in the wild for ten years and nobody knows how long they've been exploiting this hidden hardware "feature". If the hardware feature was openly documented it'd have been found much, much sooner.
by DantesKite on 12/27/23, 9:50 PM
“This iMessage exploit is crazy. TrueType vulnerability that has existed since the 90s, 2 kernel exploits, a browser exploit, and an undocumented hardware feature that was not used in shipped software”
https://x.com/sweis/status/1740092722487361809?s=46&t=E3U2EI...
by Muehe on 12/27/23, 6:38 PM
https://streaming.media.ccc.de/37c3/relive/a91c6e01-49cf-422...
(talk starts at minute 26:20)
by cf1241290841 on 12/28/23, 12:55 AM
According to him the exploit chain was likely worth in the region of a 8-digit dollar value.
¹ https://en.wikipedia.org/wiki/Felix_von_Leitner
I guess somebody is going to get fired.
by londons_explore on 12/27/23, 10:42 PM
Even though no public documentation exists, I'm sure thousands of Apple engineers have access to a modded gdb or other tooling to make use of it.
by transpute on 12/27/23, 7:04 PM
For Wi-Fi–only devices, the Messages app is hidden.
For devices with Wi-Fi and cellular, the Messages app is still available, but only the SMS/MMS service can be used.
SMS/MMS messages and non-emergency cellular radio traffic can be disabled by a SIM PIN, e.g. when using device for an extended period via WiFi.by londons_explore on 12/27/23, 9:26 PM
And for a single bit, the hash value is a single value from the sbox table. That means this hash algorithm could reasonably have been reverse engineered without internal documentation.
by londons_explore on 12/27/23, 8:17 PM
Mere differences in timing could have indicated the address was a valid address, and then the hash could perhaps have been brute forced too since it is effectively a 20 bit hash.
by soupdiver on 12/27/23, 6:34 PM
by WalterBright on 12/27/23, 6:02 PM
by throwaway81523 on 12/28/23, 4:19 AM
by stefan_ on 12/27/23, 7:33 PM
by trustingtrust on 12/28/23, 3:34 AM
The later works when you are not as big as Apple. When you are as big as Apple, you are a very hot target for attackers. There is always the effort vs reward when it comes to exploiting vulnerabilities. The amount of effort that goes into all this is worth thousands of dollars even if someone is doing it just for research. If I was doing this for some random aliexpress board it would be worth nothing and probably security by obscurity would mean no one really cares and the later part works here. But I wonder what Apple is thinking when they use obscurity cause people must start working on exploiting new hardware from day 1. You literally can get one on every corner in a city these days. Hardware Security by obscurity for example would be fine for cards sold by someone like nvidia to only some cloud customers and those are then assumed obsolete in a few years so even if someone gets those on eBay the reward is very low. iPhones on the other hand are a very consumer device and people hang on to their devices for very long.
by I_Am_Nous on 12/27/23, 8:51 PM
Reminder to reboot your iPhone at least weekly if you are concerned about this kind of attack.
by kevinwang on 12/27/23, 8:20 PM
by codedokode on 12/27/23, 10:53 PM
by kristofferR on 12/28/23, 12:58 PM
by luke-stanley on 12/27/23, 10:10 PM
by neilv on 12/27/23, 6:42 PM
Did the systems software developers know about these registers?
by amai on 12/28/23, 3:58 PM
by Despegar on 12/27/23, 7:56 PM
by mb4nck on 12/27/23, 9:05 PM
by patrickhogan1 on 12/27/23, 7:36 PM
by jeffreygoesto on 12/27/23, 7:16 PM
by xvector on 12/27/23, 7:24 PM
by g-b-r on 12/28/23, 2:41 AM
Since they're supposed to be disabled in production, what would be their point?
I'm no electronic engineer, but isn't it best for them to be fast and simple, to reduce the chance that they cause interference themselves..?
And isn't it strongly unlikely that an attacker in the supply chain (TSMC??) would be able to reliably plant this in all Apple chips from the A12 to the A16 and the M1 ??
by anotherhue on 12/27/23, 6:02 PM
by Luc on 12/28/23, 1:41 PM
From https://securelist.com/triangulation-validators-modules/1108...
by MagicMoonlight on 12/29/23, 8:56 AM
by dang on 12/28/23, 3:43 AM
4-year campaign backdoored iPhones using advanced exploit - https://news.ycombinator.com/item?id=38784073
(We moved the comments hither, but the article might still be of interest)
by apienx on 12/27/23, 10:52 PM
by LanzVonL on 12/27/23, 8:31 PM
by Liebnitz on 12/31/23, 12:41 AM
by Liebnitz on 12/31/23, 12:32 AM
by guwop on 12/27/23, 8:56 PM
by cf1241290841 on 12/28/23, 1:19 AM
Told you so.
edit: The fact that this obvious statement gets upvoted above the apple backdoor on 22:40 of the talk also says alot.
edit1: https://imgur.com/a/82JV7I9
by hcarrega on 12/27/23, 6:23 PM
by cedws on 12/27/23, 8:29 PM
This is getting ridiculous. How many iMessage exploits have there now been via attachments? Why aren't Apple locking down the available codecs? Why isn't BlastDoor doing its job?
This is really disappointing to see time and time again. If a simple app to send and receive messages is this hard to get right, I have very little hope left for software.
by pushcx on 12/27/23, 5:46 PM
by kornhole on 12/27/23, 6:02 PM
by nothercastle on 12/27/23, 5:38 PM
by haecceity on 12/27/23, 11:22 PM
by ThinkBeat on 12/27/23, 8:54 PM
They have the best possible insight into the hardware and software at all stages I should think.
by jacooper on 12/27/23, 10:20 PM
by codedokode on 12/28/23, 11:21 AM
by hnburnsy on 12/27/23, 5:54 PM
Why isn't Apple detecting the spyware\malware payload? If only Apps approved by Apple are allowed on an iPhone, detection should be trivial.
And why has no one bothered to ask Apple or ARM about this 'unknown hardware'?
>If we try to describe this feature and how the attackers took advantage of it, it all comes down to this: they are able to write data to a certain physical address while bypassing the hardware-based memory protection by writing the data, destination address, and data hash to unknown hardware registers of the chip unused by the firmware.
And finally does Lockdown mode mitigate any of this?
by WhackyIdeas on 12/27/23, 6:31 PM
Apple, like any other USA company, has to abide by the laws and doing what they are told to do. If that means hardware backdoors, software backdoors, or giving NSA a heads up over a vulnerability during the time it takes to fix said vulnerability (to give time for NSA to make good use of it) then they will.
Only someone with great sway (like Jobs) could have resisted something like this without fear of the US Govt coming after him. His successor either didn’t have that passion for privacy or the courage to resist working with the NSA.
Anyone, anywhere with an iPhone will be vulnerable to NSA being able to break into their phone anytime they please, thanks to Apple. And with Apple now making their own silicon, the hardware itself will be even more of a backdoor.
Almost every single staff member at Apple will be none the wiser about this obv and unable to do anything about it even if they did - and their phones will be just as fair game to tap whenever the spies want.
I am speculating. But in my mind, it’s really quite obvious. Just like how Prism made me win an argument I had with someone who was a die hard Apple fan and thought they would protect privacy at all costs… 6 months later, Snowden came along and won me that argument.