by rquantz on 2/24/16, 11:34 PM with 401 comments
by tptacek on 2/24/16, 11:40 PM
To me, the more profound consideration is this: if you use a strong alphanumeric password to unlock your phone, there is nothing Apple has been able to do for many years to unlock your phone. The AES-XTS key that protects data on the device is derived from your passcode, via PBKDF2. These devices were already fenced off from the DOJ, as long as their operators were savvy about opsec.
by cromwellian on 2/25/16, 12:10 AM
I mean, we're talking about threat models where chip-level doping has been shown as an attack. This just seems to be a variation on the same claims of copy protection tamper resistant dongles we've had forever. That someone builds a secure system that is premised on a secret being held in a tiny tamper-resistant piece, only the tamper resistance is eventually cracked.
It might even be the case that you don't even need to exfiltrate the UID from the Enclave, what the FBI needs to do is test a large number of PIN codes without triggering the backoff timer or wipe. But the wipe mechanism and backoff timer runs in the application processor, not on the enclave, and so it is succeptable to cracking attacks the same way much copy protection techniques are.
You may not need to crack the OS, or even upload a new firmware. You just need to disable the mechanism that wipes the device and delays how many wrong tries you get. So for example, if you can manage to corrupt, or patch the part of the system that does that, then you can try thousands of PINs without worrying about triggering the timer or wipe, and without needing to upload a whole new firmware.
I used to crack disk protection on the Commodore 64 and no matter how sophisticated the mechanism all I really needed to do was figure out one memory location to insert a NOP into, or change a BNE/BEQ branch destination, and I was done. Cracking often came down to mutating 1 or 2 bytes in the whole system.
(BTW, why the downvote? If you think I'm wrong, post a rebuttal)
by geertj on 2/25/16, 1:26 PM
The cynical side of me says that Apple's marketing tactics have worked. But I've got a feeling, heck, I want to believe, that this is actually driven by company values and not a short-term marketing benefit.
by JustSomeNobody on 2/25/16, 1:50 AM
by abalone on 2/25/16, 12:19 AM
The reason iCloud data will always be accessible by Apple, and thus governments, is not because Apple wants to make it accessible to governments. It's so that Apple can offer customers the very important feature of accessing their own data if they forget or otherwise don't have the password. That is an essential feature, and why this aspect will never change.
When someone passes away, for example, it would be a terrible compounding tragedy if all their photos from their whole life passed away along with them, because they didn't tell anyone their password or where they kept the backup key. So Apple wants and needs to provide an alternative way to recover the account. (For example, they will provide access to a deceased person's account if their spouse can obtain a court order proving the death and relationship.)
Harvard recent published a paper (called "Don't Panic") that essentially states the same.[1] Governments shouldn't "panic" because in most cases, consumers will not be exclusively using unbreakable encryption, because it has tradeoffs that aren't always desirable.
And the reason why most consumer should be backing up to iCloud is similar: that's how you prevent the tragedy of losing your data if you lose your phone.
Just something to keep in mind when discussing the "going dark" and "unhackable" news items.
It is worth noting however that people who do "have something to hide" from governments probably won't be using iCloud, if they know what they're doing. Then again if they know what they're doing, they wouldn't use anything that is backdoored anyway. So the naive criminals will still probably be hackable, and that's about all we can hope for.
[1] https://cyber.law.harvard.edu/pubrelease/dont-panic/Dont_Pan...
by kazinator on 2/25/16, 3:33 PM
Since 197X, people had home computers (and institutional computers for two decades before that) on which the FBI could install anything they want, if that equipment fell into their hands. This fact never made news headlines; it was taken for granted that the computer is basically the digital equivalent of a piece of stationery, written in pencil.
There is nothing wrong with that situation, and on such equipment, you can secure your data just fine.
No machine can be trusted if it fell under someone's physical access. Here is a proof: if I get my hands on your device, I can replace it with a physically identical device which looks exactly like yours, but is actually a man-in-the-middle (MITM). (I can put the fake device's board into your original plastic and glass, so it will have the same scratches, wear, grime pattern and whatever other markings that distinguish the device as yours.) My fake device will collect the credentials which you enter. Those are immediately sent to me and I play them against the real device to get in.
Apple are trying to portray themselves as a champion of security, making clueless users believe that the security of a device rests in the manufacturer's hands. This could all be in collaboration with the FBI, for all we know. Two versions of Big Brother are playing the "good guy/bad guy" routine, so you would trust the good guy, who is basically just one of the faces of the same thing.
by n0us on 2/24/16, 11:47 PM
I'm not well versed in security so excuse me for my ignorance but what if there were a way to solder chip onto the board that allows access to the secure enclave. Every time an iphone is made a companion chip is produced that contains some kind of access key which only works for that device and someone is required to foot the bill for storing them.
by alfiedotwtf on 2/25/16, 12:51 AM
- George Orwell, 1984
- Apple, 2016
by Evolved on 2/25/16, 5:08 PM
@Udik: I could just keep my tax documents in printed plaintext on top of my dresser but I opt to keep them locked up. Privacy and security are important. If people who utilize privacy/security tools are up to no good then why does the U.S. Gov't have a clause for not revealing information due to State Secrets? Why do we set our Facebook profiles to private? Why have passwords at all on anything? Are you beginning to see the point?
by condour75 on 2/24/16, 11:47 PM
by drcode on 2/25/16, 3:51 AM
by nickpsecurity on 2/25/16, 12:34 AM
https://news.ycombinator.com/item?id=10906999
Apple is far from having a secure phone right now. NSA certainly has ways to bypass this based on my attack framework and their prior work. They just don't want them to be known. They pulled the same stuff in the past where FBI talked about how they couldn't beat iPhones but NSA had them in the leaks & was parallel constructing to FBI. So, the current crop are probably compromised but reserved for targets worth the risk.
That said, modifying CPU to enable memory + I/O safety, restricting baseband, an isolation flow for hardware, and some software changes could make a system where 0-days were rare enough to be worth much more. Oh yeah, they'll have to remove the debugging crap out of their chips and add TEMPEST shielding. Good luck getting either of those two done. ;)
by ianamartin on 2/25/16, 2:25 AM
I want it all to go when I do. Hell, I want some of it to go now.
After I'm gone, I want to leave no part of my existence on the internet.
I realize that's not possible. But I want to minimize my footprint.
It is totally possible for a local device. I have a deadswitch on all my computers. If I don't log in and set an alive flag via the command line in any of my computers for more than a week, that computer securely wipes itself.
Let it be known, I have nothing to hide. I just think this is the best way to do things.
Edit: My reason for this is the frequency with which I encounter people who are no longer alive. It's a harsh thing to look at a link to someone who said something, and you used to know and then suddenly realize, "Oh shit. He's dead. And I used to be his best friend."
I know facebook has memorial pages, but those are difficult to get.
by wahsd on 2/24/16, 11:42 PM
What encryption and security really does is create scarcity of access to information and data in order to force a market solution where government groups have to prioritize their efforts and apply them deliberately.
by studentrob on 2/25/16, 12:31 AM
The only reason previous wiretapping laws were passed is because they weren't in the limelight and the public never had a chance to weigh in. Let's make this an election issue
by zobzu on 2/25/16, 6:25 AM
Nothing is 100% proof, crypto certainly isn't. It's going from child's play to "you actually need to knowledge" to "this is actually hard now" (but.. not impossible).
by jarcoal on 2/24/16, 11:47 PM
by jarjoura on 2/25/16, 2:54 AM
It's such a grey area and I will probably get down voted for commenting this way. I 100% agree that the power, in the wrong hands, is horrible, but can't we talk about this in a way where there's some kind of middle ground? All I've been reading are either extremes.
by drdrey on 2/25/16, 1:13 AM
I was a bit surprised by the clickbait-y nature of the HN title, but we can see in the nytimes URL that this "Apple Is Said to Be Working on an iPhone Even It Can’t Hack" was the original title, eh.
by draw_down on 2/24/16, 11:44 PM
by awqrre on 2/24/16, 11:55 PM
1. http://www.pcgamer.com/john-mcafee-on-his-fbi-iphone-hack-of...
2. http://arstechnica.com/staff/2016/02/mcafee-will-break-iphon...
edit: added source #2; see Google for additional sources...
by blinkingled on 2/25/16, 2:09 AM
It's not an attainable goal in practice. Today they generate a per device customized update that can be installed without user intervention. Even if they tomorrow enforce user intervention they still retain the capability to push a targeted update for a specific device on law enforcement/court order. The user has no way of telling what the update did.
by zekevermillion on 2/25/16, 12:57 AM
by parkej60 on 2/25/16, 4:00 PM
Full disclosure I understand this was a persons work phone. This is a statement which is solely being posted to stimulate theoretical discussion.
by wantreprenr007 on 2/25/16, 9:45 AM
(Somehow, I feel iMessage and related apps are MITMable because there is no mandatory, mutual, out-of-band validation of a recipient's identity.)
by malandrew on 2/25/16, 9:26 AM
by beshrkayali on 2/25/16, 2:09 PM
by nxzero on 2/25/16, 1:56 AM
by bunkydoo on 2/25/16, 4:20 AM
by riquito on 2/25/16, 1:14 AM
(of course if the phone is not in use anymore it doesn't apply)
by gaia on 2/25/16, 10:27 PM
If the software (Android) had the same type of protection (if the wrong PIN is entered 10 times it destroys the key), would this device be at par with the iOS approach?
by Aoyagi on 2/25/16, 9:17 AM
by joezydeco on 2/25/16, 12:16 AM
If Apple can't launch new iOS versions, can they still launch new iPhones?
by Gratsby on 2/25/16, 8:47 AM
by jokoon on 2/25/16, 1:30 PM
by frb on 2/25/16, 1:26 PM
by morninj on 2/24/16, 11:58 PM
by emodendroket on 2/25/16, 2:34 PM
by alexnewman on 2/25/16, 2:49 PM
by tempodox on 2/25/16, 1:45 PM
by ADRIANFR on 2/25/16, 4:18 AM
by pmarreck on 2/25/16, 2:37 AM
by joering2 on 2/25/16, 3:04 AM
Had he lost to the DOJ, here is what would (might) have happened:
- he would gladly unlocked this phone and bill DOJ for the time spent on redesigning IOS
- going forward, he would label each phone's box in red letters: CONTAINS GOVERNMENT-REQUIRED BACKDOOR (I doubt Gov can forbid him from doing that)
- he would then stop selling devices in Apple stores directly and only allow to order them in stores with direct home delivery from Apple website hosted and operated outside USA.
- all the shipping would be done directly from China by-passing US-tax system all together.
- shortly after he would remove the backdoor IOs for devices that are not directly sold on US soil
That would be a big fat middle finger to the DOJ.