by serhack_ on 6/10/24, 9:53 PM with 373 comments
by hexage1814 on 6/11/24, 5:41 AM
https://www.devever.net/~hl/webcrypto
And to be fair, this doesn't apply only to this case. Even the data you have stored locally, Apple could access it if they wanted, they sure have power to do it if they so wish or were ordered by the government. They might have done it already and just didn't told anyone for obvious reasons. So, I would argue the best you could say is that it's private in the sense that only Apples knows/can know what you are doing rather than a larger number of entities .
Which, you could argue it's a win when the alternatives will leak your data to many more parts... But still far away from being this unbreakable cryptography that it's portrayed it to be.
by loteck on 6/10/24, 11:55 PM
(I wonder if Matt realizes nobody can read his tweets without a X account? Use BlueSky or Masto man)
Edit: here's his thread combined https://threadreaderapp.com/thread/1800291897245835616.html?...
by zmmmmm on 6/11/24, 3:56 AM
I get that there's benefit to what they are doing. But the problem of selling a message of trust is you absolutely have to be 100% truthful about it, and them failing to be transparent that people's data is still subject to access like this poisons the larger message they are selling.
by kfreds on 6/11/24, 8:52 AM
Apple's Private Cloud Compute seems to be conceptually equivalent with System Transparency - an open-source software project my colleagues and I started six years ago.
I'm very much looking forward to more technical details. Should anyone at Apple see this, please feel free to reach out to me at stromberg@mullvad.net. I'd be more than happy to discuss our design, your design, and/or give you feedback.
Relevant links:
- https://mullvad.net/en/blog/system-transparency-future
- http://system-transparency.org (somewhat outdated)
by Shank on 6/10/24, 11:25 PM
Yes!
> Software will be published within 90 days of inclusion in the log, or after relevant software updates are available, whichever is sooner.
I think this theoretically leaves a 90-day maximum gap between publishing vulnerable software and potential-for-discovery. I sincerely hope that the actual availability of images is closer to instant than the maximum, though.
by ein0p on 6/11/24, 1:52 AM
by zer00eyz on 6/11/24, 12:16 AM
Who is this for? Dont get me wrong I think it's a great effort. This is some A+ nerd stuff right here. It's speaking my languge.
But Im just going to figure out how to turn off "calls home". Cause I dont want it doing this at all.
Is this speaking to me so I tell others "apple is the most secure option"? I don't want to tell others "linux" because I don't want to do tech support for that.
At this point I feel like an old man shouting "Dam you keep your hands off my data".
by WatchDog on 6/11/24, 4:44 AM
The main difference seems to be verifiability down to the firmware level.
Nitro enclaves does not provide measurements of the firmware[0], or hypervisor, furthermore they state that the hypervisor code can be updated transparently at any time[1].
Apple is going to provide images of the secure enclave processor operating system(sepOS), as well as the bootloader.
It also sounds like they will provide the source code for these components too, although the blog post isn't clear on that.
[0]: https://docs.aws.amazon.com/enclaves/latest/user/set-up-atte....
[1]: https://docs.aws.amazon.com/pdfs/whitepapers/latest/security...
by advael on 6/10/24, 11:30 PM
I think depending on how this plays out, Apple might manage to earn some of the trust its users have in it, which would be pretty cool! But even cooler will be if we get full chain-of-custody audits, which I think will have to entail opening up some other bits of their stack
In particular, the cloud OS being open-source, if they make good on that commitment, will be incredibly valuable. My main concern right now is that if virtualization is employed in their actual deployment, there could be a backdoor that passes keys from secure enclaves in still-proprietary parts of the OSes running on user devices to a hypervisor we didn't audit that can access the containers. Surely people with more security expertise than me will have even better questions.
Maybe Apple will be responsive to feedback from researchers and this could lead to more of this toolchain being auditable. But even if we can't verify that their sanctioned use case is secure, the cloud OS could be a great step forward in secure inference and secure clouds, which people could independently host or build an independent derivative of
The worst case is still that they just don't actually do it, but it seems reasonably likely they'll follow through on at least that, and then the worst case becomes "Super informative open-source codebase for secure computing at scale just dropped" which is a great thing no matter how the other stuff goes
by yla92 on 6/11/24, 12:38 AM
Interesting to see Swift on Server here!
by nardi on 6/11/24, 7:15 AM
by v4dok on 6/11/24, 10:31 AM
with another name. Intel, AMD and Nvidia have been working for years on this. OpenAI released a blog some time ago where they mentioned this as the "next step". Exciting that Apple went ahead and deployed first, it will motivate the rest as well.
by ramesh31 on 6/11/24, 2:13 AM
by piccirello on 6/11/24, 1:04 AM
by tzs on 6/11/24, 11:41 AM
I wonder if there is anything that enforces an upper limit on the time between reboots?
Since they are building their own chips it would be interesting to include a watchdog timer that runs off an internal oscillator, cannot be disabled by software, and forces a reboot when it expires.
by j0e1 on 6/11/24, 1:00 AM
Let the games begin!
by bayareabadboy on 6/11/24, 12:46 AM
by ethbr1 on 6/10/24, 10:32 PM
Airtag anonymity was pretty cool, technically speaking, but a peripheral use case for me.
To me, PCC is a well-reasoned, surprisingly customer-centric response to the fact that due to (processing, storage, battery) limitations not all useful models can be run on-device.
And they tried to build a privacy architecture before widely deploying it, instead of post-hoc bolting it on.
>> 4. Non-targetability. An attacker should not be able to attempt to compromise personal data that belongs to specific, targeted Private Cloud Compute users without attempting a broad compromise of the entire PCC system. This must hold true even for exceptionally sophisticated attackers who can attempt physical attacks on PCC nodes in the supply chain or attempt to obtain malicious access to PCC data centers.
Oof. That's a pretty damn specific (literally) attacker, and it's impressive that made it into their threat model.
And neat use of onion-style encryption to expose the bare minimum necessary for routing, before the request reaches its target node. Also [0]
>> For example, the [PCC node OS] doesn’t even include a general-purpose logging mechanism. Instead, only pre-specified, structured, and audited logs and metrics can leave the node, and multiple independent layers of review help prevent user data from accidentally being exposed through these mechanisms.
My condolences to Apple SREs, between this and the other privacy guarantees.
>> Our commitment to verifiable transparency includes: (1) Publishing the measurements of all code running on PCC in an append-only and cryptographically tamper-proof transparency log. (2) Making the log and associated binary software images publicly available for inspection and validation by privacy and security experts. (3) Publishing and maintaining an official set of tools for researchers analyzing PCC node software. (4) Rewarding important research findings through the Apple Security Bounty program.
So binary-only for majority, except the following:
>> While we’re publishing the binary images of every production PCC build, to further aid research we will periodically also publish a subset of the security-critical PCC source code.
>> In a first for any Apple platform, PCC images will include the sepOS firmware and the iBoot bootloader in plaintext, making it easier than ever for researchers to study these critical components.
[0] Oblivious HTTP, https://www.rfc-editor.org/rfc/rfc9458
by krosaen on 6/11/24, 12:56 PM
by paul2paul on 6/11/24, 6:47 AM
by tiffanyh on 6/11/24, 2:56 AM
And what the PCC chassis looks like for these compute devices (will it be a display-less iPad)?
by jaydeegee on 6/11/24, 12:38 AM
by vlovich123 on 6/11/24, 3:27 AM
by dymk on 6/11/24, 1:48 PM
There’s precedent for this sort of thing as well, like Apple TVs or iPads acting as HomeKit hubs and processing security can footage on-device.
Maybe they’ll open that up in the future.
by gigel82 on 6/11/24, 2:54 AM
Or better yet, make the APIs public and pluggable so that one can choose an off-device AI processor themselves if one is needed.
by KETpXDDzR on 6/14/24, 2:46 AM
by asp_hornet on 6/11/24, 7:24 AM
by CGamesPlay on 6/11/24, 3:33 AM
by thomasahle on 6/10/24, 10:42 PM
by renegade-otter on 6/11/24, 8:29 PM
Or "Professional" version of software that removes all those annoying "AI" features.
by cherioo on 6/10/24, 11:56 PM
by croes on 6/11/24, 5:54 AM
What about second hand iPhone users?
by EternalFury on 6/11/24, 4:48 AM
by SirensOfTitan on 6/11/24, 12:31 AM
... I suppose this is ultimately a question that will be tested sooner or later in the US.
by clipjokingly on 6/11/24, 2:10 AM
by m3kw9 on 6/11/24, 1:19 AM
by solarkraft on 6/11/24, 1:38 AM
One key part though will be the remote attestation that the servers are actually running what they say they're running. Without any access to the servers, how do we do that? Am I correctly expecting that that part remains a "trust me bro" situation?
by candiddevmike on 6/10/24, 11:11 PM
by rldjbpin on 6/11/24, 5:58 PM
i am happy for those who see the positives here, but for the skeptic a toggle to prevent any online processing would be more satisfactory.
by rmbyrro on 6/11/24, 12:13 AM
The absolute worst acronym for anything even remotely related to personal privacy.
by system7rocks on 6/11/24, 5:54 AM
by Havoc on 6/10/24, 11:30 PM
by nerdright on 6/11/24, 3:18 AM
This is clearly a company with an identity, unlike Microsoft and Google who are very confused.
by throwaway369 on 6/11/24, 1:14 AM
by nisten on 6/11/24, 9:51 AM
Was the cloud non-private before? Was it not secure in the first place? Do my Siri searches no longer end up as google ads metadata now? Are the feds no longer able to get rubber stamp access to my i C L O U D now?
You are a naive idiot for believing that this is anything but security theater to adress the emotional needs of AI anxiety in and outside the company.
Just my opinion.
by goupil on 6/11/24, 4:44 AM
by whatever1 on 6/11/24, 3:44 PM
by jeffbee on 6/10/24, 10:58 PM