by aral on 8/9/21, 3:55 PM with 79 comments
by foxyv on 8/9/21, 4:02 PM
Judging by how police respond to these leads, you can end up in jail based on this "Evidence." While you wait for a 6 month investigation to be completed you lose your job and get an arrest record. Even if your photo is just a picture of static on a TV which produced a false positive.
It reminds me of the dog who always indicates the presence of drugs 100% of the time. Probable cause, made to order.
by mortenjorck on 8/9/21, 4:33 PM
Initially, I saw the controversy as overblown: It's the exact same content scan that already occurs when uploading images to iCloud; it will still only occur when uploading, and the only change is where it takes place.
I now see that as a reductionist take. Where it takes place does matter. The lines between client and server have been slowly blurred over the past decade to the point where a move like this may seem trivial to many, but ultimately, it is not. It becomes a foothold for so much more, and despite Apple's detailed assurances of all the friction they've installed onto this particular slippery slope, to step onto it at all is a step too far.
by system16 on 8/9/21, 4:22 PM
Apple's FAQ to try and quell some of the backlash for this just makes it sound even worse in my opinion with gems like this:
_Could governments force Apple to add non-CSAM images to the hash list?_
_Apple will refuse any such demands._
Bullshit. If China "requests" this with threat of banning them from selling the iPhone in China? They'll just say "Apple must operate under the laws of the countries it operates in" and its hands are tied. Which is most likely how this whole thing started.
https://www.apple.com/child-safety/pdf/Expanded_Protections_...
Maybe there will be more of an uproar when this inevitably comes to macOS.
by squarefoot on 8/9/21, 4:08 PM
by josh_today on 8/9/21, 4:20 PM
by 0x0 on 8/9/21, 4:07 PM
by jaywalk on 8/9/21, 4:26 PM
by ksec on 8/9/21, 7:57 PM
Surprised that this is still a thing. Apple has made it very clear in their App Store case, they do not need developers and Apps on their platform. And Apple operating their App Store has been a benefits, or more like a gift to developers for access to their users.
by SavageBeast on 8/9/21, 4:17 PM
The Fappening Part II By Apple
https://en.wikipedia.org/wiki/ICloud_leaks_of_celebrity_phot...
That which CAN happen WILL happen.
by bogwog on 8/9/21, 4:52 PM
They could have done this quietly without telling anyone, maybe with a vaguely-worded update to the terms of service for the next mandatory iOS update that nobody reads anyways.
by rvz on 8/9/21, 4:13 PM
Hence, Apple Inc. has a very strange definition of what they think 'privacy' means.
Always with privacy in mind.™ /s
by vxNsr on 8/9/21, 4:12 PM
by valparaiso on 8/9/21, 4:23 PM
Apple's approach is less intrusive than Google and Microsoft since they don't touch your photos in iCloud except when you passed threshold and Apple workers will have technical ability to decrypt your detected (not regular) photos and manually compare with images from the database. Also iPhone doesn't trigger photo scanning if you don't upload them to iCloud.
From technical and privacy standpoint they have the best approach and it seems people are mad don't even understanding what Apple is doing.
Android users never cared but when news come to Apple everyone is losing their shit. I can't believe people are that weird.