from Hacker News

Apple wants to redefine what it means to violate your privacy. We mustn’t let it

by aral on 8/9/21, 3:55 PM with 79 comments

  • by foxyv on 8/9/21, 4:02 PM

    The thing that scares me isn't so much the violation of privacy. It's the idea that some computer algorithm can accuse me of a crime automatically with no evidence and generate an investigation.

    Judging by how police respond to these leads, you can end up in jail based on this "Evidence." While you wait for a 6 month investigation to be completed you lose your job and get an arrest record. Even if your photo is just a picture of static on a TV which produced a false positive.

    It reminds me of the dog who always indicates the presence of drugs 100% of the time. Probable cause, made to order.

  • by mortenjorck on 8/9/21, 4:33 PM

    I've been persuaded on this over the past few days.

    Initially, I saw the controversy as overblown: It's the exact same content scan that already occurs when uploading images to iCloud; it will still only occur when uploading, and the only change is where it takes place.

    I now see that as a reductionist take. Where it takes place does matter. The lines between client and server have been slowly blurred over the past decade to the point where a move like this may seem trivial to many, but ultimately, it is not. It becomes a foothold for so much more, and despite Apple's detailed assurances of all the friction they've installed onto this particular slippery slope, to step onto it at all is a step too far.

  • by system16 on 8/9/21, 4:22 PM

    I can't understand how anyone - even Apple's usual cheerleaders like Gruber - can justify defending this with a straight face. It's scanning your content on your device, without your consent. Full stop.

    Apple's FAQ to try and quell some of the backlash for this just makes it sound even worse in my opinion with gems like this:

    _Could governments force Apple to add non-CSAM images to the hash list?_

    _Apple will refuse any such demands._

    Bullshit. If China "requests" this with threat of banning them from selling the iPhone in China? They'll just say "Apple must operate under the laws of the countries it operates in" and its hands are tied. Which is most likely how this whole thing started.

    https://www.apple.com/child-safety/pdf/Expanded_Protections_...

    Maybe there will be more of an uproar when this inevitably comes to macOS.

  • by squarefoot on 8/9/21, 4:08 PM

    I know about nothing on this topic, so my only comment is that every time I read something along "please think of the children!" it suddenly raises warning flags.
  • by josh_today on 8/9/21, 4:20 PM

    What Apple is doing is the equivalent of the police one day deciding to search everyone’s physical home photo albums just in case there’s a picture of an illegal activity.
  • by 0x0 on 8/9/21, 4:07 PM

    Will the next NSOGroup Pegasus malware feature a swatting-as-a-service plugin that makes the victim phones self-report?
  • by jaywalk on 8/9/21, 4:26 PM

    A lot of people seem to be forgetting/not know about the other aspect of what Apple will be doing, which is scanning iMessage pics sent or received by minors for nudity. If it's detected, their parents will be notified and have the ability to view the pic in question.
  • by ksec on 8/9/21, 7:57 PM

    >I will not write another line of code for their platforms ever again.

    Surprised that this is still a thing. Apple has made it very clear in their App Store case, they do not need developers and Apps on their platform. And Apple operating their App Store has been a benefits, or more like a gift to developers for access to their users.

  • by SavageBeast on 8/9/21, 4:17 PM

    Apple apparently has the ability to look at pix stored in iCloud. I wonder who's pix they will start looking at first?

    The Fappening Part II By Apple

    https://en.wikipedia.org/wiki/ICloud_leaks_of_celebrity_phot...

    That which CAN happen WILL happen.

  • by bogwog on 8/9/21, 4:52 PM

    Serious question: why did Apple bother making that announcement at all? I can't imagine they're naive enough to think it would be good press for them?

    They could have done this quietly without telling anyone, maybe with a vaguely-worded update to the terms of service for the next mandatory iOS update that nobody reads anyways.

  • by rvz on 8/9/21, 4:13 PM

    Exactly. It is rephrasing the definition and re-defining what 'they really mean' of their intentions.

    Hence, Apple Inc. has a very strange definition of what they think 'privacy' means.

    Always with privacy in mind.™ /s

  • by vxNsr on 8/9/21, 4:12 PM

    If you wanna screw someone over and you know they have an iPhone with icloud backup set up, you can whatsapp them a pic that matches CP signature.
  • by valparaiso on 8/9/21, 4:23 PM

    As a back-end engineer I can't understand this outrage.

    Apple's approach is less intrusive than Google and Microsoft since they don't touch your photos in iCloud except when you passed threshold and Apple workers will have technical ability to decrypt your detected (not regular) photos and manually compare with images from the database. Also iPhone doesn't trigger photo scanning if you don't upload them to iCloud.

    From technical and privacy standpoint they have the best approach and it seems people are mad don't even understanding what Apple is doing.

    Android users never cared but when news come to Apple everyone is losing their shit. I can't believe people are that weird.