from Hacker News

Apple's plan to “think different” about encryption opens a backdoor to your life

by bbatsell on 8/5/21, 8:20 PM with 824 comments

  • by threatofrain on 8/5/21, 8:42 PM

  • by trangus_1985 on 8/5/21, 8:30 PM

    I've been maintaining a spare phone running lineage os exactly in case something like this happened - I love the apple watch and apple ecosystem, but this is such a flagrant abuse of their position as Maintainers Of The Device that I have no choice but to switch.

    Fortunately, my email is on a paid provider (fastmail), and my photos are on a NAS, I've worked hard to get all of my friends on Signal. While I still use google maps, I've been trialing out OSM alternatives for a minute.

    The things they've described are in general, reasonable and probably good in the moral sense. However, I'm not sure that I support what they are implementing for child accounts (as a queer kid, I was terrified of my parents finding out). On the surface, it seems good - but I am concerned about other snooping features that this portents.

    However, with icloud photos csam, it is also a horrifying precedent that the device I put my life into is scanning my photos and reporting on bad behavior (even if the initial dataset is the most reprehensible behavior).

    I'm saddened by Apple's decision, and I hope they recant, because it's the only way I will continue to use their platform.

  • by triska on 8/5/21, 8:39 PM

    I remember an Apple conference where Tim Cook personally assured us that Apple is fully committed to privacy, that everything is so secure because the iPhone is so powerful that all necessary calculations can happen on the device itself, and that we are "not the product". I think the Apple CEO said some of this in the specific context of speech processing, yet it seemed a specific case of a general principle upheld by Apple.

    I bought an iPhone because the CEO seemed to be sincere in his commitment to privacy.

    What Apple has announced here seems to be a complete reversal from what I understood the CEO saying at the conference only a few years ago.

  • by c7DJTLrn on 8/5/21, 9:37 PM

    Catching child pornographers should not involve subjecting innocent people to scans and searches. Frankly, I don't care if this "CSAM" system is effective - I paid for the phone, it should operate for ME, not for the government or law enforcement. Besides, the imagery already exists by the time it's been found - the damage has been done. I'd say the authorities should prioritise tracking down the creators but I'm sure their statistics look much more impressive by cracking down on small fry.

    I've had enough of the "think of the children" arguments.

  • by geraneum on 8/5/21, 10:21 PM

    Didn’t they [Apple] make the same points that EFF is making now, to avoid giving FBI a key to unlock an iOS device that belonged to a terrorist?

    “ Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.”

    “… We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.”

    “ The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.”

    Tim Cook, 2016

  • by Shank on 8/5/21, 9:05 PM

    I really love the EFF, but I also believe the immediate backlash is (relatively) daft. There is a potential for abuse of this system, but consider the following too:

    1. PhotoDNA is already scanning content from Google Photos and a whole host of other service providers.

    2. Apple is obviously under pressure to follow suit, but they developed an on-device system, recruited mathematicians to analyze it, and published the results, as well as one in-house proof and one independent proof showing the cryptographic integrity of the system.

    3. Nobody, and I mean nobody, is going to successfully convince the general public that a tool designed to stop the spread of CSAM is a "bad thing" unless they can show concrete examples of the abuse.

    For one and two: given the two options, would you rather that Apple implement serverside scanning, in the clear, or go with the on-device route? If we assume a law was passed to require serverside scanning (which could very well happen), what would that do to privacy?

    For three: It's an extremely common trope to say that people do things to "save the children." Well, that's still true. Arguing against a CSAM scanning tool, which is technically more privacy preserving than alternatives from other cloud providers, is an extremely uphill battle. The biggest claim here is that the detection tool could be abused against people. And that very well may be possible! But the whole existence of NCMEC is predicated on stopping the active and real danger of child sex exploitation. We know with certainty this is a problem. Compared to a certainty of child sex abuse, the hypothetical risk from such a system is practically laughable to most people.

    So, I think again, the backlash is daft. It's been about two days of the announcement being public (leaks). The underlying mathematics behind the system has barely been published [0]. It looks like the EFF rushed to make a statement here, and in doing so, it doesn't look like they took the time to analyze the cryptography system, to consider the attacks against it, or to consider possible motivations and outcomes. Maybe they did, and they had advanced access to the material. But it doesn't look like it, and in the court of public opinion, optics are everything.

    [0]: https://www.apple.com/child-safety/pdf/Alternative_Security_...

  • by Wowfunhappy on 8/5/21, 11:36 PM

    This isn't the biggest issue at play, but one detail I can't stop thinking about:

    > If an account held by a child under 13 wishes to send an image that the on-device machine learning classifier determines is a sexually explicit image, a notification will pop up, telling the under-13 child that their parent will be notified of this content. [...] For users between the ages of 13 and 17, a similar warning notification will pop up, though without the parental notification.

    Why is it different for children under 13, specifically? The 18-year cutoff makes sense, because turning 18 carries legal weight in the US (as decided via a democratic process), but 13?

    13 is an age when many parents start granting their children more freedom, but that's very much rooted in one's individual culture—and the individual child. By giving parents fewer options for 13-year-olds, Apple—a private company—is pushing their views about parenting onto everyone else. I find that a little disturbing.

    ---

    Note: I'm not (necessarily) arguing for greater restrictions on 13-year-olds. Privacy for children is a tricky thing, and I have mixed feelings about this whole scheme. What I know for sure, however, is that I don't feel comfortable with Apple being the one to decide "this thing we've declared an appropriate invasion of privacy for a 12-year-old is not appropriate for a 13-year-old."

  • by strogonoff on 8/5/21, 10:54 PM

    If Mallory gets a lawful citizen Bob to download a completely innocuous looking but perceptual-CSAM-hash-matching image to his phone, what happens to Bob? I imagine the following options:

    - Apple sends Bob’s info to law enforcement; Bob is swatted or his life is destroyed in some other way. Worst, but most likely outcome.

    - An Apple employee (or an outsourced contractor) reviews the photo, comparing it to CSAM source image sample used for the hash. Only if the image matches according to human vision, Bob is swatted. This requires there to be some sort of database of CSAM source images, which strikes me as unlikely.

    - An Apple employee or a contractor reviews the image for abuse without comparing it to CSAM source, using own subjective judgement. Better, but implies Apple employees could technically SWAT Apple users.

  • by farmerstan on 8/5/21, 11:34 PM

    Police routinely get drug sniffing dogs to give false positives so that they are allowed to search a vehicle.

    How do we know Apple or the FBI don’t do this? If they want to search someone’s phone all they need to do is enter a hash of a photo they know is on the targets phone and voila, instant access.

    Also, how is this not a violation of the 14th amendment? I know Apple isn’t part of the government but they are basically acting as a defacto agent of the police by scanning for crimes. Using child porn as a completely transparent excuse to start scanning all our material for anything they want makes me very angry.

  • by cwizou on 8/5/21, 10:27 PM

    The FT article mentioned it was US only, but I'm more afraid of how other governments will try to pressure Apple to adapt said technology to their needs.

    Can they trust random government to give them a database of only CSAM hashes and not insert some extra politically motivated content that they deem illegal ?

    Because once you've launched this feature in the "land of the free", other countries will require for their own needs their own implementation and demand (through local legislation which Apple will need to abide to) to control said database.

    And how long until they also scan browser history for the same purpose ? Why stop at pictures ? This is opening a very dangerous door that many here will be uncomfortable with.

    Scanning on their premises (considering they can as far as we know ?) would be a much better choice, this is everything but (as the "paper" linked tries to say) privacy forward.

  • by lovelyviking on 8/6/21, 7:24 AM

    - Apple: Dear User, We are going to install Spyware Engine in your device.

    - User: Are you out of your f... mind?

    - Apple: It's for children protection.

    - User: Ah, ok, no problem, please install spyware and do later whatever you wish and forget about any privacy, the very basis of rights, freedom and democracy.

    This is by the way how Russia started to filter the web from political opponents. All necessary controls were put in place under the same slogan: "to protect children"

    Yeah, right.

    Are modern people that naive and dumb and can't think 2 steps forward? Is that's why it's happening?

    Edit: Those people would still need to explain how living in society without privacy, freedom and democracy with authoritarian practices when those children will grow up will make them any 'safer' ...

  • by hncurious on 8/5/21, 9:31 PM

    Apple employees successfully pressured their employer to fire a new hire and are petitioning to keep WFH.

    https://www.vox.com/recode/2021/5/13/22435266/apple-employee...

    https://www.vox.com/recode/22583549/apple-employees-petition...

    Will they apply that energy and leverage to push back on this?

    How else can this be stopped before it goes too far? Telling people to "Drop Apple" is even less effective than "Delete Facebook".

  • by iamleppert on 8/5/21, 9:55 PM

    It’s pretty trivial to iteratively construct an image that has the same hash as another, completely different image if you know what the hash should be.

    All one needs to do, in order to flag someone or get them caught up in this system, is to gain access to this list of hashes and construct an image. This data is likely to be sought after as soon as this system is implemented, and it will only be a matter of time before a data breach exposes it.

    Once that is done, the original premise and security model of the system will be completely eroded.

    That said, if this does get implemented I will be getting rid of all my Apple devices. I’ve already switched to Linux on my development laptops. The older I get, the less value Apple products have to me. So it won’t be a big deal for me to cut them out completely.

  • by haskaalo on 8/5/21, 11:32 PM

    At this point, I think phones can be compared to a home in terms of privacy.

    In your house, you might have private documents, do some things you don't want other people to have or see just like what we have on our phones nowadays.

    The analogy I'm trying to make is that if suddenly the government decided to install cameras in every houses with the premise to make sure no pedophile is abusing a child and that the cameras never send data unless the AI done locally detects it is something that I believe would shock everyone.

  • by skee_0x4459 on 8/5/21, 10:38 PM

    wow. in the middle of reading that, i realized that this is a watershed moment. why would apple go back on their painstakingly crafted image and reputation of being staunchly pro privacy? its not for the sake of the children (lol). no, something happened that has changed the equation for apple. some kind of decisive shift has occurred. maybe apple has finally caved in to the chinese market, like everyone else in the US, and is now making their devices compatible with chinese surveillance. or maybe the US government has finally managed to force apple to crack open its shell of encryption in the name of a western flavored surveillance. but either way, i think it is a watershed moment because securing privacy will from this moment onward be a fringe occupation in the west. unless a competitor rises up -- but thats impossible because there arent enough people who care about privacy to sustain a privacy company. thats the real reason why privacy has died today.

    if you really want to save the children, why not build the scanning into safari? scan the whole phone! just scan it all. its really no different than what they are doing. its not like they would have to cross the rubicon to do it, not anymore anyway.

    and also i think its interesting how kids will adjust to this. i think a lot of kids wont hear about this and will find themselves caught up in a child porn case.

    im so proud of the responses that people seem to generally have. it makes me feel confident in the future of the world.

    isnt there some device to encrypt and decrypt messages with a separate device that couples to your phone? like a device fit into a case and that has a keyboard interface built into a screen protector with indium oxide electrodes.

  • by Waterluvian on 8/5/21, 11:45 PM

    If I go on 4chan and an illegal image loads and caches into my phone before moderators take it down or I hit the back button, will Apple’s automated system ruin my life?

    This kind of stuff absolutely petrifies me because I’m so scared of getting accidentally scooped up for something completely unintentional. And I do not trust police one bit to behave like intelligent adult humans.

    Right now I feel like I need to stop doing ANYTHING that goes anywhere outside the velvet ropes of the modern commercial internet. That is, anywhere that cannot pay to moderate everything well enough that I don’t run the risk of having my entire life ruined because some #%^*ing algorithm picks up on some content I didn’t even choose to download.

  • by roody15 on 8/5/21, 9:59 PM

    My two cents: I get the impression this is related to NSO pegasus software. So once the Israeli firms leaks were made public Appple had to respond and has patched some security holes that were exposed publicly.

    NSO used exploits in iMessage to enable them to grab photos, texts among other things.

    Now shortly after Apple security patches we see them pivot and now want to “work” with law enforcement. Hmmm almost like once access was closed Apple needs a way to justify “opening” access to devices.

    Yes I realize this could be a stretch based on the info. Just seems like an interesting coincidence… back door exposed and closed…. now it’s back open… almost like governments demand access

  • by shrimpx on 8/6/21, 2:01 AM

    From Apple's original text[0]:

    > Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching [...]

    It's incredible that Apple arrived at the conclusion that client-side scanning that you cannot prevent is more private than cloud-scanning.

    Since they claim they're only scanning iCloud content, why not scan in the cloud?

    They decided the most private way is to scan iCloud content before it's uploaded to the cloud... Because if they scanned in the cloud it would be seen as a breach of privacy and is bad optics for a privacy-focused company? But scanning on the physical device that they have described as "personal" and "intimate" has better optics? That's amazing.

    This decision can only be read as Apple paving the way to scanning all content on the device, to bypass the pesky "Backup to iCloud" options being turned off.

    [0] https://www.apple.com/child-safety/

  • by nicetryguy on 8/5/21, 9:48 PM

    I'm looking forward to this platform being expanded to facially ID against more databases such as criminals, political dissenters, or anyone with an undesirable opinion so that SWAT teams can barge into the homes of false positive identifications to murder them and their dogs.
  • by blintz on 8/6/21, 2:02 AM

    One disappointing development from a larger perspective is that many privacy-preserving technologies (multi-party computing, homomorphic encryption, hardware enclaves, etc) are actually getting used to build tools that undermine once-airtight privacy guarantees. E2E starts to become… whatever this is.

    A more recent example is how private set intersection became an easy way to get contact tracing tech everywhere while maintaining an often perfunctory notion of privacy.

    I wonder where large companies will take this next. It behooves us cryptography/security people who actually care about not walking down this slippery slope to fight back with tech of our own.

    This whole thing also somewhat parallels the previous uses of better symmetric encryption and enclaves technologies for DRM and copyright protection.

  • by endisneigh on 8/5/21, 8:50 PM

    Unless the entire stack you’re using is audited and open source this sort of thing is inevitable.

    As far as this is concerned, seems like if you don’t use iMessage or iCloud you’re safe for now.

  • by avnigo on 8/6/21, 8:36 AM

    > Once a certain number of photos are detected, the photos in question will be sent to human reviewers within Apple, who determine that the photos are in fact part of the CSAM database. If confirmed by the human reviewer, those photos will be sent to NCMEC, and the user’s account disabled.

    Chilling. Why have human reviewers, unless false positives are bound to happen (this is of 100% certainty with the aggregate amount of photos to be scanned)?

    So, in effect, Apple has hired human reviewers to police your photos that an algorithm has flagged. Whether you knowingly consent to or not (through some fine print), you are being subjected to a search without probable cause.

    This is not the future I was looking forward to.

  • by DaveSchmindel on 8/5/21, 11:40 PM

    (1) I'm a bit frustrated, as a true Apple "bitch", at the irony here. As a loyal consumer, I am (likely) never going to be privileged enough to know exactly which part of Apple's budget allowed for this implementation to occur. I can only assume that such data would speak volumes as to _why_ the decision to introduce CSAM this way has come to light.

    (2) I'm equally intrigued by the paradox that in order for the algorithms that perform the CSAM detection to work, it must require some data set that represents these reprehensible images (which are illegal to possess).

  • by falcolas on 8/5/21, 10:27 PM

    Apple,

    Not that you care, but this is the straw that's broken this camel's back. It's too ripe for abuse, it's too invasive, and I don't want it.

    You've used one of the Four Horsemen of the Infocalypse perfectly… and so I'm perfectly happy to leave your ecosystem.

    Cheers.

  • by outworlder on 8/5/21, 9:28 PM

    > these notifications give the sense that Apple is watching over the user’s shoulder—and in the case of under-13s, that’s essentially what Apple has given parents the ability to do.

    Well, yes? Parents are already legally responsible for their young children and under their supervision. The alternative would be to not even give such young children these kind of devices to begin with - which might actually be preferable.

    > this system will give parents who do not have the best interests of their children in mind one more way to monitor and control them

    True. But the ability to send or receive explicit images would most likely not be the biggest issue they would be facing.

    I understand the slippery slope argument the EFF is making, but they should keep to the government angle. Having the ability for governments to deploy specific machine learning classifiers is not a good thing.

  • by arihant on 8/5/21, 10:34 PM

    I’m very concerned that a bunch of false positives will send people’s nudes to Apple for manual review. I don’t trust apple’s on device ML for something this sensitive. I also can’t imagine that Apple will now not be forced to implement government forced filtering and reporting on iMessage. And this will likely affect others like WhatsApp because now governments know that there is a way to do this on E2E.

    What are some other fully encrypted photo options out there?

  • by young_unixer on 8/6/21, 1:54 AM

    Lately, I've been on the fence about open source software, and I've been tempted by propietary programs. Mainly because FOSS is much less polished than commercial closed-source software, and I care about polish. I even contemplated buying an Apple M1 at some point.

    But now I'm reminded of how fucking awful and hostile Apple and other companies can be. I'm once again 100% convinced that free software is the only way to go, even if I have to endure using software with ugly UIs and bad UX. It will be worth it just not to have to use software written by these assholes.

    Stallman was right.

  • by babesh on 8/5/21, 9:11 PM

    Apple is part of the power structure of the US. That means that it has a hand in shaping the agenda for the US but with that power comes the responsibility to carry out the agenda.

    This also means that it is shielded from attack by the power structure. That is the bargain that the tech industry has struck.

    The agenda is always towards increasing power for the power structure. One form of power is information. That means that Apple is inexorably drawn towards increasing surveillance. Also, Apple’s massive customer base both domestic and overseas is a juicy surveillance target.

  • by panny on 8/5/21, 10:50 PM

    I left Apple behind years ago after using their gear for more than a decade. I recently received a new M1 laptop from work and liked it quite a bit. It's fast, it's quiet, it doesn't get hot. I liked it so much, that I was prepared to go back full Apple for a while. I was briefly reviewing a new iPhone, a M1 mini as a build server, a display, and several accessories to go along with a new M1 laptop for myself. (I don't like to mix work and personal)

    Then this news broke. Apple, you just lost several thousand dollars in sales from me. I had items in cart and was pricing everything out when I found this news. I will spend my money elsewhere. This is a horrendous blunder. I will not volunteer myself up to police states by using your gear now or ever again in the future. I've even inquired about returning the work laptop in exchange for a Dell.

    Unsafe at any speed. Stallman was right. etc etc etc.

  • by stakkur on 8/6/21, 12:22 AM

    Imagine if the government said they were installing a backdoor in your checking account to 'anonymously' analyze your expenses and payees, 'just to check for known illegal activity'. Every time you use your debit card or pay a bill, the government analyzes it to see if it's 'safe'.
  • by imranhou on 8/5/21, 10:51 PM

    I think it's easy to say no to any solution, but harder to say "this is bad, but we should do this instead to solve the problem". In a world with ubiquitous/distributed communication, the ideas that come up would generally avoid direct interception but need some way to identify a malicious transaction.

    When saying no to ideas like this, we should at the same time attempt to also share our thoughts on what would be an acceptable alternative solution.

  • by dsign on 8/6/21, 10:11 AM

    Apple is not a dumb company, they did this fully knowing of the backslash they would receive, very likely impacting their bottom line. Two scenarios come to mind:

    1. They expect must people will shrug and let themselves be scanned. That is, this privacy invasion will result in minimal damage to the Apple brand, or

    2. They know privacy-savvy people will put them from now on on the same league with Android, and they are prepared to take the financial loss.

    Scenario 1 is the most plausible, though it hints an impish lack of consideration for their customers.

    Scenario 2 worries me most. No smart company does something counter-productive financially unless under dire pressure. What could possibly make Apple shoot itself on the foot and announce it publicly? In other words, Apple's actions, from my perspective, look like a dead canary.

  • by hungryforcodes on 8/6/21, 5:05 AM

    Am I bring cynical?

    https://techcrunch.com/2021/04/28/apple-record-china-2021/

    Apple's iPhone revenu just doubled from last year in China -- now 17 billion. Thats not a small number. The play against Huawei has done it's job, apparently -- it's quite mortally injured.

    For sure the CCP would love to scan everyone's phones for files or images it finds troubling and for sure every country will eventually be allowed to have its own entries in this database or even their own custom DB.

    So my cynical side says...Apple just sold out. MASSIVELY. The loosers -- everyone pretty much that buys their phones.

  • by tomxor on 8/5/21, 8:50 PM

    I keep thinking, It's like they are trying to be the most ironic company in history...

    But then I have to remind myself, the old Apple is long gone, the new Apple is a completely different beast, with a very different concept of what it is marketing.

  • by Sunspark on 8/5/21, 10:52 PM

    This is going to do wonders for Apple's marketshare once the teenagers realize that Apple is going to be turning them in to the police.

    Teens are not stupid. They'll eventually clue-in that big brother is watching and won't appreciate it. They'll start by using other messengers instead of imessage and then eventually leaving the ecosystem for Android or whatever else comes down the pike in the future.

  • by walterbell on 8/6/21, 12:04 AM

    Now that we know iPhones have the ability to perform frame-level, on-device PhotoDNA hashing of videos and photos, could the same infrastructure be used to identify media files which are attempting to exploit the long list of buffer overflows that Apple has patched in their image libraries, as recently as 14.7.1?

    This would be super useful for iPhone security, e.g. incoming files could be scanned for attempting to use (closed) exploits, when the user can easily associate a malicious media file with the message sender or origin app/site.

    On jailbroken devices (e.g. iPhone 7 and earlier with unpatchable boot ROMs), is there a Metasploit equivalent for iOS, which aggregates PoCs for public exploits?

    A related question: will PhotoDNA hashing take place continuously or in batch, e.g. overnight? How will associated Battery/Power usage be accounted, e.g. attributed to generic "System" components or itemized separately? If the former, does that create class-action legal exposure for a post-sale change in device "fitness for purpose"?

  • by tlogan on 8/5/21, 11:11 PM

    Oh well… it always starts with “protect the children”. Then “protect us from terrorists”, then “terrorist sympathizers“, …

    And I bet that Saudis and other oppressive regimes will use this to detect other “crimes”.

  • by strictnein on 8/5/21, 9:11 PM

    This is an excellent example of how far off the rails the EFF has gone. This is completely false:

    > "Apple is planning to build a backdoor into its data storage system and its messaging system"

  • by gcanyon on 8/6/21, 12:08 AM

    Are child porn viewers actually going to use iCloud backup? That seems like even the stupidest person would know not to do that.

    So I'll propose an alternative theory: Apple is doing this not to actually catch any child pornographers, but to ensure that any CP won't actually reach their servers. Less public good, more self-serving.

  • by mcone on 8/5/21, 9:09 PM

    I wish there was a privacytools.io for hardware. I've been an iPhone user since the beginning but now I'm interested in alternatives. Last I checked, PinePhone was still being actively developed. Are there any decent phones that strike a balance between privacy and usability?
  • by Animats on 8/6/21, 12:03 AM

    Is Apple under some legal pressure to do this? Is there some kind of secret deal here: "put in spyware and we back off on antitrust?"
  • by j1elo on 8/5/21, 11:21 PM

    I'm not sure what's the point; in this day and age, I'm pretty sure that if your 14 years old wants to send a nude picture, if they really have already reached to that decision, they will do it.

    The only practical barrier here is that their parents have educated them and their mental model arrives by its own at "no, this is a very bad idea" instead of "yes, I want to send this pic". Anything else, including petty prohibitions from their parents, will not be a decision factor in most cases. Have we forgotten how it was to be a teenager?

    (I mean people, both underage and criminals, will just learn to avoid apple and use other channels)

  • by nick_naumov on 8/5/21, 11:23 PM

    Goodbye Apple! I have trusted you for 12 years. All I wanted was you to trust me.
  • by NazakiAid on 8/5/21, 10:50 PM

    Wait until a corrupt govenment starts forcing Apple or Microsoft to scan for leaked documents exposing them and then automatically notifying them. Just one of the many ways this could go wrong in the future.
  • by dalbasal on 8/6/21, 12:21 PM

    "”Apple sells iPhones without FaceTime in Saudi Arabia, because local regulation prohibits encrypted phone calls. That's just one example of many where Apple's bent to local pressure. What happens when local regulations in Saudi Arabia mandate that messages be scanned not for child sexual abuse, but for homosexuality or for offenses against the monarchy?”"

    Good question. Companies have to follow laws. The naive, early 2000s notion that the internet was unstoppable and ungovernable was mistaken. Apple, Google and the other internet bottlenecks were, it turned out, the pathway to a governable internet. That fight is lost.

    Now that it's governable, attention needs to be on those governing... governments, parliaments, etc.

    The old version of freedom of speech and such didn't come from the divine. They were created and codified and now we have them. We need to do that again. Declare new, big, hairy freedoms that come with a cost that we have agreed to pay.

    There are dichotomies here, and if we deal with them one droplet at a time, they'll be compromised away. "Keep your private messages private" and "Prevent child pornography and terrorism in private messages" are incompatible. But, no one is going to admit that they are choosing between them... not unless there's an absolut-ish principle to defer to.

    Once you're scanning email for ad targeting, it's hard to justify not scanning it for child abuse.

  • by n_io on 8/6/21, 12:39 AM

    This is exactly the event that I’ve been preparing for. I figured out long ago that it’s not a matter of if, but when, Apple fully embraces the surveillance economy. This seems to be a strong step in that direction. As dependant as I’ve been on the Apple ecosystem, I’ve been actively adopting open source solutions in place of the Apple incumbents so that when I have to fully pull the plug, I can at least soften the blow.

    In place of Mail: Tutanota In place of iMessage: Signal And so on…

  • by chinchilla2020 on 8/6/21, 7:44 PM

    Child abusers are dumb, but smart enough to know not to upload pictures to the cloud.

    If was a conspiracy type, I would assume this is more likely to be apple responding to an NSA request to de-crypt data.

    This idea will be gradually expanded:

    1. To detect child abuse (unsuccessfully)

    2. To detect terrorism (also unsuccessfully)

    3. To detect criminal activity (successful only against low-level criminals)

    4. To detect radical political views as defined by Apple corporation

    5. To detect human behaviors that are not supported by Apple's corporate vision

  • by hamburgerwah on 8/5/21, 11:08 PM

    It will take a matter of days for other parties including copyright holders, if they have not already, to get in on this action. The infrastructure will then be compromised by human int so that it can be used to intelligence agencies to find people hitting red flag words like snowden and wikileaks. But lets be real for a moment that anyone who thinks apple cares about security or privacy over profits is in some way kidding themselves.
  • by robertwt7 on 8/5/21, 11:05 PM

    When I thought that Tim Cook really respected everyone’s privacy sincerely.

    Apparently I was wrong, I loved apple products and ecosystem. Not sure what to switch after this :/

  • by joering2 on 8/6/21, 1:28 AM

    > This means that when the features are rolled out, a version of the NCMEC CSAM database will be uploaded onto every single iPhone.

    Question - if most people literally don't want to have anything to do with CP, isn't uploading of a hash database of that material to their phones precisely that?

    For once I think I will feel disgusted walking around with my phone in a pocket; a phone that is full of hashes of child porn. That's a terrible feeling.

  • by djanogo on 8/5/21, 11:04 PM

    Why didn't apple just add option in screen time to block all images in iMessage?, that would have let parents choose what's best for their kids?
  • by kntoukakis on 8/6/21, 5:58 AM

    From https://www.apple.com/child-safety/

    “The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.”

    How did they calculate this? Also, I can imagine more than a trillion photos being uploaded to iCloud a year.

  • by iamnotwhoiam on 8/6/21, 2:19 AM

    If sexual images exchanged by a kid are saved to the parent’s phone then doesn’t that put the parent at risk for charges if the kids are sexting?
  • by scratchmyhead on 8/6/21, 6:59 PM

    If Apple broadcasts their surveillance strategy so publicly, wouldn't criminals stop using Apple products and delete their iCloud data immediately? Who will be left to "catch" at that point? The most incompetent criminals?

    I'm missing how this will actually work if perpetrators knew Apple was going to analyze their data beforehand. Could someone explain?

  • by neilv on 8/5/21, 11:52 PM

    The article spends time on the implications for kids messaging other kids. Though I think parents as a group might tend to lean more towards wanting that snooping going on.

    Separate from kids, I wonder whether Apple's is yet shooting itself in the foot for teens.

    Teens should start caring about privacy around then, are very peer/fashion-sensitive, and have shown that they'll readily abandon platforms. Many parents/teachers/others still want to be treating teens as children under their power, but teens have significant OPSEC motivation and ability.

    Personally, I'd love to see genuinely good privacy&security products rushing to serve the huge market of a newly clueful late-teen generation. The cluefulness seems like it would be good for society, and market forces mean the rest of us then might also be able to buy products that aren't ridiculously insecure and invasive.

  • by didibus on 8/6/21, 12:03 AM

    I have a question, does this mean that Apple will have a way to decrypt photos in iCloud?

    It seems this can then be a security risk, since Apple could be breached and they'd have the means to server side decrypt things.

    If it was simply that client side end to end encryption can be turned on/off based on if the account is a child account or not (or as a configuration for parental control) that be different.

    As just a config, then I mean the slippery slope always existed, Apple could always just be forced into changing the settings of what gets end to end encrypted and when.

    But if this means that all photos are sent unencrypted to Apple at some point, or sent to Apple in a way they can decrypt, then it does open the door to your photos not being securely stored and attackers being able to steal them. That seems a bit of an issue.

  • by voidmain on 8/6/21, 12:50 AM

    This seems less concerning than the fact that iCloud backup is not end-to-end encrypted in the first place.
  • by viktorcode on 8/6/21, 4:46 AM

    On device data scan, however well-intended it may be, is an invasion of privacy. Server scan is entirely different matter, because it is an optional service which may come with any clauses its provider may deem necessary.

    I understand that it doesn't scan everything, but it don't matter. What matter is there's an implemented technical capability to run scans against external fingerprint database. it's a tool which may be used for many needs.

    I hope some countries will prohibit Apple doing that. Germany with its strict anti-snooping laws comes to mind. Maybe Japan. The more, the better.

    Oh, and by the way, every tech-savvy sex predator now knows what they should avoid doing. As always with mass privacy invasions: criminals are the last to suffer from it.

  • by dep_b on 8/6/21, 8:25 AM

    So we have a person that is technical enough to find known CP, so the stuff that's already automatically filtered out by Google and co because those same hashes are already checked against for all images they index. So knowledge of dark web should be assumed, something I don't even know how to use let alone how find the filth on there.

    Yet....dumb enough to upload it unencrypted to iCloud instead of storing it in a strongly encrypted folder on their PC?

    The two circles in this diagram have a very thin overlap I think.

    Dumb move by Apple, privacy is either 100% private or not private.

    Unless somebody can enlighten me that like 23% of all investigated pedophiles that had an iPhone seized had unencrypted CP on their iCloud accounts? I am willing to be proven wrong here.

  • by sadness3 on 8/6/21, 4:49 AM

    For me, this crosses a line. There should be no need to "strike a balance" with authorities wanting what are essentially unwarranted searches. The right balance is, "fuck off".

    I'm looking into privacy phones for the first time and will be switching.

  • by mccorrinall on 8/5/21, 10:37 PM

    They are putting their own users under surveillance. Didn’t expect that from Apple.
  • by Grustaf on 8/6/21, 6:54 AM

    The articles I've read say:

    _Hashes_ of photos will be scanned for _known_ abusive material, client side.

    So the only thing Apple can find out about you is if you have some of these known and catalogued images. They will definitely not know if you have other nude photos, including of your children.

    The other, separate feature is a parental control feature. You as a parent can be told if your children send or receive nude photos. This obviously sacrifices some of their privacy, but that is what parenting is. It's not more intrusive than screentime, or any number of things you might do as a parent to make sure your children are safe..

  • by bogomipz on 8/6/21, 2:00 PM

    The article states:

    >"The (unauditable) database of processed CSAM images will be distributed in the operating system (OS), the processed images transformed so that users cannot see what the image is, and matching done on those transformed images using private set intersection where the device will not know whether a match has been found"

    Am I reading this correctly in that Apple will essentially be pushing out contraband images to user's phones? Couldn't the existence of these images on a user's phone potentially have consequences and potentially be used against an unwitting iPhone user?

  • by Calvin02 on 8/5/21, 8:49 PM

    I think the issue is that what the tech community sees as privacy is different than what the general public thinks of as privacy.

    Apple, very astutely, understands that difference and exploited the latter to differentiate its phones from its main competitor: cheap(er) android phones.

    Apple didn’t want the phones to be commoditized, like personal computers before it. And “privacy” is something that you can’t commoditize. Once you own that association, it is hard to fight against it.

    Apple also understands that the general public will support its anti child exploitation and the public will not see this as a violation of privacy.

  • by Clubber on 8/6/21, 2:38 AM

    What are some options for phones that don't spy on me or my children?
  • by RedComet on 8/6/21, 7:48 AM

    It won't be long before this is turned on political dissidents.

    * knock knock * "we received an anonymous report that you have hate speech an illegal meme on your phone, please come with us"

  • by m3kw9 on 8/5/21, 10:03 PM

    Gonna get downvoted for this, I maybe the few that supports this and I hope they catch these child exploiters by the boat load and save 1000s of kids from traffickers and jail their asses
  • by tango-unchained on 8/6/21, 12:12 AM

    Advancement of the surveillance state is especially terrifying after this past summer of police abuse. We already know that in our country people in power abuse their authority and nothing happens (unless international protests prompt an action). This just collects more power under the disgusting guise of "won't somebody think of the children" while calling the people opposed pedophile supporters.

    Does anybody have recommendations on what to do to help oppose this instead of just feeling helpless?

  • by etempleton on 8/5/21, 10:06 PM

    I think this is probably the reasonable and responsible thing for Apple to do as a company, even if it it goes against their privacy ethos. Honestly they probably have been advised by their own lawyers that this is the only way to cover themselves and protect shareholder value.

    The question will be if Apple will bend to requests to leverage this for other reasons less noble than the protection of children. Apple has a lot of power to say no right now, but they might not always have that power in the future.

  • by stereoradonc on 8/6/21, 12:55 AM

    The privacy creep usually happens by building narratives around CSAM. Yes, agreed it was objectionable, but there was no "scientific analysis" that such measures would prevent dissemination in the first place. Surveillance is morally discreditable, and Apple seems to have tested the waters well - by building a privacy narrative and then screwing the users in the process. Most users believe it is "good for them". Though, it remains the most restrictive system.
  • by contingencies on 8/5/21, 11:25 PM

    Never buying another Apple product.
  • by fetzu on 8/6/21, 7:09 AM

    I honestly fail to see how the “oppressive regimes could just turn the on-device scanning into a state surveillance tool” is not a slippery slope arguments when on-device scanning and classification (NN for image processing and classification) has been going on for years on iOS devices.

    It just seem very paradoxical to be using a cloud based photo and/or un-encrypted backup service and then worry about one’s privacy being at risk.

  • by superkuh on 8/5/21, 11:46 PM

    I guess Apple has given up on Apple Pay and becoming a bank. Without that as motivation for security this is probably the first of many compromises to come.
  • by Guthur on 8/6/21, 10:41 AM

    I think it's becoming very apparent that through apathy, indoctrination, and fear that freedom will be well and truly stamped out.

    You just have to say for the greater good and you can get away with anything. Over the last year and half so many have been desensitised to over bearing collectivism that at this stage i think governments and their any Big Corp lackeys could get away with just about anything now.

  • by citizenpaul on 8/6/21, 4:34 PM

    I fully support this. History has shown us that humanity and especially their governments are very well equipped to deal with near godlike power of surveillance. There are basically no examples of this power being abused through all of history. Maybe a couple of bad apples. We should really look into how this can be expanded. Imagine if crime could be stopped before it starts.
  • by akouri on 8/6/21, 3:07 AM

    Nobody is talking about the performance implications to the photos and messages app. All these image hashes and private set intersection operations are going to eat CPU and battery life.

    This is the downside to upgrading your iOS version. Once you update, it's not like you can go back, either. You're stuck with a slower, more power-hungry phone for the life of the phone.

  • by egotripper on 8/6/21, 7:47 AM

    Who ordered Apple to do this, "or else?" What was the "or else?" How easy will it be to expand this capability by Apple or anyone outside of Apple?

    I expect that any time you take a photo, the scan will be performed right away, and the results file will be waiting to be sent the next time you enable voice and data.

    This capability crushes the trustworthiness of the devices.

  • by _robbywashere on 8/6/21, 12:44 AM

    This is waaaay too turnkey for searching images on our devices that someone/something doesn’t like. Absolutely terrifying.
  • by _carl_j_b_223 on 8/6/21, 5:14 AM

    Does Apple really think those bastards share their disgusting content via iCloud or message themself via iMessage? Even if some idiots did, they'll stop by now. So even if Apple has pure good intentions it'll be pretty useless and so Apple don't even have to start with these kind of questionable practices.
  • by citboin on 8/6/21, 3:16 AM

    All of my hardware is outdated so I was about to make the jump to Apple all across the board. Now I’m probably going to dive into the deep end and go into FOSS full throttle. I’m going to investigate Linux OEM vendors tonigh. The only one that I know of is System 76. Are there any Linux based iPad competitors?
  • by everyone on 8/5/21, 10:11 PM

    When u upload any build to app store, before you can have it in testflight or submit it for release, you have to fill out this questionnaire asking "does your app use encryption?" If you say yes, you're basically fucked, good luck releasing it.. You have to say no as far as I'm aware.
  • by dukeofdoom on 8/5/21, 9:15 PM

    Technocrats are the new railway tycoons
  • by rotbart on 8/5/21, 11:01 PM

    As a former 13year old, that would be the end of 13 year olds using iMessages... I smell an opportunity.
  • by roamerz on 8/6/21, 4:33 AM

    Bad Apple. Today it is something socially unacceptable - child exploitation. The reason that is used as a reason is plainly obvious. What will be the next socially unacceptable target? Guess it depends on who the ruling class. Very disappointed in this company’s decision.
  • by lenkite on 8/6/21, 7:47 AM

    Can the legions of Apple Apologists on this forum at-least agree that all the talk about how well the iPhone supports individual privacy is just a bunch of bald-faced lies ?

    I mean they use the privacy argument to avoid side-loading apps, lol. But scanning your photos is OK.

    What absolute hypocrisy.

  • by miika on 8/8/21, 8:29 AM

    People at Apple really think that someone who has such images would add them to iCloud Library?
  • by kevin_thibedeau on 8/5/21, 9:57 PM

    It would be a shame if we had to start an investigation into your anti-competitive behavior...
  • by klempotres on 8/5/21, 9:36 PM

    Technically speaking, if Apple plans to perform PSI on device (as opposed to what Microsoft does), how come that "the device will not know whether a match has been found"?

    Is there anyone who's familiar with the technology so they can explain how it works?

  • by xbmcuser on 8/6/21, 5:25 AM

    Apple scanning for law enforcement in 1 country gives proof of concept for another country to ask for the same for their own laws. And with a big enough market can easily arm twist Apple to comply as $$ means more than all privacy they talk about.
  • by gowld on 8/6/21, 12:12 AM

    I get the concern, but "Corporation X can be compromised by the State, which is evil" is not a problem with the corporation. It's a problem with your civilization.

    If you don't trust the rule of law, Apple can't fix that for you.

  • by jason2323 on 8/6/21, 2:00 AM

    Whats the alternative here? What other viable alternative operating system will we use?
  • by suizi on 8/5/21, 11:10 PM

    https://news.ycombinator.com/item?id=28081184 The NCMEC already had it's problems. But, this takes it to a whole new level.
  • by suizi on 8/6/21, 1:20 AM

    The FBI doesn't even have the resources to review all the reports they do get (we learned that in 2019), and yet they want to intrude on everyone's rights to get even more to investigate (which they won't).
  • by mactavish88 on 8/6/21, 11:09 AM

    What kind of amateur criminal would store illegal material in their iCloud account?
  • by fortran77 on 8/6/21, 3:12 AM

    What’s to stop a malicious person from sending a prohibited image to an unsuspecting person, and causing the target to get into legal trouble for which there is no legal defense ("strict liability" for possession).
  • by XorNot on 8/5/21, 11:08 PM

    Various copyright enforcement lobbies are all furiously drafting letters right now.
  • by rStar on 8/6/21, 8:20 AM

    i’m ashamed of every single apple employee who worked to make this happen. their work will be used to subjugate the most vulnerable among us. i hope you all hate yourselves forever for your cowardice and immorality.
  • by FpUser on 8/5/21, 9:25 PM

    Luckily I only use phone to make phone calls, offline GPS and to control some gizmos like drones. Do not even have data plan. Not an Apple customer either so I guess my exposure to things mentioned is more limited.
  • by slaymaker1907 on 8/5/21, 9:46 PM

    I'd be surprised if this goes through as is since you can't just save this stuff indefinitely. Suppose a 14 year old sexts a 12 year old. That is technically child porn and so retention is often illegal.
  • by xbar on 8/6/21, 5:08 AM

    Police-state-designed device.
  • by hmwhy on 8/6/21, 7:23 AM

    And, in the meantime, Roblox is promoted in the App Store.

    For context, see https://news.ycombinator.com/item?id=20620102

  • by barrkel on 8/6/21, 5:47 AM

    Once this tech is implemented, courts will direct it to be used in situations Apple did not intend. Apple will have created a capability and the courts will interpret refusal to expand its use as contempt.
  • by thysultan on 8/6/21, 5:07 AM

    All that expansive "privacy" marketing undone by a single move.
  • by jra_samba on 8/6/21, 1:51 AM

    Sorry Apple fans, but you have been living in the very definition of "The Hotel California".

    Apple has altered the deal. Pray they do not alter it any further.

    Now you have to live with the consequences of convenience.

  • by Spooky23 on 8/5/21, 9:21 PM

    This article is irresponsible hand-waving.

    “ When Apple releases these “client-side scanning” functionalities, users of iCloud Photos, child users of iMessage, and anyone who talks to a minor through iMessage will have to carefully consider their privacy and security priorities in light of the changes, and possibly be unable to safely use what until this development is one of the preeminent encrypted messengers.”

    People sending messages to minors that trigger a hash match have more fundamental things to consider, as they are sending known photos of child exploitation to a minor.

    The EFF writer knows this, as they describe the feature in the article. They should be ashamed of publishing this crap.

  • by mrwww on 8/6/21, 12:17 PM

    So if your Apple ID/icloud gets compromised, and somebody save an album of CP to your icloud photos, it is then only a question of time until the police comes knocking?
  • by mulmen on 8/6/21, 12:58 AM

    Will my photos still be scanned if I do not use iCloud Photos?
  • by christkv on 8/6/21, 4:08 PM

    Why don’t they just run their trained classifier on the phone itself to do this stuff. There should not be any need to do this on the server no matter what they say.
  • by alfiedotwtf on 8/6/21, 1:41 AM

    Let's call it out for what it is - Apple's Dragnet.
  • by villgax on 8/6/21, 2:08 AM

    The impact a false positive can have on relations between parents & friends of the family is huge for something banal as an art poster/music cover art
  • by beebeepka on 8/6/21, 7:30 AM

    My fellow Earthicans, we enjoy so much freedom, it's almost sickening. We're free to chose which hand our sex-monitoring chip is implanted in.
  • by andrewmcwatters on 8/5/21, 10:40 PM

    I suspect Apple is subject to government and gag orders and Microsoft has already been doing this with OneDrive but no one has heard about it yet.
  • by tw600040 on 8/6/21, 1:18 AM

    I wish there existed some decentralized device that can do iCloud backups and people can just buy that divide and set it up in their home.
  • by anupamchugh on 8/6/21, 6:48 AM

    By notifying parents of children under 13 for image abuse, looks like Apple wants to be both the police and the parent of iPhone owners.
  • by swiley on 8/5/21, 10:40 PM

    I'm really worried about everyone. Somehow I've missed this until now and I've felt sick all day since hearing about it.
  • by michalu on 8/7/21, 12:05 PM

    This will have only one effect, pedophiles will stop using ios and for all the rest of us our privacy will remain compromised.
  • by temeritatis on 8/5/21, 9:41 PM

    the road to hell is paved with good intentions
  • by alana314 on 8/6/21, 1:34 AM

    I thought Apple's iMessage wasn't end-to-end anyway but instead used a key stored on Apple's servers?
  • by jimt1234 on 8/5/21, 11:16 PM

    > ... a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.
  • by shadowhack on 8/8/21, 2:29 PM

    They want to do this, but not interested in taking out apps like Kik from their app store...
  • by volta83 on 8/6/21, 8:28 AM

    So Apple is putting a database of child pornography on my phone ?

    I’d rather not have that on my phone.

  • by xyst on 8/5/21, 11:24 PM

    If this project goes live, I would drop Apple in a heart beat.
  • by wellthisisgreat on 8/5/21, 9:16 PM

    Apple's parental controls are HORRIBLE. There is at least 20% false positives there, that flag all sorts of absolutely benign sites as "adult".

    Any kind of machine-based contextual analysis of users' content will be a disaster.

  • by Drblessing on 8/5/21, 10:07 PM

    Use signal y'all
  • by 14 on 8/6/21, 2:56 AM

    Will the jailbreakers be able to disable this feature?
  • by throw7 on 8/5/21, 11:10 PM

    The question that should be asked is if you think it's ok if the U.S. gov't looks at every picture you take and have taken and store and will store. The U.S. gov't will access, store, and track that information on you for your whole life. Past pictures. Present pictures. Future pictures.

    I don't use apple products, but if I found out google was scanning my photos on photos.google.com on behalf of the government I would drop them. I'm not saying it wouldn't hurt, because it definitely would, but in a capitalistic country this is the only way to fight back.

  • by RightTail on 8/5/21, 9:34 PM

    This is going to be used to suppress political dissidents aka "populist/nationalist right" aka the new alqaeda

    searching for CP is the original pretext

  • by unstatusthequo on 8/5/21, 10:14 PM

    4th Amendment. Plaintiff lawyers gear up.
  • by nullc on 8/6/21, 6:08 AM

    Your smartphone or desktop computer is your agent. You can't accomplish many necessary tasks without it, you're nearly required by law to use one. It handles your most private data, and yet you have no real visibility into its actions. You just have to trust it.

    As such, it should NEVER do anything that isn't in your best interest-- to the greatest extent possible under the law. Your relationship with your personal computer is closer and more trusted than your relationship with your doctor or lawyer-- in fact, you often communicate with these parties via your computer.

    We respect the confidentiality you enjoy with your professional agents but that confidentiality cannot functionally exist if your computing devices are not equally duty bound to act in their users best interest!

    This snitching 'feature' is a fairly general purpose tracing/tracking mechanism-- We are to assume that the perceptual hashes are exclusively of unlawful images (though I can't actually find a firm, binding assertion of that!)-- but there is nothing assuring that to us except for blind trust.

    Even if the list today exclusively has unlawful images there is no guarantee that tomorrow it won't have something different-- no guarantee that some hysterical political expediency won't put images associated with your (non-)religion or ethnicity into it, no guarantee that the facility serving these lists won't be hacked or abused by insiders. Considering that possession of child porn is a strict liability crime, Apple themselves has presumably not validated the content of the list themselves and certainly you won't be allowed to check it. Moreover, even if there were some independent vetting of the list content there is nothing that would prevent targeted parties from being given a different unvetted list without their knowledge.

    The pervasive scanning can also be expected to dramatically increases the effectiveness of framing. It's kind of cliche that the guilty person often claims "I was framed"-- but part of the reason that framing is rare is because the false evidence has to intersect a credibly motivated investigation, and they seldom do except where there are other indicators of guilt. With automated scanning it would be much more reliable to cause someone a world of trouble by slipping some indicated material on their device, and so framing would have a much better cost/benefit trade-off.

    Any of the above flaws are sufficiently fatal on their own-- but add to it the potential for inadvertent false positives both in the hash matching and in the construction of the lists. Worse, it'll probably be argued that the detailed operation of the system must be kept secret from the very users whos systems it runs on specifically because knowledge of the operation would greatly simplify the malicious construction of intentional false positives which could be used for harassment by causing spurious investigations.

    In my view Apple's actions here aren't just inappropriate, they're unambiguously unethical and in a more thoughtful world they'd be a violation of the law.

  • by bississippi on 8/5/21, 9:20 PM

    First they built a walled garden beautiful on the inside and excoriated competitors [1] for their lack of privacy. Now that the frogs have walked into the walled garden, they have started to boil the pot [2] . I don’t think the frogs will ever find out when to get off the pot.

    [1] https://www.vox.com/the-goods/2019/6/4/18652228/apple-sign-i...

    [2] https://en.wikipedia.org/wiki/Boiling_frog

  • by alisonkisk on 8/6/21, 10:32 PM

    OP completely misunderstands the situation.

    > OS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.

    WhatsApp is not a hosting service.

  • by TroisM on 8/6/21, 1:40 AM

    at least they dont lie about their spying on your device anymore...
  • by websites2023 on 8/5/21, 8:50 PM

    Apple's battle is against Surveillance Capitalism, not against state-level surveillance. In fact, there is no publicly traded company that is against state-level surveillance. It's important not to confuse the two.

    Think of it this way: If you want to hide from companies, choose Apple. If you want to hide from the US Government, choose open source.

    But if your threat model really does include the US government or some other similarly capable adversary, you are well and truly fucked already. The state-level apparatus for spying on folks through metadata and traffic interception is now mode than a decade old.

  • by cblconfederate on 8/5/21, 10:15 PM

    Makes you rally for NAMBLA
  • by thedream on 8/5/21, 10:34 PM

    The Cult Of The Apple hawks its slimy surveillance Snake Oil to a gluttonous throng of thralls.

    So where's the news?

  • by shmerl on 8/5/21, 9:24 PM

    Is anyone even using Apple if they care about privacy and security?
  • by aetherspawn on 8/6/21, 12:56 AM

    Yeah, sure. I’m happy to be downvoted to hell, but I know people who would have benefit greatly from this (perhaps have entirely different lives) if it were implemented 10 years ago.

    Convince me that a strong step to ending CSA at the expense of a little privacy is a bad thing.

  • by new_realist on 8/5/21, 9:54 PM

    Moral panics are nothing new, and have now graduated into the digital age. The last big one I remember was passage of the DMCA in 1999; it was just absolutely guaranteed to kill the Internet! And as per usual, the Chicken Littles the world were proven wrong. The sky will not fall in this case, either. Unfortunately civilization has produced such abundance and free time that outage viruses like this one will always circulate.
  • by new_realist on 8/5/21, 9:57 PM

    Moral panics are nothing new, and have now graduated into the digital age. The last big one I remember was passage of the DMCA in 1999; it was just absolutely guaranteed to kill the Internet! And as per usual, the Chicken Littles the world were proven wrong. The sky will not fall in this case, either. Unfortunately civilization has produced such abundance and free time that outage viruses like this one will always circulate. Humans need something to spend their energy on.
  • by edison112358 on 8/5/21, 8:50 PM

    “This means that when the features are rolled out, a version of the NCMEC CSAM database will be uploaded onto every single iPhone.”

    So every iPhone will now host the explicit images from the National Center for Missing & Exploited Children database.

  • by new_realist on 8/5/21, 9:15 PM

    Studies have shown that CCTV reduces crime (https://whatworks.college.police.uk/toolkit/Pages/Interventi...). I expect results here will be even better.

    This technology uses secret sharing to ensure a threshold of images are met before photos are flagged. In this case, it's even more private than CCTV.

    Totalitarian regimes to do not need some magic bit of technology to abuse citizens; that's been clear since the dawn of time. Those who are concerned about abuse would do well to direct their efforts towards maintenance of democratic systems: upholding societal, political, regulatory and legal checks and balances.

    Criminals are becoming better criminals by taking advantage of advancements in technology right now, and, for better or worse, it's an arms race and society will simply not accept criminals gaining the upper hand.

    If not proven necessary, society is capable of reverting to prior standards (Habeas Corpus resumed after the Civil War, and parts of the Patriot Act have expired, for example.).