by bbatsell on 8/5/21, 8:20 PM with 824 comments
by threatofrain on 8/5/21, 8:42 PM
by trangus_1985 on 8/5/21, 8:30 PM
Fortunately, my email is on a paid provider (fastmail), and my photos are on a NAS, I've worked hard to get all of my friends on Signal. While I still use google maps, I've been trialing out OSM alternatives for a minute.
The things they've described are in general, reasonable and probably good in the moral sense. However, I'm not sure that I support what they are implementing for child accounts (as a queer kid, I was terrified of my parents finding out). On the surface, it seems good - but I am concerned about other snooping features that this portents.
However, with icloud photos csam, it is also a horrifying precedent that the device I put my life into is scanning my photos and reporting on bad behavior (even if the initial dataset is the most reprehensible behavior).
I'm saddened by Apple's decision, and I hope they recant, because it's the only way I will continue to use their platform.
by triska on 8/5/21, 8:39 PM
I bought an iPhone because the CEO seemed to be sincere in his commitment to privacy.
What Apple has announced here seems to be a complete reversal from what I understood the CEO saying at the conference only a few years ago.
by c7DJTLrn on 8/5/21, 9:37 PM
I've had enough of the "think of the children" arguments.
by geraneum on 8/5/21, 10:21 PM
“ Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.”
“… We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.”
“ The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.”
Tim Cook, 2016
by Shank on 8/5/21, 9:05 PM
1. PhotoDNA is already scanning content from Google Photos and a whole host of other service providers.
2. Apple is obviously under pressure to follow suit, but they developed an on-device system, recruited mathematicians to analyze it, and published the results, as well as one in-house proof and one independent proof showing the cryptographic integrity of the system.
3. Nobody, and I mean nobody, is going to successfully convince the general public that a tool designed to stop the spread of CSAM is a "bad thing" unless they can show concrete examples of the abuse.
For one and two: given the two options, would you rather that Apple implement serverside scanning, in the clear, or go with the on-device route? If we assume a law was passed to require serverside scanning (which could very well happen), what would that do to privacy?
For three: It's an extremely common trope to say that people do things to "save the children." Well, that's still true. Arguing against a CSAM scanning tool, which is technically more privacy preserving than alternatives from other cloud providers, is an extremely uphill battle. The biggest claim here is that the detection tool could be abused against people. And that very well may be possible! But the whole existence of NCMEC is predicated on stopping the active and real danger of child sex exploitation. We know with certainty this is a problem. Compared to a certainty of child sex abuse, the hypothetical risk from such a system is practically laughable to most people.
So, I think again, the backlash is daft. It's been about two days of the announcement being public (leaks). The underlying mathematics behind the system has barely been published [0]. It looks like the EFF rushed to make a statement here, and in doing so, it doesn't look like they took the time to analyze the cryptography system, to consider the attacks against it, or to consider possible motivations and outcomes. Maybe they did, and they had advanced access to the material. But it doesn't look like it, and in the court of public opinion, optics are everything.
[0]: https://www.apple.com/child-safety/pdf/Alternative_Security_...
by Wowfunhappy on 8/5/21, 11:36 PM
> If an account held by a child under 13 wishes to send an image that the on-device machine learning classifier determines is a sexually explicit image, a notification will pop up, telling the under-13 child that their parent will be notified of this content. [...] For users between the ages of 13 and 17, a similar warning notification will pop up, though without the parental notification.
Why is it different for children under 13, specifically? The 18-year cutoff makes sense, because turning 18 carries legal weight in the US (as decided via a democratic process), but 13?
13 is an age when many parents start granting their children more freedom, but that's very much rooted in one's individual culture—and the individual child. By giving parents fewer options for 13-year-olds, Apple—a private company—is pushing their views about parenting onto everyone else. I find that a little disturbing.
---
Note: I'm not (necessarily) arguing for greater restrictions on 13-year-olds. Privacy for children is a tricky thing, and I have mixed feelings about this whole scheme. What I know for sure, however, is that I don't feel comfortable with Apple being the one to decide "this thing we've declared an appropriate invasion of privacy for a 12-year-old is not appropriate for a 13-year-old."
by strogonoff on 8/5/21, 10:54 PM
- Apple sends Bob’s info to law enforcement; Bob is swatted or his life is destroyed in some other way. Worst, but most likely outcome.
- An Apple employee (or an outsourced contractor) reviews the photo, comparing it to CSAM source image sample used for the hash. Only if the image matches according to human vision, Bob is swatted. This requires there to be some sort of database of CSAM source images, which strikes me as unlikely.
- An Apple employee or a contractor reviews the image for abuse without comparing it to CSAM source, using own subjective judgement. Better, but implies Apple employees could technically SWAT Apple users.
by farmerstan on 8/5/21, 11:34 PM
How do we know Apple or the FBI don’t do this? If they want to search someone’s phone all they need to do is enter a hash of a photo they know is on the targets phone and voila, instant access.
Also, how is this not a violation of the 14th amendment? I know Apple isn’t part of the government but they are basically acting as a defacto agent of the police by scanning for crimes. Using child porn as a completely transparent excuse to start scanning all our material for anything they want makes me very angry.
by cwizou on 8/5/21, 10:27 PM
Can they trust random government to give them a database of only CSAM hashes and not insert some extra politically motivated content that they deem illegal ?
Because once you've launched this feature in the "land of the free", other countries will require for their own needs their own implementation and demand (through local legislation which Apple will need to abide to) to control said database.
And how long until they also scan browser history for the same purpose ? Why stop at pictures ? This is opening a very dangerous door that many here will be uncomfortable with.
Scanning on their premises (considering they can as far as we know ?) would be a much better choice, this is everything but (as the "paper" linked tries to say) privacy forward.
by lovelyviking on 8/6/21, 7:24 AM
- User: Are you out of your f... mind?
- Apple: It's for children protection.
- User: Ah, ok, no problem, please install spyware and do later whatever you wish and forget about any privacy, the very basis of rights, freedom and democracy.
This is by the way how Russia started to filter the web from political opponents. All necessary controls were put in place under the same slogan: "to protect children"
Yeah, right.
Are modern people that naive and dumb and can't think 2 steps forward? Is that's why it's happening?
Edit: Those people would still need to explain how living in society without privacy, freedom and democracy with authoritarian practices when those children will grow up will make them any 'safer' ...
by hncurious on 8/5/21, 9:31 PM
https://www.vox.com/recode/2021/5/13/22435266/apple-employee...
https://www.vox.com/recode/22583549/apple-employees-petition...
Will they apply that energy and leverage to push back on this?
How else can this be stopped before it goes too far? Telling people to "Drop Apple" is even less effective than "Delete Facebook".
by iamleppert on 8/5/21, 9:55 PM
All one needs to do, in order to flag someone or get them caught up in this system, is to gain access to this list of hashes and construct an image. This data is likely to be sought after as soon as this system is implemented, and it will only be a matter of time before a data breach exposes it.
Once that is done, the original premise and security model of the system will be completely eroded.
That said, if this does get implemented I will be getting rid of all my Apple devices. I’ve already switched to Linux on my development laptops. The older I get, the less value Apple products have to me. So it won’t be a big deal for me to cut them out completely.
by haskaalo on 8/5/21, 11:32 PM
In your house, you might have private documents, do some things you don't want other people to have or see just like what we have on our phones nowadays.
The analogy I'm trying to make is that if suddenly the government decided to install cameras in every houses with the premise to make sure no pedophile is abusing a child and that the cameras never send data unless the AI done locally detects it is something that I believe would shock everyone.
by skee_0x4459 on 8/5/21, 10:38 PM
if you really want to save the children, why not build the scanning into safari? scan the whole phone! just scan it all. its really no different than what they are doing. its not like they would have to cross the rubicon to do it, not anymore anyway.
and also i think its interesting how kids will adjust to this. i think a lot of kids wont hear about this and will find themselves caught up in a child porn case.
im so proud of the responses that people seem to generally have. it makes me feel confident in the future of the world.
isnt there some device to encrypt and decrypt messages with a separate device that couples to your phone? like a device fit into a case and that has a keyboard interface built into a screen protector with indium oxide electrodes.
by Waterluvian on 8/5/21, 11:45 PM
This kind of stuff absolutely petrifies me because I’m so scared of getting accidentally scooped up for something completely unintentional. And I do not trust police one bit to behave like intelligent adult humans.
Right now I feel like I need to stop doing ANYTHING that goes anywhere outside the velvet ropes of the modern commercial internet. That is, anywhere that cannot pay to moderate everything well enough that I don’t run the risk of having my entire life ruined because some #%^*ing algorithm picks up on some content I didn’t even choose to download.
by roody15 on 8/5/21, 9:59 PM
NSO used exploits in iMessage to enable them to grab photos, texts among other things.
Now shortly after Apple security patches we see them pivot and now want to “work” with law enforcement. Hmmm almost like once access was closed Apple needs a way to justify “opening” access to devices.
Yes I realize this could be a stretch based on the info. Just seems like an interesting coincidence… back door exposed and closed…. now it’s back open… almost like governments demand access
by shrimpx on 8/6/21, 2:01 AM
> Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching [...]
It's incredible that Apple arrived at the conclusion that client-side scanning that you cannot prevent is more private than cloud-scanning.
Since they claim they're only scanning iCloud content, why not scan in the cloud?
They decided the most private way is to scan iCloud content before it's uploaded to the cloud... Because if they scanned in the cloud it would be seen as a breach of privacy and is bad optics for a privacy-focused company? But scanning on the physical device that they have described as "personal" and "intimate" has better optics? That's amazing.
This decision can only be read as Apple paving the way to scanning all content on the device, to bypass the pesky "Backup to iCloud" options being turned off.
by nicetryguy on 8/5/21, 9:48 PM
by blintz on 8/6/21, 2:02 AM
A more recent example is how private set intersection became an easy way to get contact tracing tech everywhere while maintaining an often perfunctory notion of privacy.
I wonder where large companies will take this next. It behooves us cryptography/security people who actually care about not walking down this slippery slope to fight back with tech of our own.
This whole thing also somewhat parallels the previous uses of better symmetric encryption and enclaves technologies for DRM and copyright protection.
by endisneigh on 8/5/21, 8:50 PM
As far as this is concerned, seems like if you don’t use iMessage or iCloud you’re safe for now.
by avnigo on 8/6/21, 8:36 AM
Chilling. Why have human reviewers, unless false positives are bound to happen (this is of 100% certainty with the aggregate amount of photos to be scanned)?
So, in effect, Apple has hired human reviewers to police your photos that an algorithm has flagged. Whether you knowingly consent to or not (through some fine print), you are being subjected to a search without probable cause.
This is not the future I was looking forward to.
by DaveSchmindel on 8/5/21, 11:40 PM
(2) I'm equally intrigued by the paradox that in order for the algorithms that perform the CSAM detection to work, it must require some data set that represents these reprehensible images (which are illegal to possess).
by falcolas on 8/5/21, 10:27 PM
Not that you care, but this is the straw that's broken this camel's back. It's too ripe for abuse, it's too invasive, and I don't want it.
You've used one of the Four Horsemen of the Infocalypse perfectly… and so I'm perfectly happy to leave your ecosystem.
Cheers.
by outworlder on 8/5/21, 9:28 PM
Well, yes? Parents are already legally responsible for their young children and under their supervision. The alternative would be to not even give such young children these kind of devices to begin with - which might actually be preferable.
> this system will give parents who do not have the best interests of their children in mind one more way to monitor and control them
True. But the ability to send or receive explicit images would most likely not be the biggest issue they would be facing.
I understand the slippery slope argument the EFF is making, but they should keep to the government angle. Having the ability for governments to deploy specific machine learning classifiers is not a good thing.
by arihant on 8/5/21, 10:34 PM
What are some other fully encrypted photo options out there?
by young_unixer on 8/6/21, 1:54 AM
But now I'm reminded of how fucking awful and hostile Apple and other companies can be. I'm once again 100% convinced that free software is the only way to go, even if I have to endure using software with ugly UIs and bad UX. It will be worth it just not to have to use software written by these assholes.
Stallman was right.
by babesh on 8/5/21, 9:11 PM
This also means that it is shielded from attack by the power structure. That is the bargain that the tech industry has struck.
The agenda is always towards increasing power for the power structure. One form of power is information. That means that Apple is inexorably drawn towards increasing surveillance. Also, Apple’s massive customer base both domestic and overseas is a juicy surveillance target.
by panny on 8/5/21, 10:50 PM
Then this news broke. Apple, you just lost several thousand dollars in sales from me. I had items in cart and was pricing everything out when I found this news. I will spend my money elsewhere. This is a horrendous blunder. I will not volunteer myself up to police states by using your gear now or ever again in the future. I've even inquired about returning the work laptop in exchange for a Dell.
Unsafe at any speed. Stallman was right. etc etc etc.
by stakkur on 8/6/21, 12:22 AM
by imranhou on 8/5/21, 10:51 PM
When saying no to ideas like this, we should at the same time attempt to also share our thoughts on what would be an acceptable alternative solution.
by dsign on 8/6/21, 10:11 AM
1. They expect must people will shrug and let themselves be scanned. That is, this privacy invasion will result in minimal damage to the Apple brand, or
2. They know privacy-savvy people will put them from now on on the same league with Android, and they are prepared to take the financial loss.
Scenario 1 is the most plausible, though it hints an impish lack of consideration for their customers.
Scenario 2 worries me most. No smart company does something counter-productive financially unless under dire pressure. What could possibly make Apple shoot itself on the foot and announce it publicly? In other words, Apple's actions, from my perspective, look like a dead canary.
by hungryforcodes on 8/6/21, 5:05 AM
https://techcrunch.com/2021/04/28/apple-record-china-2021/
Apple's iPhone revenu just doubled from last year in China -- now 17 billion. Thats not a small number. The play against Huawei has done it's job, apparently -- it's quite mortally injured.
For sure the CCP would love to scan everyone's phones for files or images it finds troubling and for sure every country will eventually be allowed to have its own entries in this database or even their own custom DB.
So my cynical side says...Apple just sold out. MASSIVELY. The loosers -- everyone pretty much that buys their phones.
by tomxor on 8/5/21, 8:50 PM
But then I have to remind myself, the old Apple is long gone, the new Apple is a completely different beast, with a very different concept of what it is marketing.
by Sunspark on 8/5/21, 10:52 PM
Teens are not stupid. They'll eventually clue-in that big brother is watching and won't appreciate it. They'll start by using other messengers instead of imessage and then eventually leaving the ecosystem for Android or whatever else comes down the pike in the future.
by walterbell on 8/6/21, 12:04 AM
This would be super useful for iPhone security, e.g. incoming files could be scanned for attempting to use (closed) exploits, when the user can easily associate a malicious media file with the message sender or origin app/site.
On jailbroken devices (e.g. iPhone 7 and earlier with unpatchable boot ROMs), is there a Metasploit equivalent for iOS, which aggregates PoCs for public exploits?
A related question: will PhotoDNA hashing take place continuously or in batch, e.g. overnight? How will associated Battery/Power usage be accounted, e.g. attributed to generic "System" components or itemized separately? If the former, does that create class-action legal exposure for a post-sale change in device "fitness for purpose"?
by tlogan on 8/5/21, 11:11 PM
And I bet that Saudis and other oppressive regimes will use this to detect other “crimes”.
by strictnein on 8/5/21, 9:11 PM
> "Apple is planning to build a backdoor into its data storage system and its messaging system"
by gcanyon on 8/6/21, 12:08 AM
So I'll propose an alternative theory: Apple is doing this not to actually catch any child pornographers, but to ensure that any CP won't actually reach their servers. Less public good, more self-serving.
by mcone on 8/5/21, 9:09 PM
by Animats on 8/6/21, 12:03 AM
by j1elo on 8/5/21, 11:21 PM
The only practical barrier here is that their parents have educated them and their mental model arrives by its own at "no, this is a very bad idea" instead of "yes, I want to send this pic". Anything else, including petty prohibitions from their parents, will not be a decision factor in most cases. Have we forgotten how it was to be a teenager?
(I mean people, both underage and criminals, will just learn to avoid apple and use other channels)
by nick_naumov on 8/5/21, 11:23 PM
by NazakiAid on 8/5/21, 10:50 PM
by dalbasal on 8/6/21, 12:21 PM
Good question. Companies have to follow laws. The naive, early 2000s notion that the internet was unstoppable and ungovernable was mistaken. Apple, Google and the other internet bottlenecks were, it turned out, the pathway to a governable internet. That fight is lost.
Now that it's governable, attention needs to be on those governing... governments, parliaments, etc.
The old version of freedom of speech and such didn't come from the divine. They were created and codified and now we have them. We need to do that again. Declare new, big, hairy freedoms that come with a cost that we have agreed to pay.
There are dichotomies here, and if we deal with them one droplet at a time, they'll be compromised away. "Keep your private messages private" and "Prevent child pornography and terrorism in private messages" are incompatible. But, no one is going to admit that they are choosing between them... not unless there's an absolut-ish principle to defer to.
Once you're scanning email for ad targeting, it's hard to justify not scanning it for child abuse.
by n_io on 8/6/21, 12:39 AM
In place of Mail: Tutanota In place of iMessage: Signal And so on…
by chinchilla2020 on 8/6/21, 7:44 PM
If was a conspiracy type, I would assume this is more likely to be apple responding to an NSA request to de-crypt data.
This idea will be gradually expanded:
1. To detect child abuse (unsuccessfully)
2. To detect terrorism (also unsuccessfully)
3. To detect criminal activity (successful only against low-level criminals)
4. To detect radical political views as defined by Apple corporation
5. To detect human behaviors that are not supported by Apple's corporate vision
by hamburgerwah on 8/5/21, 11:08 PM
by robertwt7 on 8/5/21, 11:05 PM
Apparently I was wrong, I loved apple products and ecosystem. Not sure what to switch after this :/
by joering2 on 8/6/21, 1:28 AM
Question - if most people literally don't want to have anything to do with CP, isn't uploading of a hash database of that material to their phones precisely that?
For once I think I will feel disgusted walking around with my phone in a pocket; a phone that is full of hashes of child porn. That's a terrible feeling.
by djanogo on 8/5/21, 11:04 PM
by kntoukakis on 8/6/21, 5:58 AM
“The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.”
How did they calculate this? Also, I can imagine more than a trillion photos being uploaded to iCloud a year.
by iamnotwhoiam on 8/6/21, 2:19 AM
by scratchmyhead on 8/6/21, 6:59 PM
I'm missing how this will actually work if perpetrators knew Apple was going to analyze their data beforehand. Could someone explain?
by neilv on 8/5/21, 11:52 PM
Separate from kids, I wonder whether Apple's is yet shooting itself in the foot for teens.
Teens should start caring about privacy around then, are very peer/fashion-sensitive, and have shown that they'll readily abandon platforms. Many parents/teachers/others still want to be treating teens as children under their power, but teens have significant OPSEC motivation and ability.
Personally, I'd love to see genuinely good privacy&security products rushing to serve the huge market of a newly clueful late-teen generation. The cluefulness seems like it would be good for society, and market forces mean the rest of us then might also be able to buy products that aren't ridiculously insecure and invasive.
by didibus on 8/6/21, 12:03 AM
It seems this can then be a security risk, since Apple could be breached and they'd have the means to server side decrypt things.
If it was simply that client side end to end encryption can be turned on/off based on if the account is a child account or not (or as a configuration for parental control) that be different.
As just a config, then I mean the slippery slope always existed, Apple could always just be forced into changing the settings of what gets end to end encrypted and when.
But if this means that all photos are sent unencrypted to Apple at some point, or sent to Apple in a way they can decrypt, then it does open the door to your photos not being securely stored and attackers being able to steal them. That seems a bit of an issue.
by voidmain on 8/6/21, 12:50 AM
by viktorcode on 8/6/21, 4:46 AM
I understand that it doesn't scan everything, but it don't matter. What matter is there's an implemented technical capability to run scans against external fingerprint database. it's a tool which may be used for many needs.
I hope some countries will prohibit Apple doing that. Germany with its strict anti-snooping laws comes to mind. Maybe Japan. The more, the better.
Oh, and by the way, every tech-savvy sex predator now knows what they should avoid doing. As always with mass privacy invasions: criminals are the last to suffer from it.
by dep_b on 8/6/21, 8:25 AM
Yet....dumb enough to upload it unencrypted to iCloud instead of storing it in a strongly encrypted folder on their PC?
The two circles in this diagram have a very thin overlap I think.
Dumb move by Apple, privacy is either 100% private or not private.
Unless somebody can enlighten me that like 23% of all investigated pedophiles that had an iPhone seized had unencrypted CP on their iCloud accounts? I am willing to be proven wrong here.
by sadness3 on 8/6/21, 4:49 AM
I'm looking into privacy phones for the first time and will be switching.
by mccorrinall on 8/5/21, 10:37 PM
by Grustaf on 8/6/21, 6:54 AM
_Hashes_ of photos will be scanned for _known_ abusive material, client side.
So the only thing Apple can find out about you is if you have some of these known and catalogued images. They will definitely not know if you have other nude photos, including of your children.
The other, separate feature is a parental control feature. You as a parent can be told if your children send or receive nude photos. This obviously sacrifices some of their privacy, but that is what parenting is. It's not more intrusive than screentime, or any number of things you might do as a parent to make sure your children are safe..
by bogomipz on 8/6/21, 2:00 PM
>"The (unauditable) database of processed CSAM images will be distributed in the operating system (OS), the processed images transformed so that users cannot see what the image is, and matching done on those transformed images using private set intersection where the device will not know whether a match has been found"
Am I reading this correctly in that Apple will essentially be pushing out contraband images to user's phones? Couldn't the existence of these images on a user's phone potentially have consequences and potentially be used against an unwitting iPhone user?
by Calvin02 on 8/5/21, 8:49 PM
Apple, very astutely, understands that difference and exploited the latter to differentiate its phones from its main competitor: cheap(er) android phones.
Apple didn’t want the phones to be commoditized, like personal computers before it. And “privacy” is something that you can’t commoditize. Once you own that association, it is hard to fight against it.
Apple also understands that the general public will support its anti child exploitation and the public will not see this as a violation of privacy.
by Clubber on 8/6/21, 2:38 AM
by RedComet on 8/6/21, 7:48 AM
* knock knock * "we received an anonymous report that you have hate speech an illegal meme on your phone, please come with us"
by m3kw9 on 8/5/21, 10:03 PM
by tango-unchained on 8/6/21, 12:12 AM
Does anybody have recommendations on what to do to help oppose this instead of just feeling helpless?
by etempleton on 8/5/21, 10:06 PM
The question will be if Apple will bend to requests to leverage this for other reasons less noble than the protection of children. Apple has a lot of power to say no right now, but they might not always have that power in the future.
by stereoradonc on 8/6/21, 12:55 AM
by contingencies on 8/5/21, 11:25 PM
by fetzu on 8/6/21, 7:09 AM
It just seem very paradoxical to be using a cloud based photo and/or un-encrypted backup service and then worry about one’s privacy being at risk.
by superkuh on 8/5/21, 11:46 PM
by Guthur on 8/6/21, 10:41 AM
You just have to say for the greater good and you can get away with anything. Over the last year and half so many have been desensitised to over bearing collectivism that at this stage i think governments and their any Big Corp lackeys could get away with just about anything now.
by citizenpaul on 8/6/21, 4:34 PM
by akouri on 8/6/21, 3:07 AM
This is the downside to upgrading your iOS version. Once you update, it's not like you can go back, either. You're stuck with a slower, more power-hungry phone for the life of the phone.
by egotripper on 8/6/21, 7:47 AM
I expect that any time you take a photo, the scan will be performed right away, and the results file will be waiting to be sent the next time you enable voice and data.
This capability crushes the trustworthiness of the devices.
by _robbywashere on 8/6/21, 12:44 AM
by _carl_j_b_223 on 8/6/21, 5:14 AM
by citboin on 8/6/21, 3:16 AM
by everyone on 8/5/21, 10:11 PM
by dukeofdoom on 8/5/21, 9:15 PM
by rotbart on 8/5/21, 11:01 PM
by roamerz on 8/6/21, 4:33 AM
by lenkite on 8/6/21, 7:47 AM
I mean they use the privacy argument to avoid side-loading apps, lol. But scanning your photos is OK.
What absolute hypocrisy.
by miika on 8/8/21, 8:29 AM
by kevin_thibedeau on 8/5/21, 9:57 PM
by klempotres on 8/5/21, 9:36 PM
Is there anyone who's familiar with the technology so they can explain how it works?
by xbmcuser on 8/6/21, 5:25 AM
by gowld on 8/6/21, 12:12 AM
If you don't trust the rule of law, Apple can't fix that for you.
by jason2323 on 8/6/21, 2:00 AM
by suizi on 8/5/21, 11:10 PM
by suizi on 8/6/21, 1:20 AM
by mactavish88 on 8/6/21, 11:09 AM
by fortran77 on 8/6/21, 3:12 AM
by XorNot on 8/5/21, 11:08 PM
by rStar on 8/6/21, 8:20 AM
by FpUser on 8/5/21, 9:25 PM
by slaymaker1907 on 8/5/21, 9:46 PM
by xbar on 8/6/21, 5:08 AM
by hmwhy on 8/6/21, 7:23 AM
For context, see https://news.ycombinator.com/item?id=20620102
by barrkel on 8/6/21, 5:47 AM
by thysultan on 8/6/21, 5:07 AM
by jra_samba on 8/6/21, 1:51 AM
Apple has altered the deal. Pray they do not alter it any further.
Now you have to live with the consequences of convenience.
by Spooky23 on 8/5/21, 9:21 PM
“ When Apple releases these “client-side scanning” functionalities, users of iCloud Photos, child users of iMessage, and anyone who talks to a minor through iMessage will have to carefully consider their privacy and security priorities in light of the changes, and possibly be unable to safely use what until this development is one of the preeminent encrypted messengers.”
People sending messages to minors that trigger a hash match have more fundamental things to consider, as they are sending known photos of child exploitation to a minor.
The EFF writer knows this, as they describe the feature in the article. They should be ashamed of publishing this crap.
by mrwww on 8/6/21, 12:17 PM
by mulmen on 8/6/21, 12:58 AM
by christkv on 8/6/21, 4:08 PM
by alfiedotwtf on 8/6/21, 1:41 AM
by villgax on 8/6/21, 2:08 AM
by beebeepka on 8/6/21, 7:30 AM
by andrewmcwatters on 8/5/21, 10:40 PM
by tw600040 on 8/6/21, 1:18 AM
by anupamchugh on 8/6/21, 6:48 AM
by swiley on 8/5/21, 10:40 PM
by michalu on 8/7/21, 12:05 PM
by temeritatis on 8/5/21, 9:41 PM
by alana314 on 8/6/21, 1:34 AM
by jimt1234 on 8/5/21, 11:16 PM
by shadowhack on 8/8/21, 2:29 PM
by volta83 on 8/6/21, 8:28 AM
I’d rather not have that on my phone.
by xyst on 8/5/21, 11:24 PM
by wellthisisgreat on 8/5/21, 9:16 PM
Any kind of machine-based contextual analysis of users' content will be a disaster.
by Drblessing on 8/5/21, 10:07 PM
by 14 on 8/6/21, 2:56 AM
by throw7 on 8/5/21, 11:10 PM
I don't use apple products, but if I found out google was scanning my photos on photos.google.com on behalf of the government I would drop them. I'm not saying it wouldn't hurt, because it definitely would, but in a capitalistic country this is the only way to fight back.
by RightTail on 8/5/21, 9:34 PM
searching for CP is the original pretext
by unstatusthequo on 8/5/21, 10:14 PM
by nullc on 8/6/21, 6:08 AM
As such, it should NEVER do anything that isn't in your best interest-- to the greatest extent possible under the law. Your relationship with your personal computer is closer and more trusted than your relationship with your doctor or lawyer-- in fact, you often communicate with these parties via your computer.
We respect the confidentiality you enjoy with your professional agents but that confidentiality cannot functionally exist if your computing devices are not equally duty bound to act in their users best interest!
This snitching 'feature' is a fairly general purpose tracing/tracking mechanism-- We are to assume that the perceptual hashes are exclusively of unlawful images (though I can't actually find a firm, binding assertion of that!)-- but there is nothing assuring that to us except for blind trust.
Even if the list today exclusively has unlawful images there is no guarantee that tomorrow it won't have something different-- no guarantee that some hysterical political expediency won't put images associated with your (non-)religion or ethnicity into it, no guarantee that the facility serving these lists won't be hacked or abused by insiders. Considering that possession of child porn is a strict liability crime, Apple themselves has presumably not validated the content of the list themselves and certainly you won't be allowed to check it. Moreover, even if there were some independent vetting of the list content there is nothing that would prevent targeted parties from being given a different unvetted list without their knowledge.
The pervasive scanning can also be expected to dramatically increases the effectiveness of framing. It's kind of cliche that the guilty person often claims "I was framed"-- but part of the reason that framing is rare is because the false evidence has to intersect a credibly motivated investigation, and they seldom do except where there are other indicators of guilt. With automated scanning it would be much more reliable to cause someone a world of trouble by slipping some indicated material on their device, and so framing would have a much better cost/benefit trade-off.
Any of the above flaws are sufficiently fatal on their own-- but add to it the potential for inadvertent false positives both in the hash matching and in the construction of the lists. Worse, it'll probably be argued that the detailed operation of the system must be kept secret from the very users whos systems it runs on specifically because knowledge of the operation would greatly simplify the malicious construction of intentional false positives which could be used for harassment by causing spurious investigations.
In my view Apple's actions here aren't just inappropriate, they're unambiguously unethical and in a more thoughtful world they'd be a violation of the law.
by bississippi on 8/5/21, 9:20 PM
[1] https://www.vox.com/the-goods/2019/6/4/18652228/apple-sign-i...
by alisonkisk on 8/6/21, 10:32 PM
> OS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.
WhatsApp is not a hosting service.
by TroisM on 8/6/21, 1:40 AM
by websites2023 on 8/5/21, 8:50 PM
Think of it this way: If you want to hide from companies, choose Apple. If you want to hide from the US Government, choose open source.
But if your threat model really does include the US government or some other similarly capable adversary, you are well and truly fucked already. The state-level apparatus for spying on folks through metadata and traffic interception is now mode than a decade old.
by cblconfederate on 8/5/21, 10:15 PM
by thedream on 8/5/21, 10:34 PM
So where's the news?
by shmerl on 8/5/21, 9:24 PM
by aetherspawn on 8/6/21, 12:56 AM
Convince me that a strong step to ending CSA at the expense of a little privacy is a bad thing.
by new_realist on 8/5/21, 9:54 PM
by new_realist on 8/5/21, 9:57 PM
by edison112358 on 8/5/21, 8:50 PM
So every iPhone will now host the explicit images from the National Center for Missing & Exploited Children database.
by new_realist on 8/5/21, 9:15 PM
This technology uses secret sharing to ensure a threshold of images are met before photos are flagged. In this case, it's even more private than CCTV.
Totalitarian regimes to do not need some magic bit of technology to abuse citizens; that's been clear since the dawn of time. Those who are concerned about abuse would do well to direct their efforts towards maintenance of democratic systems: upholding societal, political, regulatory and legal checks and balances.
Criminals are becoming better criminals by taking advantage of advancements in technology right now, and, for better or worse, it's an arms race and society will simply not accept criminals gaining the upper hand.
If not proven necessary, society is capable of reverting to prior standards (Habeas Corpus resumed after the Civil War, and parts of the Patriot Act have expired, for example.).