by monort on 8/23/22, 9:15 AM with 396 comments
by paganel on 8/23/22, 12:50 PM
> The EU wants to oblige providers to search all private chats, messages, and emails automatically for suspicious content – generally and indiscriminately. The stated aim: To prosecute child pornography. [1]
Yeah, that will go down well, a central government checking our private conversations for "suspicious content". Of course they would use the "think of the children" trope, they could also have gone with the "think of the bad terrorists" trope, but that would have been too American, too cowboy-ish, we need to feel special, we're Europeans, after all.
Minus some street protests I don't think we can actually stop this, and, even then, I have my very big doubts. It so happens that I live in the EU periphery (I still need to present my ID card if I want to travel to Budapest or West from there), and it sickens me to see that my privacy depends on countries and electorates on which I have no say (like Germany, with all due respect to the Germans who still care about their privacy). Why should my privacy be made fun of because of decisions taken by some people from half way around the continent with which I have no direct connection and no shared past? Did they have a Securitate-like thing? Many of them didn't, and even those that did (like the same Germans), it looks like it doesn't matter at this point, they're all too happy to see their private political conversations be scrutinised 24/7.
F. that, the only viable solution I see for my country is an exit from the EU, but the money (still) coming in from Bruxelles is too good to leave aside for pesky political principles, so of course that no serious politician from around these parts puts the problem that way.
[1] https://www.patrick-breyer.de/en/posts/messaging-and-chat-co...
by Tainnor on 8/23/22, 12:22 PM
This was different with the previous government where the "law & order" mentality was much more entrenched, and which did nothing to prevent e.g. upload filters (despite promising to do so).
So I try to maintain some hope that at least Germany as a member state could tank this awful bill.
edit:
Here's the list of 61 questions that the German government sent to the EU concerning the bill (at the end of the article, in English): https://netzpolitik.org/2022/chatkontrolle-bundesregierung-l...
From a cursory reading, it reads to me like the diplomatic equivalent of "what you're proposing doesn't make any sense".
by vaylian on 8/23/22, 11:11 AM
Saying that this is about child protection is a blatant lie. This serves only as a stepping stone to introduce other screening criteria later. And with opaque ML models it will be very tedious to determine what the model is supposed to find.
by chucklenorris on 8/23/22, 11:28 AM
How big is the risk of a child being groomed through these electronic means? Is it comparable to being struck by lightning? What is worse weighted by probability: being sexually assaulted as a child or being suspected and having your life turned upside down for years by these algorithms. We already see these things happen with relatively minor things like having your google account closed by an algorithmic mishap.
How was this 10% number of false positives determined? Is this only an expectation of false positives or an actual statistic. What does 10% mean in the context of mass surveillance?
It might well be that millions of children are groomed and assaulted every year through chats. I don't have the data so i cannot say. I was under the impression that most sexual assault cases happen in the family and not by strangers.
What's worrying though is that these decisions are taken behind closed doors without any oversight, on the hope that they might save a child and possibly putting our lives in the hands of algorithmic justice.
by hnhg on 8/23/22, 10:21 AM
by sparsely on 8/23/22, 9:50 AM
That 10% is the percent of flagged images which are actually OK. Whether this represents a large fraction of all legal content depends on how much illegal content there is. It would be better if they quoted the false positive rate and false negative rate as a fraction of legal/illegal images respectively.
e.g. if 1/100,000,000 legal images are flagged incorrectly, and 100% of illegal images are flagged correctly, then a corpus of 100,000,000 legal images + 9 illegal images would result in the stats in the headline. That seems like a pretty good system (ignoring any principled objections to the scanning in the first place).
by altacc on 8/23/22, 9:46 AM
I also didn't find anything in there about expectations for reducing numbers of false negatives (where automation fails to flag suspicious activity). Content control is basically just PR if it ignores the majority of activity it is designed to police.
by seper8 on 8/23/22, 9:45 AM
by DangerousPie on 8/23/22, 10:20 AM
Whether that makes this whole plan a good idea or not is obviously a very different question, but I think it's important to be clear about what this number actually means.
by uvesten on 8/23/22, 10:25 AM
by isaacfrond on 8/23/22, 9:51 AM
by Neil44 on 8/23/22, 9:44 AM
by tomekn on 8/23/22, 10:21 AM
From the perspective of tech companies, they are being put between a rock and a hard place by simultaneously being asked for more privacy, and also less privacy.
by dylkil on 8/23/22, 10:00 AM
by shakaijin on 8/23/22, 12:04 PM
by jernejzen on 8/23/22, 12:56 PM
by mnd999 on 8/23/22, 10:00 AM
by throwaway22032 on 8/23/22, 9:55 AM
Is Signal subject to this? Telegram? Do we need something "less mainstream"?
by charlieyu1 on 8/23/22, 9:57 AM
by diziet on 8/23/22, 10:02 AM
by DubiousPusher on 8/23/22, 2:06 PM
If the answer is yes, then regardless of the accuracy of the system or the mass nature of the communication network this is objectionable law making.
IMO, it is a fundamental human right to communicate privately.
The only real question is what is the responsibility of a third party. If I give a shipper illicit material are they responsible to inspect it and report it? I'm personally unaware of the law regarding this but I assume your shipper is not required by law to open every package it ships and report upon it. Are they required to do a percentage?
If not than what the state is claiming here is a right by convenience. It happens that digital communication is easier to inspect than crates. Therefore, the state can create an expectation of one third party it does not of another.
by seper8 on 8/23/22, 9:48 AM
Let's hope its a typical EU project and will take at least a decade to complete, or better, let's just hope it will outright fail.
by Am4TIfIsER0ppos on 8/23/22, 12:12 PM
by choeger on 8/23/22, 12:25 PM
So they're fine to read 10% of all messages? Probably more, because of context? Besides this obviously being a massive DDOS on whatever dystopian spy center of sanitary thoughts they want to build, I wonder how the big EU honchos get their free pass on that? Or didn't they simply not consider that they're going to get monitored as well?
by lakomen on 8/23/22, 2:36 PM
Also I thought chat control was off the table? Did that change?
by therealmarv on 8/23/22, 2:15 PM
Not even when driving a car I'm tracked all the time. And I think driving a car can be also dangerous.
by perryizgr8 on 8/23/22, 10:29 AM
by pelasaco on 8/23/22, 10:26 AM
Probably later they extend that to "protect and combat right wing opinion", because nobody is against "combating the right wing opinion", or even "protect and combat the climate changes", because "Who are against it", right?
Sounds like the a lot of "paranoid people" were just right, i guess?
by idrathernot on 8/23/22, 11:47 AM
by erdos4d on 8/23/22, 1:16 PM
by hammyhavoc on 8/23/22, 1:28 PM
by ho_schi on 8/23/22, 9:44 AM
Okay.
Encryption is not only important to protect private communication, but would also help perpetrators/criminals
No.
It is there to protect us from perpetrators, criminals and all the people which think they are on the good side. The road to hell is paved with good intentions. Authoritarian regimes on our planet always thought they were the "good guys". Encryption is actually there to protect us from you!
The mothers and fathers of the German Grundgesetz (~ constitution) learned that the hard way.
by GuB-42 on 8/23/22, 10:36 AM
I don't have the numbers but I think that during an investigation, way more than 10% of suspects did nothing wrong. In fact, some people estimate that 10% of convictions are wrong (though I think that's an overestimate). A 90% effective system may actually end up preventing false arrests, search warrants, etc... a win for privacy!
The real concern is the potential for abuse, not that 90% bar that is, I think, completely reasonable.
by gloppers on 8/23/22, 10:06 AM
This regulation is for the purposes of criminal investigation into serious harms against children, not for spying on whatever innocuous messages you happen to be sharing with friends and family. The privacy fears are being way overblown for us ordinary people.
Paedophiles, on the other hand, do not deserve privacy. They need to be scrutinised their entire lives to keep children - the targets of their vile depravity - from harm.
I support this regulation; every parent should.