from Hacker News

EU plan to scan private messages for child abuse meets fresh scandal

by daenney on 10/24/23, 4:19 PM with 112 comments

  • by bloopernova on 10/24/23, 5:12 PM

    Astonishing. The EU was caught using targeted ads in twitter to get support for a very intrusive law.

    Smells really bad, but I'm too cynical to believe that anyone responsible will face consequences.

  • by ChrisMarshallNY on 10/24/23, 4:43 PM

    Sort of reminds me of the famous "Opinion Polls" segment of Yes, Prime Minster:

    https://www.youtube.com/watch?v=ahgjEjJkZks

  • by shrubble on 10/24/23, 5:46 PM

    I am sure they would never exempt themselves from such monitoring, correct? Given the Dutroux scandal, which revealed some rather shady behavior on the part of multiple people in positions of power, seems likely there would be rules for the little people but not others.
  • by nixass on 10/24/23, 4:35 PM

  • by fsflover on 10/24/23, 4:34 PM

    Related discussion: https://news.ycombinator.com/item?id=37958473

    EU Commissioner as double agent of foreign interference (patrick-breyer.de)

  • by fwungy on 10/24/23, 5:14 PM

    "It's for the children"

    Big red flag.

  • by pembrook on 10/24/23, 6:45 PM

    If Europeans are going to voluntarily allow their government to continuously scan all of their private data 24/7, can the rest of the world stop wasting time caring about GDPR please?

    Moving every cloud service to be EU-hosted is going to look really stupid once it means you’ve just centralized all worlds data under a 24/7 foreign surveillance & espionage dragnet.

    If I had to guess, I’d say the amount of money spent globally on lawyers re: GDPR must exceed the gross national product of Greece. For it to all backfire in this way would be quite humorous.

  • by eigenket on 10/24/23, 10:08 PM

    This is not really a "fresh" scandal, we had a hacker news discussion about exactly this 10 days ago.
  • by matheusmoreira on 10/24/23, 7:19 PM

    Just cut out the "for Child Abuse" part. They want to scan people's private messages. The reason doesn't matter.
  • by awsthr0w4y on 10/24/23, 8:01 PM

    The first horseman of the Infocalypse has arrived.
  • by hsbauauvhabzb on 10/24/23, 8:23 PM

    I often wonder if we’re emerging as a society that must agree on free speech but must also agree on ethical compromises. Extreme left and right wing views are both problematic and there is some validity to LE being unable to track electronic crime due to encryption.

    Don’t get me wrong, I don’t trust any proposed laws in their current state, but as a society we trust judges to sentence people for crimes, and referees to call foul in sport - I just wish there was the formation of a legitimate ethics committee which would act in good faith on society, because inevitably rights will erode and without one we will be left with less.

    Edit: I’m sure this will be an unpopular opinion, but I’m interested to hear contrasting thoughts, please vocialize disagreements

  • by zosima on 10/24/23, 6:52 PM

    People are so enamored by the idea of EU, and cooperation across European borders, that they fail to see what kind of entity it really is, and what people control it.

    If this monster is not controlled it will grow to become another USSR, completely stifling all innovation in Europe, 24/7 censoring what is appropriate for citizens to read and see, while continuously spying on everyone, everywhere.

    It can't be allowed to continue.

  • by karaterobot on 10/24/23, 6:00 PM

    I'm simply astonished to find that a body which condemns the use of microtargeted Facebook ads to influence elections would do this. I'm equally shocked to find out that they would commission a fake survey to provide pseudoscientific cover to their self-serving positions. It's almost enough to shatter my faith that people in government are working on my behalf, and make me think they're all just hypocrites. Next, I might start to question the validity of surveys on political topics, but fortunately I'm still pretty naive and credulous.
  • by gosub100 on 10/24/23, 7:55 PM

    Disclaimer: I acknowledge that this is a whataboutism argument.

    Why not use the phone to detect physical and verbal abuse? The microphone is always on for most users, we have algos that can detect the sound of an adult yelling, a child crying, and the percussive sound of a human hitting another human. If 2/3 of those sounds are detected, why not forward that to NCMEC for further review? It doesn't have to be "1 yell, 1 cry, break down the door", it could be a pattern detector that logs the times, the severity, and the mode. If it hears objects being thrown, people yelling, kids crying, why isn't that sufficient to call authorities? How come "think of the children" suddenly doesn't work anymore?

    I'm being 50% facetious here because obviously that's a huge privacy issue. But the same logic for CSAM scanning still applies nonetheless. Why only address 1/3 of abuse and ignore verbal/emotional, and physical? Those kids need justice too.

  • by throwaway290 on 10/24/23, 5:30 PM

    We should separate anti-abuse laws that don't compromise e2e (like Apple's hash matching) and laws that ban e2e. Once we do it we should fight for one against the other.

    If abuse is real and e2e is key to it, taking measures against abuse while keeping e2e is important. Fighting against any measure altogether is immoral