from Hacker News

Encryption Is Not a Crime

by freddyym on 4/17/25, 12:56 PM with 211 comments

  • by i5heu on 4/17/25, 1:37 PM

    I do not like these framings of "not a" because it always sounds so suspicious like "we are not a cult".

    It puts the idea into the world that it could be a crime and maybe that it is the status quo.

    Much better IMHO is something like "Encryption is a fundamental right.", "Encryption protects everyone.", "Without encryption there is no democracy." and so on.

    Maybe "Don’t let them take your right to privacy."

  • by kube-system on 4/17/25, 1:32 PM

    This is too many words to convince someone who already doesn’t believe this.

    Put more simply: the modern internet doesn’t work without encryption, it is a fundamental part of the technology. Without it, anyone could log into any of your accounts, take your money, messages, photos, anything.

  • by cjs_ac on 4/17/25, 1:52 PM

    End-to-encryption is a good thing, and so is this website providing information about how to use it.

    But this particular article represents a particular pathology surrounding freedom. Freedom is supposed to be about doing what you want. It's not about making florid speeches about how free you supposedly are. If you want to use end-to-end encryption, just use it, and maybe offer advice to others on how to use it.

    There are some politicians who have decided that only bad people use encryption. Going up to one of these politicians and trying to explain that you use encryption but you're actually a good person won't convince them that encryption's okay, it'll just convince them that you're a bad person. Politics is one of those things that attracts people who just want to find the shortest route to a decision about who are the good people and who are the bad people, and keeping secrets isn't something that those sorts of people like other people doing.

    Unless you have evidence that the government is rounding up people just for using encryption, all this sort of advocacy does is to draw attention to you having something to hide, and therefore probably being some sort of wrong'un. If the government is rounding up people for using encryption, that's a specific threat you need to respond to, and starting a public campaign is not the right response.

  • by loftsy on 4/17/25, 1:19 PM

    Something is a crime if society determines that it should be so. Nothing more.

    Clearly the pressure on government to write these laws is coming from somewhere. You should engage with the arguments the other side makes.

  • by elric on 4/17/25, 1:29 PM

    If we had trustworthy governments, or trustworthy police agencies, then maybe mandated backdoors wouldn't be all that bad. But if anything, recent events that clearly demonstrated that governments are not trustworthy, even if one is trustworthy today it couldn't become an evil regime tomorrow, and handing all your power over literally anything to such an organization does not seem wise.
  • by ajsnigrutin on 4/17/25, 1:13 PM

    Not a crime, but somehow our dear EU overloads try every year or so to make it a crime in any way possible (eg. chat control).

    If we want to play in a world with full transparency, let's start with the politicians!

  • by bitbasher on 4/17/25, 2:14 PM

    The problem is the average person doesn't care very much or understand it.

    If you ask anyone if privacy matters they will of course say yes. If you ask them why they use software with telemetry or websites with Google Analytics they will simply shrug.

    If you ask them if it's alright for the NSA to collect and analyze data from everyone they will say yes and they have nothing to hide.

    People don't know what privacy is. They don't know what they are fighting for or where the fight is taking place.

    If you take that and then add encryption to the mix... and you have politicians and agency plants talking about "saving the children from online pedos" by banning these "encryption apps and technology"....

  • by mohi-kalantari on 4/17/25, 1:16 PM

    It’s honestly annoying how often experts speak up about this, and still nothing changes. We’re stuck in the same cycle—fear gets in the way, and in the end, it’s our privacy and security that suffer. If anything, this should be a sign to invest in stronger encryption and better law enforcement tactics that don’t mess with the tools keeping us safe online.
  • by OhMeadhbh on 4/17/25, 3:44 PM

    Also... we're throwing around words like "crime" and "terror" and talking about shadowy quasi-governmental organizations encroaching on civil rights to privacy. I offer this commentary from the Eurythmics' score to Michael Radford's 1984 film "1984" to serve as background music for our discussions.

    https://youtu.be/IcTP7YWPayU

  • by jagger27 on 4/17/25, 1:53 PM

    Encryption is a threat to power structure. Of course if you're in power, and you're under threat, you criminalize threat.

    As long as we preserve the knowledge of one-time pads, they will not take this power from us.

  • by kubb on 4/17/25, 1:15 PM

    There's an abstract argument template that I've noticed floating around. It goes like this:

      1. There's a thing T in the world, and that thing has negative outcomes X, Y, Z, and positive outcomes A, B, C.
      2. Some people believe that Y and Z are so bad, that they want to partly compromise C to diminish them.
      3. However that will never work! And they'll definitely also take B if we let them mess with C.
      4. Besides, C is so important, that we should accept Y and Z to have it.
    
    I've heard it many times before. Reading this post feels like watching a rerun of Friends.
  • by jmclnx on 4/17/25, 2:09 PM

    Seems to be geared towards Apple, but informative nevertheless.

    To me, the only sure end-end encryption is gnupg, where you personally create the keys and distribute.

  • by jaxn on 4/17/25, 1:58 PM

    I believe encryption is the most important 2nd Amendment issue of our time, but I never see it framed that way.
  • by SirMaster on 4/17/25, 1:56 PM

    This kind of reminds me about the same sort of assertion that BitTorrent is not illegal.
  • by DarkWiiPlayer on 4/17/25, 1:16 PM

    > Ignoring experts doesn't make facts disappear

    And yet it seems like every last politician without literally a single exception thinks that they it does work that way.

  • by EGreg on 4/17/25, 1:59 PM

    Just a bit more color on where this war on encryption is currently being fought:

    https://community.qbix.com/t/the-global-war-on-end-to-end-en...

  • by OhMeadhbh on 4/17/25, 2:55 PM

    As a software engineer who specialized in cryptography in the 1990s and didn't work for the NSA (working for RSADSI, Bell Canada and Certicom) I feel I have an informed vantage point from which to offer notes.

    a) This seems like a decent introduction to the subject of cryptographic regulation in the last 30 years. It's far from exhaustive, however. I do appreciate the collected references from diverse points in the last several decades.

    b) I would have mentioned "Sink Clipper" and the ACLU "dotRights" campaigns. Neither are especially easy to find in the increasingly enshittified google cache, but Le Monde Diplomatique has this article, complete with a link to Sink Clipper poster (I think from the mind of Kurt Stammberger) that no collection of CypherPunk oriented ephemera from the era can be without: https://mondediplo.com/openpage/selling-your-secrets

    The ACLU dotRights.org site seems to have receded into history, but some of it's content is still available at the archive. For example: https://web.archive.org/web/20100126102126/http://dotrights....

    c) Herb Lin presented a very nice paper back in the day comparing PROPOSED encryption regulation with ACTUAL encryption regulation. I think the thesis was through the 90s, proposed regulation was increasingly draconian (clipper, etc.) but actual regulation was liberalizing (effective deregulation of open-source tools.) I found Herb's page at Stanford and heartily recommend it if for no other reason than it's sheer volume of written material: https://herblin.stanford.edu/recent-publications/recent-publ...

    d) I was a little surprised the wired article linked to at the beginning of the piece didn't have that issue's front cover, which was sort of a cultural touchstone at the time. But you can see it here: https://pluralistic.net/2022/03/27/the-best-defense-against-... - and this one: https://www.reddit.com/r/Bitcoin/comments/1cgpktp/31_years_a... (dang, look at those non-receding hairlines!)

    e) Making the web "secure" or "private" is like putting lipstick on a pig. Modern web technology is designed to de-anonymize and collect identifying information to enable targeted ad delivery. Thought I generally respect Moxie Marlinspike and have no great beef with Signal, there has been a concerted effort to exploit its device sharing protocol and your carrier and national governments can easily extract traffic analysis info from people using it. Were I to add one sentence to this guide, it would be "While these tools are better than nothing, they are far from perfect."

    f) The guide seems to conflate encryption with privacy. Encryption technology can enable privacy, but you're not going to get privacy from encryption technology unless you pair it with well reasoned policy (for organizations) and operational guidelines (for both organizations and individuals.)

    The extreme example is to say "nothing stops a participant in an encrypted communication from sharing the un-encrypted plaintext after it's recovered." People earnestly trying to maintain message security probably know not to do that, but when talking about exchanging keys and figuring out which keys or organizations you should trust, it's easy for even the well-informed to make privacy-eroding decisions.

    So... I think this article is a good jumping off point, covering material I would call "required, but not sufficient." I would just view it as the beginning of a deep-dive instead of the end.

  • by wulfstan on 4/17/25, 1:33 PM

    Playing devil's advocate here...

    What is wrong with:

    * an expiring certificate

    * issued by the device manufacturer or application creator

    * to law enforcement

    * once a competent court of law has given approval

    * that would allow a specific user's content to be decrypted prior to expiry

    There are a million gradations of privacy from "completely open" to "e2e encrypted". Governments (good ones!) are rightly complaining that criminals are using encryption to commit particularly awful crimes. Politicians are (mistakenly) asking for a master key - but what I feel we should as a community support is some fine-grained legal process that would allow limited access to user information if justified by a warrant.

    Competent jurisdictions allow this for physical search and seizure. It's not unreasonable to ask for the same thing to apply to digital data.

  • by 9rx on 4/17/25, 1:19 PM

    > It's not a crime to lock your home's door for protection, why would it be a crime to lock your digital door?

    A locked home's door is still trivially opened. You can pick the lock or even apply simple brute force, neither of which all that difficult, and open happily it will. Similarly, I don't suppose anyone would be concerned about you using rot13 encryption. If a home could be sealed to the same degree as strong encryption, absolutely it would be a crime, for better or worse.

  • by Fokamul on 4/17/25, 1:45 PM

    I'm not surprised, UK became literal African/Middle-East hell hole. They've kicked out all working immigrants and replaced them with ultra-religious freaks.

    And of course, UK being a country, where every form of self-defense is the most serious crime, when attacked you must call police, then lay on the ground and die, is cherry on top.