by ekr____ on 12/28/24, 9:28 PM with 128 comments
by rkagerer on 12/31/24, 4:28 PM
I love those classic tools from the likes of Sysinternals or Nirsoft. I didn't hesitate to give them full access to my machine, because I was confident they'd (mostly) work as expected. Although I couldn't inspect their source, I could reason about how they should behave, and the prevailing culture of the time was one where I knew the developers and myself shared a common set of expectations.
Their creators didn't tend to pull stunts like quietly vacuuming all your data up to themselves. When they did want feedback, they asked you for it first.
There wasn't such a potent "extract value" anti-culture, and successful companies recognized enduring value came from working in the user's best interest (eg. early Google resisted cluttering their search results).
Although silos existed (like proprietary data formats), there was at least an implicit acknowledgement and expectation you retained ownership and control over the data itself.
Distribution wasn't locked behind appstores. Heck, license enforcement in early Office and Windows was based on the honour system - talk about an ecosystem of trust.
One way to work toward a healthier zeitgeist is to advocate tirelessly for the user at every opportunity you get, and stand by your gut feeling of what is right - even when faced with opposing headwinds.
by whatever1 on 12/31/24, 4:17 PM
The stakes are now higher with data being so important and the advent of algorithms that affect people directly. From health insurance claims, to automated trading, social media drugs and ai companions, bad code today can and does ruin lives.
Software engineers, like every other engineer have to be held accountable for code they sign off and ship. Their livelihoods should be on the line.
by rini17 on 12/29/24, 2:20 AM
by geokon on 12/31/24, 8:40 AM
If you're going to be militant and absolutist about things, that seems like the best place to start
And then probably updating your software incredibly slowly at a rate that can actually be reviewed
Software churn is so incredibly high that my impression is that only some core encryption algo really get scrutinized
by lesuorac on 12/31/24, 1:29 PM
Of course, warranty still has the counter-party risk that they go out of business (probably because of all the lawsuits about a bad software ...).
by cbxjksls on 12/31/24, 8:28 PM
A big problem is user hostile software (and products in general). I'm not able to walk into a Walmart and buy a TV or walk into a dealership and buy a new car, because there are no options that aren't user hostile.
Options exist, but I have to go out of my way to buy them.
by spenrose on 12/31/24, 4:14 PM
* https://www.cs.cmu.edu/~rdriley/487/papers/Thompson_1984_Ref...
by ryukafalz on 12/31/24, 4:41 PM
But aside from that, yes, we still need to trust our software to a large degree especially on desktop operating systems. I would like to see more object capability systems start to show up so we can more effectively isolate software that we don't fully trust. (WebAssembly and WASI feel like they might be particularly interesting in that regard.)
by superkuh on 12/31/24, 3:52 PM
An installed binary is much more verifiable and secure and trustworthy.
by yu3zhou4 on 12/31/24, 10:49 AM
by missing-acumen on 12/31/24, 9:50 AM
Today's most advanced projects are able to compile pretty much arbitrary rust code into provable RISC-V programs (using SNARKs).
Imo that solves a good chunk of the problem of proving to software users that what they get is what they asked for.
by ghjfrdghibt on 12/31/24, 12:32 PM
by woadwarrior01 on 12/31/24, 1:06 PM
by BlueTemplar on 12/31/24, 11:52 AM
by Timber-6539 on 12/31/24, 9:25 AM
by mikewarot on 12/31/24, 7:06 PM
Imagine if we ran the electrical grid this way... with inspections, certifications, and all manner of paperwork. That world would be hell.
Instead we carefully capabilities at the source, with circuit breakers, fuses, and engineering of same so that the biggest circuit breakers trip last.
Capabilities based operating systems limit capabilities at the source, and never trust the application. CapROS, KeyKOS, and EROS have lead the way. I'm hopeful Hurd or Genode can be our daily driver in the future. Wouldn't it be awesome to be able to just use software without trusting it?
by localghost3000 on 12/31/24, 8:39 PM
by egypturnash on 12/31/24, 5:40 PM
"image by chatgpt"
I'm just gonna assume the rest of this post is also AI-generated waffle. closes tab
by mwkaufma on 12/31/24, 8:05 PM
by paulnpace on 12/31/24, 4:56 PM