by womitt on 3/16/22, 12:00 PM with 18 comments
by russnewcomer on 3/16/22, 12:22 PM
That’s not really _ethics_ as I understand the term. Software ethics would do more to plumb the question of, is it ethical for the WebKit team to ship a feature that will make one of Apple’s business goals more viable at the cost of increasing daily energy consumption of the iPhone by 3%? That is actual ethics, not ‘did I make this system easy to extend?’
by tgv on 3/16/22, 12:33 PM
> There have been high-profile calamities caused by software over the past decades. Software developers need to start discussions about what may be done before a disaster happens that takes control out of our hands.
The next paragraph mentions cars, so losing control of something that we apparently do not control at this moment seems to be worse than a disaster with car software?
> You can’t use the excuse that you’re just building a game or a thermostat, so it doesn’t matter. Everything matters. We need to develop the proper emotional attitude to building software; that’s the role of software engineering ethics.
Everything matters? That's a wild take. I don't consider a bug in Farmville of the same gravity as one in a self-driving car, the autopilot, medical device automation, or control over a steel oven, to mention just a few.
What follows in the article is a proposal for a mixture of personal responsibility and a trust chain that can only work when there are laws and penalties, and which will reduce the number of active software developers. By about 95%, I'd say. It would get rid of Farmville, so perhaps there is an upside.
by ysleepy on 3/16/22, 12:20 PM
by eternityforest on 3/16/22, 12:42 PM
Individual ethics are important. But some of the more important decisions are company level though. As a user, I am only minorly affected by code quality... because these days it's largely all pretty good.
What I am affected by is vendor lock in, weird SaaS cloud only stuff, and IoT devices that could be remote bricked at any time.
I also think piloting is the wrong metaphor. We absolutely should put the responsibility on the computers instead of ourselves as much as possible.
Our tools should make it very hard to write bugs. We should have AI and strong types and static analysis watching over us at all times.
And we should have reviews and beat practices and all that too.
If people keep making a mistake, we should teach our computers to catch it.
Programmers have already proven they can't be trusted. Just because you can write good code by "Being real smart and real careful" doesn't mean we can.
I'm also suspicious of personal joy. For programmers that often means reinventing wheels. Even if something is totally rock solid, a lot of coders wont really be satisfied by a day where all they did was string together some NPM libraries.
Programmers like to write suckless style stuff. They like experiments. They like hacks. They like rebuilding stuff from scratch.
They do not like using a 10k line library with real security audits for something they know they could do in ten lines if they sacrifice a few edge cases.
Programmers care way more about code than software. In fact it seems they kind of look down on the world for depending entirely on software. They seem to see the whole modern world as just a nightmare mess of spyware.
They're not going to feel as good about making great software as they would about making the absolute minimum possible amount of software.
But these are all just details, and the idea in general seems great.
by trentnix on 3/16/22, 12:22 PM
Nonsense - engineering ethics have been discussed, codified, and taught for decades. Why are engineering ethics insufficient to cover building software?