by dguo on 2/6/22, 10:01 PM with 193 comments
by emehrkay on 2/7/22, 3:05 AM
by a2tech on 2/7/22, 12:04 AM
by TheOtherHobbes on 2/7/22, 1:59 AM
I'd suggest there's a difference between "I don't understand how this is supposed to work" to "I do understand how this is supposed to work, and I don't understand why it isn't working."
Seat belt story seems to be Type 1. But what if a lot of "stupid" design decisions are actually Type 2?
And the reasons may or may not be good - somewhere on a scale from real budget and/or time constraints, lack of insight, indifference, penny pinching economics, to passive aggressive user hostility.
When Apple removed Magsafe no doubt there were perfectly good internal justifications for that decision. But ultimately the people outside the company who said it was a poor move from a user POV turned out to be right.
by irrational on 2/7/22, 1:30 AM
It’s unfortunate how many people never come to this realization.
by bede on 2/7/22, 12:05 AM
by codegeek on 2/7/22, 1:10 AM
by gyulai on 2/7/22, 8:19 AM
I feel like, at this point in my career, and in the specific context where I work, this has now become the bottleneck to my career progress.
The problem is: It works as a self-fulfilling prophecy based on the fallacies of social proof and fundamental attribution error. People reach for the snap judgment "this is overcomplicated" because they expect me to just be the kind of person who just overcomplicates things. And they feel easily justified in that judgment given that other people also tend to make that judgment about me. This means they budget even less time for trying to understand the complexities in the problems I solve before reaching for the easy judgment of "this is overcomplicated".
So: The guy who always overcomplicates things, thus draining the company's cognitive resource needlessly and just generally being a nuisance, ends up looking much the same from the outside as the guy who could just be a company's strongest engineering asset. He could be the guy to put the company ahead of the competition, always understanding the engineering problems more deeply and solving them more thoroughly than others, including the competition, in a way that the competition can't easily replicate. -- ...if only the people tasked with passing judgment could overcome biased and lazy decision-making.
by Tehdasi on 2/7/22, 3:40 AM
I find this is particularly the case when the design flaw is particularly bad, the developers end up adding complexity to work around the flaw, which has the effect of hiding the flaw, making it alot less obvious to the novice.
by seanalltogether on 2/6/22, 11:57 PM
by nojs on 2/7/22, 12:04 AM
> How do you overcome schlep blindness? Frankly, the most valuable antidote to schlep blindness is probably ignorance. Most successful founders would probably say that if they'd known when they were starting their company about the obstacles they'd have to overcome, they might never have started it. Maybe that's one reason the most successful startups of all so often have young founders.
by cobertos on 2/7/22, 12:37 AM
My default nowadays is I never know enough, even in the fields I'm specialized in. Part of it is I think just a limit of how much information I can accurately refer to/pull up in my brain at any given time too.
by puffoflogic on 2/7/22, 12:41 AM
What's interesting is that I now know something that the engineers designing the system should have known but apparently didn't, or knew and didn't care about. So sometimes (or much of the time?) users do know something the engineers don't.
by helsinkiandrew on 2/7/22, 7:29 AM
A single rude salesperson and they won't step foot in the entire chain, a bad cop or doctor and the whole police force or medical establishment is corrupt. Listen to a politician giving a speech and people become lifelong supporters even when their policies change radically.
There is something deeply embedded in the human mind that makes us very susceptible to some stories - against all evidence, if it fits our preconceived perception or coincides with something in our memory.
by Graffur on 2/6/22, 11:16 PM
by justnotworthit on 2/7/22, 5:08 AM
I'd also add: ...should be proportional to my need to or benefit from passing judgement.
Postpone judgement and you observe more. It's counterproductive to stamp everything in life "good/bad" "dumb/smart" "friend/foe" as soon as possible.
by nyokodo on 2/6/22, 11:47 PM
by unityByFreedom on 2/7/22, 12:58 AM
by foreigner on 2/7/22, 8:49 AM
by WaitWaitWha on 2/6/22, 11:32 PM
I judge something 'proportional to how much I know about it' and in context when it was developed.
Designers take things outside of the design into consideration. The design may remain alive and in use (but not necessarily the original intended use) way past the original, outside environment.
by csours on 2/7/22, 12:53 AM
My logic:
Being a test pilot is dangerous. Only idiots do dangerous things. Therefore test pilots must be idiots.
Later (embarrassingly later) I learned that many test pilots were also engineers. This made me reconsider my opinion. I learned to be very careful when judging intelligence, and also the limits of inference.
by quickthrower2 on 2/7/22, 12:42 AM
If I find a stupidly designed product, it is often because it was the cheap option. Spending more saves money in the long run. Water bottles spring to mind - most of them leak or get damaged by dish-washing sooner or later. Spending $20 on a water bottle is cheaper than spending $5 ten times. Although a high price isn't a guarantee of quality either!
by jonpalmisc on 2/6/22, 11:11 PM
> My willingness to judge something should be proportional to how much I know about it.
I think it’s a sound rule. Going to try to remind myself of this more.
by spullara on 2/7/22, 8:23 AM
by sturza on 2/7/22, 7:15 AM
by Scarblac on 2/7/22, 10:14 AM
by hn_throwaway_69 on 2/7/22, 12:29 AM
When everyday folk start doubting the safety and/or effectiveness of mRNA vaccines this is what comes to my mind.
by spchampion2 on 2/7/22, 12:18 AM
by burlesona on 2/7/22, 4:46 AM
by captainmuon on 2/7/22, 12:21 PM
Be it some technical choice at work that had obvious flaws. Or when we were renovating our house and as a layperson I noticed a serious problem the experts didn't see. Or when I was reading about poststructuralism, or critical theory at university. I had a feeling it was just a lot of word games around a couple important ideas - then I put in the work and read books and went to courses, and yupp, that was basically true.
Looking at it from the other side, as an expert on some topics, I know there are a lot of things we do that are not justified by the "subject matter" but we just do them because we have always been doing them, or because a pointy haired boss decided so. Or we have operational blindness and can't notice the flaws anymore.
by halikular on 2/7/22, 7:42 AM
by deltaonefour on 2/6/22, 11:54 PM