by ssdspoimdsjvv on 7/28/22, 7:58 AM with 126 comments
by joe-collins on 7/28/22, 8:50 AM
by dusted on 7/28/22, 8:48 AM
But rationally, I know that a lot of programmers don't care about computers, or even about programming and to them it's just a job.. And I guess I have no right to judge that..
by hericium on 7/28/22, 8:40 AM
This theoretical person from the title does use internet and would like to know how many megabytes per second can be pushed through their n megabits link, no?
by basilgohar on 7/28/22, 9:13 AM
If they embrace new knowledge and appreciate those who bring it to them, then I couldn't think of someone better for this art. If, on the other hand, they dismiss anything that they don't immediately see the value to, even at the behest of someone either more senior or experienced, then I would say, that is the red flag.
by jb1991 on 7/28/22, 8:30 AM
by brunooliv on 7/28/22, 8:58 AM
But... does it matter?
Unless you are working with hardware or in embedded systems more directly, it ABSOLUTELY does not matter at all.
What matters is to have a critical eye for numbers when they matter and/or are "off". That is much more valuable for the day-to-day work.
"Damn, before we could easily hit this endpoint X times per second and after that refactor and library change it's down to X/5? We need to have a look".
Sizes of DB tables, latency of endpoints and being able to properly read stacktraces or logs from a third-party service is much more valuable.
PS: people writing in the comments about not knowing how text encoding works, about sending data trough sockets, etc... Folks, you know how to do these things because they are required from you at your day job, you probably had exposure to proper patterns and got help from experienced people within that particular niche at your company. That's all. Knowing 1 byte is 8 bits is absolutely meaningless knowledge on its own.
I needed to know these "standard sizes" for my CS degreee,sure. Do they matter for my day-to-day work? Not in the slightest.
by mjbeswick on 7/28/22, 8:12 AM
by TheCoelacanth on 7/28/22, 3:12 PM
It has no bearing on actually doing the job, but how on earth did you manage to learn the necessary skills without learning that?
by seren on 7/28/22, 8:20 AM
by joshka on 7/28/22, 9:02 AM
- storage capacity
- speed of communication
- speed of computation
- representation of numeric values
- encoding of strings
Bits and Bytes are the fundamental units of size and representation, and knowing them is critical to understanding most other things that build on top of them.
For high level software, you see this come up in:
- integer limitations (e.g. database identifiers)
- internationalization of user interfaces
- floating point numeric comparison and other mathematical operations
- communications over various transport layers
- speed of various abstractions that are used to make the higher level software fast / good
by vbezhenar on 7/28/22, 8:34 AM
These days one should even know how exactly double is represented in hardware as it's the main type in JS and one should clearly understand bounds of integer numbers which could be represented in double without precision lost.
by WelcomeShorty on 7/28/22, 11:04 AM
I would say it depends totally on the tasks of the coder. If you fiddle around with some ASP: NO. If you are squeezing every last bit for Netflix performance: YES.
Our domain is so incredibly huge, deep, wide & divers that I do not expect regular coders to be familiar with most details.
by blikdak on 8/5/22, 5:53 AM
by shafyy on 7/28/22, 8:33 AM
by exyi on 7/28/22, 9:43 AM
Even as computer user: Internet speeds are regularly given as megabits per seconds. How do you estimate how long will a 5GB file upload without knowing it's 8b/B ?
by dragonelite on 7/28/22, 8:36 AM
But then again knowing how many bits go in a byte it's not needed to be productive in 99% of the cases.
Then again when a professional software developer can't tell me how many bits go in a byte or cant at least explain to me how to read/use call stack/stack frame in a IDE or terminal etc. I really start to wonder how experienced and professional that developer really is.
by alkonaut on 7/28/22, 9:04 AM
And not just “should be able to look it up”, but should know it in their sleep.
Should also know other esoterica like e.g how a string could be zero terminated in one case but length prefixed in others, Joe they could be one, two, four, or variable numbers of bytes per character/glyph in a string and so on. Not because one needs this every day but a) because doing work adjacent to it means you eventually shoot yourself in the foot and b) doing any amount of work will expose you to this and not having picked up on it means you aren’t picking up things.
by AlphaGeekZulu on 7/28/22, 8:52 AM
Much more important, in my eyes, and more sophisticated, is the concept of the computer "word": a professional software dev should know how many bytes go in a word, how word length corresponds with bus width, memory organisation and how types depend on word length.
And in my humble opinion, Unicode encoding is such a basic concept, that every dev should understand it on a bit level.
by oxff on 7/28/22, 9:11 AM
by superchroma on 7/28/22, 8:28 AM
I think what matters is that you're reasoning about things and being thorough and careful. Broadly, I haven't yet had someone give me a pop quiz at my job yet.
by aoshifo on 7/28/22, 9:04 AM
I mean, it's something you will inevitably stumble upon. I'd say that even if you actively try to avoid learning how many bits go in a byte you will one day wake up screaming "Fuuuuuuck, I never wanted to know that there are 8 bits in a byte!!!!!" There, now you know, too!
by postcynical on 7/28/22, 9:20 AM
Hiring wise, if you've spent a few years programming without ever learning about "that bits&bytes thing", an employer is tempted to doubt your education, level of curiosity and/or general interest in your field.
by MattPalmer1086 on 7/28/22, 10:48 AM
To me, knowing how computers work is just something I thought every programmer would just know as fundamental knowledge (note bytes don't always have 8 bits, but it's the most common).
I guess there is not an absolute need to know this if it doesn't affect business logic type programming or UIs, although I think I might think a person was not very well educated or curious if they didn't know that.
by Kim_Bruning on 7/28/22, 9:09 AM
It's surprising how far some people can get with their hands tied behind their backs.
I wouldn't recommend it of course. Also it might possibly be a kind of warning flag: how can someone constantly be exposed to this stuff and never pick it up? But somehow it does sometimes happen!
(see also eg. : https://news.ycombinator.com/item?id=32216904 Experienced C programmer who can't fizzbuzz?)
[ For folks wondering why I'm talking about counting: Bits and bytes is all about learning to count with just 1 finger: 1, 10, 11, 100, 101, 110, 111, 1000, ... And the largest number you can count up to in a byte is 1111 1111 of course (A byte has 8 Binary digITs, and the highest number you can count to in a byte is having all those digits be 1s. In decimal this is the number 255. If you use a computer a lot, you might have run into 255 more often ). ]
by throwaway2016a on 7/28/22, 1:55 PM
> Is this number positive or negative? > 0101 0101
They expected me to ask about things like endianness and to know the least significant bit is the only one we are about to answer that question.
While bits in a byte "is" a "low level" topic. It is one of the more useful ones even for CRUD.
My instinct is that "yes"... basic knowledge like this is important. After all, even as an API developer I have often had to know things like how many bytes an int is on my machine, how bitwise operators work, whether the system is little endian or big endian, etc.
Also, how do you understand Unicode (something very important for i18n) if you don't know the answer to this question?
But then I had to go back and second guess myself. Most of this stuff is abstracted out now.
I would say it comes down to what type of data you are dealing with. If you're dealing with files in binary formats, compilers, etc. Then yes, you need that stuff. And yes, a simple CRUD app does sometimes need to deal with that stuff. But if not? I guess it's fine not to know.
by zerof1l on 7/29/22, 9:42 AM
A "professional software dev" is a very broad category. In most of web development, you can get away just fine without knowing how many bits are in a byte. On the other hand, in Arduino you'd encounter bits and bytes on a regular basis.
As for me, if somebody was to ask me this question during an interview, I would assume that it's a trick question. There were some old exotic architectures in the past such as IBM that stored data in 6bits.
by beej71 on 7/29/22, 3:57 AM
If I were hiring a generalist hacker type, they'd better know the answer. I expect a generalist to be able to do all kinds of tasks. But a specialist on something high level might get a pass.
by wruza on 7/28/22, 10:32 AM
If by some sort of baseline certification (school, degree, etc) it’s hard to imagine that it’s not in an educational program, e.g. UTF-8 hints at it directly it the name. Can a programmer read a text file without knowing about UTF-8, at least as an enum-like parameter? Or know floating-point limitations, which every programmer should know, and not notice that float32 is 4 bytes wide? Or choose #rrggbb color never realizing that each 00..FF part stands for 2 hex / 8 bits?
I think in reality it’s either a sign of a very narrow specialization, very low experience or just a complete lack of awareness. It’s like going through a forest for months and never noticing a fallen tree.
by Vanit on 7/28/22, 8:39 AM
by jacknews on 7/28/22, 1:43 PM
Sure you might not actually need to know it to do your job, at least most of the time. But sometimes you will, and honestly it's a bit ridiculous not to know it.
by ragebol on 7/28/22, 9:24 AM
So as always: it depends. A front-end dev: nope, don't think so. Embedded software dev: you don't become one without knowing.
by tpxl on 7/28/22, 8:55 AM
Reminder that 8 bit bytes were standardized in 1993 (ISO/IEC 2382-1:1993 according to Wiki), and before that it was common to have bytes be 6, 8 or 9 bits.
by beardyw on 7/28/22, 8:16 AM
by artemonster on 7/28/22, 8:54 AM
by mettamage on 7/28/22, 8:50 AM
by maxbaines on 7/28/22, 8:24 AM
by raxxorraxor on 7/28/22, 8:38 AM
It is quite useful knowledge though, at least in some applications. I do embedded systems where I use bit operations regularly. I also do visualizations in the browser with JS mostly and I cannot remember ever needing to know that a byte is 8 bit (usually). Thinking about it I don't know the exact size of the datatypes I often use either...
by irvingprime on 7/31/22, 5:42 PM
A better fit for your question would be, should a dev know the difference between big endian and little endian? (Hint: Yes)
by MrDresden on 7/28/22, 10:04 AM
I would internally question the knowledge of someone claiming to be a professional in this field if they didn't know this fact.
Though that doesn't mean they couldn't be proficient and able to do their job.
by jareklupinski on 7/28/22, 1:48 PM
by anotherhue on 7/28/22, 9:57 AM
If they've learned enough to market themselves as a software person, then they should have learned about bytes.
by iib on 7/28/22, 8:48 AM
by StupidOne on 7/28/22, 9:17 AM
by forinti on 7/28/22, 3:16 PM
Bits, bytes, words, doubles, floats, ints... This information is everywhere. Every coding book starts with this.
by farseer on 7/29/22, 7:21 AM
by retrac98 on 7/28/22, 8:41 AM
by drewcoo on 7/28/22, 1:22 PM
Should devs also know economics and philosophy and English literature?
by jiveturkey on 7/28/22, 8:02 AM
by xupybd on 7/28/22, 8:51 AM
I don't think so.
by orangepanda on 7/28/22, 8:42 AM
by lupinglade on 7/28/22, 9:50 AM