from Hacker News

Ask HN: Should a professional software dev know how many bits go in a byte?

by ssdspoimdsjvv on 7/28/22, 7:58 AM with 126 comments

Would you say that if you program computers for a living you should always know there are (usually) 8 bits in a byte? Or is it perfectly fine not to know that / can it be considered specialized knowledge if you only program high level software, e.g. front-end web or CRUD business applications?
  • by joe-collins on 7/28/22, 8:50 AM

    Even if someone doesn't need to work at a sufficiently low level as to need that information regularly, not knowing that suggests a severe lack of fundamentals and hints at trouble on the horizon. Someone could play at chemist and work with raw materials and follow recipes without knowing about electron orbitals, but without that understanding they may make disastrous choices if given insufficient oversight and their capacity for problem solving is limited.
  • by dusted on 7/28/22, 8:48 AM

    Emotionally, I want to say yes, they should know how many bits are in a byte, how many bytes are in a kilobyte (1000) and in a kibibyte (1024) and what big and little endianness is, and at least some part of the computing history that had led us to where we are now..

    But rationally, I know that a lot of programmers don't care about computers, or even about programming and to them it's just a job.. And I guess I have no right to judge that..

  • by hericium on 7/28/22, 8:40 AM

    I wouldn't call someone not knowing basics "a professional". And this is a basic knowledge not only for programming but networking, too.

    This theoretical person from the title does use internet and would like to know how many megabytes per second can be pushed through their n megabits link, no?

  • by basilgohar on 7/28/22, 9:13 AM

    What I have observed, as more and more people from a wider background enter into computer science and software engineering, is that it's not so much what they know or don't know that matters, but rather, what is their attitude towards learning something they don't already know.

    If they embrace new knowledge and appreciate those who bring it to them, then I couldn't think of someone better for this art. If, on the other hand, they dismiss anything that they don't immediately see the value to, even at the behest of someone either more senior or experienced, then I would say, that is the red flag.

  • by jb1991 on 7/28/22, 8:30 AM

    Absolutely they should know. This comes up so many times even in high-level programming that anyone who’s doing anything beyond very trivial student applications should know this stuff.
  • by brunooliv on 7/28/22, 8:58 AM

    I think any person remotely interested in computers or programming can and should "know" that a byte is 8 bits.

    But... does it matter?

    Unless you are working with hardware or in embedded systems more directly, it ABSOLUTELY does not matter at all.

    What matters is to have a critical eye for numbers when they matter and/or are "off". That is much more valuable for the day-to-day work.

    "Damn, before we could easily hit this endpoint X times per second and after that refactor and library change it's down to X/5? We need to have a look".

    Sizes of DB tables, latency of endpoints and being able to properly read stacktraces or logs from a third-party service is much more valuable.

    PS: people writing in the comments about not knowing how text encoding works, about sending data trough sockets, etc... Folks, you know how to do these things because they are required from you at your day job, you probably had exposure to proper patterns and got help from experienced people within that particular niche at your company. That's all. Knowing 1 byte is 8 bits is absolutely meaningless knowledge on its own.

    I needed to know these "standard sizes" for my CS degreee,sure. Do they matter for my day-to-day work? Not in the slightest.

  • by mjbeswick on 7/28/22, 8:12 AM

    Absolutely they should, as without knowing what a byte is there is little chance someone would know how data is represented in computer systems.
  • by TheCoelacanth on 7/28/22, 3:12 PM

    That's like if a professional writer didn't know how many letters are in the alphabet.

    It has no bearing on actually doing the job, but how on earth did you manage to learn the necessary skills without learning that?

  • by seren on 7/28/22, 8:20 AM

    It is important to understand how many MegaBytes/s are sent through 100 Megabits/s ethernet link for example. (or whatever communication link)
  • by joshka on 7/28/22, 9:02 AM

    Yes, regardless of how much these are abstracted in modern software development. Size is an important aspect of many things in a computer. It impacts:

    - storage capacity

    - speed of communication

    - speed of computation

    - representation of numeric values

    - encoding of strings

    Bits and Bytes are the fundamental units of size and representation, and knowing them is critical to understanding most other things that build on top of them.

    For high level software, you see this come up in:

    - integer limitations (e.g. database identifiers)

    - internationalization of user interfaces

    - floating point numeric comparison and other mathematical operations

    - communications over various transport layers

    - speed of various abstractions that are used to make the higher level software fast / good

  • by vbezhenar on 7/28/22, 8:34 AM

    This is basic knowledge. Of course one should know how many bits go in a byte. It's like asking if one should know how to replace a bulb. Not a software developer, every person should know that. It's like knowing that there're 1024 bytes in kibibyte or 1000 meters in kilometer.

    These days one should even know how exactly double is represented in hardware as it's the main type in JS and one should clearly understand bounds of integer numbers which could be represented in double without precision lost.

  • by WelcomeShorty on 7/28/22, 11:04 AM

    Should a programmer also know about FS block sizes? About ethernet size? About Multithreading? Memory access bandwidth? CPU architecture? Stored procedures in a DB? Current encryption schemes?

    I would say it depends totally on the tasks of the coder. If you fiddle around with some ASP: NO. If you are squeezing every last bit for Netflix performance: YES.

    Our domain is so incredibly huge, deep, wide & divers that I do not expect regular coders to be familiar with most details.

  • by blikdak on 8/5/22, 5:53 AM

    Yes. Try adding 1 to a number which is already (in binary) all '1's. If you don't know the size of a byte, or an 'int' or a 'long int' etc for the language you are using you will be mystified why it is now 0, or maybe some other number your language decided was sensible to use (signed vs. unsigned etc). Then extrapolate that times 100 when dealing with floating point etc and presenting a sensible total in your shopping cart web app. If you don't know this stuff learn it or stay away from programming.
  • by shafyy on 7/28/22, 8:33 AM

    I don't think it's required to work as an average professional web dev. If you want to be a good developer and keep getting better, understanding the fundamentals of computer science is very important (and makes the job more interesting, in my mind).
  • by exyi on 7/28/22, 9:43 AM

    People say you should know something about the lower-level programming and they are right IMHO.

    Even as computer user: Internet speeds are regularly given as megabits per seconds. How do you estimate how long will a 5GB file upload without knowing it's 8b/B ?

  • by dragonelite on 7/28/22, 8:36 AM

    Of course people should know some of the basics of hardware and software development. Not knowing the basics or the chain of steps a framework or a protocol takes means you might oversee optimisation or pain points in a certain workflow.

    But then again knowing how many bits go in a byte it's not needed to be productive in 99% of the cases.

    Then again when a professional software developer can't tell me how many bits go in a byte or cant at least explain to me how to read/use call stack/stack frame in a IDE or terminal etc. I really start to wonder how experienced and professional that developer really is.

  • by alkonaut on 7/28/22, 9:04 AM

    Yes. This comes up even in the highest level and most trivial work.

    And not just “should be able to look it up”, but should know it in their sleep.

    Should also know other esoterica like e.g how a string could be zero terminated in one case but length prefixed in others, Joe they could be one, two, four, or variable numbers of bytes per character/glyph in a string and so on. Not because one needs this every day but a) because doing work adjacent to it means you eventually shoot yourself in the foot and b) doing any amount of work will expose you to this and not having picked up on it means you aren’t picking up things.

  • by AlphaGeekZulu on 7/28/22, 8:52 AM

    Yes, probably any software developer should know the trivial fact, that a byte consists of 8 bits (nowadays) - even if the developer is not involved with bit operations.

    Much more important, in my eyes, and more sophisticated, is the concept of the computer "word": a professional software dev should know how many bytes go in a word, how word length corresponds with bus width, memory organisation and how types depend on word length.

    And in my humble opinion, Unicode encoding is such a basic concept, that every dev should understand it on a bit level.

  • by oxff on 7/28/22, 9:11 AM

    If you can't do napkin math with bits and bytes to calculate capacities, bandwidths, flows etc. you are lacking one of the most essential skills there are, and need to pick up the slack ASAP.
  • by superchroma on 7/28/22, 8:28 AM

    Probably not, it's something you can look up, and the 1000 vs 1024 naming distinction (e.g. kibi vs kilo) continues to be something that most developers I encounter aren't up to speed on. I use libraries with classes modelling such units in them if I need to do arithmetic with them, at the very least to offer security for others when they look at my code.

    I think what matters is that you're reasoning about things and being thorough and careful. Broadly, I haven't yet had someone give me a pop quiz at my job yet.

  • by aoshifo on 7/28/22, 9:04 AM

    I was about to say "no", until I realized just how much I internalized this knowledge. In front-end you may be able to write decent applications without knowing that, but for CRUD apps it would say it is impossible to avoid.

    I mean, it's something you will inevitably stumble upon. I'd say that even if you actively try to avoid learning how many bits go in a byte you will one day wake up screaming "Fuuuuuuck, I never wanted to know that there are 8 bits in a byte!!!!!" There, now you know, too!

  • by postcynical on 7/28/22, 9:20 AM

    Based on my 25years experience with that line of work, this knowledge is not required at all and won't make you write better code. Some coworkers will question your competence, which is a disadvantage.

    Hiring wise, if you've spent a few years programming without ever learning about "that bits&bytes thing", an employer is tempted to doubt your education, level of curiosity and/or general interest in your field.

  • by MattPalmer1086 on 7/28/22, 10:48 AM

    What a surprising question!

    To me, knowing how computers work is just something I thought every programmer would just know as fundamental knowledge (note bytes don't always have 8 bits, but it's the most common).

    I guess there is not an absolute need to know this if it doesn't affect business logic type programming or UIs, although I think I might think a person was not very well educated or curious if they didn't know that.

  • by Kim_Bruning on 7/28/22, 9:09 AM

    There's a surprising number of programmers who can't count in binary, or who can't do bitwise operations , or ... any number of things.

    It's surprising how far some people can get with their hands tied behind their backs.

    I wouldn't recommend it of course. Also it might possibly be a kind of warning flag: how can someone constantly be exposed to this stuff and never pick it up? But somehow it does sometimes happen!

    (see also eg. : https://news.ycombinator.com/item?id=32216904 Experienced C programmer who can't fizzbuzz?)

    [ For folks wondering why I'm talking about counting: Bits and bytes is all about learning to count with just 1 finger: 1, 10, 11, 100, 101, 110, 111, 1000, ... And the largest number you can count up to in a byte is 1111 1111 of course (A byte has 8 Binary digITs, and the highest number you can count to in a byte is having all those digits be 1s. In decimal this is the number 255. If you use a computer a lot, you might have run into 255 more often ). ]

  • by throwaway2016a on 7/28/22, 1:55 PM

    My first tech question I got asked at an interview 20 years ago was:

    > Is this number positive or negative? > 0101 0101

    They expected me to ask about things like endianness and to know the least significant bit is the only one we are about to answer that question.

    While bits in a byte "is" a "low level" topic. It is one of the more useful ones even for CRUD.

    My instinct is that "yes"... basic knowledge like this is important. After all, even as an API developer I have often had to know things like how many bytes an int is on my machine, how bitwise operators work, whether the system is little endian or big endian, etc.

    Also, how do you understand Unicode (something very important for i18n) if you don't know the answer to this question?

    But then I had to go back and second guess myself. Most of this stuff is abstracted out now.

    I would say it comes down to what type of data you are dealing with. If you're dealing with files in binary formats, compilers, etc. Then yes, you need that stuff. And yes, a simple CRUD app does sometimes need to deal with that stuff. But if not? I guess it's fine not to know.

  • by zerof1l on 7/29/22, 9:42 AM

    My answer to the question of whether should X know Y is yes if this is what you do on daily basis. If this is relevant but needed only occasionally, then just having a vague awareness of it existing and being able to find exact information when needed is enough. In my opinion, a professional is not somebody who knows absolutely everything there is to know in the field, but someone who firstly has a lot of hands-on experience and somebody who can quickly fill in the gaps in the knowledge when necessary.

    A "professional software dev" is a very broad category. In most of web development, you can get away just fine without knowing how many bits are in a byte. On the other hand, in Arduino you'd encounter bits and bytes on a regular basis.

    As for me, if somebody was to ask me this question during an interview, I would assume that it's a trick question. There were some old exotic architectures in the past such as IBM that stored data in 6bits.

  • by beej71 on 7/29/22, 3:57 AM

    20 years ago? Definitely. Today things are a little different. So much abstraction makes it possible to plug programs together without knowing such things.

    If I were hiring a generalist hacker type, they'd better know the answer. I expect a generalist to be able to do all kinds of tasks. But a specialist on something high level might get a pass.

  • by wruza on 7/28/22, 10:32 AM

    That’s a strange question. If by professional you mean by trade, then yes, many programmers don’t work with bits or type limits directly and could do fine without that knowledge.

    If by some sort of baseline certification (school, degree, etc) it’s hard to imagine that it’s not in an educational program, e.g. UTF-8 hints at it directly it the name. Can a programmer read a text file without knowing about UTF-8, at least as an enum-like parameter? Or know floating-point limitations, which every programmer should know, and not notice that float32 is 4 bytes wide? Or choose #rrggbb color never realizing that each 00..FF part stands for 2 hex / 8 bits?

    I think in reality it’s either a sign of a very narrow specialization, very low experience or just a complete lack of awareness. It’s like going through a forest for months and never noticing a fallen tree.

  • by Vanit on 7/28/22, 8:39 AM

    Yep. Even in a language like JavaScript it'll come up for handling binary data blobs/streams.
  • by jacknews on 7/28/22, 1:43 PM

    It's almost like a journalist or novelist not knowing there are 26 letters in the alphabet.

    Sure you might not actually need to know it to do your job, at least most of the time. But sometimes you will, and honestly it's a bit ridiculous not to know it.

  • by ragebol on 7/28/22, 9:24 AM

    I think it's hard not to know this, but outside of embedded I don't think I ever needed that knowledge.

    So as always: it depends. A front-end dev: nope, don't think so. Embedded software dev: you don't become one without knowing.

  • by tpxl on 7/28/22, 8:55 AM

    They need not know by heart, but they should be able to find out if necessary.

    Reminder that 8 bit bytes were standardized in 1993 (ISO/IEC 2382-1:1993 according to Wiki), and before that it was common to have bytes be 6, 8 or 9 bits.

  • by beardyw on 7/28/22, 8:16 AM

    Modern JavaScript has byte representations like Uint8Array so I would say yes.
  • by artemonster on 7/28/22, 8:54 AM

    The infestation of the workforce by the bootcamped parrots that puke out memoized nonsense and call themselves a professional programmer is going to further degrade the quality of code everywhere
  • by mettamage on 7/28/22, 8:50 AM

    The problem with purely learning things on a need to know basis is that every debug task that goes outside of the programming knowledge of the dev becomes an incredible scavenger hunt.
  • by maxbaines on 7/28/22, 8:24 AM

    Hmmm this is interesting, my background is industrial automation, where every day I would work with bits and bytes. But today and the last 10 or so years I have been building higher level software, machine data collection, web, db etc. I can honestly say not once have I had to deal with bits or bytes, I cannot think of an instance here I have needed the knowledge either. So I guess it’s not a requirement for software development today. But I still feel it’s important knowledge but can qualify why.
  • by raxxorraxor on 7/28/22, 8:38 AM

    I guess you could develop in a high level language and don't know about it. But I guess at some point you would come in contact with that knowledge.

    It is quite useful knowledge though, at least in some applications. I do embedded systems where I use bit operations regularly. I also do visualizations in the browser with JS mostly and I cannot remember ever needing to know that a byte is 8 bit (usually). Thinking about it I don't know the exact size of the datatypes I often use either...

  • by irvingprime on 7/31/22, 5:42 PM

    Functionally, you don't usually need to know this kind of thing. But it's something that is taught in CS 101, pretty close to day 1. Yes, you should know stuff that basic!

    A better fit for your question would be, should a dev know the difference between big endian and little endian? (Hint: Yes)

  • by MrDresden on 7/28/22, 10:04 AM

    Yes. Knowing that 8 bits make up a byte is basic knowledge akin to knowing H2O is the chemical make up of water.

    I would internally question the knowledge of someone claiming to be a professional in this field if they didn't know this fact.

    Though that doesn't mean they couldn't be proficient and able to do their job.

  • by jareklupinski on 7/28/22, 1:48 PM

    Yes, so that you can make people chuckle when you tell them 4 bits is a 'nibble'
  • by anotherhue on 7/28/22, 9:57 AM

    Uneducated people do not determine whether basic knowledge should be classified as 'specialised'.

    If they've learned enough to market themselves as a software person, then they should have learned about bytes.

  • by iib on 7/28/22, 8:48 AM

    The highest respect goes to the person answering `CHAR_BIT`, of course.
  • by StupidOne on 7/28/22, 9:17 AM

    When you are working as backend developer and your clients are mobile devices, is your message 8 times smaller or bigger is difference between up-and-running and outage.
  • by forinti on 7/28/22, 3:16 PM

    I find it hard to believe someone who codes doesn't know this.

    Bits, bytes, words, doubles, floats, ints... This information is everywhere. Every coding book starts with this.

  • by farseer on 7/29/22, 7:21 AM

    Unless you only write excel formulas or strictly do GUI development, I think you should know this. It is also a great phone screening question.
  • by retrac98 on 7/28/22, 8:41 AM

    It depends if it's going meaningfully affect their ability to develop software. In a lot of modern software engineering roles it won't.
  • by drewcoo on 7/28/22, 1:22 PM

    To all of the people who think devs should know this:

    Should devs also know economics and philosophy and English literature?

  • by jiveturkey on 7/28/22, 8:02 AM

    Should you even know what a byte is?
  • by xupybd on 7/28/22, 8:51 AM

    I would say so. But I'm trying to think if I've ever had to use that?

    I don't think so.

  • by orangepanda on 7/28/22, 8:42 AM

    Bytes/bits are themselves an abstraction. If you dont work with this representation directly, then this trivia can be added with others in the list of things you dont know.
  • by lupinglade on 7/28/22, 9:50 AM

    Do you really have to ask?