by harscoat on 9/7/22, 8:07 PM with 49 comments
by vba616 on 9/8/22, 5:10 AM
I think it's interesting to consider how this is/is not true in a real life context.
If someone tells you a fact you already know, then it could be communicating several different things.
1. It might mean *they think* you don't know it.
2. It might mean they want to make sure *you know* they know you know it.
3. It might mean they believe you know it and want to communicate they agree.
4. It might mean they are hoping for you to (or not to) contradict/argue/refute it.
If, however, you can't tell which of those (or some other alternative) is intended, then no information was communicated at all.Except...there is a message communicated that they have some interest or concern regarding the topic, which is different from remaining silent.
Whenever someone says anything, true, false, nonsense, lie, they are communicating a true and possibly significant fact - that they chose to say that thing, in whatever circumstance.
I am thinking of "Gödel, Escher, Bach", if you can't tell.
by zbobet2012 on 9/7/22, 11:55 PM
This is, in fact, incorrect generally. Movie files are generally compressed via _lossy_ compression, which escapes the Shannon entropy limit. Entropy based encoding, as described in this article, is generally only the final stage in a video "codec" (algorithm to compress, and decompress video). The first stage relies of finding information in the visual field that humans are unlikely to notice the loss of and discarding it. The second stage does interframe (between frames) compression by simply searching for blocks of pixels that match. Entropy coding is generally the final stage.
This, by the way, is why zipping a video file can make it... larger.
by ttctciyf on 9/7/22, 11:17 PM
While this is generally true in practice, strictly speaking the shortest possible encoding of a sequence of bits could be smaller than this would suggest, since it's possible a pattern exists that has escaped notice.
For example: the first 10 billion digits of pi pass statistical tests for randomness, but can be generated by a short program which calculates them until the desired length is reached, in effect "compressing" them to the length of the generator program.
Because of considerations like this, Algorithmic Information Theory[1] equates the information content of a bit sequence with the length of the shortest program which will generate it - though this has the drawback of being generally uncomputable - an intriguingly different paradigm.
1: see https://en.wikipedia.org/wiki/Algorithmic_information_theory
by acjohnson55 on 9/7/22, 10:18 PM
Claude Shannon was a titan. He's also known for publishing what some call the most important masters thesis of all time, https://en.wikipedia.org/wiki/A_Symbolic_Analysis_of_Relay_a....
by rkp8000 on 9/7/22, 9:32 PM
by mpalmer on 9/7/22, 9:52 PM
Speaking of information and decompression, my brain initially interpreted the title as a breaking news headline about a female supervillain who's also a theory nerd.
It was only the added context of the domain name that fixed the error.
by javajosh on 9/7/22, 11:52 PM
by _wldu on 9/7/22, 10:37 PM
A four digit numeric PIN (that we know) has 0 bits of entropy. There is no uncertainty about what the PIN actually is. A randomly selected one (that we do not know) has just over 13 bits.
print(math.log(10)/math.log(2)*4)
13.28771237954945
The more entropy, the more uncertain we are.
However, humans are not random. We use the year we were born, some keyboard sequence or some other predictable number as our PIN. We don't know exactly how much entropy these PINS have (there is some degree of uncertainty), but we do know they are significantly less than 13 bits.
by bob1029 on 9/7/22, 10:54 PM
The actual information content of a thing has always surprised me, especially when factoring in slightly imperfect (but passable) representations such as JPEG or MP3. To lose some information seems entirely acceptable in many cases.
by ffhhj on 9/8/22, 1:29 AM
http://www-formal.stanford.edu/jmc/slides/dartmouth/dartmout...