by bdr on 11/13/23, 9:12 PM with 10 comments
by derbOac on 11/15/23, 3:08 PM
The other day I was talking about something related with my daughter, for example, who was learning about rounding in elementary school — we ended up in a discussion about accuracy in calculations versus "number of operations" very vaguely speaking (in elementary school terms), and the tradeoff, and how you're always rounding at some level in practice, so that tradeoff always exists at some level.
I also do research in information theory, and somehow the topic Tao discusses seems related. In that area, there's always some potential or actual loss of information due to information and computational constraints, things are always being discretized, and some representation always has some information cost. What Tao is talking about is an information cost, but cast in terms of numerical accuracy rather than stochastic terms.
This is all very vague in my head but it seems like there is some path from stochastic information costs of representation to deterministic information costs of representation, along the lines of approximations and limits. People use probabilistic arguments in proofs, for instance, and there's pseudorandom numbers; I imagine you could treat both what Tao is talking about and more traditional information theory problems in the same framework.
by tgv on 11/14/23, 12:36 PM
Convergence is presented as just a pattern. It doesn't have to be economic, but the example naturally suggests convergence, so that's ok.
But continuity and differentiability didn't make sense either. You don't "buy" continuity. There's no (increasing) value attached to smaller intervals, at least not in my understanding of it.
by robertlagrant on 11/14/23, 4:59 PM
On a much more basic level, I plugged in e to formulae throughout my schooling to the age of 18, and only later realised that $e is equal to the amount of interest you'd have on a bank account of $1 if you applied 100% interest continuously compounded.
by timeagain on 11/14/23, 6:32 PM