by jjhawk on 7/21/20, 11:11 PM with 217 comments
by Koshkin on 7/22/20, 2:09 AM
by dopu on 7/22/20, 4:16 AM
by JadeNB on 7/21/20, 11:59 PM
Usually Tao's posts are so insightful, and crystallise some idea so perfectly that it feels like I was just on the cusp of discovering it myself—a rare talent, and hard to cultivate since it goes against the ego. In this case, though: I'm a professional mathematician, and as prone as anyone in my discipline to use mathematical language to describe not strictly mathematical things, but the pseudo-mathematisation here ("Notation^{-1}(C)", for example) seems more like wit than clarity. Not that there's anything wrong with wit, but in this case it seems to me that it's at the expense of, rather than a pleasant addition to, the central point.
I'd like to hear especially from anyone who isn't a professional mathematician: did you feel that this post improved your understanding of the purpose and function of good notation?
(EDIT: I was scared about making this post, since there's rightfully a lot of respect and appreciation for Tao—and I hope it's clear that I concur on both counts—and I wasn't sure how my reticence on his post would go over; but I'm super glad I asked. Thanks so much to everyone downthread; these are wonderful responses and I feel that it benefited me a lot to read them.)
by moonchild on 7/22/20, 1:12 AM
c ≡ u +.× v
1. https://www.jsoftware.com/papers/tot.htmby emmanueloga_ on 7/22/20, 12:40 AM
by Koshkin on 7/22/20, 12:45 AM
by riazrizvi on 7/22/20, 12:53 AM
by Darkstryder on 7/22/20, 7:53 AM
My biggest pet peeve with mathematical symbols is the difficulty of looking them up when you don’t know them already. If I’m reading a text on a topic I'm unfamiliar with, I can at least google the keywords I don't know. This is difficult with symbols.
by btrettel on 7/22/20, 2:42 AM
One that I like is that in Einstein notation you can't have 3 of the same index, e.g., u_i u_i u_i is invalid.
by wavegeek on 7/22/20, 10:13 AM
I would add
1. Clearly telegraphing notations. Not hiding them in the middle of long paragraphs or even, and yes I have seen this a few times, defining essential notation in an optional exercise.
2. Having a glossary of notations, so people don't have to remember every single notation and to read every word of the book sequentially.
3. Not creating low value notations that may be used only once and then, possibly forgotten. I have read books with > 1 new notation or definition per page, mostly forgotten thereafter but some random subset needed later, and you are not to know which.
by vii on 7/22/20, 3:06 AM
Programming languages are notations within this framework - and domain specific languages, while much more efficient are unpopular as the costs of changing notation, in terms of training people, are too high.
The cost of communicating the notation is captured in a few of the desiderata (e.g. 1,7) but practically it is most important. If we want to be easily understood we should speak a common language!
by jjhawk on 7/21/20, 11:12 PM
by enriquto on 7/22/20, 9:15 AM
c = 0
for i in range(u.size):
c = c + u[i] * v[i]
or c = u.T @ v
and even if the result is identical, the computation is not, the first one being orders of magnitude slower. There is no good reason for it to be so, unfortunately.by eternalban on 7/22/20, 11:31 AM
I suggest that should one strive to 'fine tune' notation N to possess the above two qualities for a family of objects in X, other categories of objects in X will become opaque and difficult to express, i.e. a domain specific notation.
by peignoir on 7/22/20, 1:37 PM
by peter303 on 7/22/20, 3:42 AM
by foobar_ on 7/22/20, 2:18 AM
I think in the future programming will force all mathematicians to code or give out simulations. Most mathematical notation was intended to be throwaway by the original authors, thats why there are so many notations. Trying to find relevance in them is a pointless exercise. Much like 80x20, tabs vs spaces ... most of the original intent is lost and what survives is guff meant for ceremonious purposes.
by zitterbewegung on 7/22/20, 12:07 AM
I think that he has a good idea for the most part and if you did formalize this notation there is a good chance that someone in the computer science domain would eventually program something that could interpret it. Lisp comes to mind.
by pubby on 7/22/20, 12:28 AM
Now imagine if those words I gave you were in Vietnamese, or some language you don't speak. Suddenly the task becomes much more confusing. You aren't remembering a small handful of objects and ideas, but instead trying to juggle the individual syllables in your head.
Math notation sucks because none of it maps to things non-mathematicians know. Every time a new symbol is introduced, whether it be a greek letter or a operator, it's one more mapping your brain has to create to remember it. And on top of this, you also have to remember the English names too. Yes I said names - most math concepts have so many different names it's crazy. Even basic arithmetic can't escape this. There are two names for multiplication (multiply, product) and four common notations for representing it (*, x, ·, and whatever you call it when two variables are next to each other).