from Hacker News

Terry Tao on some desirable properties of mathematical notation

by jjhawk on 7/21/20, 11:11 PM with 217 comments

  • by Koshkin on 7/22/20, 2:09 AM

    Mathematical notation is great at facilitating formal manipulations. This is its critical feature, and without it we would get stuck at the level of ancient mathematics. This is the reason it was invented a few hundred years ago in the first place. That said, I find that notation is often abused in texts as a mere substitute for the normal human language which, while allowing to compress the text, does in fact nothing to help the reader better understand what is being said but rather looks like a crazy mess of characters and other marks in a multitude of fonts, styles and sizes the only purpose of which seems to be to cause an eye strain.
  • by dopu on 7/22/20, 4:16 AM

    Is it just me, or does probability theory in general have fairly terrible notation? Ambiguity between random variables and their distributions because of them simply being distinguished by being upper-case or lower-case, writing likelihood functions alternatively with an L() or p(), and using p() (with different arguments) to refer to different probability distributions. Perhaps I'm just having such a difficult time grokking probability theory because it's just difficult stuff, but I often find myself immensely frustrated with the notation.
  • by JadeNB on 7/21/20, 11:59 PM

    I found this post a shame. (The post itself, not putting it here; I love seeing math posts on HN, and automatically upvote. Bringing hackers and mathematicians together is highly worthwhile for both.)

    Usually Tao's posts are so insightful, and crystallise some idea so perfectly that it feels like I was just on the cusp of discovering it myself—a rare talent, and hard to cultivate since it goes against the ego. In this case, though: I'm a professional mathematician, and as prone as anyone in my discipline to use mathematical language to describe not strictly mathematical things, but the pseudo-mathematisation here ("Notation^{-1}(C)", for example) seems more like wit than clarity. Not that there's anything wrong with wit, but in this case it seems to me that it's at the expense of, rather than a pleasant addition to, the central point.

    I'd like to hear especially from anyone who isn't a professional mathematician: did you feel that this post improved your understanding of the purpose and function of good notation?

    (EDIT: I was scared about making this post, since there's rightfully a lot of respect and appreciation for Tao—and I hope it's clear that I concur on both counts—and I wasn't sure how my reticence on his post would go over; but I'm super glad I asked. Thanks so much to everyone downthread; these are wonderful responses and I feel that it benefited me a lot to read them.)

  • by moonchild on 7/22/20, 1:12 AM

    Another interesting notation is iverson notation. See Notation as a Tool of Thought[1]. Here's the inner product (note that this is actually general inner product):

      c ≡ u +.× v
    
    1. https://www.jsoftware.com/papers/tot.htm
  • by emmanueloga_ on 7/22/20, 12:40 AM

    The discussion of mathematical notation reminds me of the talk by Guy Steele "It's Time for a New Old Language", discussed previously in HN [1]. That talk was focused on the Math notation that is used in computer science papers, but I feel a similar analysis could be expanded to other areas of Math.

    1: https://news.ycombinator.com/item?id=15473199

  • by Koshkin on 7/22/20, 12:45 AM

    Difficulties, if any, perceived or real, arising in connection with notation, are usually incomparably smaller than those presented with the subject itself. (Personally, I only wish mathematical notation were better integrated with software in general and programming languages in particular.)
  • by riazrizvi on 7/22/20, 12:53 AM

    Unambiguity as an adjective is slippery. Mathematical notation must be concise, because a key purpose is to provide understanding, which it achieves by focused abstraction. So when you search for notation to model some real world system, you leave things out, as such it leaves room for interpretation when remapping back to the real world, ie there is ambiguity. I think this #1 item should really be termed Consistency, because above all, notation must not contradict itself.
  • by Darkstryder on 7/22/20, 7:53 AM

    Steal this idea: a Shazam of mathematical notation. In an app you would draw (or take a picture) of a mathematical symbol you don’t recognize and get a link to the appropriate Wikipedia page.

    My biggest pet peeve with mathematical symbols is the difficulty of looking them up when you don’t know them already. If I’m reading a text on a topic I'm unfamiliar with, I can at least google the keywords I don't know. This is difficult with symbols.

  • by btrettel on 7/22/20, 2:42 AM

    Terry Tao mentions that notation can help with error detection. Anyone here aware of some good examples?

    One that I like is that in Einstein notation you can't have 3 of the same index, e.g., u_i u_i u_i is invalid.

  • by wavegeek on 7/22/20, 10:13 AM

    I like his point about lack of ambiguity. Nothing makes me want to punch an author in the head (without, to be clear, any possibility I would actually do it) like lazily creating an ambiguous notation, which is supposed to be "clear from context", but rarely is. As for example the Einstein summation convention which is to be ignored "when clear from context".

    I would add

    1. Clearly telegraphing notations. Not hiding them in the middle of long paragraphs or even, and yes I have seen this a few times, defining essential notation in an optional exercise.

    2. Having a glossary of notations, so people don't have to remember every single notation and to read every word of the book sequentially.

    3. Not creating low value notations that may be used only once and then, possibly forgotten. I have read books with > 1 new notation or definition per page, mostly forgotten thereafter but some random subset needed later, and you are not to know which.

  • by vii on 7/22/20, 3:06 AM

    Enumerating what we want from notation helps us understand how far we are from the ideal. The whimsical introduction of Notation to talk about notation makes it practical. Given a domain in mathematics, adding notation (e.g. modulo arithmetic) can make complex notions pretty to express and quick to prove. I used to really enjoy this and tried to redefine notation for each exposition. It's shorter and prettier, but just pushes complexity into the notation :) and teaching people new notations is expensive, actually unless repeatedly used, more expensive than laying out details in a less concise notation.

    Programming languages are notations within this framework - and domain specific languages, while much more efficient are unpopular as the costs of changing notation, in terms of training people, are too high.

    The cost of communicating the notation is captured in a few of the desiderata (e.g. 1,7) but practically it is most important. If we want to be easily understood we should speak a common language!

  • by jjhawk on 7/21/20, 11:12 PM

  • by enriquto on 7/22/20, 9:15 AM

    It would be nice to have an equivalent post, but with programming languages. The fact that different programs perform an identical computation is important. For example, in Python/numpy you can write

        c = 0
        for i in range(u.size):
            c = c + u[i] * v[i]
    
    or

        c = u.T @ v
    
    and even if the result is identical, the computation is not, the first one being orders of magnitude slower. There is no good reason for it to be so, unfortunately.
  • by eternalban on 7/22/20, 11:31 AM

    "Preservation of quality, II" and "Suggestiveness, I" are likely co-manifests.

    I suggest that should one strive to 'fine tune' notation N to possess the above two qualities for a family of objects in X, other categories of objects in X will become opaque and difficult to express, i.e. a domain specific notation.

  • by peignoir on 7/22/20, 1:37 PM

    Anyone would be interested to help on building a google translate for math?
  • by peter303 on 7/22/20, 3:42 AM

    Tao a rare person with 200 IQ
  • by foobar_ on 7/22/20, 2:18 AM

    No one uses mathematical notation for practical purposes. This is just like the medival music notation which is neither practical nor what modern composers use, which is more visual in nature. Infact modernism is a rejection of medievalism.

    I think in the future programming will force all mathematicians to code or give out simulations. Most mathematical notation was intended to be throwaway by the original authors, thats why there are so many notations. Trying to find relevance in them is a pointless exercise. Much like 80x20, tabs vs spaces ... most of the original intent is lost and what survives is guff meant for ceremonious purposes.

  • by zitterbewegung on 7/22/20, 12:07 AM

    This looks like a similar approach to TLA+ but, it looks more similar to a markup language that is domain specific.

    I think that he has a good idea for the most part and if you did formalize this notation there is a good chance that someone in the computer science domain would eventually program something that could interpret it. Lisp comes to mind.

  • by pubby on 7/22/20, 12:28 AM

    Imagine I give you a list of words and ask you to remember them. 5 minutes later, I ask you to give me those words in reverse order. Not too hard, right?

    Now imagine if those words I gave you were in Vietnamese, or some language you don't speak. Suddenly the task becomes much more confusing. You aren't remembering a small handful of objects and ideas, but instead trying to juggle the individual syllables in your head.

    Math notation sucks because none of it maps to things non-mathematicians know. Every time a new symbol is introduced, whether it be a greek letter or a operator, it's one more mapping your brain has to create to remember it. And on top of this, you also have to remember the English names too. Yes I said names - most math concepts have so many different names it's crazy. Even basic arithmetic can't escape this. There are two names for multiplication (multiply, product) and four common notations for representing it (*, x, ·, and whatever you call it when two variables are next to each other).