by dhruvp on 4/23/19, 5:25 PM with 93 comments
by _hardwaregeek on 4/23/19, 8:39 PM
by andrewla on 4/23/19, 7:41 PM
Personally I found the prospect of tensor algebra to be much more intuitive than either of these; with matrices thrown in mostly as a computational device. Even a vector (through the dot product) is just a linear function on other vectors, and the notion of function composition carries through to that and to higher-order tensors.
Covariance and contravariance are a little more complicated to completely grok, but for most applications in Euclidean space (where the metric is the identity function) the distinction is of more theoretical interest anyway.
by munchbunny on 4/23/19, 8:58 PM
That was taught right after a unit on complex numbers and trigonometry so that we could see the parallels between composing polynomial functions on complex numbers and composing affine transformations.
To this day I think that was one of the most beautiful and eye opening lessons I've had in mathematics.
In hindsight, I think I got lucky that the teachers who wrote the curriculum this way were math, physics, and comp sci masters/phd's who looked at their own educations and decided that geometry class was a great Trojan horse for linear algebra.
by whatshisface on 4/23/19, 7:01 PM
by dhruvp on 4/23/19, 7:43 PM
When I first was introduced to matrices (high school) it was in the context of systems of equations. Matrices were a shorthand for writing out the equations and happened to have interesting rules for addition etc. It took me a while to think about them as functions on their own right and not just tables. This post is my attempt to relearn them as functions which has helped me develop a much stronger intuition for linear algebra. That’s my motivation for this post and why I decided to work on it. Feedback is more than welcome.
by michelpp on 4/23/19, 8:30 PM
https://www.3blue1brown.com/essence-of-linear-algebra-page
Math is Fun also has a nice writeup that explain matrix multiplication from a real world example of a bakery making pies and tracking costs:
by noobermin on 4/24/19, 1:08 AM
PS. incase you didn't know, affline transformations are not linear:
f(x) = mx + b =>
f(x+y) = m(x+y) + b /= mx+b + my+b = f(x) + f(y),
f(cx) = c m x + b /= c(mx + b) = c f(x)
by tptacek on 4/23/19, 11:12 PM
https://www.dhruvonmath.com/2019/04/04/kernels/
The matrix/function stuff is elementary enough that I understand it intuitively (I suck at math), although it's neat to be reminded that given a enough independent points you can reconstruct the function (this breaks a variety of bad ciphers, sometimes including ciphers that otherwise look strong).
The kernel post actually does some neat stuff with the kernel, which I found more intuitively accessible than (say) what Strang does with nullspaces.
by adenadel on 4/23/19, 7:12 PM
by meuk on 4/24/19, 7:27 AM
For a matrix M, denote f_M(x) = M * x. Then f_{A * B} = f_A(f_B(x)) so that f_{(A * B) * C} = f_{A * B}(f_C(x)) = f_A(f_B(f_C(x))) and also f_{A * (B * C)} = f_A(f_{B * C}(x)) = f_A(f_B(f_C(x))).
So f_{(A * B) * C} = f_{A * (B * C)} = f_A(f_B(f_C(x)))
by adamnemecek on 4/23/19, 9:53 PM
http://www.reproducibility.org/RSF/book/bei/conj/paper_html/...
Esp the ray tracing/topology relationship is nuts.
by ivan_ah on 4/23/19, 7:54 PM
Here is a video tutorial that goes through some of the same topics (build up matrix product from the general principle of a linear function with vector inputs): https://www.youtube.com/watch?v=WfrwVMTgrfc
Associated Jupyter notebook here: https://github.com/minireference/noBSLAnotebooks/blob/master...
by Jun8 on 4/23/19, 7:24 PM
One question that usually pops up that I was confused about till recently: are rank two tensor equivalent to matrices? Answer is no, e.g. see here: https://physics.stackexchange.com/questions/20437/are-matric...
by S4M on 4/24/19, 7:35 AM
by sytelus on 4/24/19, 10:57 AM
by zwieback on 4/23/19, 7:35 PM
by kregasaurusrex on 4/23/19, 7:38 PM
by diehunde on 4/23/19, 8:18 PM
by mikorym on 4/23/19, 8:09 PM
by je42 on 4/24/19, 6:16 PM
by Grustaf on 4/24/19, 7:04 AM
by j7ake on 4/23/19, 7:08 PM