by organicfigs on 5/12/20, 4:55 AM with 200 comments
by katzgrau on 5/12/20, 12:13 PM
Unfortunately I'm one of those people who tends to reject the process until I understand why it works.
If it wasn't for Strang's thoughtful and sometimes even entertaining lectures via OCW, I probably would have failed the course. Instead, as the material became considerably more abstract and actually required understanding, I had my strongest exam scores. I didn't even pay attention in class. I finished with an A. Although my first exam was a 70/100, below the class average, the fact that I got an A overall suggests how poorly the rest of the class must have done on the latter material, where I felt my strongest thanks to the videos.
So anyway, thank you Gilbert Strang.
by pengaru on 5/12/20, 11:57 AM
At some point it's like "Wait, is linear algebra really just about heaps of multiplication and addition? Like every dimension gets multiplied by values for every dimension, and values 0 and 1 are way more interesting than I previously appreciated. That funny identity matrix with the diagonal 1s in a sea of 0s, that's just an orthonormal basis where each corresponding dimension's axis is getting 100% of the multiplication like a noop. This is ridiculously simple yet unlocks an entire new world of understanding, why the hell couldn't my textbooks explain it in these terms on page 1? FML"
I'm still a noob when it comes to linear algebra and 3D stuff, but it feels like all the textbooks in the world couldn't have taught me what some hands-on 3D graphics programming impressed upon me rather quickly. Maybe my understanding is all wrong, feel free to correct me, as my understanding on this subject is entirely self-taught.
by knzhou on 5/12/20, 6:31 AM
I'm really thankful to MIT OCW for putting his lectures out for free -- in fact, I think I'll go donate to them now.
by auggierose on 5/12/20, 8:20 AM
If you look at the order of topics in his book "An Introduction to Linear Algebra", you will find the topic "Linear Transformation" way back in chapter 8! Even after the chapters eigenvalue decomposition and singular value decomposition. But understanding that a matrix is just the representation of a linear transformation in a particular basis is probably the most important and first thing you should learn about matrices ...
by roenxi on 5/12/20, 7:31 AM
by enitihas on 5/12/20, 9:26 AM
Link: https://link.springer.com/book/10.1007/978-3-319-11080-6
by crdrost on 5/12/20, 8:16 AM
But, I do foresee some difficulties. One thing that I find really difficult, for example, is that I take undergrads who have had linear algebra and ask "what is the determinant?" and seldom get back the "best" conceptual answer, "the determinant is the product of the eigenvalues." Like, this is math, the best answer should not be the only one, but it should be ideally the most popular. We would consider it a failure in my mind if the most popular explanation of the fundamental theorem of calculus was not some variation of "integrals undo derivatives and vice versa". I don't see this approach solving that. Furthermore there is a lot of focus from day one on this CR decomposition which serves to say that a linear transform from R^m to R^n might map to a subspace of R^n with smaller dimension r < min(m,n) and while in some sense this is true it is itself quite "unphysical"—if a matrix contains noisy entries then it will generally only be degenerate in this way with probability zero. (You need perfect noise cancelation to get degeneracies, which amounts to a sort of neglected underlying conserved quantity which is pushing back on you and demanding to be conserved.) In that sense the CR decomposition is kind of pointless and is just working around some "perfect little counterexamples". So it seems weird to see someone say "hold this up as the most important thing!!"
by brmgb on 5/12/20, 9:27 AM
My country curriculum introduces linear algebra through group theory and vector spaces. Matrices come later.
by balls187 on 5/12/20, 4:36 PM
Some of the concepts made sense, especially solving for linear systems of equations.
Recently, I decided to brush up on my math skills via Youtube videos, and came across this series: https://www.youtube.com/channel/UCYO_jab_esuFRV4b17AJtAw
It explains Linear Algebra concepts using 2D and 3D vector manipulation, and the animations help me visualize the underlying maths.
by srean on 5/12/20, 8:26 AM
In my time I had picked LA from Ben Noble, Halmos and Axler and the computation side of things from Golub & van Loan.
by irl_zebra on 5/12/20, 1:03 PM
I started doing LA on Khan academy, and checked out Linear Algebra Done Right. LADR was a little too much into the deep end for me. KA seemed to be good. One nice thing about KA is that when I didn't quite remember something (i.e. how exactly to multiply a matrix) I could just go to an earlier pre-LA lesson, pick it up, and then go back to LA where I left off. I'm a few lessons in.
What do you all recommend for someone like me?
by praptak on 5/12/20, 8:48 AM
by tomahunt on 5/12/20, 7:51 AM
https://www.tandfonline.com/doi/abs/10.1080/00029890.2018.14...
by cashsterling on 5/14/20, 5:25 AM
I also really like the applied linear algebra book by Boyd Vandenberghe: https://web.stanford.edu/~boyd/vmls/ Free PDF is available on their website. There is Julia and Python code companions for the book and lecture slides from both Profs their websites. Also check out their other books, many of which have free PDF's available.
I can also recommend Data-Driven science and engineering by Brunton and Kutz. http://databookuw.com/ There used to be a free preprint PDF of the book but I can't find it now. Book is totally worth picking up... MATLAB and Python code available. Steve Brunton's lectures on YouTube are pretty damn good and compliment the book well: https://www.youtube.com/channel/UCm5mt-A4w61lknZ9lCsZtBw/fea...
Another really cool book is Algorithms for Optimization by Mykel Kochenderfer and Tim Wheeler: https://mitpress.mit.edu/books/algorithms-optimization. Julia code used in book.
by frequentnapper on 5/12/20, 7:30 AM
by sqlmonkey on 5/12/20, 7:55 AM
by chadcmulligan on 5/12/20, 11:24 AM
by jp0d on 5/12/20, 7:20 AM
by ivan_ah on 5/12/20, 6:35 PM
One of the interesting new ways of thinking in these lectures is the A = CR decomposition for any matrix A, where C is a matrix that contains a basis for the column space of A, while R contains the non-zero rows in RREF(A) — in other words a basis for the row space, see https://ocw.mit.edu/resources/res-18-010-a-2020-vision-of-li...
Example you can play with: https://live.sympy.org/?evaluate=C%20%3D%20Matrix(%5B%5B1%2C...
Thinking of A as CR might be a little intense as first-contact with linear algebra, but I think it contains the "essence" of what is going on, and could potentially set the stage for when these concepts are explained (normally much later in a linear algebra course). Also, I think the "A=CR picture" is a nice justification for where RREF(A) comes about... otherwise students always complain that the first few chapters on Gauss-Jordan elimination is "mind-numbing arithmetic" (which is kind of true...) but maybe if we present the algorithm as "finding the CR-decomposition which will help you understand dozens of other concepts in the remainder of the course" it would motivate more people to learn about RREFs and the G-J algo.
by inshadows on 5/12/20, 9:47 AM
by synaesthesisx on 5/12/20, 8:00 AM
by glram on 5/12/20, 7:57 AM
On another note, he is such a nice guy. 10/10.
by abecode on 5/12/20, 6:56 PM
by vertak on 5/12/20, 12:05 PM
by penguin_booze on 5/12/20, 9:15 AM
by DreamScatter on 5/12/20, 10:16 PM
by hprotagonist on 5/12/20, 1:39 PM
That certainly got our attention. I’ve always found linear algebra to be kind of ... almost soothing.
by longtimegoogler on 5/12/20, 6:02 PM
That book discusses the actual algorithms used for computation. It is a bit more advanced, but amazingly clear.
by elAhmo on 5/12/20, 10:13 AM
I am aware of his course on OCW, but wondering is there something more interactive and/or newer than those lectures that has similar quality.
by potta_coffee on 5/12/20, 6:54 AM
by anandrm on 5/12/20, 12:48 PM
by xchip on 5/12/20, 7:58 PM
by tomerbd on 5/12/20, 6:50 AM
by lcuff on 5/12/20, 7:57 AM