from Hacker News

Tensors

by bovem on 7/3/24, 3:19 PM with 2 comments

  • by WCSTombs on 7/3/24, 4:05 PM

    > In linear algebra, a tensor is an array of data expanding in multiple (or zero) independent dimensions. It is used to represent quantities/equations/functions with multiple components, for example, the equation 3x+2y=0 could be represented with the tensor [3 2 0] where each value in the tensor represents the different components of the equation.

    Sorry to be pedantic here, but in linear algebra, a tensor represents a multilinear operation, i.e., a function that satisfies

        F(c*x0 + y0, x1, ..., x{n}) = c*F(x0, x1, ..., x{n}) + F(y0, x1, ..., x{n})
    
    and so on for the other n arguments. The variables x{k} and y0 are vectors here, and c is a scalar. The multiple components come in when you choose bases for all the vector spaces involved there, just like the components of a vector or a matrix.

    I'm sweeping some stuff under the rug here (like tensor products) because I don't want to go into a full discourse, but that one fact is like 90% of what all tensors do, and with it in hand, you can easily reach the other 10%.