A linear transformation is a function from one vector space to another that respects the underlying (linear) structure of each vector space. A linear transformation is also known as a linear operator or map. The range of the transformation may be the same as the domain, and when that happens, the transformation is known as an endomorphism or, if invertible, an automorphism. The two vector spaces must have the same underlying field.
The defining characteristic of a linear transformation is that, for any vectors and in and scalars and of the underlying field,
Linear transformations are useful because they preserve the structure of a vector space. So, many qualitative assessments of a vector space that is the domain of a linear transformation may, under certain conditions, automatically hold in the image of the linear transformation. For instance, the structure immediately gives that the kernel and image are both subspaces (not just subsets) of the range of the linear transformation.
Most linear functions can probably be seen as linear transformations in the proper setting. Transformations in the change of basis formulas are linear, and most geometric operations, including rotations, reflections, and contractions/dilations, are linear transformations. Even more powerfully, linear algebra techniques could apply to certain very non-linear functions through either approximation by linear functions or reinterpretation as linear functions in unusual vector spaces. A comprehensive, grounded understanding of linear transformations reveals many connections between areas and objects of mathematics.
A common transformation in Euclidean geometry is rotation in a plane, about the origin. By considering Euclidean points as vectors in the vector space , rotations can be viewed in a linear algebraic sense. A rotation of counterclockwise by angle is given by
The linear transformation goes from to and is given by the matrix shown above. Because this matrix is invertible for any value , it follows that this linear transformation is in fact an automorphism. Since rotations can be "undone" by rotating in the opposite direction, this makes sense.
Linear transformations are most commonly written in terms of matrix multiplication. A transformation from -dimensional vector space to -dimensional vector space is given by an matrix . Note, however, that this requires choosing a basis for and a basis for , while the linear transformation exists independent of basis. (That is, it could be expressed as a matrix for any selection of bases.)
The linear transformation from to defined by is given by the matrix
So, can also be defined for vectors by the matrix product
Note that the dimension of the initial vector space is the number of columns in the matrix, while the dimension of the target vector space is the number of rows in the matrix.
Linear transformations also exist in infinite-dimensional vector spaces, and some of them can also be written as matrices, using the slight abuse of notation known as infinite matrices. However, the concept of linear transformations exists independent of matrices; matrices simply provide a nice framework for finite computations.
A linear transformation is surjective if every vector in its range is in its image. Equivalently, at least one minor of the matrix is invertible. It is injective if every vector in its image is the image of only one vector in its domain. Equivalently, at least one minor of the matrix is invertible.
Is the linear transformation , from to , injective? Is it surjective?
For a vector , this can be written as
is a matrix, so it is surjective because the minor has determinant and therefore is invertible (since the determinant is nonzero). However, there are no minors, so it is not injective.
A linear transformation between two vector spaces of equal dimension (finite or infinite) is invertible if there exists a linear transformation such that and for any vector . For finite dimensional vector spaces, a linear transformation is invertible if and only if its matrix is invertible.
Note that a linear transformation must be between vector spaces of equal dimension in order to be invertible. To see why, consider the linear transformation from to . This linear transformation has a right inverse That is, for all . However, it has no left inverse, since there is no map such that for all . This follows from facts about the rank of .
Which of the following is/are invertible linear transformations?
- is the transformation that takes to .
- is the transformation that takes to .
- is the vector space of all sequences of real numbers (vector addition creates a new sequence from the component-wise sums of the previous two). is the "right shift" transformation that takes a sequence and returns a sequence satisfying and for all .
Assume is a vector space over the complex numbers.
A linear transformation can take many forms, depending on the vector space in question.
Consider the vector space of polynomials of degree at most . By noting there are coefficients in any such polynomial, in some sense the equality holds. However, there is a natural linear transformation on the vector space that satisfies
A linear transformation from vector space to vector space is determined entirely by the image of basis vectors of . This allows for more concise representations of linear transformations, and it provides a linear algebraic explanation for the relation between linear transformations and matrices (the matrix's columns and rows represent bases).
Let and be vector spaces over the same field, and let be a set of basis vectors of . Then, for any function , there is a unique linear transformation such that for each . Furthermore, the span of is equal to the image of .
In other words, a linear transformation can be created from any function (no matter how "non-linear" in appearance) on the basis vectors. The behavior of basis vectors entirely determines the linear transformation.
The proof follows from the fact that any element of is expressible as a linear combination of basis elements and that there is only one possible such linear combination.
With this mentality, change of basis can be used to rewrite the matrix for a linear transformation in terms of any basis. This is particularly helpful for endomorphisms (linear transformations from a vector space to itself).
However, the linear transformation itself remains unchanged, independent of basis choice. That is, no matter what the choice of basis, all the qualities of a linear transformation remain unchanged: injectivity, surjectivity, invertibility, diagonalizability, etc.
We can also establish a bijection between the linear transformations on -dimensional space to -dimensional space . Let be a such transformation, and fix the bases for , and for . Then we can describe the effect of on each basis vector as follows: . Define the matrix to be the matrix of transformation of in the bases . The th column of describes the effect of on the basis vector of , and from the previous ideas, we can now describe using coordinates the effect of on any vector in via matrix multiplication. Conversely, selecting an matrix and multiplying on the right-hand side linear combinations of the vectors in also defines a linear transformation from an -dimensional space into an -dimensional space by definition of matrix multiplication. Since the linear transformations from to , the set of which is denoted , is itself a vector space, when bases are fixed for such transformations, a bijection is established therefrom into the set of all matrices.
Let be a linear transformation in a finite-dimensional of a vector space. Then, for some choice of basis and matrix , . Which properties of the matrix remain unchanged regardless of basis