Vector Space
Vector spaces are mathematical objects that abstractly capture the geometry and algebra of linear equations. They are the central objects of study in linear algebra.
The archetypical example of a vector space is the Euclidean space . In this space, vectors are -tuples of real numbers; for example, a vector in is . These vectors have an addition operation defined on them, where one adds coordinate-wise: For a numerical example, consider the equation , which holds in . There is also a notion of scalar multiplication; given a number , one may scale a vector by as follows: For instance, in one has .
Abstract vector spaces generalize the example of . Roughly, a vector space is a set whose elements are called vectors, and these vectors can be added and scaled according to a set of axioms modeled on properties of .
Vector spaces often arise as solution sets to various problems involving linearity, such as the set of solutions to homogeneous system of linear equations and the set of solutions of a homogeneous linear differential equation.
Apart from their central role in linear algebra, vector spaces, equipped with some additional structure, appear frequently in other areas of mathematics. Prominent examples include Hilbert spaces and the use of vector spaces in representation theory.
Contents
Definition and Examples
Let be a field; for simplicity, one can assume or .
A vector space over is an abelian group such that for every , there is a map satisfying the following properties:
For any and , we have In other words, the maps are group homomorphisms of .
The map is just the identity map on . That is, for any , one has .
Composition of the maps corresponds to multiplication in : if , then
Addition in commutes with addition in : if and , then
Usually, one simply writes to denote . The map should be thought of as scaling the vector by a factor of .
To better understand this definition, some examples are in order:
Verify that is a vector space over under the standard notions of vector addition and scalar multiplication.
First, we must show is an abelian group under addition of vectors. Certainly the addition operation is associative and commutative, since addition in is associative and commutative. There is an additive identity, the zero vector . For any , an additive inverse exists, namely . Thus, is an abelian group.
Next, we check that scalar multiplication satisfies the given properties. We have , as desired. Furthermore,
We conclude , with the given addition and scalar multiplication operations, forms a vector space.
Other examples of vector spaces include the following:
For any field , the set of -tuples of elements in , denoted , is a vector space over . Vector addition and scalar multiplication are defined precisely as in the case of above. Taking , linear binary codes arise as vector subspaces of .
Let , the set of continuous functions . This is a vector space over . Vector addition is defined by and scalar multiplication by Similarly, , the space of -times continuously differentiable functions is a vector space over .
The set of polynomials with real coefficients of degree , with the usual polynomial addition and multiplication by a real number, is a vector space.
is a vector space over
is a vector space over
, the space of matrices over of order with the matrix addition and scalar multiplication by defined by multiplying each entry of a matrix by
Main Concepts
A set of vectors in a vector space over is said to be linearly independent if no vector in can be expressed as a linear combination of other vectors . One can define linear independence in an equivalent way:
A set is linearly independent if with implies .
The linear span of a subset is the set of all linear combinations of vectors in .
A basis of a vector space is a linearly independent set whose linear span equals . One of the theorems equivalent to the axiom of choice is that every vector space has a basis.
Having defined a mathematical object, it is natural to consider transformations which preserve its underlying structure. This gives rise to the concept of linear transformation between vector spaces over the same field. The choice of bases of vector spaces over the same field enables to represent linear transformations between and through matrices.
Vectors in
The set of vectors in is not linearly independent: . Geometrically, the span of is the -axis in .
The set is linearly independent and its span is . is a basis of .
The set . Although the span of is , is not a basis of since it is not linearly independent.
The set is a basis of .
Subspaces
Given a vector space it is natural to consider properties of its subspaces. The following theorem provides a useful criterion to find subspaces which are vector spaces with the structure inherited from :
Let be a subset of a vector space over with the vector space operations from restricted to . Then is a vector space if and only if the following properties hold:
- for all
- for all
The set of solutions of the equation with is a plane in . When , the plane passes through the origin and forms a vector space as a subspace of .
![]()
However, planes which do not pass through the origin do not have the structure of a vector space when considered as subsets of points of . This shows the necessity of the restriction to homogeneous systems of linear equations. This technical difficulty is resolved using the concept of affine spaces.
The set of functions in satisfying the differential equation is a vector space.
Since the set of solutions is a nonempty-- solves the equation--subset of the vector space , it suffices to show that if and satisfy the given equation, then so do and for all . If satisfy , then
Similarly, .
Additional Structure on Vector Spaces
Further Examples
An matrix with entries in with the property that the sum of entries along each row, column and diagonal is constant and equals is called a matrix magic square of order with line-sum .
An example of a matrix magic square of order is the matrix which is itself a magic square. Note that for matrix magic squares we do not have a restriction that the entries of the matrix have to be distinct, so the matrix is also a matrix magic square of order .
The space of matrix magic squares of order having arbitrary line-sums forms a vector space as a subspace of the matrix vector space :
- The zero matrix is an element of
- If with line-sums and , then the matrix is a matrix magic square with the line-sum . Similarly, the matrix is a matrix magic square having the line-sum .
Problems
Which of the following sets of polynomials are vector spaces over with the usual polynomial addition and scalar multiplication?
A: All polynomials of the form with and .
B: All polynomials of the form with and .
C: All polynomials of the form with and .
Let be a vector space over any field .
A collection of vectors is called dependent if there exist real numbers such that and at least one of the 's is nonzero. Consequently, a collection of vectors is called independent if it is not dependent.
Note that may be thought of as a vector space over the field of rational numbers. With this vector space structure on , is the set dependent or independent? What about the set
Let be a vector space over . A norm on is a function satisfying the following properties:
- The norm is nonnegative: for all , with equality if and only if .
- The norm scales with vectors: for all and .
- The norm satisfies the triangle inequality: for all .
A vector space equipped with a norm is called, unsurprisingly, a normed vector space.
Suppose is a normed vector space. For , define a function by One can verify that this function is a metric on , giving the structure of a metric space.
If the metric space structure on induced by the norm is complete (i.e., Cauchy sequences converge), then is called a Banach space.
Consider the space , consisting of bounded sequences of real numbers. This is a vector space over , using coordinate-wise addition and scalar multiplication. One may define a norm on by setting Under this norm, is a Banach space?