# Vector Space

**Vector spaces** are mathematical objects that abstractly capture the geometry and algebra of linear equations. They are the central objects of study in linear algebra.

The archetypical example of a vector space is the Euclidean space $\mathbb{R}^n$. In this space, *vectors* are $n$-tuples of real numbers; for example, a vector in $\mathbb{R}^2$ is $(3,4)$. These vectors have an addition operation defined on them, where one adds coordinate-wise: $(a_1, a_2, \ldots, a_n) + (b_1, b_2, \ldots, b_n) = (a_1 + b_1, a_2 + b_2, \ldots, a_n + b_n).$ For a numerical example, consider the equation $(1,-1) + (6,1) = (7,0)$, which holds in $\mathbb{R}^2$. There is also a notion of *scalar multiplication*; given a number $c\in \mathbb{R}$, one may scale a vector by $c$ as follows: $c\cdot (a_1, a_2, \ldots, a_n) = (c a_1, ca_2, \ldots, ca_n).$ For instance, in $\mathbb{R}^2$ one has $3\cdot (5,8) = (15,24)$.

Abstract vector spaces generalize the example of $\mathbb{R}^n$. Roughly, a vector space is a set whose elements are called *vectors*, and these vectors can be added and scaled according to a set of axioms modeled on properties of $\mathbb{R}^n$.

Vector spaces often arise as solution sets to various problems involving linearity, such as the set of solutions to homogeneous system of linear equations and the set of solutions of a homogeneous linear differential equation.

Apart from their central role in linear algebra, vector spaces, equipped with some additional structure, appear frequently in other areas of mathematics. Prominent examples include Hilbert spaces and the use of vector spaces in representation theory.

#### Contents

## Definition and Examples

Let $F$ be a field; for simplicity, one can assume $F = \mathbb{R}$ or $F = \mathbb{C}$.

A

vector space over $F$is an abelian group $(V,+)$ such that for every $c\in F$, there is a map $\phi_{c}: V \to V$ satisfying the following properties:

For any $v, w \in V$ and $c\in F$, we have $\phi_{c} (v + w) = \phi_{c} (v) + \phi_{c} (w).$ In other words, the maps $\phi_{c}$ are group homomorphisms of $V$.

The map $\phi_{1}: V \to V$ is just the identity map on $V$. That is, for any $v\in V$, one has $\phi_{1} (v) = v$.

Composition of the maps $\phi_{c}$ corresponds to multiplication in $F$: if $a, b \in F$, then $\phi_{a} \circ \phi_{b} = \phi_{ab}.$

Addition in $F$ commutes with addition in $V$: if $a, b \in F$ and $v\in V$, then $\phi_{a+b} (v) = \phi_{a} (v) + \phi_{b} (v).$

Usually, one simply writes $cv$ to denote $\phi_{c} (v)$. The map $\phi_{c}$ should be thought of as

scalingthe vector $v$ by a factor of $c$.

To better understand this definition, some examples are in order:

Verify that $\mathbb{R}^2$ is a vector space over $\mathbb{R}$ under the standard notions of vector addition and scalar multiplication.

First, we must show $\mathbb{R}^2$ is an abelian group under addition of vectors. Certainly the addition operation is associative and commutative, since addition in $\mathbb{R}$ is associative and commutative. There is an additive identity, the zero vector $(0,0)$. For any $(a,b) \in \mathbb{R}^2$, an additive inverse exists, namely $(-a,-b)$. Thus, $(\mathbb{R}^2, +)$ is an abelian group.

Next, we check that scalar multiplication $c(a,b) = (ca, cb)$ satisfies the given properties. We have $1(a,b) = (a,b)$, as desired. Furthermore,

$\begin{aligned} c\big((a,b) + (x,y)\big) &= c(a+x,b+y) \\ &= (ca+cx,cb+cy) \\ &= (ca, cb) + (cx, cy) \\ &= c(a,b) + c(x,y)\\\\ mn(a,b) &= (mna, mnb) \\ &= m(na,nb) \\ &= m\big(n(a,b)\big)\\\\ (c+d)(a,b) &= (ca+da, cb+db) \\ &= (ca, cb) + (da,db) \\ &= c(a,b) + d(a,b). \end{aligned}$

We conclude $\mathbb{R}^2$, with the given addition and scalar multiplication operations, forms a vector space. $_\square$

Other examples of vector spaces include the following:

For any field $F$, the set of $n$-tuples of elements in $F$, denoted $F^n$, is a vector space over $F$. Vector addition and scalar multiplication are defined precisely as in the case of $\mathbb{R}^n$ above. Taking $F=\mathbb{Z}_2$, linear binary codes arise as vector subspaces of $\mathbb{Z}_2^{n}$.

Let $V = C[0,1]$, the set of continuous functions $[0,1] \to \mathbb{R}$. This is a vector space over $\mathbb{R}$. Vector addition is defined by $(f+g)(x) = f(x) + g(x)$ and scalar multiplication by $(cf)(x) = cf(x).$ Similarly, $C^{k}(\mathbb{R})$, the space of $k$-times continuously differentiable functions $\mathbb{R} \to \mathbb{R}$ is a vector space over $\mathbb{R}$.

The set $P_{n} = \big\{\sum_{i=0}^{n} a_i x^{i} \big| a_{i}\in \mathbb{R} \ \forall i \big\}$ of polynomials with real coefficients of degree $\leq n$, with the usual polynomial addition and multiplication by a real number, is a vector space.

$\mathbb{C}$ is a vector space over $\mathbb{R}.$

$\mathbb{R}$ is a vector space over $\mathbb{Q}.$

$M_n(\mathbb{R})$, the space of matrices over $\mathbb{R}$ of order $n$ with the matrix addition and scalar multiplication by $c\in \mathbb{R}$ defined by multiplying each entry of a matrix by $c.$

## Main Concepts

A set $S = \{v_1,\ldots, v_k\}$ of vectors in a vector space $V$ over $F$ is said to be linearly independent if no vector in $S$ can be expressed as a linear combination of other vectors $S$. One can define linear independence in an equivalent way:

A set $\{v_1,\ldots,v_k\}\subseteq V$ is linearly independent if $a_1v_1 + a_2v_2 + \cdots + a_kv_k = 0$ with $a_i\in F \ \forall \ i = 1,\ldots,k$ implies $a_1=a_2=\cdots=a_k=0$.

The linear span of a subset $S\subseteq V$ is the set of all linear combinations of vectors in $S$.

A basis of a vector space $V$ is a linearly independent set whose linear span equals $V$. One of the theorems equivalent to the axiom of choice is that every vector space has a basis.

Having defined a mathematical object, it is natural to consider transformations which preserve its underlying structure. This gives rise to the concept of linear transformation between vector spaces over the same field. The choice of bases of vector spaces $V,W$ over the same field $F$ enables to represent linear transformations between $V$ and $W$ through matrices.

Vectors in $\mathbb{R}^2$

The set $S_1 = \{(1,0),(2,0)\}$ of vectors in $\mathbb{R}^2$ is not linearly independent: $(2,0) = 2\cdot(1,0)$. Geometrically, the span of $S$ is the $x$-axis in $\mathbb{R}^2$.

The set $S_2 = \{(1,0),(0,1)\}$ is linearly independent and its span is $\mathbb{R}^2$. $S_2$ is a basis of $\mathbb{R}^2$.

The set $S_3 = \{(1,0),(0,1),(1,1)\}$. Although the span of $S_3$ is $\mathbb{R}^2$, $S_3$ is not a basis of $\mathbb{R}^2$ since it is not linearly independent.

The set $\{1,x,x^2,\ldots, x^n\}$ is a basis of $P_n$.

## Subspaces

Given a vector space $V,$ it is natural to consider properties of its subspaces. The following theorem provides a useful criterion to find subspaces which are vector spaces with the structure inherited from $V$ :

Let $W\neq \emptyset$ be a subset of a vector space $V$ over $F$ with the vector space operations from $V$ restricted to $W$. Then $W$ is a vector space if and only if the following properties hold:

- $u+v\in W$ for all $u,v\in W$
- $cu\in W$ for all $u\in W, c\in F.$

The set of solutions $(x,y,z)\in\mathbb{R}^{3}$ of the equation $ax+by+cz + d = 0$ with $a,b,c,d\in \mathbb{R}$ is a plane in $\mathbb{R}^{3}$. When $d=0$, the plane passes through the origin and forms a vector space as a subspace of $\mathbb{R}^{3}$.

However, planes which do not pass through the origin do not have the structure of a vector space when considered as subsets of points of $\mathbb{R}^3$. This shows the necessity of the restriction to **homogeneous** systems of linear equations. This technical difficulty is resolved using the concept of affine spaces.

The set of functions in $C^{2}(\mathbb{R})$ satisfying the differential equation $f''(x) = e^{x} f(x)$ is a vector space.

Since the set of solutions is a nonempty--$f(x) = 0$ solves the equation--subset of the vector space $C^{2}(\mathbb{R})$, it suffices to show that if $f_{1}$ and $f_{2}$ satisfy the given equation, then so do $f_{1}+ f_{2}$ and $cf_{1}$ for all $c\in\mathbb{R}$. If $f_1,f_2$ satisfy $f''(x) = e^{x} f(x)$ , then

$\big(f_1(x) + f_2(x)\big)'' = f_1^{''}(x) + f_2^{''}(x) = e^{x}f_1(x) + e^{x}f_2(x) = e^{x} \big(f_1(x) + f_2(x)\big).$

Similarly, $\big(cf_1(x)\big)^{''} = cf_1^{''}(x) = ce^{x}f_1(x)$. $_\square$

## Additional Structure on Vector Spaces

## Further Examples

An $n\times n$ matrix with entries in $\mathbb{R}$ with the property that the sum of entries along each row, column and diagonal is constant and equals $a\in\mathbb{R}$ is called a

matrix magic squareof order $n$ withline-sum$a$.An example of a matrix magic square of order $3$ is the matrix $\begin{pmatrix} 8&1&6\\3&5&7\\4&9&2\end{pmatrix},$ which is itself a magic square. Note that for matrix magic squares we do not have a restriction that the entries of the matrix have to be distinct, so the matrix $\begin{pmatrix} 0&0&0\\0&0&0\\0&0&0\end{pmatrix}$ is also a matrix magic square of order $3$.

The space $M$ of matrix magic squares of order $n$ having arbitrary line-sums forms a vector space as a subspace of the matrix vector space $\mathbb{M}_n(\mathbb{R})$:

- The zero matrix is an element of $M.$
- If $A,B\in M$ with line-sums $a$ and $b$, then the matrix $A+B$ is a matrix magic square with the line-sum $a+b$. Similarly, the matrix $cA$ is a matrix magic square having the line-sum $ca$.

## Problems

Which of the following sets of polynomials are vector spaces over $\mathbb{R}$ with the usual polynomial addition and scalar multiplication?

**A:** All polynomials of the form $p(x)=ax^2 + bx + c$ with $a,b,c\in\mathbb{R}$ and $a+b+c=1$.

**B:** All polynomials of the form $p(x) = ax^2 + bx + c$ with $a,b,c\in\mathbb{R}$ and $a+b+c=0$.

**C:** All polynomials of the form $p(x)=ax^2 + bx + c$ with $a,b,c\in\mathbb{R}$ and $p(1) = p(2)$.

Let $V$ be a vector space over any field $F$.

A collection of vectors $v_1, v_2, \ldots, v_n \in V$ is called *dependent* if there exist real numbers $a_1, a_2, \ldots, a_n \in \mathbb{R}$ such that $a_1 v_1 + \cdots + a_n v_n = 0$ and at least one of the $a_i$'s is nonzero. Consequently, a collection of vectors is called *independent* if it is **not** dependent.

Note that $\mathbb{R}$ may be thought of as a vector space over the field $\mathbb{Q}$ of rational numbers. With this vector space structure on $\mathbb{R}$, is the set $\big\{\sqrt{2}, \sqrt{3}, \sqrt{5}\big\} \subset \mathbb{R}$ dependent or independent? What about the set $\left\{\frac{\pi}{4}, \arctan\left(\frac{3}{2} \right), \arctan(2) \right\} \subset \mathbb{R}?$

Let $V$ be a vector space over $\mathbb{R}$. A *norm* on $V$ is a function $\|\cdot \| : V \to \mathbb{R}$ satisfying the following properties:

- The norm is nonnegative: $\|v\| \ge 0$ for all $v\in V$, with equality if and only if $v=0$.
- The norm scales with vectors: $\|cv\| = |c| \cdot \|v\|$ for all $v\in V$ and $c\in \mathbb{R}$.
- The norm satisfies the triangle inequality: $\|v+w\| \le \|v\| + \|w\|$ for all $v, w\in V$.

A vector space equipped with a norm is called, unsurprisingly, a *normed vector space*.

Suppose $V$ is a normed vector space. For $v,w \in V$, define a function $d: V\times V \to \mathbb{R}$ by $d(v,w) = \|v-w\|.$ One can verify that this function $d$ is a *metric* on $V$, giving $V$ the structure of a metric space.

If the metric space structure on $V$ induced by the norm is *complete* (i.e., Cauchy sequences converge), then $V$ is called a **Banach space**.

Consider the space $\ell^{\infty} (\mathbb{R})$, consisting of bounded sequences of real numbers. This is a vector space over $\mathbb{R}$, using coordinate-wise addition and scalar multiplication. One may define a norm on $\ell^{\infty} (\mathbb{R})$ by setting $\|(a_1, a_2, a_3, \cdots)\| = \sup_{i\ge 1} |a_i|.$ Under this norm, is $\ell^{\infty} (\mathbb{R})$ a Banach space?