### Introduction to Linear Algebra

In the previous quiz, we started looking at an algorithm for solving systems of linear equations, called Gaussian elimination. In this quiz, we’ll take a deeper look at this algorithm, why it works, and how we can speed it up.

# The Full Story of Gaussian Elimination

To save ourselves some effort, let’s introduce some new notation. Instead of having to write $x, y, z$ (or whatever the variables are) multiple times, we can simplify things using matrix notation:

\begin{aligned} x + 2y + 3z &= 18 \\ 3x + 4y + z &= 12 \implies \begin{pmatrix}1&2&3\\3&4&1\\2&-1&1\end{pmatrix}\begin{pmatrix}x\\y\\z\end{pmatrix}=\begin{pmatrix}18\\12\\6\end{pmatrix}. \\ 2x - y + z &= 6 \end{aligned}

Here, the first block is the coefficients of the equations, one equation per row. The second block is the variables, one per row and listed in the same order as the coefficients. Finally, the last block is the RHS (short for righthand side) of each equation, one per row. This lets us represent the whole system in just one equation!

Which column of values makes the above matrix equation true?

Note: All options contain the same numbers, but the order in which those numbers are listed is different and important!

# The Full Story of Gaussian Elimination

As with systems, some matrix equations are equivalent, which is important because some matrix equations are easier to solve than others. As a result, we can often transform a difficult matrix system into one that is much easier to solve. Over the next three problems, we’ll see the three fundamental ways to do this, using elementary matrix operations.

The first type of equivalence is fairly simple: since each row of a matrix system represents an equation, nothing changes if we swap the rows around.

Which of the following represents the same system as this matrix equation: $\begin{pmatrix}1&-1&1\\2&3&-2\\-3&3&3\end{pmatrix} \begin{pmatrix}x\\y\\z\end{pmatrix} = \begin{pmatrix}2\\-1\\12\end{pmatrix}?$

A. $\quad \begin{pmatrix}2&3&-2\\1&-1&1\\-3&3&3\end{pmatrix}\begin{pmatrix}x\\y\\z\end{pmatrix} = \begin{pmatrix}2\\-1\\12\end{pmatrix}$

B. $\quad \begin{pmatrix}-3&3&3\\2&3&-2\\1&-1&1\end{pmatrix}\begin{pmatrix}x\\y\\z\end{pmatrix} = \begin{pmatrix}2\\-1\\12\end{pmatrix}$

C. $\quad \begin{pmatrix}1&-1&1\\-3&3&3\\2&3&2\end{pmatrix}\begin{pmatrix}x\\y\\z\end{pmatrix} = \begin{pmatrix}2\\-1\\12\end{pmatrix}$

D. $\quad \begin{pmatrix}2&3&-2\\-3&3&3\\1&-1&1\end{pmatrix}\begin{pmatrix}x\\y\\z\end{pmatrix} = \begin{pmatrix}-1\\12\\2\end{pmatrix}$

# The Full Story of Gaussian Elimination

Just as we can multiply both sides of an equation by the same amount, we can multiply any of the rows of the matrices by constants (as long as we do it to both sides).

Which of the following represents the same system as this matrix equation: $\begin{pmatrix}1&-1&1\\2&3&-3\\-3&3&3\end{pmatrix} \begin{pmatrix}x\\y\\z\end{pmatrix} = \begin{pmatrix}2\\-1\\12\end{pmatrix}?$

A. $\quad \begin{pmatrix}2&-2&2\\2&3&-3\\-3&3&3\end{pmatrix}\begin{pmatrix}x\\y\\z\end{pmatrix} = \begin{pmatrix}2\\-1\\12\end{pmatrix}$

B. $\quad \begin{pmatrix}1&-1&1\\4&6&-6\\-3&3&3\end{pmatrix}\begin{pmatrix}x\\y\\z\end{pmatrix} = \begin{pmatrix}2\\-1\\12\end{pmatrix}$

C. $\quad \begin{pmatrix}1&-1&1\\2&3&-3\\-6&6&6\end{pmatrix}\begin{pmatrix}x\\y\\z\end{pmatrix} = \begin{pmatrix}2\\-1\\12\end{pmatrix}$

D. $\quad \begin{pmatrix}2&-2&2\\4&6&-6\\-6&6&6\end{pmatrix}\begin{pmatrix}x\\y\\z\end{pmatrix} = \begin{pmatrix}4\\-2\\24\end{pmatrix}$

# The Full Story of Gaussian Elimination

Finally, just as we can add two equations together, we can add one row of a matrix to another. As always, we have to be careful to add the corresponding RHS!

Which of the following represents the same system as this matrix equation: $\begin{pmatrix}1&-1&1\\2&3&-3\\-3&3&3\end{pmatrix} \begin{pmatrix}x\\y\\z\end{pmatrix} = \begin{pmatrix}2\\-1\\12\end{pmatrix}?$

A. $\quad \begin{pmatrix}1&-1&1\\3&2&-2\\-3&3&3\end{pmatrix}\begin{pmatrix}x\\y\\z\end{pmatrix} = \begin{pmatrix}2\\-1\\14\end{pmatrix}$

B. $\quad \begin{pmatrix}1&-1&1\\2&3&-3\\-2&2&4\end{pmatrix}\begin{pmatrix}x\\y\\z\end{pmatrix} = \begin{pmatrix}2\\1\\12\end{pmatrix}$

C. $\quad \begin{pmatrix}1&-1&1\\2&3&-3\\-1&6&0\end{pmatrix}\begin{pmatrix}x\\y\\z\end{pmatrix} = \begin{pmatrix}2\\-1\\11\end{pmatrix}$

D. $\quad \begin{pmatrix}-2&2&4\\2&3&-3\\-3&3&3\end{pmatrix}\begin{pmatrix}x\\y\\z\end{pmatrix} = \begin{pmatrix}1\\-1\\12\end{pmatrix}$

# The Full Story of Gaussian Elimination

Now we can finally restate Gaussian elimination in matrix form. First let’s simplify the notation even further, dropping the variables completely: \begin{aligned} x + 2y + 3z &= 6 \\ 3x + 4y + z &= 8 \implies \begin{pmatrix}1&2&3&6\\3&4&1&8\\2&-1&1&2\end{pmatrix}, \\ 2x - y + z &= 2 \end{aligned} where the last column represents the RHS of each equation. Now we follow this algorithm:

• Determine what's called a “pivot,” i.e. the top leftmost nonzero number that isn’t in a row/column already used. (If necessary, scale this whole row to make the pivot entry 1). Here we have the pivots in blue.

\begin{aligned} \begin{pmatrix} \color{#3D99F6}{1}&2&3&6\\3&4&1&8\\2&-1&1&2 \end{pmatrix} \end{aligned}

• For each row $r$ below the pivot row $p,$ multiply the pivot row by the leading entry and subtract: $r \mapsto r- (\text{leading entry of}\ r) \cdot p$

\begin{aligned} r_{2} &\rightarrow r_{2}-3 r_{1} \\ r_{3} &\rightarrow r_{3}-2 r_{1} \end{aligned} \begin{aligned} \begin{pmatrix}\color{#3D99F6}{1}&2&3&6\\0&-2&-8&-10\\0&-5&-5&-10\end{pmatrix} \end{aligned}

Each row below the pivot row will be left with zeros in that column.

• Repeat until there are no more pivots. At this point, the matrix will have all 0s in the lower left triangle.

\begin{aligned} \ \ r_{2} & \rightarrow -\frac{1}{2} r_{2} \\ \ r_{3} & \rightarrow r_{3} + 5 r_{2} \end{aligned}

\begin{aligned} \begin{pmatrix}\color{#3D99F6}{1}&2&3&6\\0&\color{#3D99F6}{1}&4&5\\0&0&15&15\end{pmatrix} \end{aligned}

Now we need pivot in $r_3$:

\begin{aligned} r_{3} \rightarrow \frac{1}{15} r_{3} \end{aligned}

\begin{aligned} \begin{pmatrix}\color{#3D99F6}{1}&2&3&6\\0&\color{#3D99F6}{1}&4&5\\0&0&\color{#3D99F6}{1}&1\end{pmatrix} \end{aligned}

Now let’s look at that last row--the equation it represents is $z = 1$. Then we can find $y$ and $x$ directly.

# The Full Story of Gaussian Elimination

The “simple” matrices we saw in the last pane have a special name: row-echelon form. They are defined as matrices with the following two properties:

1. Every row containing a nonzero number is above all the rows that only have zeros.
2. Every row’s pivot (i.e. leftmost nonzero number) is to the right of the pivot in the row above it.

For example, $\begin{pmatrix}1&3&2&4&0\\0&0&8&2&1\\0&0&0&3&-1\end{pmatrix}$ Is in row-echelon form. As we saw, these are important because they are easy to solve, and Gaussian elimination is useful because it produces precisely these matrices.

Which of the following is a row-echelon form of this matrix: $\begin{pmatrix}1&5&4&3\\2&6&3&4\\1&2&6&2\end{pmatrix}?$

A. $\quad \begin{pmatrix}1&5&4&3\\0&-4&-5&-2\\0&-3&2&-1\end{pmatrix}$

B. $\quad \begin{pmatrix}1&5&4&3\\0&-4&-5&-2\\0&0&23&2\end{pmatrix}$

C. $\quad \begin{pmatrix}1&5&4&3\\0&-4&-5&-2\\0&0&0&12\end{pmatrix}$

D. $\quad \begin{pmatrix}1&5&4&3\\0&0&3&4\\0&0&12&2\end{pmatrix}$

# The Full Story of Gaussian Elimination

We’ve now seen how Gaussian elimination provides solutions to matrix equations of the form $Ax = b,$ where $A$ is the matrix of coefficients, $x$ is the matrix of variables, and $b$ is the matrix of the right hand side (RHS).

As we know, not all systems have solutions. Is there a matrix for $b$ that guarantees there is a solution for $x$, regardless of what $A$ is?

# The Full Story of Gaussian Elimination

In the previous problem, we saw that $Ax=0$ always has at least one solution, namely $x=0$. But in some cases, there might be more solutions as well. For instance, if $A = \begin{pmatrix}1&2&-3\\2&-2&-3\end{pmatrix},$ then $x = \begin{pmatrix}4\\1\\2\end{pmatrix}$ is a solution as well (try it!).

We'll use this example to see something very important about general solutions of linear systems. First, let's consider a system similar to the one above: $A x = \begin{pmatrix}1&2&-3\\2&-2&-3\end{pmatrix} \begin{pmatrix} x \\ y \\ z \end{pmatrix} = \begin{pmatrix}3\\0\end{pmatrix} = b.$

Which of the following is also a solution?

# The Full Story of Gaussian Elimination

In this chapter, we saw how Gaussian elimination translates naturally to the language of matrices, which provides a natural starting point into their exploration. Indeed, linear algebra is all about solving the equation $Ax = b$.

We also saw the special importance of the case where $b = 0$, which naturally leads into the topic of vector spaces--as we will see in the next chapter!

# The Full Story of Gaussian Elimination

×