If there has been one recurring theme so far, it’s that more organized data can be valuable for certain types of computations, like searching for an element in a list. Specifically, we’ve noted that having a sorted list is generally more useful than having an unsorted list.

So, how can we sort a list? As with searching, we’ll start with a naive, intuitive approach (called **insertion sort**). In a later quiz, we’ll explore more efficient methods of sorting.

Suppose that we have a sorted array, \(A\), and a single element outside of it, \(x\). The natural question is, how can we place this element into our array while still keeping it sorted?

This operation is known as an insertion, and there are multiple ways to do it. However, in this chapter we will only concern ourselves with the method used in insertion sort, which uses the steps outlined below.

- Place \(x\) at the end of A.
- Compare \(x\) to the element on its left. There are three possibilities.
- If there is no element to the left, then we are done, since \(x\) will be the smallest element and it is already at the start of the array.
- If \(x\) is greater than or equal to it, then we are done; \(x\) is ahead of all smaller elements and behind all larger elements, so the array is once again sorted.
- If \(x\) is smaller, then it should be before the other element. Switch the two to put \(x\) on front.

- Repeat step two until finished.

With each repetition of step two, we will either finish inserting, or reduce the number of elements ahead of \(x\) by one. Since we are also finished when no more elements remain in front of \(x\), repeating step two is guaranteed to successfully make an insertion.

The full insertion sort algorithm works by dividing an array into two pieces, a sorted region on the left and an unsorted region on the right. Then, by repeatedly inserting elements from the unsorted half into the sorted half, the algorithm eventually produces a fully sorted array. The full steps of this process for an array, \(A\), are shown below

- Designate the leftmost element of \(A\) as the only element of the sorted side. This side is guaranteed to be sorted by default, since it now contains only one element.
- Insert the first element of the unsorted side into the sorted side, increasing the number of sorted elements by one.
- Repeat step two until there are no unsorted elements left.

Notice that this method doesn’t require us to create a new array to store the sorted values. All we have to do is keep track of how much of the original array is sorted. This makes insertion sort an in-place algorithm.

To summarize, **Insertion sort** is a sorting algorithm that builds a final sorted array one element at a time. The following is a visual representation of how insertion sort works:

**Insertion Sort:**

We start from the left. At first there is one element (which is, trivially, already sorted).

Colloquially speaking, for each subsequent element, we compare it to each of the elements to the left of it, moving right to left, until we find where it “fits” in the sorted portion of the array.

Since we are inserting the new element into the side of the array that is already sorted, the result will still be sorted.

After we’ve inserted every element, we have a sorted array!

Suppose you had the following poker hand dealt to you, and decided to use insertion sort to sort it from smallest to largest:

Note that \(\mathbf{A>K>Q>J>2.}\) How many individual comparisons between cards would need to be made for the sort?

**Reminder:** First, the \(K\) and \(2\) are compared. Then, the \(A\) and \(K\) are compared, then the \(Q\) and \(A\) and so on.

In the worst case for a list with 10 distinct elements, how many comparisons are made?

In the best case for a list with 10 distinct elements, how many comparisons are made?

The worst case increases like \(n^2\) for large \(n,\) while the best case increases linearly (like \(n\)). Unfortunately for insertion sort—as we’ll explore in the next quiz—the worst case is much more likely, and the average case also increases like \(n^2.\)

Thankfully, we can do better. In later quizzes, we’ll dive into more complex, more efficient sorting algorithms; their worst case performance will increase like \(n \log n\) for large \(n.\)

Before we get there, we need to take a detour to start formalizing this language of computational complexity. How do we actually express how many computations an algorithm will use, and how do we meaningfully compare algorithms? We’ll explore this in the next quiz, on big O notation.

×

Problem Loading...

Note Loading...

Set Loading...