Waste less time on Facebook — follow Brilliant.

Insertion Sort


If there has been one recurring theme so far, it’s that more organized data can be valuable for certain types of computations, like searching for an element in a list. Specifically, we’ve noted that having a sorted list is generally more useful than having an unsorted list.

So, how can we sort a list? As with searching, we’ll start with a naive, intuitive approach (called insertion sort). In a later quiz, we’ll explore more efficient methods of sorting.

Insertion sort is a sorting algorithm that builds a final sorted array one element at a time. The following is a visual representation of how insertion sort works:

Insertion Sort:

  • We start from the left. At first there is one element (which is, trivially, already sorted).

  • Colloquially speaking, for each subsequent element, we compare it to each of the elements to the left of it, moving right to left, until we find where it “fits” in the sorted portion of the array.

  • Since we are inserting the new element into the side of the array that is already sorted, the result will still be sorted.

  • After we’ve inserted every element, we have a sorted array!

Suppose you had the following poker hand dealt to you, and decided to use insertion sort to sort it from smallest to largest:

Note that \(\mathbf{A>K>Q>J>2.}\) How many individual comparisons between cards would need to be made for the sort?

Reminder: First, the \(K\) and \(2\) are compared. Then, the \(A\) and \(K\) are compared, then the \(Q\) and \(A\) and so on.

Ari wants to prove that insertion sort is slow. What order should he put the cards in to make sorting the cards in ascending order maximally inefficient?

In the worst case for a list with 10 distinct elements, how many comparisons are made?

In the best case for a list with 10 distinct elements, how many comparisons are made?

The worst case increases like \(n^2\) for large \(n,\) while the best case increases linearly (like \(n\)). Unfortunately for insertion sort--as we’ll explore in the next quiz--the worst case is much more likely, and the average case also increases like \(n^2.\)

Thankfully, we can do better. In later quizzes, we’ll dive into more complex, more efficient sorting algorithms; their worst case performance will increase like \(n \log n\) for large \(n.\)

Before we get there, we need to take a detour to start formalizing this language of computational complexity. How do we actually express how many computations an algorithm will use, and how do we meaningfully compare algorithms? We’ll explore this in the next quiz, on big O notation.


Problem Loading...

Note Loading...

Set Loading...