Hi everyone, here is my second post on computer science. Contrary to what I said before (sorry!), this will be about big-O notation, a very important concept in computer science.

Big-O notation is a system used to analyze how fast a program runs or how much memory it takes up. We'll focus on time for now, but it can similarly be applied to other things.

The "big-O" of the run time of a program is usually denoted as $O(f(n))$, where $f(n)$ is some function of $n$. For example, if a program runs in big-O of $n$ squared, we would write that as $O(n^2)$.

Usually, $n$ denotes the amount of input to the program. Then, the function is the number of operations the computer executes to go through the program. The run time is analyzed as $n$ approaches infinity, so if the number of operations is the sum of multiple terms, only the one that grows the fastest is considered. Also, any coefficients are ignored.

For example, here are some possible big-O categories from fastest to slowest, along with some programs that might fall under each category:

$O(1)$: This is the best big-O category. It means that no matter how much input there is, the program always takes the same amount of time. Very few programs fit this definition. One example is a program that takes in a list and returns the first element of the list.

$O(\log(n))$: This is a pretty nice big-O category. It grows much slower than $O(n)$. For example, binary search on a sorted list takes $O(\log(n))$ time. Note that the base of the logarithm is not included, since all logarithms differ only by a factor of a constant.

$O(n)$: This big-O category usually results when a program iterates over data one or more times. For example, a program to find the maximum value in a list usually falls under this category, as it loops over the list once keeping track of the maximum so far.

$O(n\log(n))$: Included because often-used sorting algorithms, such as mergesort and quicksort, fall under this category.

$O(n^2)$: This category happens when a program has a loop within a loop.

$O(a^n)$: This is a very slow big-O category. Usually, a program in this category has $n$ values that each can be any of $a$ values.

$O(n!)$: This is one of the worst big-O categories. Usually, a program in this category works with all permutations of a list.

There are many more possible categories, including products of the above, but the above are very common.

When solving computer science problems, big-O notation can be a valuable tool to figure out whether an approach is feasible or not. Here is an example of using it:

Write a program that prints all permutations of the first $n$ positive integers that are in strictly increasing order.

Okay, perhaps the program is really easy to write, since the only permutation is obvious. However, let us consider another approach: going through every permutation and examining whether the elements are in increasing order. This would run in $O(n!)$ time, since there are $n!$ permutations. It takes a constant number of operations to check each permutation, and the constant is dropped. This is infeasible, as $n!$ grows much too fast. Even $15!$ would take a nontrivial amount of time.

Here is another problem:

Pseudocode:

```
function sum(list):
sum = 0
for each element in list:
current = element
add current to sum
sum = sum*1
return sum
```

Find the big-O category for this program.

We can quickly see that this program finds the sum of the elements in a list. It may not be the most efficient, but it works.

First, it sets sum to 0, which takes one operation. Then, it iterates over the list, performing two operations each time. This takes $n$ time. Then, it performs two more operations.

Overall, the total time is $1+n\times 2+2=2n+3$. The fastest growing term is $n$, and we ignore the coefficient of 2, so the big-O category is $O(n)$.

One last note: while the big-O category can help in analyzing algorithms, it isn't the only thing to consider. Programs within the same category can be significantly faster or slower than another. Also, don't forget to consider memory usage as well as time.

Problems:

Think about big-O notation and try to categorize some programs you write.

I'll put up some problems asking for the big-O time category of some programs soon.

Thanks for reading! I hope you use big-O notation to analyze algorithms in the future.

No vote yet

1 vote

×

Problem Loading...

Note Loading...

Set Loading...

$</code> ... <code>$</code>...<code>."> Easy Math Editor

`*italics*`

or`_italics_`

italics`**bold**`

or`__bold__`

boldNote: you must add a full line of space before and after lists for them to show up correctlyparagraph 1

paragraph 2

`[example link](https://brilliant.org)`

`> This is a quote`

Remember to wrap math in $</span> ... <span>$ or $</span> ... <span>$ to ensure proper formatting.`2 \times 3`

`2^{34}`

`a_{i-1}`

`\frac{2}{3}`

`\sqrt{2}`

`\sum_{i=1}^3`

`\sin \theta`

`\boxed{123}`

## Comments

Sort by:

TopNewestGreat post Daniel! I have a challenge for you: In my feed I posted a geometry problem that I need explained to me. Would you mind trying your hand at it? The discussion is called "Can anybody help me with this?". Thanks!

Log in to reply

So, I'm struggling with Big O and in my class the lecturer essentially said "You have O(n), O(n log n) and so on, but he never said what they actually mean. So I was wondering if you could help me.

I was wondering you say: "This takes n time. Then, it performs two more operations.". The two more operations I assume are "sum = sum * 1" and "return n". I was wondering, why isn't this 3 operations, that is, the assignment, multiplication, and return?

Also, when you say "Add current to sum" That would evaluate to "sum = sum + current", which presents the same question...

And also, when you say that "Then, it iterates over the list, performing two operations each time. This takes n time", is it n * 2 or 2n) because there are two operations happening, per iteration of the loop? If there were 4 operations, it would be 4n, where n represents the size of the list, or the number of iterations it has to do?

Log in to reply

Great post. I need a little clarity, so this means the last two line makes the +2 in the equation 1 +2n (+2).

Log in to reply