Kolmogorov Complexity
Kolmogorov Complexity of an object or algorithm is the length of its optimal specification. In some sense, it could be thought of as algorithmic entropy, in the sense that it is the amount of information contained in the object.
This is a screenshot from the game .kkrieger. Given that graphics, the game must be of at least a few hundred megabytes, right? Well, it is just 96 kilobytes 
Contents
Informal Introduction
Consider the strings 11111111111111111111111111111111
and 4c1j5b2p0cv4w1x8rx2y39umgw5q85s7
.
The most obvious way to describe the first string is to say that it contains 32 1's.
1 

The description contains fewer characters than the string itself.
On the other hand, the best way to describe the other string in haskell is (probably) to just site it.
This is why the second string is actually more complex.
The complexity of describing something in a rich language such as English is not unknown to anyone aware of the Berry Paradox.
The Berry Paradox claims that the complexity of all integers in English is atmost eleven. It claims that if were not, then there would be a first integer that is not. But, then again it could be described as The smallest positive integer not definable in under eleven words.
This could be viewed as one of the intuitive reasons for the Uncomputablity of Kolmogorov Complexity.
If you are interested in Code Golf, Procedural Generation or Compression Theory, your field has got something to do with Kolmogorov Complexity.
Definition
Kolmogorov Complexity \(K\) of a string, relative to a Turing Machine \(f\) of a string \(x\) is \[ K_f(x) = \min \{ p: f(p) =x \} \]
It appears that the complexity depends both on \(f\) as well as \(x\). This is, to some extent unavoidable, in the sense that the description really depends on the description language.
However, if the Turing Machine is universal, the differences between the complexity of the string across such Turing Machine would be varying by constants.
This is known as the Invariance Theorem. The proof follows from the idea of encoding a Turing Machine and then the program in that Turing Machine. The first part(i.e, the encoded Turing Machine) would not depend upon the string and is hence a constant.
Incompressiblity and Randomness
\(x\) is incompressible if and only if \[K(x) \geq x \]
An incompressible string could also be called algorithmically random
For all \(n\), there exists incompressible strings of length greater than or equal to \(n\)
The proof is just a trivial usage of pigeonhole principle.
There are \(2^n\) strings of \(n\) bits. but there are \(2^n  1\) descriptions of size less than \(n\).
This is because \[\sum_{i=0}^{n1} {2^i} = 2^n  1 \]
It is not very hard to see that a majority of the strings are incompressible. However, we will soon see that incompressiblity is undecidable, owing to the nature of uncomputability of Kolmogorov Complexity.
A string \(x\) is \(c\text{incompressible}\) if \[K(x) \geq x  c \], for some constant \(c\) where \(K(.)\) stands for Kolmogorov Complexity
In the language \(\Sigma^* \) where \(\Sigma = \{ 0, 1 \} \), what is the minimum possible number of strings of length \(8\) that are \(3\)incompressible?
By this time, you should be able to see why your compression software compresses down some stuff really well, while others not that well. You should understand why writing a nice compression algorithm is really hard, and interesting.
Uncomputability
We need to begin with a definition
The complexity of a formal language is the complexity of a program that lists all the strings in the language.
\(K(.)\) is uncomputable.
No formal system of complexity \(n\) can prove that an object has \(K(x) > n\)
Suppose not. In that case the formal system would be able to prove that the object has complexity \(n'\). But then this encodes a description of the object in \(n\) bits. Since \(n' > n \), this is a contradiction.
Since the complexity of any program is finite, it follows that there is a string of higher complexity than the program whose complexity it cannot decide.
If this proof does not convince you, examine the following informal proof:
Suppose there is a function kolmogorov(x)
whose definition is of \(n\) bits.
Now, we could define a program of length \(m\) bits like this:
1 2 3 4 

This program is, by construction, supposed to output all strings of complexity more than \(m+n\). Since the complexity of the program is exactly \(m+n\), we have a contradiction since that would really mean that those strings are not of complexity more than \(m+n\) because there is a less complex way to output them.
Today is Alokananda's Birthday
As a tribute, Agnishom plans to generate a BF program that outputs the string Happy Birthday!
He sees that the following program would be the simplest for him to understand: Link
But then after playing around a bit, he discovers that is too profuse and can be done with a simpler code:
1 

This inspires him to find the shortest BF program that prints Happy Birthday!
.
In general, he is willing to write a (python) program that outputs the shortest BF program that prints a given string.
Will he succeed?
Assume we are running BF on an infinite cell memory and furthermore, our brains are not superturing
If you have been following along, you should see this is a version of berry paradox, in essence.
The philosophical question of whether humans could decide incompressibility or the halting problem remains. If they were, that would mean that the brain is an object of infinite complexity.
Incompressibility Proof Method
This section needs to be expanded.
This is a generic proof technique that relies on the fact that there exists incompressible strings of any length. This is used to show that a particular property in a class holds by showing that the contrary would imply the nonexistence of arbitary length random strings.
Prove that there are infinitely many primes