There was an interesting post on Reddit for generating a visual representation of the first 100,000 digits of . I modified the code slightly use Brilliant's color palette, and made a variation for producing I ran this code on Ubuntu Linux, but it should work on any Unix-based platform (Linux, OS X, BSD, etc), though you will have to figure out how to install dependencies yourself.
This works by generating 100,000 digits, converting them to color pixels in a 2-step process, and then generating a final image. Here's the result:
If you'd like to do it yourself, you will need basic command-line tools available on Unix, as well as:
pi utility for generating quickly; you could use as the original code does, but it is so slow that it never finished running on my machine and I gave up after ~30 mins.
On Ubuntu, install dependencies like this:
Here's the original code, modified to use Brilliant colors and for faster execution:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
convertutility to produce a PNG image
To produce , modify the code slightly to multiply by 2 using the unix utility
1 2 3 4
Pretty big difference between the two, no? =P
Can you figure out how to generate 1,000,000 digits of or as a 1000x1000 pixel square image?