Why do computers like to calculate Pi?

W

Pi has always held a special place in my heart and probably yours as well. When people ask me to pick a number between 1 to 10, I always pick Pi (or sometimes the square root of two), which hasn’t made me the life of many parties.

And there’s an entire community of Pi lovers like me out there–people who are just fanatical about calculating pi to the trillions of digits? With just the first 40 digits of Pi we’ll be able to calculate the circumference of our galaxy with an error that is smaller than the size of a proton, so calculating PI to trillions of digits is quite superfluous. So why then do people do it?

Because they can.

Obviously it’s a bit more complicated, but calculating Pi to trillions of digits requires advances in both the speed of the machine as well as the reliability. Think about it, if you had an error on the 4  billionth digit, then there’s no way the remaining trillions would be accurate anymore. So there’s a huge challenge in creating a machine that can calculate Pi to trillions of digits and do so quickly and reliably enough.

But begs the question–why Pi? Why not some other number?

Here’s my answer in video form:

1 comment

Astound us with your intelligence