# Testing out large integer arithmetic and Euclidean division in general

Even though there are multiple large integer arithmetic packages in the programming languages I usually write in (especially since .NET apparently introduced a standard one in version 4) I thought I’d just test out writing one myself just to get a general feel for how a some naive implementation I could come up with on the fly would work. I just represented my numbers as lists of digits and implemented the same algorithms you would use to do arithmetic by hand. I especially enjoyed seeing how long division ended up performing in contrast to naive division that is just iterated subtraction.

Just took me a couple of hours but I felt it was a nice exercise in how to formalize my memorized algorithms for dealing with numbers and see them being applied formally by a machine which does not care about what it might represent. I also tested out completely abandoning native number handling and making the arithmetic symbolic using characters such as ‘1’ and  ‘2’ and preprogramming the basic building blocks as functions like Add(‘2’, ‘1’) but after a while it got a little absurd since I know the characters were represented numerically by the programming language anyways. Reminded me of when a friend of mine at the university tried to do a similar thing a few years back using only functions all the way similarly to a Peano style setconstruction of numbers and besides the elegance of the construction it wasn’t very effective or easy to work with.

Anyways as I said I was just trying things out and part of the final goal was to see how the Euclidean Algorithm would play out in finding the greatest common divisor of my +thousand digit (base-10) integers. There probably isn’t an algorithm which comes close the EA in being both so simple and effective at solving a problem (even if it is) and I’m a little dissapointed that it didn’t make it into the Swedish math curriculum in the latest revision. Sure it’s not the most useful thing to know in the world but there are just so many beautiful things you can do with the concept of coprime numbers once you’ve become comfortable with them. My favorite being designing the periods of Star Polygons and by extension Hypocycloids

I’m not too happy with the performance of the implementation I ended up with but it worked well enough. I was able to find the gcd of two numbers with up to 8000 digit in around 2 to 3 minutes but it would probably have been much lower if I had used better List types and not relied as much on inoptimized lookup and object cloning but its a worst case and just verified to me that anyone could make functioning implementation in an afternoon so it could be given as a lab exercise would I ever have reason to suggest one. Plus just seeing the numbers rattle over a printout with the remainders becoming smaller and smaller in every iteration is beautiful regardless of speed.

In just playing around with this GCD algorithm I also came across the fact that when two numbers are chosen at random the probability of them being coprime is fairly high and I recalled the rather elegant reasoning for it. That since the probability of a number having a prime factor $p$ is informally $1/p$ the probability that two numbers not having a common factor $p$ is $1 - 1/p^2$ and so the total probability that no prime factors are shared by two numbers in a sense is $(1 - 1/2^2)(1 - 1/3^2)(1 - 1/5^2) \cdots$ which is equivalent to good ol’ Basels problem

$P = \left (\prod_{p \in \mathbb P} \cfrac{1}{1 - 1/p^2} \right )^{-1} = \left ( \prod_{p \in \mathbb P} \sum_{k = 0}^\infty \cfrac{1}{p^2} \right )^{-1} = \left ( \sum_{n = 1}^\infty \cfrac{1}{n^2} \right )^{-1} = \cfrac{6}{\pi^2}$

and where $6/\pi^2 \approx 0.6$