15 October 2007

Should everyone know number theory?

From Anarchaia: Why everyone should know number theory, by Minhyong Kim.

I'm not convinced, even if "everyone" means "all mathematicians". Kim's argument seems to be that everything mathematicians ever write down is a finite collection of data and that that data can be written down as some very large number (say you enter it into a computer and consider the corresponding binary string of 0s and 1s as an integer). This reminds me of the idea of Godel numbers (which assign a number to each logical statement in a certain language), and writing down the rules that allow you to get from one Godel number which represents a true statement to another is impossibly ugly.

However, I'm really just getting this from the title, which Kim eventually admits is an exaggeration. Kim later says:
However, there is an even more essential reason why most practicing mathematicians can get by with a rather naive understanding of numbers and might be better off doing so. This is because of the extremely useful geometric picture of the real and complex numbers. Much of the time, it is perfectly reasonable to view the complex numbers as a geometric plane, and base all other constructions upon that picture, oblivious to the fine structure of our objects, pretty much as one can do plenty of classical physics without worrying about the fact that the macroscopic objects we are considering arise from the complicated interaction of elementary particles. Now, my claim is that the role of a number theorist in mathematics is exactly analogous to te role of a particle theorist in physics.

If that's true, why does everyone need to know number theory? Still, there are times when it's useful. To take a simple example, consider the central binomial coefficients,
{2n \choose n} = {(2n)! \over n! n!},

or even more generally any binomial coefficient at all. The binomial coefficients have nice number-theoretic properties; for example, if you look at them modulo 2, and color the odd ones black and the even ones white, you get something that looks suspiciously like the Sierpinski triangle. But parity is fine structure; this particular result comes out of looking at how many factors of 2 are in the numerator and the denominator and subtracting them, and usually there are about the same number so one has to be careful about cancellations. A lot of the time what one really cares about is approximately how large this coefficient is. For this sort of question it's useful to use Stirling's approximation,
n! = \sqrt{2\pi n} \left( {n \over e} \right)^n \left( 1 + {1 \over 12n} + O\left(n^{-2}\right) \right)

which allows us to determine lots of things about the sizes of things involving factorials -- for example, binomial coefficients -- without getting anywhere near close enough to them to determine their parity.

For example, it turns out that
{4^n \over \sqrt{\pi n}} \left( 1 - {1 \over 8n} + O\left(n^{-2}\right) \right)

which is the sort of information that might be useful in, say, probability; if I have some number of objects which are counted by the central binomial coefficients, and I want to know that, say, the proportion of them having some particular property is very small, this is much more valuable than knowing, say, the complete prime factorization of a binomial coefficient.

The essay does give some interesting examples, though -- the classical ones are things like doubling the cube and squaring the circle, and there are a bunch of modern examples of theorems which are in complex geometry but whose proofs require number theory. The paper builds up to a famous theorem of Matiyasevich, which says that "listable" sets of natural numbers (those for which we can write a computer program that will output all the numbers in the set, and no others) and Diophantine sets (those which are sets of solutions to polynomial equations with integer coefficients) are the same; so if you buy into the idea that all mathematics can be encoded in "things computers can do" then you essentially have to believe this number-theoretic hypothesis. And it's hard not to believe that about computers. I have sometimes told my students that if they are attempting to solve a problem, they only know a small number of things, so they could just try all of them. The mathematical community as a whole knows a large number of things, but stil a finite number, so if we wanted a computer to solve a mathematical problem we could conceivably enter it in some suitable form into a computer, along with all the facts we know, and tell the computer to "try everything". This isn't practical, though, which is why we still have mathematicians.

However, the polynomials corresponding to "simple" listable sets are not so nice; there's a polynomial of degree 25 in 26 variables all of whose positive values are primes. You can see it here. (Supposedly the degree can be reduced to 10; 26 is kind of cute, though, because it's the number of letters in our alphabet.)

Kim also gives brief outlines of how the four-color theorem could have been proven this way, and a possible proof of the Poincare conjecture via the same method (basically, it's enough to prove it for simplicial complexes, which we can enumerate, although there are some annoying problems, namely that we can't determine if a group is trivial by looking at its presentation. Maybe these reductions aren't so trivial after all.

No comments: