03 December 2008

Logic as machine language

Gil Kalai mentions a metaphor I hadn't heard of before about the foundations of mathematics:
To borrow notions from computers, mathematical logic can be regarded as the “machine language” for mathematicians who usually use much higher languages and who do not worry about “compilation.”
Of course there would be analogues to the fact that certain computer languages are higher-level than others as well. To take an example dear to me, the theory of generating functions might be at a higher level than the various ad hoc combinatorial arguments it's often introduced to students as a replacement of. I don't want to press this metaphor too hard because it'll break -- I don't think there are analogues to particular computer languages. But feel free to disagree!


Anonymous said...

I like it. Back in the late '80s when I taught a logic class, I put together a little program to compute a proof of the tautology (A→(B→C))→(B→(A→C)), a proof in this context meaning a sequence of formulas that are either axioms or follow from preceding formulas by an application of modus ponens. The resulting proof, after weeding out duplicates, is 31 formulas long. That such a trivial statement requires such a lengthy proof is already a good demonstration of the need for abstraction and higher level concepts. Of course, the work by Russell and Whitehead is the ultimate demonstration of same.

Anonymous said...

A followup to my earlier comment: I don't actually think that most mathematicians (myself included) really think of what we do as based on first order logic and ZFC. It's just that we maintain an awareness that we might, in principle, reduce everything we do to those terms if we ever wished to. (Speaking as an analyst here ... I am sure not all mathematics actually happens inside ZFC, even in principle.)

Anonymous said...

One of the problems with machine generated proofs is that they do derive everything from first principles. Another problem is that automatically figuring out which deductions are 'interesting' is very hard.

AgainstWords said...

Of course, most mathematicians do work in some formal system, if not ZFC. And even then, they may work in a higher-level language than the 'assembly language' of the foundations of their own field.

The idea that there could be a unified foundation in ZFC is perhaps a statement about 'intermediate languages' --- or, if we identify each 'assembly language' with the 'machine' it corresponds to, the statement that ZFC provides a broad foundation is akin to saying that ZFC's machine is adept at virtualizing the machines of many other branches of mathematics.

Anonymous said...

I don't exactly agree. Higher level programming languages still offer the necessary primitives so that the language is Turing complete, but it also comes with better abstraction mechanisms, nicer syntax, etc. In other words, it's another formal system, just a more pleasant one.

What mathematicians use day to day is not a full replacement formal system. Whenever you give a proof sketch, or don't completely formalize the proof, by analogy you're omitting to fill the body of a function that you think can be easily written by the competent reader.

The full analogy is called the Curry-Howard correspondence where omitting a proof corresponds to asserting that there probably is at least one programming language term (e.g. a function) that inhabits the static type, which corresponds to the proposition.

pqnelson said...

I was just Googling this very subject!

Perhaps one should point to Russell and Whitehead's Principia as an example of when logic behaves like the "machine language of mathematics".

It does lead into an interesting situation for people trying to construct automated theorem provers: why not bootstrap something "high-level"?

After all, we have high level languages which compile to machine code...why can't we do likewise for automated proof checking (or "theorem proving")?