28 September 2007

Nested radicals, smoothness, and simplification

I saw an expression involving a nested radical, namely

\phi = \sqrt{ 1 + \sqrt{ 1 + \sqrt{ 1 + \sqrt{ 1 + \cdots } } } }
.

(Write φ = (1 + φ)1/2 and solve for φ.) The Wikipedia article on nested radicals led me to Simplifying Square Roots of Square Roots by Denesting. The authors tell us that:

The term surd is used by TeX as the name for the symbol √ Maple has a function called surd that is similar to the nth root defined here; like all good mathematical terms, the precise definition depends upon the context. In general, a mathematical term that does not have several conflicting definitions is not important enough to be worth learning.

This reminds me of a couple things that happened in my class yesterday. First, I was defining what it means for a curve to be smooth; our definition was that the curve given by the vector function r(t) is smooth if r'(t) is continuous and never zero, except perhaps at the endpoints of the interval over which it's defined. (This makes smoothness a property of a parametrization, which is a bit counterintuitive. I suppose that one could define a curve -- as an abstract set of points -- to be smooth if it has a smooth parametrization. Although I haven't worked it out, I assume that if a curve has a smooth parametrization, the arc-length parametrization is smooth.) One of the students said "but the professor said 'smooth' means something else!" I'm not sure if the professor actually said "smooth means X" or if he said "some people think smooth means X", but it's a good point. (In particular, "smooth" often seems to mean that a function has infinitely many continuous derivatives.)

Second, the article is about using computer algebra systems to simplify expressions like
\sqrt{5 + 2 \sqrt{6}} = \sqrt{2} + \sqrt{3}

where the left-hand side is "simpler"; sometimes my students worry that they are not presenting their answer in the simplest form. While I'll accept any reasonably simple answer (unless the problem statement specifies a particular form), it is remarkably difficult to define what "simple" means.

One rule I have figured out, though, is that 4x - 4z - 8 = 0 should be simplified to x - z - 2 = 0 by dividing out the common factor. In general, given a polynomial with rational coefficients, one probably wants to multiply to clear out the denominators and then divide by any common integer factor of the new coefficients, so the resulting coefficients are relatively prime integers. The article addresses this sort of "canonicalization" in the context of nested radicals. I keep telling my students that they should keep that sort of thing in mind, especially since our tests will be mostly multiple-choice.

(Sometimes I'm tempted to define "simplest" as "requires the fewest symbols"... but how does one prove that some 100-character expression one has written can't be written in 99 characters? And how do you count something like "f(x, y)= (x+y)1/2 - (x-y)1/2, where x = foo and y = bar?" ("foo" and "bar" are supposed to be very complicated expressions.) Do you plug foo and bar into the original equation and then count the characters, or do you count the actual characters that are between the quotation marks?)

11 comments:

Anonymous said...

Are you hyperventilating over your Phillies??

dimpase said...

in order to define a smooth curve (i.e. a smooth 1-dimensional manifold), just one parametrization (i.e. a 1-1 smooth map between an open set in R and and open subset of the curve) won't be enough. Think about a circle in R^2, say

Anonymous said...

"Smooth" has a unique definition, from which all other descend. A function of any sort (say, a parametrized curve) is "smooth" if it has enough derivatives which are well-enough behaved for what I'm about to do to it.

Anonymous said...

I've a question for you to make sure I understand what's been said. If you have a n dimensional thingie an its smooth in less than n dimensions by John's definition its smooth, yes?

Michael Lugo said...

M,

by John's definition, it depends on what you're about to do to it.

Theo said...

Smooth, in your sense of C^1, has a synthetic definition that doesn't depend on parameterization.

Let \gamma be a connected subset of the plane. It deserves to be called a _curve_ if at every point, there is an \epsilon-ball in the plane such that the points of \gamma within the epsilon ball, with the topology inherited from the plane, are homeomorphic to an open interval.

For us to call it a smooth curve, there should be a unique tangent line at each point.

What is a tangent line? Let x\in\gamma be a point, and \epsilon a small parameter; U = the epsilon ball around x. Let l be a line through x. The idea is to say that "the distance from l\cap U to \gamma\cap U is order \epsilon^2". In full:

lim_{\epsilon\to 0} [ \sup_{y\in(\gamma\cap U)} \inf_{z\in(l\cap U)} d(z,y) ]/\epsilon = 0

should do it. Or, perhaps I should use a slightly stronger definition of distance; for instance the max of that \sup\inf and the \sup\inf with l and \gamma switched....

In any case, if you really believe in curves that are not parameterized, this should work.

Anonymous said...

My memory is that "smooth" was usually defined as "is a member of C-infinity". However, I did have several professors who would work with functions that were "reasonably smooth" (or some variation of that), which meant John's definition of "has enough continuous derivatives to not cause this proof any difficulties".

Anonymous said...

Flooey, C^\infty is definitely "enough derivatives" for pretty much anything. The only thing I haven't seen "smooth" used to sweep under the rug is analyticity. And actually being analytic can be a burden in real differential geometry.

Anonymous said...

Sorry about the 'm' I hit the publish key too soon.

Anonymous said...

I think John's definition of smooth as "differentiable enough" might have some real meat to it. Unless you are really doing some kind of functional analysis over your manifolds, each proof will only involve a finite number of derivatives. In that sense, smooth (C^\infty) functions have the same sort of status as nonstandard infinitesimals, which exist because every first-order theorem in real analysis only involves a finite number of scales 1, 1/2, 1/3, 1/4, ... Now that I'm thinking about it, it seems strange that nonstandard numbers are considered exotic but smooth functions aren't!

Anonymous said...

Ramanujan made all sorts of whoopee with nested radicals, especially those that were infinitely so, like your expression for the golden ratio. Check out Hardy's collection of SR's papers and other works. These latter include his contributions to the problems page (not the Agony Aunt variety) of the Journal of Indian Mathematical Society, and provide early evidence of his mastery of the nested radical. Interestingly, Abdus Salam's first published work was a comment on one of these exercises in algebraic prestidigitation.