## 05 July 2009

### Problems that are hard for intermediate values of some parameter

In Clifford Henry Taubes' review of Monopoles and three-manifolds, by Peter Kronheimer and Tomasz Mrowka (citation information; article), near the end of the first paragraph the authors mention the problem of classifying compact manifolds with the homotopy type of the n-sphere. In any dimension there is exactly one. The history of this problem is roughly as follows:

• n ≥ 5: Smale, 1960 (and Stallings at around the same time)

• n = 4: Freedman, 1980

• n = 3: Perelman, early 2000s (this is the Poincare conjecture)

• n = 2: "follows from the Riemann mapping theorem"

• n = 1: "a nice exercise for an undergraduate"

So in low dimensions the problem is easy, or at least doesn't require "modern" apparatus; in high dimensions it's harder (Smale and Freedman both got Fields Medals); in "middle" dimensions (like 3) it's the hardest, or at least took the longest. It's my understanding that it's pretty typical in geometry/topology for the three- or four-dimensional cases of a problem to be the most difficult?

Can you think of other problems (from any area of mathematics) that have a similar property -- that they're hardest for some medium-sized value of whatever a natural parameter for the problem is? (Yes, it's a vague question.) Anonymous said...

Classifying exotic differentiable structures Anonymous said...

I think the notion of dimension is really important here. The point is, if the dimension is too small, there's nothing going to happen; if it's too big, anything can happen (and therefore there aren't as many interesting structures).

I'm not sure what other "natural parameters" would have that property. Smoothness maybe? Continuous functions on the one easy end, C^2-and-above on the other easy end, and not-quite-C^2 in the middle?

John said...

Donoho and Tanner have a recent paper discussing problems that get easier in higher dimensions. Anonymous said...

Can you think of other problems (from any area of mathematics) that have a similar property -- that they're hardest for some medium-sized value of whatever a natural parameter for the problem is?

This comes up all the time with phase transitions. Typically there will be some physical parameter (temperature, density, etc.) where at both extremes the behavior is simple, but it's different at the two extremes, and the transition between them can be much more complicated. Determining precisely where the transition is, or what happens nearby, can be much harder than analyzing the behavior far from the transition.

Suresh Venkatasubramanian said...

there's a neat Erdos problem: what's the maximum number of pairs of points that can be a unit distance apart, maximized over all point sets of size n.

The problem is easy in 1D (linear) and beyond (quadratic). Figuring out the right bound in 2D (currently n^{4/3} and 3D (currently n^{3/2}) is a vexing problem in combinatorial geometry.

Mark Dominus said...

I'm not sure this is exactly what you're looking for, but the kissing number has been known for 8 and 24 dimensions for some time, but was resolved for 4 dimensions only quite recently, and is still an open problem in all n>4 *except* 8 and 24.

Qiaochu Yuan said...

The kissing number problem is an interesting example. The reason dimensions 8 and 24 are solved is due to the existence of exceptional structures: the Leech lattice, the octonions, and so forth. On the other hand, one reason to single out dimensions 3 and 4 geometrically is the existence of exceptional structures in those dimensions: the dodecahedron, icosahedron, 24-cell, 120-cell, and 600-cell. And morally these (and other exceptional structures in mathematics) are closely related.

Perhaps exceptional structures recur in different branches of mathematics for the same underlying reason related to this "intermediate parameter" idea; in low-parameter regimes one expects local obstructions and in high-parameter regimes one expects global obstructions, or something like that. For example, any sufficiently small finite simple group must be abelian), and any sufficiently large finite simple group must be non-sporadic.

AA said...

Off the top of my mind i can think of the n-body problem (which has been dealt with for the n=[2,3] cases but not truly for n>3) and the Small World property emerging in some graphs.

This is in fact close to what an anonymous commenter posted above...It's a phase transition.

The beta network model by Watts-Strogatz can produce random networks ranging from an ordered lattice to a completely random structure depending on a parameter (p). When p=0 you get a lattice network. When p=1 you get a random network. Networks produced for intermediate values of p had not been studied until recently (1998) and lead to the realisation of the Small Word Property through an investigation of the Clustering Coefficient and Mean Path Length statistics.

Flooey said...

Possibly the classification of all finite simple groups? My memory is that the early proofs centered around either classifying all finite simple groups of order less than some N, or classifying all finite simple groups of order more than some other N, and that the final proofs filled in the middle values until the two bounds met.

Qiaochu Yuan said...

While we're on the subject of the classification, Ars Mathematica has a post I think is relevant. JacquesC's comment about "room" is opposite, but I think complementary, to what I wrote. It all depends on whether you think the exceptional structures or the asymptotic structures are the interesting ones. (Litmus test: how do you feel about the exceptional outer automorphism of S_6?) pciszek said...

I remember noticing this phenomenon in statistical mechanics; it was always easier to prove from first principles that various real-world phenomenon would happen in 2 or 4 dimenstions than in 3.

So then I see all this string theory stuff about how the universe started with 11 dimensions and quickly collapsed down to 3 of space and 1 of time. Is there any possible link here? That the universe had to stop collapsing at three dimensions because the a three-dimensional universe is too complex? Anonymous said...

pciszek: as I understand it, string theory has no good explanation for why we ended up at 3 spatial dimensions instead of 2 or 4 or some other number. It's just one of many (about 10^500) parameters on the theory that was originally supposed to have at most one parameter.

harrison said...

This is only tangentially related, since the problem isn't really much harder in intermediate dimensions, but of course dimensions 3 and 4 have exceptional regular polytopes. (Which I just now see Qiaochu already mentioned).

A natural example of phase transition occurs, of course, in the theory of random graphs. For instance, if the number of edges is big or small compared to n log n, then a graph is almost certainly Hamiltonian; in a range around n log n, though, not much can be said.

Would the dichotomy between structure and randomness count as a "meta" example? In combinatorics, we often know how to deal with objects that are very structured and with objects that are "random," but you need powerful tools such as the regularity lemma to deal with objects that are neither fully structured nor (pseudo)random.

Aaron said...

Hehehehe... here's a problem that fits the letter of the challenge, if not the spirit.

Problem: Find a set with cardinality X.

Easy for X <= |N|, easy for X = |R|, hard in between. ;)

Unknown said...

Loop-erased random walk on Z^d is a good example. It's known what the scaling limit is for d=2 (SLE_2), d\geq 5 (Brownian motion), d=4 (Brownian motion with a scaling factor), but not d=3. The high dimensional cases were handled by studying the intersections of random walks, but completely new techniques were required for d=2, and completely new techniques will likely be required for d=3.