How many solutions does a typical quadratic equation have? Discuss.
(I'm deliberately leaving the problem vague, because it's more interesting that way.)
31 January 2009
28 January 2009
Fields medal birthdate quirk?
A comment at this post from the Secret Blogging Seminar, signed "estraven", said that since the Fields Medal is awarded every four years, and only to people under 40, birthdate modulo 4 is relevant.
There's an easy explanation, if true -- assuming that people get something done in their late thirties, a 39-year-old is more likely to have done Fields-worthy work than a 36-year-old. It's our version of the effect that Malcolm Gladwell talked about in Outliers: The Story of Success. There, he points out that the cutoff for most junior hockey leagues in Canada is January 1, and as a result players born earlier in the year are more likely to be selected for teams since they're older than their competition; thus they get better instruction and more practice, and as a result a surprisingly large proportion of players in the highest-level leagues are born early in the year. Something like two-thirds of players in the highest Canadian junior hockey leagues are born between January and April. (You wouldn't see this in the NHL, I suspect, because not all their players are from Canada.)
But I don't feel like doing the research to determine if birthdate modulo 4 actually has an effect on Fields Medal winning. If you feel like it, I'd like to know the results.
There's an easy explanation, if true -- assuming that people get something done in their late thirties, a 39-year-old is more likely to have done Fields-worthy work than a 36-year-old. It's our version of the effect that Malcolm Gladwell talked about in Outliers: The Story of Success. There, he points out that the cutoff for most junior hockey leagues in Canada is January 1, and as a result players born earlier in the year are more likely to be selected for teams since they're older than their competition; thus they get better instruction and more practice, and as a result a surprisingly large proportion of players in the highest-level leagues are born early in the year. Something like two-thirds of players in the highest Canadian junior hockey leagues are born between January and April. (You wouldn't see this in the NHL, I suspect, because not all their players are from Canada.)
But I don't feel like doing the research to determine if birthdate modulo 4 actually has an effect on Fields Medal winning. If you feel like it, I'd like to know the results.
27 January 2009
Universality theory for cranks
From the geomblog, in 2004: a meta-proof of P=/!=NP. With very slight modifications this could be a meta-proof of the Riemann hypothesis, or any other outstanding open problem in mathematics, theoretical CS, theoretical physics, or other heavily mathematical fields. The cranks work in roughly the same way regardless of the specific question.
26 January 2009
Probabilistic fun with the n-sphere
On reddit: a link to the old chestnut that high-dimensional spheres are weird. If you consider the volume of the unit ball in Rn as a function of n, it increases up to n=5 and then decreases. The volume is πn/2/((n/2)!). (By the way, I generally use the factorial notation, not the Γ notation, even when the argument isn't an integer.)
But I hear you complaining that it doesn't make sense to compare volumes in different dimensions! Fair enough. Compare the volume of the unit ball in Rn to the cube circumscribing it, which has volume 2n. Then the portion of the cube which is inside the ball is f(n) = πn/2/((n/2)! 2n). This is rapidly decreasing with n. For n = 2, it's π/4 -- the volume of the unit disc is π, and it can be inscribed in a square of area 4. In n = 3, the unit ball has volume 4π/3 and it's inscribed in a cube of size 8, so we get f(3) = π/6. But f(n) decreases superexponentially. f(10) is about 0.0025, f(20) is about 25 in a billion.
I was surprised that it decayed that quickly -- I'd never bothered to work it out. But if you think about it probabilistically, it kind of makes sense. Namely, a random point in the unit n-cube can be identified with its coordinates (x1, x2, ..., xn). It's in the unit n-sphere if and only if the sum of the squares of those coordinates is less than 1. Let yi = xi2 -- then yi has mean 1/3 and variance 4/45. (That's calculus.) So the sum of the squares of the coordinates is a sum of n such independent random variables, and is thus itself a random variable with mean n/3 and variance 4n/45 -- it's no surprise most of its mass is at n > 1. One could probably use large deviation inequalities to quantify this, but come on, it's 11 at night and I have real work to do.
But I hear you complaining that it doesn't make sense to compare volumes in different dimensions! Fair enough. Compare the volume of the unit ball in Rn to the cube circumscribing it, which has volume 2n. Then the portion of the cube which is inside the ball is f(n) = πn/2/((n/2)! 2n). This is rapidly decreasing with n. For n = 2, it's π/4 -- the volume of the unit disc is π, and it can be inscribed in a square of area 4. In n = 3, the unit ball has volume 4π/3 and it's inscribed in a cube of size 8, so we get f(3) = π/6. But f(n) decreases superexponentially. f(10) is about 0.0025, f(20) is about 25 in a billion.
I was surprised that it decayed that quickly -- I'd never bothered to work it out. But if you think about it probabilistically, it kind of makes sense. Namely, a random point in the unit n-cube can be identified with its coordinates (x1, x2, ..., xn). It's in the unit n-sphere if and only if the sum of the squares of those coordinates is less than 1. Let yi = xi2 -- then yi has mean 1/3 and variance 4/45. (That's calculus.) So the sum of the squares of the coordinates is a sum of n such independent random variables, and is thus itself a random variable with mean n/3 and variance 4n/45 -- it's no surprise most of its mass is at n > 1. One could probably use large deviation inequalities to quantify this, but come on, it's 11 at night and I have real work to do.
Obama's Erdos number
Does anybody know what Barack Obama's Erdos number is? (In particular, is it even defined?)
I learned a few days ago that if the links are people who know each other, my distance from Barack Obama is at most four -- Joe Biden went to high school with the father of a friend of mine. (Yes, this only proves that I'm at distance at most three from Biden; proving that I'm at distance at most four from Obama is left as an exercise for the reader.
Then again, I'm not sure if Obama even has academic publications, or where I'd look to find out if he did. The relevant Wikipedia article says "he published no legal scholarship", citing this New York Times article, though.
I learned a few days ago that if the links are people who know each other, my distance from Barack Obama is at most four -- Joe Biden went to high school with the father of a friend of mine. (Yes, this only proves that I'm at distance at most three from Biden; proving that I'm at distance at most four from Obama is left as an exercise for the reader.
Then again, I'm not sure if Obama even has academic publications, or where I'd look to find out if he did. The relevant Wikipedia article says "he published no legal scholarship", citing this New York Times article, though.
25 January 2009
Existence of bijections?
I have two counting sequences that I know to be the same. That is, I have objects of type A and of type B, which each have a size, and I know that the number of A-objects of size n and the number of B-objects of size n are the same. But I am struggling to find a bijection between the things they count.
Are there results of the form "sure, the two counting sequences are the same, but there's no `natural' bijection between the corresponding sets?" (Obviously bijections will always exist between sets of the same size, so this would of course depend on the definition of the word "natural" -- perhaps a meaning analogous to the category-theoretic one is what I'm looking for, perhaps not.) I've never seen any, but most of the people I know are more interested in finding the bijections than in proving there aren't any.
Are there results of the form "sure, the two counting sequences are the same, but there's no `natural' bijection between the corresponding sets?" (Obviously bijections will always exist between sets of the same size, so this would of course depend on the definition of the word "natural" -- perhaps a meaning analogous to the category-theoretic one is what I'm looking for, perhaps not.) I've never seen any, but most of the people I know are more interested in finding the bijections than in proving there aren't any.
Finite fields
From everything.com: what are finite fields, anyway? From this I learned that "number sets grow when a mathematician and an equation love each other very much." And it's a pretty good exposition of the construction of a finite field, too.
22 January 2009
This Week's Finds has a feed
Something that may not be well-known (I didn't know it): This Week's Finds in Mathematical Physics, John Baez' long-running series of pointers to things he found interesting and commentary thereon (so it's sort of a blog, except not) has an RSS feed. That's nice because I don't remember to check for new issues. (And no, it's not a weekly column; Baez writes it when he gets around to it, but when he does it's about things he found recently, so the name is still apt.)
Tall people are smarter?
Men are more intelligent than women because they're taller, say psychologists Satoshi Kanazawa and Diane J. Reyniers.
I don't feel like picking this one apart. Have fun.
I don't feel like picking this one apart. Have fun.
Reverse bibliography
Something that would be useful: if I could flip through the bibliography of a book and see if it cites papers I recognize, and the bibliography indicated where in the book that paper was cited. That way I could tell where the book uses work that I'm already familiar with. (If I recall correctly, Graham, Knuth, and Patashnik do such a thing in their book Concrete Mathematics -- each entry in their bibliography indicates the numbers of those pages on which the cited work is referred to.)
This would be more useful for books than papers, because:
1. books are longer;
2. I'm more likely to be reading a book in dead-tree form. I usually read papers on-screen (since most of the papers I read are downloaded, and I don't print them out because I don't like drowning in paper), and most of the time the text is searchable.
This would be more useful for books than papers, because:
1. books are longer;
2. I'm more likely to be reading a book in dead-tree form. I usually read papers on-screen (since most of the papers I read are downloaded, and I don't print them out because I don't like drowning in paper), and most of the time the text is searchable.
Two to the sixty-fourth minus one
I noticed yesterday that the number 264 - 1 appears at least twice in the mathematical folklore.
The first place is in the Tower of Hanoi puzzle. This puzzle is as follows: you are given n disks, all of different sizes, with holes on them, and three posts. The disks are originally stacked on one of the posts in order of size, with the small disks on the top and the large disks on the bottom. Your job is to move the disks to one of the other posts, subject to the following constraints:
1. you can only move one disk at a time;
2. the disks on any post, at any time, must be in order of size.
This can be solved in 2n-1 moves. The solution is given by a nice recursion. To move n disks from post A to post B, first move the smallest n-1 disks from post A to post C. Then move the largest disk from post A to post B. Then move the smallest n-1 disks from post C to post B. Thus moving n disks takes twice as many moves as moving n-1 disks, plus 1. Moving a stack consisting of one disk of course takes one move; solving the recursion gives the answer.
There's a story that goes along with this. It involves some monks in some temple somewhere whose job is, basically, to do this for n = 64. When they finish the universe will end. If they can make one move per second, this takes 264 - 1 seconds, about six hundred billion years, which is suspiciously close to the actual expected lifespan of the universe for such a crude model...
The other place that 264 - 1 occurs is in the story about the invention of chess. Supposedly the ruler of the land where the inventor of chess lived really liked chess, and he offered the inventor a reward. The inventor of chess said that he wanted one grain of wheat placed on the first square of the chessboard, two on the second square, four on the third square, and so on, each square having double the number of grains of the square before. The total number of grains works out to be 264 - 1, which is a lot. Wikipedia has an article on this. A grain of wheat is about 50 milligrams; this works out to about 15,000 kilograms of wheat for every person who has ever lived. (I'm assuming sixty billion people have ever lived, which I have no evidence for but it's a number I've heard.) If we assume an average historical lifespan of 40 years, that's one kilogram per day per person. In other words, you could approximately feed everyone ever for their whole lives with this amount of wheat. It's a lot.
The first place is in the Tower of Hanoi puzzle. This puzzle is as follows: you are given n disks, all of different sizes, with holes on them, and three posts. The disks are originally stacked on one of the posts in order of size, with the small disks on the top and the large disks on the bottom. Your job is to move the disks to one of the other posts, subject to the following constraints:
1. you can only move one disk at a time;
2. the disks on any post, at any time, must be in order of size.
This can be solved in 2n-1 moves. The solution is given by a nice recursion. To move n disks from post A to post B, first move the smallest n-1 disks from post A to post C. Then move the largest disk from post A to post B. Then move the smallest n-1 disks from post C to post B. Thus moving n disks takes twice as many moves as moving n-1 disks, plus 1. Moving a stack consisting of one disk of course takes one move; solving the recursion gives the answer.
There's a story that goes along with this. It involves some monks in some temple somewhere whose job is, basically, to do this for n = 64. When they finish the universe will end. If they can make one move per second, this takes 264 - 1 seconds, about six hundred billion years, which is suspiciously close to the actual expected lifespan of the universe for such a crude model...
The other place that 264 - 1 occurs is in the story about the invention of chess. Supposedly the ruler of the land where the inventor of chess lived really liked chess, and he offered the inventor a reward. The inventor of chess said that he wanted one grain of wheat placed on the first square of the chessboard, two on the second square, four on the third square, and so on, each square having double the number of grains of the square before. The total number of grains works out to be 264 - 1, which is a lot. Wikipedia has an article on this. A grain of wheat is about 50 milligrams; this works out to about 15,000 kilograms of wheat for every person who has ever lived. (I'm assuming sixty billion people have ever lived, which I have no evidence for but it's a number I've heard.) If we assume an average historical lifespan of 40 years, that's one kilogram per day per person. In other words, you could approximately feed everyone ever for their whole lives with this amount of wheat. It's a lot.
The square root of 3 and mock theta functions. (No, they're not connected.)
Mark Dominus writes about why it really isn't so mysterious that Archimedes had the approximation √3 = 265/153 (which is correct to four decimal places). Apparently historians of mathematics have been mystified by this. Dominus points out that tabulating n2 and 3n2 for the first few hundred integers would be enough. And it might even be enough to go up to 100 or so, observing where n2 and 3m2 are close to each other (which gives an approximation √3 ~ n/m), guess the pattern (it comes from the continued fraction of √3, but Archimedes didn't need to know that), and extrapolate. He suggests that's because the historians themselves weren't so good at arithmetic. Many of these historians date from the late 19th and early 20th century, after when mathematics generally turned more abstract and before computers existed, so it's plausible. If I were a historian I'd have something serious and insightful to say about this.
A more general question: if you're trying to work out the history of mathematics by examining the original sources, how important is it to be a good mathematician? I saw a lecture by George Andrews last week on Ramanujan's lost notebook; he and Bruce Berndt are working on an edited version of it (first volume, second volume, review of first volume in the October 2006 Bulletin of the AMS). Andrews happened to be looking through some papers at the library of Trinity College, Cambridge, when he came across these papers. The manuscript conventionally called "Ramanujan's lost notebook" consists of many pages of formulas and almost no words and is concerned with mock theta function; Andrews claims that he would not have recognized the significance of what he was looking at had he not wrote a PhD thesis on mock theta functions.
A more general question: if you're trying to work out the history of mathematics by examining the original sources, how important is it to be a good mathematician? I saw a lecture by George Andrews last week on Ramanujan's lost notebook; he and Bruce Berndt are working on an edited version of it (first volume, second volume, review of first volume in the October 2006 Bulletin of the AMS). Andrews happened to be looking through some papers at the library of Trinity College, Cambridge, when he came across these papers. The manuscript conventionally called "Ramanujan's lost notebook" consists of many pages of formulas and almost no words and is concerned with mock theta function; Andrews claims that he would not have recognized the significance of what he was looking at had he not wrote a PhD thesis on mock theta functions.
Labels:
Andrews,
Archimedes,
history of mathematics,
Mark Dominus,
number theory
20 January 2009
Ladies and gentlemen, the 44th President of the United States of America
I will let a more eloquent man speak for me.
We will build the roads and bridges, the electric grids and digital lines that feed our commerce and bind us together. We will restore science to its rightful place, and wield technology's wonders to raise health care's quality and lower its cost. We will harness the sun and the winds and the soil to fuel our cars and run our factories. And we will transform our schools and colleges and universities to meet the demands of a new age. All this we can do. And all this we will do.- Barack Obama, from his inaugural address.
19 January 2009
Back-to-back days off for some
Ken Jennings points out that today and tomorrow are both federal holidays (tomorrow only in the Washington, DC area), meaning that employees of the US federal government who work in the DC area will get the day off. Have there ever been two consecutive federal holidays before?
For those who don't know the schedule of holidays: the third Monday in January (that's today) is Martin Luther King, Jr. Day. January 20, in years with number one more than a multiple of four (that's tomorrow), is Inauguration Day. The reason that federal employees in the DC area get it off, if I understand correctly, is to keep the traffic down. (Not that it'll help tomorrow; from what I understand Washington will still be a mess.) As Ken Jennings points out, these days are consecutive if January 20 falls on a Sunday or a Tuesday.
Now, a presidential term is a whole number of weeks (208, to be exact) and five days long. (How do I know this? A common year is one day longer than a whole number of weeks; a leap year is two days longer than a whole number of weeks; thus a presidential term, consisting of three common years and a leap year, has five "extra" days.) So we can work backwards. The 2009 inauguration is on a Tuesday; the 2005 inauguration was on a Thursday. The 2001 inauguration was on a Saturday (which I could have told you anyway; I got in a car accident that day and remember the circumstances pretty well). 1997 was a Monday, 1993 was a Wednesday, 1989 was a Friday, and 1985 was a Sunday.
But Martin Luther King Day wasn't observed for the first time until 1986. So the answer is that Martin Luther King Day and the inauguration have never fallen on consecutive days. The pattern of when they do is kind of complicated, because leap years are periodic with period 400. But the 2013 inauguration falls on a Sunday; 2037 and 2041 are a Tuesday and Sunday, respectively; and most of the time these come in pairs; a Tuesday inauguration is followed by a Sunday inauguration. (The end of a century could break this pattern.)
However, the answer to Ken Jennings' actual question is yes, because of an obscure piece of trivia I just remembered: January 2, 2007 was a federal holiday, an official day of mourning for President Ford. (For some reason I remember checking my mail and being surprised there was none. This is strange, because there are lots of days where mail is delivered and I don't get any.) January 1, 2007, was of course New Year's Day. And December 31, 2006 was a Sunday, so there was actually no mail for three days.
For those who don't know the schedule of holidays: the third Monday in January (that's today) is Martin Luther King, Jr. Day. January 20, in years with number one more than a multiple of four (that's tomorrow), is Inauguration Day. The reason that federal employees in the DC area get it off, if I understand correctly, is to keep the traffic down. (Not that it'll help tomorrow; from what I understand Washington will still be a mess.) As Ken Jennings points out, these days are consecutive if January 20 falls on a Sunday or a Tuesday.
Now, a presidential term is a whole number of weeks (208, to be exact) and five days long. (How do I know this? A common year is one day longer than a whole number of weeks; a leap year is two days longer than a whole number of weeks; thus a presidential term, consisting of three common years and a leap year, has five "extra" days.) So we can work backwards. The 2009 inauguration is on a Tuesday; the 2005 inauguration was on a Thursday. The 2001 inauguration was on a Saturday (which I could have told you anyway; I got in a car accident that day and remember the circumstances pretty well). 1997 was a Monday, 1993 was a Wednesday, 1989 was a Friday, and 1985 was a Sunday.
But Martin Luther King Day wasn't observed for the first time until 1986. So the answer is that Martin Luther King Day and the inauguration have never fallen on consecutive days. The pattern of when they do is kind of complicated, because leap years are periodic with period 400. But the 2013 inauguration falls on a Sunday; 2037 and 2041 are a Tuesday and Sunday, respectively; and most of the time these come in pairs; a Tuesday inauguration is followed by a Sunday inauguration. (The end of a century could break this pattern.)
However, the answer to Ken Jennings' actual question is yes, because of an obscure piece of trivia I just remembered: January 2, 2007 was a federal holiday, an official day of mourning for President Ford. (For some reason I remember checking my mail and being surprised there was none. This is strange, because there are lots of days where mail is delivered and I don't get any.) January 1, 2007, was of course New Year's Day. And December 31, 2006 was a Sunday, so there was actually no mail for three days.
Newton's method fractals
Simon Tatham has produced fractals derived from Newton-Raphson iteration which are quite interesting to look at.
The main idea here is that if you want to find a root of some function f, then you start from a guess a0; then compute a1 = a0 - f(a0)/f'(a0); geometrically this corresponds to replacing the graph of the function with its tangent line at (a0, f(a)0) and finding the root of the tangent line. Then starting from a1 we find a2 and so on. If you're already close to a root you'll get closer. But if you're far away from a root unexpected things can happen; the set of all starting points a0 for which the sequence (a0, a1, a@, a3, ...) converges to a given root of f is fractal.
(I've mentioned this before, and so has John Armstrong, but Tatham's pictures are better.)
The main idea here is that if you want to find a root of some function f, then you start from a guess a0; then compute a1 = a0 - f(a0)/f'(a0); geometrically this corresponds to replacing the graph of the function with its tangent line at (a0, f(a)0) and finding the root of the tangent line. Then starting from a1 we find a2 and so on. If you're already close to a root you'll get closer. But if you're far away from a root unexpected things can happen; the set of all starting points a0 for which the sequence (a0, a1, a@, a3, ...) converges to a given root of f is fractal.
(I've mentioned this before, and so has John Armstrong, but Tatham's pictures are better.)
17 January 2009
Mathematics Illuminated
While at the laundromat today, I saw an episode of Mathematics Illuminated, about game theory. This is a series of 13 half-hour episodes on "major themes in the field of mathematics"; the game theory episode covered Nash equilibria, the prisoner's dilemma, evolutionarily stable strategies, etc. (I may be leaving out some things, because there was laundry-machine noise.) Their intended audience seems to be high school teachers and interested but perhaps mathematically unsophisticated adult learners.
It appears you can watch the whole series online. The main mathematician involved is Dan Rockmore of Dartmouth.
(And no, I don't know what channel it was on. Like I said, it wasn't my TV.)
It appears you can watch the whole series online. The main mathematician involved is Dan Rockmore of Dartmouth.
(And no, I don't know what channel it was on. Like I said, it wasn't my TV.)
15 January 2009
Sestinas
Best mathematical poem ever: S|{e,s,t,i,n,a}|, by Caleb Emmons. This is a sestina that explains what a sestina is, using the language of the symmetric group. (A sestina is a certain highly structured kind of poem. I won't say more. See the Wikipedia article on them if you have trouble with the poem.)
(I actually had a longer post about the mathematics of sestinas written, but the computer ate it. Oh well.)
(I actually had a longer post about the mathematics of sestinas written, but the computer ate it. Oh well.)
The synonym-following game
An interesting random fact: form a graph where the vertices are words that the dictionary says has at least one synonym, and the edges are synonym pairs. Then the resulting graph has a giant component. In particular, if "the dictionary" is Merriam-Webster's dictionary, there are 23,279 words that have at least one synonym, and the resulting graph has a component of size 22,311. It also has a clustering coefficient of 0.7. The clustering coefficient is the probability that if we pick a vertex (word) u uniformly at random, and then pick two of its neighbors (synonyms) v and w uniformly at random, then v and w are neighbors (synonyms). So it's not surprising this is high for the dictionary network. This seems consistent with synonyms being words that are "near" each other in some "semantic space". I'm also kind of curious if the results are different for different dictionaries; a dictionary that's less aggressive in declaring things "synonyms" might not show this behavior, and in particular I suspect there's a critical point at which small perturbations of aggressiveness lead to large perturbations in the size of the giant component. So if you've ever played that game of following synonyms in a dictionary and ending up at words that seem to have nothing to do with where you started, this is why.
I'm paraphrasing this from "Statistical mechanics of complex networks", by Reka Albert and Albert-Laszlo Barabasi (cond-mat/0106096); apparently it comes from an unpublished manuscript of Yook, Jeong, and Barabasi. (The article, from 2001, called it a "preprint" but I can't find anything with that set of authors that fits the description. Also, does anybody else find the habit of not including titles of articles in citations supremely annoying? There are actually two "preprints" by that three-author set cited in this article, both from 2001; these are distinguished only as "2001a" and "2001b".) If you actually point me to this paper (or a similar study done by someone else) I'll appreciate it and will publicly thank you.
I'm paraphrasing this from "Statistical mechanics of complex networks", by Reka Albert and Albert-Laszlo Barabasi (cond-mat/0106096); apparently it comes from an unpublished manuscript of Yook, Jeong, and Barabasi. (The article, from 2001, called it a "preprint" but I can't find anything with that set of authors that fits the description. Also, does anybody else find the habit of not including titles of articles in citations supremely annoying? There are actually two "preprints" by that three-author set cited in this article, both from 2001; these are distinguished only as "2001a" and "2001b".) If you actually point me to this paper (or a similar study done by someone else) I'll appreciate it and will publicly thank you.
13 January 2009
Mathematicians on roofs
I am a mathematician, and I would like to stand on your roof (video, 17 minutes). Ron Eglash talks about African fractals. It seems that fractals of one sort or another show up naturally in designs used both decoratively and functionally in various modern African cultures. (As far as I can tell from the video, Eglash is not claiming this is something unique about Africa. I'm curious if he addresses this question in his book, African Fractals: Modern Computing and Indigenous Design, which I have not yet read because I didn't know it existed until this morning.) The title of this post comes about because the best way to see how a village is designed is to stand on the roof of the tallest building.
12 January 2009
Sociology of mathematicians?
Has anybody seriously looked at mathematicians from an anthropological or sociological perspective? I was talking to some sociologists last night, and apparently various people coming from the disciplines that study people have looked at biologists in this way; this got me wondering.
09 January 2009
Meta-disclaimers
From Michael Filaseta's home page:
DISCLAIMER: "The views and opinions expressed in this page are strictly those of the page author. The contents of the page have not been reviewed or approved by the University of South Carolina."Thanks, Kate!
DISCLAIMER OF DISCLAIMER: "The views and opinions expressed in the disclaimer above are strictly those of another entity different from the author of this page. Its presence on this page does not imply approval of its contents by the page author. The page author has however provided a link to a review of the above disclaimer."
Spiegelhalter on "risk literacy"
Probability lessons may teach children how to weigh life’s odds and be winners, from The Times (London), January 5. David Spiegelhalter, of Cambridge University, claims that people need a better understanding of probability and statistics -- not in order to do mathematics, but to help them out in everyday life. He describes what needs to be taught as "risk literacy".
I agree. Much of the curriculum in US schools (the article is about British schools, but from what I can tell this is true there as well) basically seems to be set up so that students will be prepared to learn calculus either at the end of high school or right at the beginning of college. But there's so much more than calculus. And so many of the numbers people run into every day are probabilities or statistics of some sort.
I agree. Much of the curriculum in US schools (the article is about British schools, but from what I can tell this is true there as well) basically seems to be set up so that students will be prepared to learn calculus either at the end of high school or right at the beginning of college. But there's so much more than calculus. And so many of the numbers people run into every day are probabilities or statistics of some sort.
07 January 2009
Dictionaries -- not just for looking up words!
Robert W. Jernigan, A photographic view of cumulative distribution functions, Journal of Statistics Education Volume 16, Number 1 (2008). via reddit. Many dictionaries have little squares on the outside of the page, positioned according to the location of the words on that page in the alphabet. These are an approximation of the cumulative distribution function of the first letter of a randomly selected word. (I say an approximation; this would be exactly true if words with each first letter had equally long definitions on average.)
Jernigan also has a blog, statpics, which is "devoted to images that illustrate statistical ideas"; not surprisingly, he wrote about this.
Jernigan also has a blog, statpics, which is "devoted to images that illustrate statistical ideas"; not surprisingly, he wrote about this.
06 January 2009
Hard Problems: documentary about the 2006 IMO
There is a documentary called Hard Problems: The Road to the World's Toughest Math Contest available from amazon.com
which focuses on the 2006 United States team in the International Mathematical Olympiad. I'm interested to learn about its existence. I'm inclined to view it as a cinematic analogue of Count Down: Six Kids Vie for Glory at the World's Toughest Math Competition, a book about the US team at the 2001 IMO which I rather liked. (Of course, it's possible that my soft spot for the book exists because I like to think I could have been on the 2001 IMO team.)
which focuses on the 2006 United States team in the International Mathematical Olympiad. I'm interested to learn about its existence. I'm inclined to view it as a cinematic analogue of Count Down: Six Kids Vie for Glory at the World's Toughest Math Competition, a book about the US team at the 2001 IMO which I rather liked. (Of course, it's possible that my soft spot for the book exists because I like to think I could have been on the 2001 IMO team.)
Pólya and Szegö take a dim view of authority
George Pólya and Gabor Szegö, in Problems and Theorems in Analysis I: Series, Integral Calculus, Theory of Functions, define the "maximum term" of a power series with all coefficients pi positive,
p0 + p1 x + p2 x2 + ...,
as that term for which pi xi is largest. (Obviously this depends on x.) They then ask:
p0 + p1 x + p2 x2 + ...,
as that term for which pi xi is largest. (Obviously this depends on x.) They then ask:
[Part 1, Problem] 120. [Prove that] the subscript of the maximum term increases as x increases. (One might consider this situation as somewhat unusual: in the course of successive changes the position of maximum importance is held by more and more capable individuals.)Problem 119 is to prove that for an everywhere convergent power series which is not a polynomial, the subscript of the maximal term goes to ∞ with x. Pólya and Szegö offer no commentary.
We're number one!
A ranking of 200 jobs in the United States. Mathematician, actuary, and statistician are #1. #2, #3. Lumberjack, dairy farmer, and taxi driver are #200, #199, and #198. ("Farmer" and "chauffeur" are a bit higher on the list.)
Their one-sentence description of what a mathematician does is "Applies mathematical theories and formulas to teach or solve problems in a business, educational, or industrial climate", which isn't too bad. The word "formulas" in there rubs me the wrong way, though; I don't like the implication that mathematics is all about formulas, and I tend to use "formulae" as the plural. Here's an explanation of the methodology. They don't give the scores they assigned to mathematicians on each of the many factors they take into account. But basically, it's a good job because we sit on our asses, don't have to deal with customers, and make good money. (Well, I suppose I make good money if you factor in the fact that I don't pay tuition...)
Many of their top jobs are one what might call "academic" jobs -- the top ten also includes statisticians, biologists, historians, and sociologists. I'm wondering whether it's really true that academic jobs are generally good, or whether their methodology consistently overrates such jobs.
Via Not Even Wrong (Peter Woit); see also the Wall Street Journal and reddit.
Edit, 5:56 pm: see also See also Cosmic Variance (which, as usual, has many commenters, some insightful), Rigorous Trivialities, Unapologetic Mathematician.
Edit, Friday, 9:47 am: See also The Accidental Mathematician and Junk Charts.
Edit, Tuesday 1/13, 12:05 pm: And Computational Complexity.
Their one-sentence description of what a mathematician does is "Applies mathematical theories and formulas to teach or solve problems in a business, educational, or industrial climate", which isn't too bad. The word "formulas" in there rubs me the wrong way, though; I don't like the implication that mathematics is all about formulas, and I tend to use "formulae" as the plural. Here's an explanation of the methodology. They don't give the scores they assigned to mathematicians on each of the many factors they take into account. But basically, it's a good job because we sit on our asses, don't have to deal with customers, and make good money. (Well, I suppose I make good money if you factor in the fact that I don't pay tuition...)
Many of their top jobs are one what might call "academic" jobs -- the top ten also includes statisticians, biologists, historians, and sociologists. I'm wondering whether it's really true that academic jobs are generally good, or whether their methodology consistently overrates such jobs.
Via Not Even Wrong (Peter Woit); see also the Wall Street Journal and reddit.
Edit, 5:56 pm: see also See also Cosmic Variance (which, as usual, has many commenters, some insightful), Rigorous Trivialities, Unapologetic Mathematician.
Edit, Friday, 9:47 am: See also The Accidental Mathematician and Junk Charts.
Edit, Tuesday 1/13, 12:05 pm: And Computational Complexity.
05 January 2009
Shadow lengths
From a Jeopardy! clue: to get vitamin D, walk around in the sunlight for at least twenty minutes a day -- but only when the sun is low in the sky. "Low in the sky" is defined as when your shadow is longer than you. That is, when the altitude of the sun is less than forty-five degrees.
The time of Muslim afternoon prayers are also defined in terms of simple arithmetic on shadow lengths.
The time of Muslim afternoon prayers are also defined in terms of simple arithmetic on shadow lengths.
03 January 2009
Things I saw at ANALCO '09
1. a picture of a Rubik's cube subwoofer. (They're only making 200, so I could never afford one.) This was actually in the magazine in my hotel room, so it's really just a coincidence.
2. An actual copy of Flajolet and Sedgewick's book Analytic Combinatorics. (Cambridge University Press had a display table.) I was not the only person who didn't believe it was actually a book. (In fact, Sedgewick seemed a bit surprised.) It appears to be very nice as a physical object, and in particular is surprisingly un-brick-like for an 800-page book.
3. A bunch of talks, many of which were interesting. I may have more commentary on the ones I found particularly interesting once I read the associated papers.
2. An actual copy of Flajolet and Sedgewick's book Analytic Combinatorics. (Cambridge University Press had a display table.) I was not the only person who didn't believe it was actually a book. (In fact, Sedgewick seemed a bit surprised.) It appears to be very nice as a physical object, and in particular is surprisingly un-brick-like for an 800-page book.
3. A bunch of talks, many of which were interesting. I may have more commentary on the ones I found particularly interesting once I read the associated papers.
Labels:
conferences,
Flajolet,
Rubik's cube,
Sedgewick
02 January 2009
Subscribe to:
Posts (Atom)