The Yankees and Red Sox have played each other two thousand and something times, as the good folks on Fox told us, so I started to wonder: which baseball teams have played each other the most times? If I had to guess it's Giants-Dodgers; Wikipedia agrees with me but gives no source. Apparently Cardinals-Cubs, Cardinals-Pirates, and Cubs-Pirates are all a close second. These are the ones you'd expect if you take into account when teams changed divisions, etc.
Anyway, trying to find this out I found a blog post entitled Have Two Baseball Games Ever Played Out Identically?. The answer is no, but "identically" is defined a bit too strictly; (say) a groundout to second and a groundout to shortstop are counted as different. And the metric that the author uses for similarity of two games A and B is, I think, the number of times where the nth plate appearance in games A and B had the same outcome. Intuitively I think you'd want to line up innings with each other. Two "most similar" games should at least have similar-looking line scores. I think what one wants is some notion of "edit distance" between games, and defining that is hardly trivial.
I've sort of poked at this before: in 2007 I asked what's the most common line score in connection with a promotion that MLB did for that year's all-star game.
There's a nice combinatorial/probabilistic question hiding here; I've seen results on, say, the probability that two randomly chosen permutations of [n] have the same cycle type, or the probability that two binary trees with n labelled nodes have the same shape. Baseball games are combinatorial structures, and I'm not just saying that to justify the fact that I'm probably going to spend six hours today watching baseball. (The Yankees and Red Sox are on TV now, the Phillies and Mets later.)
22 August 2009
15 August 2009
A number-theoretic tangent for your Saturday afternoon
You may know that 1001 has a simple prime factorization, namely 1001 = (7)(11)(13).
And of course 1000 has a simple prime factorization, namely 1000 = (23)(53).
The natural question to ask at this point, to me, is "how often does this happen?" Of course one has to define "this". One question is "how often are there two consecutive numbers that have all their prime factors less than or equal to 13?" There seem to be finitely many, and after I compiled a list I googled a few numbers from it and discovered that David Rusin had also done so. In case you're wondering why I conjecture there are finitely many: the number of integers less than n with all prime factors at most 13 is proportional to (log n)6. (The exponent 6 comes from the fact that there are six primes which are 13 or less.) So the "probability" (in the usual heuristic number-theoretic sense) that a given number n is "13-smooth" is the derivative of this, and thus of the order of (log n)5/n. The "probability" that two consecutive numbers are "13-smooth" is on the order of the square of this "probability", i. e. (log n)10n-2, and the integral of that from, say, 2 to infinity converges. Nothing is special about 13 here; there should be finitely many pairs of consecutive "p-smooth numbers" where p is any prime.
(Rusin's list, by the way, was something he compiled to illustrate how one might calculate logarithms; since 1000 ~ 1001, we can take common logs and get 3 ~ log(7) + log(11) + log(13). Given enough relations like this one can solve for the logarithms themselves.)
Alternatively, we can define the "roughness" of n as (log p)/(log n) where p is the largest prime factor of n. I call it "roughness", not "smoothness", because if it's smaller than the corresponding number is smoother, i. e. has comparatively small prime factors. This naturally compensates for the size of the number; if I recall correctly the proportion of numbers less than n with roughness less than r approaches some limit (a function of r) as n goes to infinity. (I only dabble in analytic number theory so I can't provide a source off the top of my head.)
It seems reasonable then that there should be a similar limit for pairs of consecutive numbers. Then empirically it seems that the probability that n and n+1 both have roughness at most (log 13)/(log 1001) is about 1 in 250. The pairs of numbers that are "at least as round" as (1000, 1001) in this sense are
(80, 81), (224, 225), (1000, 1001), (1715, 1716), (2079, 2080), (2400, 2401), ...
and it seems like about one in 230 pairs of consecutive numbers have this property (4291 in the first million). As is often the case, the interesting thing is that the limiting probability appears to exist and be neither zero nor one.
Postscript: I know some readers will be offended by my use of "probability" in this number-theoretic sense, because the prime factorization of a number is hardly a random object! But I'm a probabilist; this is how I think.
And of course 1000 has a simple prime factorization, namely 1000 = (23)(53).
The natural question to ask at this point, to me, is "how often does this happen?" Of course one has to define "this". One question is "how often are there two consecutive numbers that have all their prime factors less than or equal to 13?" There seem to be finitely many, and after I compiled a list I googled a few numbers from it and discovered that David Rusin had also done so. In case you're wondering why I conjecture there are finitely many: the number of integers less than n with all prime factors at most 13 is proportional to (log n)6. (The exponent 6 comes from the fact that there are six primes which are 13 or less.) So the "probability" (in the usual heuristic number-theoretic sense) that a given number n is "13-smooth" is the derivative of this, and thus of the order of (log n)5/n. The "probability" that two consecutive numbers are "13-smooth" is on the order of the square of this "probability", i. e. (log n)10n-2, and the integral of that from, say, 2 to infinity converges. Nothing is special about 13 here; there should be finitely many pairs of consecutive "p-smooth numbers" where p is any prime.
(Rusin's list, by the way, was something he compiled to illustrate how one might calculate logarithms; since 1000 ~ 1001, we can take common logs and get 3 ~ log(7) + log(11) + log(13). Given enough relations like this one can solve for the logarithms themselves.)
Alternatively, we can define the "roughness" of n as (log p)/(log n) where p is the largest prime factor of n. I call it "roughness", not "smoothness", because if it's smaller than the corresponding number is smoother, i. e. has comparatively small prime factors. This naturally compensates for the size of the number; if I recall correctly the proportion of numbers less than n with roughness less than r approaches some limit (a function of r) as n goes to infinity. (I only dabble in analytic number theory so I can't provide a source off the top of my head.)
It seems reasonable then that there should be a similar limit for pairs of consecutive numbers. Then empirically it seems that the probability that n and n+1 both have roughness at most (log 13)/(log 1001) is about 1 in 250. The pairs of numbers that are "at least as round" as (1000, 1001) in this sense are
(80, 81), (224, 225), (1000, 1001), (1715, 1716), (2079, 2080), (2400, 2401), ...
and it seems like about one in 230 pairs of consecutive numbers have this property (4291 in the first million). As is often the case, the interesting thing is that the limiting probability appears to exist and be neither zero nor one.
Postscript: I know some readers will be offended by my use of "probability" in this number-theoretic sense, because the prime factorization of a number is hardly a random object! But I'm a probabilist; this is how I think.
13 August 2009
Mathematicians in today's New York Times crossword
A clue from today's New York Times crossword puzzle: "Mathematician Post or Artin". (Four letters. If you don't know the answer, click on the links.)
The crossword blogs (here, here, here) think this was an unfair clue; this one says that "Neither [...] will be familiar to most solvers, or even to all mathematicians."
I got this with no problem. But it took me a moment, because the son of the Artin the clue was about was one of my professors as an undergrad. A few commenters here and there say they needed some of the crossing letters to decide which Artin the clue referred to.
The crossword blogs (here, here, here) think this was an unfair clue; this one says that "Neither [...] will be familiar to most solvers, or even to all mathematicians."
I got this with no problem. But it took me a moment, because the son of the Artin the clue was about was one of my professors as an undergrad. A few commenters here and there say they needed some of the crossing letters to decide which Artin the clue referred to.
04 August 2009
Student "entitlement" to grades
An interesting statistic from this article from the Madison (Wisconsin) Times: in fall 2008, the average grade in social work courses was 3.70 on a 4.0 scale, and the average grade in mathematics courses was 2.79. (The article doesn't indicate why these two departments were chosen, but I suspect they're at the extremes of the distribution.) This is a fact offered in an article about how students feel more "entitled" to high grades than in the past.
I don't want to comment on how students may or may not feel entitled to high grades. Most of the information I've seen indicates that this sort of entitlement is more common now than in the past; I haven't been teaching long enough to feel like I can comment intelligently on historical trends. (And I wouldn't want to include data from when I was taking classes, because my friends and I may or may not have been a good sample.)
From 3σ → left.
I don't want to comment on how students may or may not feel entitled to high grades. Most of the information I've seen indicates that this sort of entitlement is more common now than in the past; I haven't been teaching long enough to feel like I can comment intelligently on historical trends. (And I wouldn't want to include data from when I was taking classes, because my friends and I may or may not have been a good sample.)
From 3σ → left.
01 August 2009
When 2+2 = 5
From "you suck at craigslist:" 2+2 = 5. People selling things on craigslist who want $x for one of some item and more than $2x for two.
I'm pretty sure I once saw somebody selling T-shirts on the boardwalk in Atlantic City for "$3, 3 for $10." Or it might have been "$4, 2 for $10" or "$2, 4 for $10". In any case, it didn't make sense.
I'm sure there are cases where this sort of pricing structure actually makes sense, but I can't think of anybody. (I also wouldn't be surprised to learn that economists have a name for it.) Anyone want to try?
I'm pretty sure I once saw somebody selling T-shirts on the boardwalk in Atlantic City for "$3, 3 for $10." Or it might have been "$4, 2 for $10" or "$2, 4 for $10". In any case, it didn't make sense.
I'm sure there are cases where this sort of pricing structure actually makes sense, but I can't think of anybody. (I also wouldn't be surprised to learn that economists have a name for it.) Anyone want to try?
Subscribe to:
Posts (Atom)