Over at Mike's Baseball Rants, Mike Carminati has analyzed what happens to baseball teams when they change managers in midseason, and he wonders: how does a team fare when it changes its manager?
What got my attention was the statement that on average, if a team changes its manager its winning percentage improves by .044 in the next season.
After all, we ought to expect some sort of regression to the mean. The average team which changes its manager is a .445 team. (Over 162 games, that's 72 wins and 90 losses.) I figure that the fact that the team wins less than half its game is some combination of lack of skill and bad luck; one expects that they'll do better next year. A team which is "naturally" a .500 team should do this poorly one year out of every fourteen or so.
And indeed, if we compare the winning percentage of a team in year N to the same team in year N+1, the winning percentage in year N+1 is given by, roughly,
PCTn+1 = 0.8 * PCTn + 0.1
I'm being deliberate in only giving one significant figure here; depending on exactly which sample I use the coefficients change quite a bit. But no matter what happens, it seems like a team's record regresses one-fifth of the way back to the mean. A 71-91 team this year (ten games below .500) ought to be eight below 500, or 73-89, next year; a 111-51 team this year (thirty games above .500) ought to be 105-57 next year.
So you'd expect that .445 team this year to improve to .456 next year -- an improvement of .011. There is therefore some real effect from changing the manager.
However, midseason replacements don't seem to have much of an effect. Carminati also computes that when a team changes managers in midseason, they do .007 better after the replacement than before. It would be too much effort to compile how teams do on average in the first and second half of the season, but I'd expect some sort of regression to the mean. It's not immediately clear how this should compare with the regression to the mean between consecutive seasons. On the one hand, the samples are smaller, so you'd expect a bit more volatility -- but does that mean that our hypothetical .445 team over the first half should do better or worse than .456 in the second half? On the other hand, there's a lot more turnover during the offseason just because there's more time there; I'd definitely expect the regression to be stronger between seasons than between halves of the same season, because a team which has done badly will shop around and get itself some talent, and a team which has done well is likely to be paying out too much and will sell off some talent. So this .007 figure might not even be significant. I suspect that a lot matters on the particular situation that the team is in.