See the data tables from the 2006 census. These give the number of people whose personal income is in each interval of the form [2500N, 2500N+2499], for integer N.

One sees, for instance, that the number of people making between $27,500 and $29,999 (which is near the mode of the distribution) is less than

*both*those making $25,000 to $27,499 and those making $30,000 to $32,499. Something similar occurs at all income levels -- the number of people making between 2500N and 2500(N+1)-1 dollars is smaller if N is odd (and thus this interval doesn't contain a multiple of 5000) than if N is even (and so it does).

Surprisingly, the effect occurs even at very low levels of earnings. If you make $87,714 in a year I can see rounding to $90,000 -- but is the person who makes $7,714 in a year really rounding to $10,000?

(I found this while trying to answer a question at Metafilter: How many people in the United States make more than $10,000,000 per year?. I seem to recall reading somewhere that personal income roughly follows a power law in the tails, but can't actually find a reference for this.)

There also seems to be a preference for multiples of $10,000 over multiples of $5,000 that are not multiples of $10,000. But I have work to do, so I'm not going to do the statistics.

## 3 comments:

It also may actually be the case that more people have salaries that are multiples of 5K. The reason being that many starting salaries are multiples of 5K.

Maybe it's really a test of whether or not people can do the math involved in "remember your income to the nearest $2500." A lot of us don't really know what we make, myself included at the moment.

This effect is actually noticable in cences data for ages (large bumps every 5 years) in places where it's not as customary to keep track of your age that closely.

Post a Comment