The standard deviation is a measure of difference from the average (technically the mean). A useful way to think of the standard deviation is to view it as the amount of difference above or below the average that is noticeable.

The typical textbook example is of men's heights in the United States, where the average is around 5' 11" and the standard deviation is about 3 inches. A man slightly taller than the average, say 6' won't stand out as tall, but a man 6' 2" is just tall enough for people to refer to him as "the tall guy." It's the same with anything else that is normally distributed in a bell-shaped curve. IQs, for example, are like this. With an average IQ of 100 and a standard deviation of 15, an IQ of 115 is right about the point at which a person starts to stand out as smart.

An interesting feature of standard deviations is how few of them there are. Over two-thirds of a population falls within one standard deviation from the average, and 95 percent within two. Only one in a thousand is either three standard deviations above or below the average.

As a result, even professional basketball players are typically no more than three standard deviations taller than the average, college players shorter. Similarly, Nobel Prize winning scientists often have IQs about three standard deviations above the average, while more ordinary tenured college professors come in at less than two.

Turning to economics, let's therefore assume that merit-based economic contributions vary more or less according to the same statistical rule. That is, over two-thirds of the population probably contributes economically within a range of one standard deviation above or below the average, 95 percent contributes within two standard deviations of the average, and only one in a thousand contributes three standard deviations above or below the average.

If income inequality corresponded to these presumed differences in economic contributions, what would the distribution of incomes look like?

To get this answer, we need to start with the average household income in the United States. Recent data show it to be around $73,000. Importantly, this is the average, or the mean, not the median. The median is the lower number (around $50,000-some) usually reported. This reporting is correct, since the median is more typical of the average household income, while the mean is inflated by the few very high earners. But if we want to know what the real average is before income is unequally distributed, that answer is around $73,000 a year.

Unfortunately, we can't calculate a useful standard deviation from the income distribution in the United States. The difficulty is that standard deviations are only usefully calculated from normal bell-shaped distributions, while the income distribution in the United States is an abnormally lopsided one. There are fancier statistics that might be used instead of the standard deviation, but there's also a workaround. It is to use the less lopsided distribution of housing prices and make the standard assumption that housing costs represent about a third or less of total household income.

The average price of a house in the United States is around $340,000, and the standard deviation is around $50,000. Suggestively, at the average income of $73,000, a household can just qualify for a mortgage of about $300,000, based upon the usual rule that housing should cost no more than a third of total income, and can therefore afford the average house (after saving up for the down payment). This makes the numbers look right.

Better, by using the standard deviation of housing costs, we can also estimate the incomes required to be able to buy houses more or less expensive than the average. Specifically, the income required for $50,000 worth of home mortgage is around $11,000 a year. Thus, we can estimate that a reasonable standard deviation of household incomes is around $11,000 a year.

It follows that a merit-based household income distribution would find two-thirds of the households taking in between $62,000 and $84,000 (one standard deviation), 95 percent between $51,000 and $95,000 (two standard deviations), and only one in a thousand living on either less than $40,000 or more than $106,000 (three standard deviations).

Of course, these numbers would have to be adjusted for region, age, household size, tax rates, and so on. They are also only ballparks.

However, these ballpark numbers provide a proxy glimpse into the degree of income inequality that can be reasonably justified by differences in merit-based economic contributions. To depart too far from a pattern like this is to believe that the distribution of economic merit is the economic equivalent of a society in which one in every five adult men stands only four-and-a-half feet tall and has an IQ of 25, while another one in five adult men stands eight-and-a-half feet tall and has an IQ of 250.