Today’s post was shared by US Labor Department and comes from blog.dol.gov

There has been a lot of discussion recently about whether job growth in the U.S. labor market has been concentrated in low-wage jobs, middle-wage, and/or high-wage jobs. To get at the answer, let’s look at how the distribution of wages has changed over time, starting with the Great Recession.

Job losses during the Great Recession were profound, but they were not felt equally across the wage distribution. Figure A shows the average monthly change in employment between 2007 and 2009 by wage level. The blue bars show the actual change in employment by wage level, and the purple line is a benchmark showing what the employment loss would have been at each wage level if job loss had been evenly distributed.* Notably, this means the purple line is a reflection of the 2007 wage distribution. In 2007, around half of workers’ earned $17 or less per hour, so it might be expected that around half of jobs lost would be lost by workers’ who earned $17 or less per hour. But Figure A shows that workers’ who earn $17 or less per hour made up a disproportionately high share of the losses, since most of the blue bars for those wages extend far below the purple line. In particular, very low-wage jobs — in this case jobs that pay $10 per hour or less – saw strongly disproportionate job loss , as did lower-middle-wage jobs — in this case, jobs that pay $13-$16 per hour. On the other hand, very high-wage jobs – jobs that pay around $50 per hour or more – saw job

[Click here to see the rest of this post]