An extensive survey of decades of minimum-wage research, published by William Wascher of the Federal Reserve Board and me in a 2008 book titled “Minimum Wages,” generally found a 1% or 2% reduction for teenage or very low-skill employment for each 10% minimum-wage increase.
That has long been the view of most economists, although there are some outliers. In 1994 two Princeton economists, David Card (now at Berkeley) and Alan Krueger, published a study of changes in employment in fast-food restaurants in New Jersey and Pennsylvania after the minimum wage went up in New Jersey. The study not only failed to find employment losses in New Jersey, it reported sharp employment gains. The study has been widely cited by proponents of a higher minimum wage, even though further scrutiny showed that it was flawed. My work with William Wascher showed that the survey data collected were so inaccurate that they badly skewed the study’s findings.
More recently, a 2010 study by Arindrajit Dube of the University of Massachusetts-Amherst, T. William Lester of the University of North Carolina at Chapel Hill, and Michael Reich of the University of California, Berkeley, found “no detectable employment losses from the kind of minimum wage increases we have seen in the United States.”
This study and others by the same research team, all of whom support a higher minimum wage, strongly contest the conclusion that minimum wages reduce low-skill employment. The problem, they say, is that state policy makers raise minimum wages in periods that happen to coincide with other negative shocks to low-skill labor markets like, for instance, an economic downturn.
Join the conversation as a VIP Member