The minimum wage and “the end of work”
posted at 2:31 pm on February 22, 2014 by Jazz Shaw
Dr. James Joyner has an interesting analysis of two stories of economic recovery this weekend titled, “The End of Work?” In it, he looks at a recent study which compares how well the United States fared in the period since the crash as compared to Great Britain. The difference is startling.
Two rich economies, relatively similar in structure, reacted very differently to the global financial shock of late 2008. In America output sank sharply but then rebounded to new highs. Employment, by contrast, fell dramatically and has recovered much more slowly; it has yet to regain the pre-crisis peak. In Britain the trends were reversed; employment is setting new highs while output suffered an L-shaped recovery.
The key difference appears to be rates of inflation. Higher inflation in Britain reduced real wages. That, in turn, allowed firms to meet a given level of demand by using more workers less intensively—at lower productivities. In America, by contrast, lower inflation meant that real wages rose over the course of the recession and recovery. Some research results suggests that firms respond to sticky real wages by wringing more output out of existing workers—raising productivity. Firms meet a given level of demand using fewer workers more intensively, resulting in a jobless recovery.
Joyner notes that differing approaches to industrial expansion also play a role.
Presumably, the process has been more pronounced in the United States than in Europe because of different public policy choices. Most notably, our tax system incentivizes capital investment, making replacing workers with technology less expensive, and also concentrates wealth since we tax it at much lower rates…
If this is right—and it certainly seems plausible—then we’re left with a choice between an economy that’s operating at maximum productivity and one that employs a maximum number of people. If so, then the solution is an unsettling one that has come up from time to time in the comments section here, offered by Michael Reynolds and others: an economy where the most talented work and subsidize an increasingly large pool of less talented people who don’t.
Most of this analysis makes a lot of sense, but I would take it one step further and ask if the minimum wage debate doesn’t play directly into the effect being observed. With public policy, regulation and an uncertain environment for employers exerting downward pressure on hiring, the temptation is already there to invest in advanced technology to achieve the same productivity. Unlike the case of Great Britain, if the government drives up the cost of employing workers further, wouldn’t that incentive to automate become greater? Businesses have a target level of productivity they must achieve in a profitable manner. When labor costs are low, expensive technology isn’t as attractive and they would naturally hire more workers as described above. But when you artificially drive up the cost of labor in an already challenging environment, the opposite effect would seem natural.
Food for thought. (If not for wage earners.)