Over the past century, we’ve had three broad labor regimes. The first, in the early 1900s, featured “unfettered labor markets,” as economic historian Price Fishback of the University of Arizona puts it. Competition set wages and working conditions. There was no federal unemployment insurance or union protection. Workers were fired if they offended bosses or the economy slumped; they quit if they thought they could do better. Turnover was high. Less than a third of manufacturing workers in 1913 had been at their current jobs for more than five years.
After World War II, labor relations became more regulated and administered — the second regime. The Wagner Act of 1935 gave workers the right to organize; decisions of the National War Labor Board also favored unions. By 1945, unions represented about a third of private workers, up from 10 percent in 1929. Health insurance, pensions and job protections proliferated. Factory workers laid off during recessions could expect to be recalled when the economy recovered. Job security improved. By 1973, half of manufacturing workers had been at the same job for more than five years.
To avoid unionization and retain skilled workers, large non-union companies emulated these practices. Career jobs were often the norm. If you went to work for IBM at 25, you could expect to retire from IBM at 65. Fringe benefits expanded. Corporate America, unionized or not, created a private welfare state to protect millions from job and income loss.