This series is adapted from Growing Apart: A Political History of American Inequality, a resource developed for the Project on Inequality and the Common Good at the Institute for Policy Studies and inequality.org. It is presented in nine parts. The introduction laid out the basic dimensions of American inequality and examined some of the usual explanatory suspects. The political explanation for American inequality is developed through chapters looking in turn at labor relations, the minimum wage and labor standards, job-based benefits, social policy, taxes, financialization, executive pay, and macroeconomic policy. Previous installments in this series can be found here.
American inequality is driven not just by the uneven distribution of wages, but also by the uneven distribution of job-based benefits. More than any other country, the United States relies on private employment and private bargaining to deliver basic social benefits—including health coverage, retirement security, and paid leave. The results—on any basic measure of economic security—have been dismal.
Reliance on private benefits made some sense in a mid-century economy organized around lifetime “family wage” employment in large and stable firms. But even under these circumstances, benefits bypassed many workers. Their coverage was always uncertain (loss of a job meant loss of benefits) and often capricious (consider the health and pension plans that routinely evaporate in corporate restructuring). Good benefits followed good jobs, widening the gap between low-wage workers and everyone else. The expectation of private coverage undercut public programs—which were often structured as ways of supplementing job-based plans or mopping up around their edges. And, across the last generation, the logic of delivering social policy via private employment unraveled with the economy on which it was based.
A Short History of Job-Based Benefits
The growth of job-based benefits proceeded fitfully in the first half of the twentieth century; given widespread anxieties about competitive disadvantage, only a few firms in select industries were willing to experiment at first, and it took the parallel emergence of public social programs to shape the kinds of benefits we’re familiar with today. Consider the trajectory of health insurance. Aside from a few union and cooperative experiments in the 1930s, insurance against illness (or wages lost due to illness) was relatively rare. All of this changed in the 1940s, when the combination of a stark labor shortage and inflation-anxious wage controls pressed employers to experiment with non-wage benefits—including job-based health insurance. The federal government encouraged this with the Revenue Act of 1942, exempting employer contributions to their health plans from payroll and income taxes.
Although seen as bearing little consequence at the time, this innovation came to dominate the logic and politics of American health care. In the early postwar years, labor, employers, insurers, and medical interests—each for their own reasons—championed job-based benefits as a source of general security and an alternative to public insurance. Through postwar bargaining, coverage under job-based plans broadened to spouses and dependents, and to a wider range of costs and services. Public programs, meanwhile, confined their attention to those too old, too young, or too categorically disadvantaged to rely on job-based coverage.
The reach of job-based health care plateaued at about seventy percent of the population in the early 1970s. After that, the pretension that employment-based benefits might serve as a surrogate for a national program was punctured by rising costs, declining coverage, and the steady erosion of benefits. The financing of health care was transformed as variations on health maintenance organizations and “managed care” displaced the older fee-for-service model. Employers chafed at the burden of health coverage and sought to redistribute the costs: some looked to political solutions that might spread the burden to their competitors or to other sectors of the economy, and others simply shuffled that burden onto the backs of their workers—or abandoned health care benefits altogether. In health care reform debates, the idea of subsidizing or mandating job-based coverage was gradually displaced by innovations in individual coverage—such as high-deductible plans, or the “individual mandate” embedded in the Affordable Care Act.
Steady losses in employer-sponsored coverage have been partially backfilled by public programs, but the categorical reach of these programs (children, the very poor) varies widely by state. In this sense, the unevenness of private coverage is magnified by uneven public coverage. The same states that have dug in against collective bargaining (and hence the extension of job-based health care) have also sustained the sparest public programs. Here the divide between North and South is especially pronounced. In Minnesota, for example, almost 70 percent are covered by job-based health plans, and adults (without dependent children) qualify for Medicaid when their incomes dip below 200 percent of the federal poverty threshold. In Mississippi, by contrast, barely half (54 percent) claim job-based coverage, and childless adults do not qualify for Medicaid until their incomes dip to a thin fraction (16 percent) of the federal poverty threshold.
This series is adapted from Growing Apart: A Political History of American Inequality, a resource developed for the Project on Inequality and the Common Good at the Institute for Policy Studies and inequality.org. It is presented in nine parts. The introduction laid out the basic dimensions of American inequality and examined some of the usual explanatory suspects. The political explanation for American inequality is developed through chapters looking in turn at labor relations, the minimum wage and labor standards, job-based benefits, social policy, taxes, financialization, executive pay, and macroeconomic policy. Previous installments in this series can be found here.
American inequality is driven not just by the uneven distribution of wages, but also by the uneven distribution of job-based benefits. More than any other country, the United States relies on private employment and private bargaining to deliver basic social benefits—including health coverage, retirement security, and paid leave. The results—on any basic measure of economic security—have been dismal.
Reliance on private benefits made some sense in a mid-century economy organized around lifetime “family wage” employment in large and stable firms. But even under these circumstances, benefits bypassed many workers. Their coverage was always uncertain (loss of a job meant loss of benefits) and often capricious (consider the health and pension plans that routinely evaporate in corporate restructuring). Good benefits followed good jobs, widening the gap between low-wage workers and everyone else. The expectation of private coverage undercut public programs—which were often structured as ways of supplementing job-based plans or mopping up around their edges. And, across the last generation, the logic of delivering social policy via private employment unraveled with the economy on which it was based.
A Short History of Job-Based Benefits
The growth of job-based benefits proceeded fitfully in the first half of the twentieth century; given widespread anxieties about competitive disadvantage, only a few firms in select industries were willing to experiment at first, and it took the parallel emergence of public social programs to shape the kinds of benefits we’re familiar with today. Consider the trajectory of health insurance. Aside from a few union and cooperative experiments in the 1930s, insurance against illness (or wages lost due to illness) was relatively rare. All of this changed in the 1940s, when the combination of a stark labor shortage and inflation-anxious wage controls pressed employers to experiment with non-wage benefits—including job-based health insurance. The federal government encouraged this with the Revenue Act of 1942, exempting employer contributions to their health plans from payroll and income taxes.
Although seen as bearing little consequence at the time, this innovation came to dominate the logic and politics of American health care. In the early postwar years, labor, employers, insurers, and medical interests—each for their own reasons—championed job-based benefits as a source of general security and an alternative to public insurance. Through postwar bargaining, coverage under job-based plans broadened to spouses and dependents, and to a wider range of costs and services. Public programs, meanwhile, confined their attention to those too old, too young, or too categorically disadvantaged to rely on job-based coverage.
The reach of job-based health care plateaued at about seventy percent of the population in the early 1970s. After that, the pretension that employment-based benefits might serve as a surrogate for a national program was punctured by rising costs, declining coverage, and the steady erosion of benefits. The financing of health care was transformed as variations on health maintenance organizations and “managed care” displaced the older fee-for-service model. Employers chafed at the burden of health coverage and sought to redistribute the costs: some looked to political solutions that might spread the burden to their competitors or to other sectors of the economy, and others simply shuffled that burden onto the backs of their workers—or abandoned health care benefits altogether. In health care reform debates, the idea of subsidizing or mandating job-based coverage was gradually displaced by innovations in individual coverage—such as high-deductible plans, or the “individual mandate” embedded in the Affordable Care Act.
Steady losses in employer-sponsored coverage have been partially backfilled by public programs, but the categorical reach of these programs (children, the very poor) varies widely by state. In this sense, the unevenness of private coverage is magnified by uneven public coverage. The same states that have dug in against collective bargaining (and hence the extension of job-based health care) have also sustained the sparest public programs. Here the divide between North and South is especially pronounced. In Minnesota, for example, almost 70 percent are covered by job-based health plans, and adults (without dependent children) qualify for Medicaid when their incomes dip below 200 percent of the federal poverty threshold. In Mississippi, by contrast, barely half (54 percent) claim job-based coverage, and childless adults do not qualify for Medicaid until their incomes dip to a thin fraction (16 percent) of the federal poverty threshold.