Above Photo: From newrepublic.com
Google is blocking our site. Please use the social media sharing buttons (upper left) to share this on your social media and help us break through.
Why policymakers calculate the cost of life and death, sickness and health
In 2015, according to the Centers for Disease Control, some 33,091 people died as a result of an opioid overdose. The final 2016 figure, there is little doubt, will be even higher. Last year, researchers at the CDC put the “societal” cost of the opioid epidemic at $78.5 billion for 2013. Some of that figure includes spending on healthcare and on criminal justice related to the trade in opioids. But much of the $78.5 billion represents something less tangible: “lost productivity.” The researchers estimated that the lost future economic output of Americans affected by the epidemic—those who were disabled by opioid dependence, who died prematurely, or were incarcerated—amounted to $41.9 billion a year. And in November, the White House made headlines by putting an even bigger price tag—$504 billion—on the opioid epidemic, by adding to this a dollar value for each life lost, the so-called “value of a statistical life.”
The idea of putting a price on health—or life and death—may seem intuitive, even natural, in an age in which the human body is commonly conceived as a sort of investment. “When I’m at a country club or a party and people ask me what I do, I say I’m an asset manager,” said one concierge physician (annual fee: $40,000 and up) whose practice was recently profiled in the New York Times. “When they ask what asset, I point to their body.”
But this was not always so. The view of a human life as revenue-producing capital—the price of which can be computed by years and dollars of “lost productivity”—is a very new development, one that is at the core of historian Eli Cook’s groundbreaking new book. The Pricing of Progress: Economic Indicators and the Capitalization of American Life traces how health, lives, and land came to be seen as “income-generating investments,” a transformation that has not just shaped how we perceive the costs of catastrophes like the opioid epidemic (or how we market boutique medical care), but that also, Cook asserts, propelled the emergence of capitalism itself.
To be sure, plenty of pre-capitalist societies have relied on some of what we see as the trappings of capitalism: money, trade, markets, and even wage labor. But capitalism itself, Cook argues (like many historians and theorists of capitalism, going back to at least Marx), is something very novel. “Markets, commodities, and consumer goods,” he writes, “while certainly necessary components, do not a capitalist society make.” For Cook, it is instead the rise of dividend-producing investments that defines capitalism. First land, and then people, came to be seen not merely as commodities to be bought and sold, but as capital, the price of which was increasingly tied not to some measure of innate value or even to hours of labor committed, but by the promise of future profits.
In this telling, the story of capitalism begins not in smog-laden factory cities but on the farms of the English countryside, where, in the wake of the devastation of the Black Death, a new economy was born.
1976 was a pivotal year in the study of the history of capitalism. That year, in the journal Past and Present, the radical historian Robert Brenner published his seminal paper, “Agrarian Class Structure and Economic Development in Pre-Industrial Europe.” Brenner argued that the Black Death—responsible, according to one estimate, for some 50 million deaths, or 60 percent of Europe’s population, between 1346 and 1353—produced a profound labor shortage that, in turn, led to conflict between peasants and lords. In much of Western Europe that conflict brought serfdom to an end. But, Brenner asserts, in England peasants were unable to join a class of small freeholders, or at least not for very long. Instead, they were gradually removed from their lands, as landlords enclosed large swathes of the countryside. This created a landless proletariat, who had no choice but to work for wages on what had been their forefathers’ lands. Later they would man the sweatshops of industrial towns.
This is about where Cook begins his story. With the peasants pushed off their lands, the “pricing of progress and the capitalization of everyday life” began for the very first time. Lords had, of course, always squeezed the peasantry—whether by requisitioning their labor or by skimming a share of the crop. But for the most part this did not involve the exchange of cash. After the middle ages, this changed: Increasingly, lords rented enclosed lands in return for annual cash payments. By the seventeenth century, land was being priced not by custom, but according to its predicted future annual revenues. In order to cover rental payments and maximize profits, rent-paying tenants needed to find ways to make the land over-perform. This drove a predictable obsession with increasing agricultural productivity, which in turn revolutionized English agricultural production and later initiated the Industrial Revolution.
A seminal thinker in this new “capitalization” of land—and other things—was the polymath William Petty, a physician, political economist, and cartographer, born in 1623. Petty was tasked with surveying the land Oliver Cromwell had confiscated in his conquest of Ireland. He planned to turn these seized lands over to his soldiers and other supporters. When Petty drew up his map, however, he added new information, designating parishes according to their potential future profitability. This allowed Cromwell to work out more accurately how much each soldier should receive. It also transformed Ireland, Cook writes, “from a physical place into a pecuniary reimbursement.” Petty, who had served in the army as physician-general, stood to gain from his innovation. Indeed, his choice Irish lands made him filthy rich, vaulting him “into the upper echelon of English society practically overnight.”
Petty developed the idea of “capitalization” in multiple arenas. He was the “founding father of GDP” (though he didn’t coin the term), which he computed by estimating the country’s total consumption of goods like housing and food; this, he reasoned correctly, should equal total economic production. He was perhaps also the first to conceive of humans as units of interest-producing capital (or “human capital,” although again, the term comes later). He estimated the total economic output of workers in England, and then that of the average worker. From this, he priced the life of an English laborer at £138, based on his or her annual output compounded over a 16-year period. He priced all sorts of aspects of events and activities. If English workers shortened their dinner times and skipped Friday night supper, he calculated, the productivity gains would amount to over £250,000 per month. The Plague of 1665 cost, by his reckoning, £7 million.
Cook’s book, however, mostly concerns capitalization in the United States, where British colonialists had hoped to replicate “the burgeoning agrarian capitalism of Ireland and England,” by enclosing land in the New World and transforming it into units of rent-producing capital. These early efforts were largely unsuccessful. There was simply too much open land and too few willing workers, Cook writes. Once indentured servants made money, they opted out of waged labor, and bought their own farms. The means of production—in this case, land—wound up falling into the hands of a “relatively wide swath of households” (albeit overwhelming white ones, of course). As a result, colonial America remained—unlike England at the time—non-capitalist in its economic orientation, while being highly repressive of people who happened to not be white men.
The early American slave economy, Cook contends, was largely non-capitalist too (which is not to imply that it was any less monstrous). Whereas enslaved human beings may always have been treated as a form of property (Aristotle described a slave as “a tool in charge of other tools”), slaves were not always seen as forms of wealth-generating capital. In the colonial era, most of the South produced tobacco, which can be cultivated on a much smaller scale than sugar or cotton. The slaves who worked tobacco plantations were, according to Cook, infrequently bought and sold. “Most white Americans viewed them … not as income-yielding, capitalized assets but rather as pieces of property whose direct use was rooted within the household mode of production.”
This was to change dramatically with the rise of cotton in the early nineteenth century. “I will give you an invariable rule,” Frederick Douglass proposed in 1846. “When cotton gets up in the market in England, the price of human flesh gets up in the United States.” Douglass’s observation was very apt: In the nineteenth-century South, the prices of slaves became directly tied to the price of the commodity they produced, cotton, similar to how the price of enclosed land in England had become tied to rental income centuries earlier. Enslaved human beings, in other words, were increasingly seen, and treated, less as a form of family property than as liquid financial investments, a transformation that contributed to the unleashing of a giant internal slave market. Priced according to their expected future wealth generation, enslaved individuals were rented, insured, advertised, and mortgaged; they became “mobile, productive assets,” a “dream investment,” under the reign of “King Cotton.”
At the same time, the arguments for and against slavery were being cast increasingly in economic terms. A central thesis of Cook’s book is that over the nineteenth century, progress was increasingly judged not through “moral statistics” but through “capitalizing ones.” While “moral statistics” take the measure of individual welfare—through figures on, for instance, mental suffering, impoverishment or imprisonment, and disability or death—“capitalizing” statistics measure economic costs, such as the price in dollars of “lost productivity.” Reformers increasingly relied, Cook argues, on the latter to advocate for social change.
In the 1830s and 1840s, moral statistics were wielded both to defend and attack the institution of slavery. For instance, a racist Massachusetts physician named Edward Jarvis, relying on numbers from the 1840 U.S. Census, wrote a paper in the precursor of today’s New England Journal of Medicine contending that “insanity” rates were higher among blacks in the North than the South, which he attributed to the alleged benefits of slavery on the slave’s mental health. Jarvis later found large incongruities in the underlying data, however, and published a correction warning that his numbers were bunk; the damage, however, could not be undone, and his argument spread like wildfire in the South as a defense of slavery.
In contrast, by the 1850s and 1860s, slavery was less often debated on moral than on economic grounds, as Cook shows. For instance, Thomas Kettell’s influential book Southern Wealth and Northern Profits, published in 1856, cited soaring slaves prices as evidence that the Southern economy, far from being backwards, had tremendous wealth-generating potential. (These rising prices would also ensure, he contended, that slaves would be treated well). Those who critiqued slavery similarly based their arguments on economic statistics. They noted, for instance, that per capita economic output was higher in the North ($141 annually in Massachusetts) than in the South ($41 annually in South Carolina). Slavery was clearly economically irrational.
This change in thinking corresponded, Cook emphasizes, with the growing capitalization of American economic life. In the North, capitalization was driven by the railroads, and in the South, by the rise of cotton slavery. The Civil War, Cook writes, was therefore less a war between a capitalist North and pre-capitalist or feudal South, but between two capitalist polities, undergirded by different types of capital. It was only the latter, of course, which had literally capitalized humanity itself.
Capitalization accelerated after the Civil War, perhaps reaching something of an apogee in the Progressive Era—especially in public health thought. “There is a growing tendency in modern times,” Dr. Ellice M. Alger, an eye surgeon with an interest in preventive ophthalmology, announced in a 1911 speech, “to consider the individual a mere unit in a great industrial organization .… Society, therefore, has a direct interest in the health of each of its units, because ill health not only increases cost but lessens productivity.”
This quote (which is taken from Hace Sorel Tishler’s Self-Reliance and Social Security, 1870–1917) exemplifies the kind of thinking that undergirded the first campaign for a system of semi-universal health insurance in the United States, launched that same decade. The campaign was led by the “American Association of Labor Legislation,” a progressive organization that advocated for the “conservation of human resources.” This motto speaks to the triumph of the capitalization mindset, however well-meaning its advocates were. Social insurance, including health insurance, was not only a moral or medical project aimed at improving the quality and length of life, but a way to improve the productivity of our precious human resources.
The economist Irving Fisher, who took over as the head of the AALL during the final stage of its doomed health insurance campaign, seemed to epitomize this way of thinking. As Cook explores in his final chapter, Fisher, like Petty, was an expert at pricing things: He produced estimates of the price of public healthcare and tuberculosis. He also priced the average American baby, based on its capacity to produce wealth over a lifetime—obviously after subtracting the costs of rearing.
As easy as it might be to lampoon Fisher, it’s unlikely he believed human life could be reduced to a dollar amount. The reality is more complicated: What Fisher essentially seemed to believe, Cook writes, is that even though life amounts to much more than capital, it was only by describing it as such that one could persuade American elites to go along with major social welfare initiatives, like compulsory health insurance. “If you wanted to help people in a corporate capitalist society,” Cook writes, “you would have to price them and their progress.” This position is analogous to those writers of an earlier era who opposed slavery by arguing that it was—at the end of the day—economically irrational.
It’s a type of argument that many of us—myself included—often make in the policy world to this day, and that we are all very used to hearing: It just makes economic sense. In September in the New Yorker, Sheelah Kolhatkar argued in a piece titled “The Cost of the Opioid Crisis” that President Trump should tackle the opioid crisis not merely because of lives lost, but because of its economic cost to the nation—citing the $78.5 billion figure with which I began this essay. “If Trump were running the U.S. government like a business,” she writes, “as he often claims to be doing, then he would have made tackling an inefficiency of such scale a priority.”
We are accustomed to thinking, like Fisher, that this is how change is wrought in the real world—by convincing policy elites that this or that policy is economically rational. But as the many examples in Cook’s book demonstrate, arguments from economic rationality can obscure as much as they reveal. For if capitalism meant the transformation of land and lives into units of wealth-producing human capital, it also meant the transformation of sickness and death into a currency of wealth-reducing decapitalization. And this poses a question: wealth for whom?