The America We Need
How to make the nation more just, less fragile — and more free.
From some of its darkest hours, the United States has emerged stronger and more resilient.
Between May and July 1862, even as Confederate victories in Virginia raised doubts about the future of the Union, Congress and President Abraham Lincoln kept their eyes on the horizon, enacting three landmark laws that shaped the nation’s next chapter: The Homestead Act allowed western settlers to claim 160 acres of public land apiece; the Morrill Act provided land grants for states to fund universities; and the Pacific Railway Act underwrote the transcontinental railroad.
Nearly 75 years later, in the depths of the Great Depression, with jobs in short supply and many Americans reduced to waiting in bread lines, President Franklin Roosevelt proved similarly farsighted. He concluded the best way to revive and sustain prosperity was not merely to pump money into the economy but to rewrite the rules of the marketplace. “Liberty,” Roosevelt said at the Democratic Party’s convention in 1936, “requires opportunity to make a living — a living decent according to the standard of the time, a living which gives man not only enough to live by, but something to live for.” His administration, working with Congress, enshrined the right of workers to bargain collectively, imposed strict rules and regulators on the financial industry, and created Social Security to provide pensions for the elderly and disabled.
Between May and July 1862, even as Confederate victories in Virginia raised doubts about the future of the Union, Congress and President Abraham Lincoln kept their eyes on the horizon, enacting three landmark laws that shaped the nation’s next chapter: The Homestead Act allowed western settlers to claim 160 acres of public land apiece; the Morrill Act provided land grants for states to fund universities; and the Pacific Railway Act underwrote the transcontinental railroad.
Nearly 75 years later, in the depths of the Great Depression, with jobs in short supply and many Americans reduced to waiting in bread lines, President Franklin Roosevelt proved similarly farsighted. He concluded the best way to revive and sustain prosperity was not merely to pump money into the economy but to rewrite the rules of the marketplace. “Liberty,” Roosevelt said at the Democratic Party’s convention in 1936, “requires opportunity to make a living — a living decent according to the standard of the time, a living which gives man not only enough to live by, but something to live for.” His administration, working with Congress, enshrined the right of workers to bargain collectively, imposed strict rules and regulators on the financial industry, and created Social Security to provide pensions for the elderly and disabled.
This article is part of a Times Opinion series exploring how the nation
can emerge from this crisis stronger, fairer and more free. Read the editor’s introductory letter.
The coronavirus pandemic has laid bare once
again the incomplete nature of the American project — the great distance
between the realities of life and death in the United States and the
values enunciated in its founding documents.
Over
the past half century, the fabric of American democracy has been
stretched thin. The nation has countenanced debilitating decay in its
public institutions and a concentration of economic power not seen since
the 1920s. While many Americans live without financial security or
opportunity, a relative handful of families holds much of the nation’s
wealth. Over the past decade, the wealth of the top 1 percent of
households has surpassed the combined wealth of the bottom 80 percent.
The present crisis has revealed the United States as a nation in which professional basketball players could be rapidly tested for the coronavirus but health care workers were turned away; in which the affluent could retreat to the safety of second homes, relying on workers who can’t take paid sick leave to deliver food; in which children in lower-income households struggle to connect to the digital classrooms where their school lessons are now supposed to be delivered.
The present crisis has revealed the United States as a nation in which professional basketball players could be rapidly tested for the coronavirus but health care workers were turned away; in which the affluent could retreat to the safety of second homes, relying on workers who can’t take paid sick leave to deliver food; in which children in lower-income households struggle to connect to the digital classrooms where their school lessons are now supposed to be delivered.
It is a nation in which local officials issuing
stay-at-home orders must reckon with the cruel irony that hundreds of
thousands of Americans do not have homes. Lacking private places, they
must sleep in public spaces. Las Vegas
painted rectangles on an asphalt parking lot to remind homeless
residents to sleep six feet apart — an act that might as well have been a
grim piece of performance art titled “The Least We Can Do.”
It is a nation in which enduring racial inequalities, in wealth and in health, are reflected in the pandemic’s death toll. In Michigan, where the coronavirus hit early and hard, African-Americans make up just 14 percent of the state’s population but 40 percent of the dead. Jason Hargrove, who kept driving a Detroit city bus as the virus spread, posted a Facebook video on March 21 complaining about a female passenger who coughed without covering her mouth. He said he had to keep working, to care for his family. In the video, he told his wife he’d take off his clothes in the front hall when he got home and get right in the shower, so that she stayed safe. Less than two weeks later, he was dead.
The federal government is providing temporary aid to less fortunate Americans, and few have objected to those emergency measures. But already some politicians are asserting that the extraordinary nature of the crisis does not warrant permanent changes in the social contract.
This misapprehends both the nature of crises in general and the particulars of the present emergency. The magnitude of a crisis is determined not just by the impact of the precipitating events but also by the fragility of the system it attacks. Our society was especially vulnerable to this pandemic because so many Americans lack the essential liberty to protect their own lives and the lives of their families.
This nation was ailing long before the coronavirus reached its shores.
A great divide separates affluent Americans, who fully enjoy the benefits of life in the wealthiest nation on earth, from the growing portion of the population whose lives lack stability or any real prospect of betterment.
The hedge-fund billionaire Kenneth Griffin paid $238 million last year for a New York apartment overlooking Central Park. He plans to stay there when he happens to be in town. Meanwhile, 10.9 million American families barely can afford an apartment. They spend more than half of their incomes on rent, and so they scrimp on food and health care. And on any given night, half a million Americans are homeless.
It is a nation in which enduring racial inequalities, in wealth and in health, are reflected in the pandemic’s death toll. In Michigan, where the coronavirus hit early and hard, African-Americans make up just 14 percent of the state’s population but 40 percent of the dead. Jason Hargrove, who kept driving a Detroit city bus as the virus spread, posted a Facebook video on March 21 complaining about a female passenger who coughed without covering her mouth. He said he had to keep working, to care for his family. In the video, he told his wife he’d take off his clothes in the front hall when he got home and get right in the shower, so that she stayed safe. Less than two weeks later, he was dead.
The federal government is providing temporary aid to less fortunate Americans, and few have objected to those emergency measures. But already some politicians are asserting that the extraordinary nature of the crisis does not warrant permanent changes in the social contract.
This misapprehends both the nature of crises in general and the particulars of the present emergency. The magnitude of a crisis is determined not just by the impact of the precipitating events but also by the fragility of the system it attacks. Our society was especially vulnerable to this pandemic because so many Americans lack the essential liberty to protect their own lives and the lives of their families.
This nation was ailing long before the coronavirus reached its shores.
A great divide separates affluent Americans, who fully enjoy the benefits of life in the wealthiest nation on earth, from the growing portion of the population whose lives lack stability or any real prospect of betterment.
The hedge-fund billionaire Kenneth Griffin paid $238 million last year for a New York apartment overlooking Central Park. He plans to stay there when he happens to be in town. Meanwhile, 10.9 million American families barely can afford an apartment. They spend more than half of their incomes on rent, and so they scrimp on food and health care. And on any given night, half a million Americans are homeless.
For those at the bottom, moreover, the chances of rising are in decline. By the time they reached 30, more than 90 percent
of Americans born in 1940 were earning more than their parents had
earned at the same age. But among those born in 1980, only half were
earning more than their parents by the age of 30.
The erosion of the American dream is not a result of laziness or a talent drought. Rather, opportunity has slipped away. The economic ladder is harder to climb; real incomes have stagnated for decades even as the costs of housing, education and health care have increased. Many lower-income Americans are born into polluted, impoverished neighborhoods, with no decent jobs to be found.
“By 40, my parents owned a house, had a kid — me — and were both doing well in their careers,” said Melanie Martin-Leff, who works in marketing in Philadelphia. “I’m freelancing, renting, partnerless and childless.”
The inequalities of wealth have become inequalities of health. A middle-aged American in the top fifth of the income distribution can expect to live about 13 years longer than a person of the same age in the bottom fifth — an advantage that has more than doubled since 1980.
These changes have become harder to reverse because the distribution of political power also is increasingly unequal. Our system of democracy is under strain as those with wealth increasingly shape the course of policymaking, acting from self-interest and perhaps also because it has become harder to imagine life on the other side of the divide or to design policy in the common interest.
The wealthy are particularly successful in blocking changes they don’t like. The political scientists Martin Gilens of Princeton and Benjamin Page of Northwestern have calculated that between 1981 and 2002, policies supported by at least 80 percent of affluent voters passed into law about 45 percent of the time, while policies opposed by at least 80 percent of those voters passed into law just 18 percent of the time. Importantly, the views of poor and middle-class voters had little influence.
The erosion of the American dream is not a result of laziness or a talent drought. Rather, opportunity has slipped away. The economic ladder is harder to climb; real incomes have stagnated for decades even as the costs of housing, education and health care have increased. Many lower-income Americans are born into polluted, impoverished neighborhoods, with no decent jobs to be found.
“By 40, my parents owned a house, had a kid — me — and were both doing well in their careers,” said Melanie Martin-Leff, who works in marketing in Philadelphia. “I’m freelancing, renting, partnerless and childless.”
The inequalities of wealth have become inequalities of health. A middle-aged American in the top fifth of the income distribution can expect to live about 13 years longer than a person of the same age in the bottom fifth — an advantage that has more than doubled since 1980.
These changes have become harder to reverse because the distribution of political power also is increasingly unequal. Our system of democracy is under strain as those with wealth increasingly shape the course of policymaking, acting from self-interest and perhaps also because it has become harder to imagine life on the other side of the divide or to design policy in the common interest.
The wealthy are particularly successful in blocking changes they don’t like. The political scientists Martin Gilens of Princeton and Benjamin Page of Northwestern have calculated that between 1981 and 2002, policies supported by at least 80 percent of affluent voters passed into law about 45 percent of the time, while policies opposed by at least 80 percent of those voters passed into law just 18 percent of the time. Importantly, the views of poor and middle-class voters had little influence.
The fragility of
our society and government is the product of deliberate decisions. The
modern welfare state was constructed in three great waves:
These
policies embodied a broad and muscular conception of liberty: that
government should provide all Americans with the freedom that comes from
a stable and prosperous life.
“We have come to a clear realization of the fact that true individual freedom cannot exist without economic security and independence,” Roosevelt told the nation in 1944.
The goal, of course, was never realized in full, but since the late 1960s, the federal government has largely abandoned the attempt. The defining trend in American public policy has been to diminish government’s role as a guarantor of personal liberty.
Advocates of a minimalist conception of government claim they too are defenders of liberty. But theirs is a narrow and negative definition of freedom: the freedom from civic duty, from mutual obligation, from taxation. This impoverished view of freedom has in practice protected wealth and privilege. It has perpetuated the nation’s defining racial inequalities and kept the poor trapped in poverty, and their children, and their children’s children.
One of the most important aspects of this retreat was the government’s role in constructing a new residential landscape of economically and racially segregated communities. The government built highways that carried white families to new suburban neighborhoods where minorities often were not allowed to live; it provided mortgage loans that minorities were not allowed to obtain; and even after explicit discrimination was declared illegal, single-family zoning laws continued to exclude low-income families, particularly minorities.
Policymakers tied funding for public services to the prosperity of the new communities, and the Supreme Court blessed the practice in a 1973 ruling, San Antonio Independent School District v. Rodriguez, that allowed differences in school funding based on differences in local property values. The effect was to substitute economic segregation for explicitly racial segregation.
“We have come to a clear realization of the fact that true individual freedom cannot exist without economic security and independence,” Roosevelt told the nation in 1944.
The goal, of course, was never realized in full, but since the late 1960s, the federal government has largely abandoned the attempt. The defining trend in American public policy has been to diminish government’s role as a guarantor of personal liberty.
Advocates of a minimalist conception of government claim they too are defenders of liberty. But theirs is a narrow and negative definition of freedom: the freedom from civic duty, from mutual obligation, from taxation. This impoverished view of freedom has in practice protected wealth and privilege. It has perpetuated the nation’s defining racial inequalities and kept the poor trapped in poverty, and their children, and their children’s children.
One of the most important aspects of this retreat was the government’s role in constructing a new residential landscape of economically and racially segregated communities. The government built highways that carried white families to new suburban neighborhoods where minorities often were not allowed to live; it provided mortgage loans that minorities were not allowed to obtain; and even after explicit discrimination was declared illegal, single-family zoning laws continued to exclude low-income families, particularly minorities.
Policymakers tied funding for public services to the prosperity of the new communities, and the Supreme Court blessed the practice in a 1973 ruling, San Antonio Independent School District v. Rodriguez, that allowed differences in school funding based on differences in local property values. The effect was to substitute economic segregation for explicitly racial segregation.
The
government similarly enabled growing divisions in the workplace. As the
economy shifted from manufacturing to services, corporations — with the
help of Congress and local lawmakers — successfully resisted the
unionization of new jobs. And the government declined to replace
organized labor as the protector of workers in burgeoning sectors like
retail and health care.
Companies were not required to provide employees with basic benefits like paid leave, and they were given free rein to claim that many of their full-time workers were actually contractors. The purchasing power of the federal minimum wage has been falling since 1968.
A shift in corporate behavior also harmed workers. Many business leaders rallied around a narrow conception of corporate responsibility, arguing the sole obligation of a corporation was to maximize shareholder returns. Policymakers backed the shift, notably by writing that narrow definition into the laws of Delaware, where many large companies maintain official homes.
The results are clear enough: Executive pay has skyrocketed, and shareholders have enjoyed rising stock prices, at least until recently, while most workers are falling behind. If individual income had kept pace with overall economic growth since 1970, Americans in the bottom 90 percent of the income distribution would be making an extra $12,000 per year, on average. In effect, the extreme increase in inequality means every worker in the bottom 90 percent of the income distribution is sending an annual check for $12,000 to a worker in the top 10 percent.
The idealization of individual action in an open marketplace has had its mirror image in the denigration of collective action through government.
The United States does not guarantee the availability of affordable housing to its citizens, as do most developed nations. It does not guarantee reliable access to health care, as does virtually every other developed nation. The cost of a college education in the United States is among the highest in the developed world. And beyond the threadbare nature of the American safety net, the government has pulled back from investment in infrastructure, education and basic scientific research, the building blocks of future prosperity. It is not surprising many Americans have lost confidence in the government as a vehicle for achieving the things that we cannot achieve alone.
Companies were not required to provide employees with basic benefits like paid leave, and they were given free rein to claim that many of their full-time workers were actually contractors. The purchasing power of the federal minimum wage has been falling since 1968.
A shift in corporate behavior also harmed workers. Many business leaders rallied around a narrow conception of corporate responsibility, arguing the sole obligation of a corporation was to maximize shareholder returns. Policymakers backed the shift, notably by writing that narrow definition into the laws of Delaware, where many large companies maintain official homes.
The results are clear enough: Executive pay has skyrocketed, and shareholders have enjoyed rising stock prices, at least until recently, while most workers are falling behind. If individual income had kept pace with overall economic growth since 1970, Americans in the bottom 90 percent of the income distribution would be making an extra $12,000 per year, on average. In effect, the extreme increase in inequality means every worker in the bottom 90 percent of the income distribution is sending an annual check for $12,000 to a worker in the top 10 percent.
The idealization of individual action in an open marketplace has had its mirror image in the denigration of collective action through government.
The United States does not guarantee the availability of affordable housing to its citizens, as do most developed nations. It does not guarantee reliable access to health care, as does virtually every other developed nation. The cost of a college education in the United States is among the highest in the developed world. And beyond the threadbare nature of the American safety net, the government has pulled back from investment in infrastructure, education and basic scientific research, the building blocks of future prosperity. It is not surprising many Americans have lost confidence in the government as a vehicle for achieving the things that we cannot achieve alone.
The nation’s hierarchies are starkly
visible during periods of crisis. The coronavirus pandemic has
necessitated extraordinary sacrifices, but the distribution is
profoundly unequal.
The wealthy and famous and politically powerful have laid first claim to the available lifeboats: Senators Richard Burr of North Carolina and Kelly Loeffler of Georgia secured their own fortunes by selling off stock holdings as the virus spread in January and February, even as they reassured the nation that everything was going to be OK; the billionaire David Geffen posted on Instagram that he planned to ride out the crisis on his 454-foot yacht, Rising Sun, adding, “I’m hoping everybody is staying safe”; large corporations lobbied successfully against a proposal to provide paid sick leave to every American worker, pleading they couldn’t afford the cost.
Less affluent Americans will bear the brunt in health and wealth. Already they suffer disproportionately from the diseases of labor like black lung and mesothelioma; the diseases of poverty like obesity and diabetes; and the opioid epidemic that has raged in the communities where opportunity is in short supply. By one estimate, these patterns of poor health mean those at the bottom of the income spectrum are twice as likely to die from Covid-19. Many are losing their jobs; those still working generally cannot do so from the safety of the living room couch. They risk death to obtain the necessities of life.
Children, relatively safe from the coronavirus itself, are in particular danger from the economic fallout. Public schools are one of the great equalizing forces in American life; the shift to online learning means existing inequalities matter more. Millions of children lack reliable internet access. The principal of a high school in Phoenix found three students huddled under a blanket outside the building on a rainy day, using the school’s wireless network to complete their required schoolwork because they could not log in from their homes.
And research shows the impact of economic traumas in childhood are long-lasting. The children of parents who lose work, for example, end up earning less over their own lifetimes.
The crisis has also exposed the federal government’s lack of resources, competence and ambition. The government failed to contain the virus through a program of testing and targeted quarantines; it is struggling to provide states with the medical equipment necessary to help those who fall ill; and instead of moving more aggressively to contain the economic damage, the federal government has allowed companies to lay off millions of workers. The unemployment rate in the United States has most likely already reached the highest level since the Great Depression.
The wealthy and famous and politically powerful have laid first claim to the available lifeboats: Senators Richard Burr of North Carolina and Kelly Loeffler of Georgia secured their own fortunes by selling off stock holdings as the virus spread in January and February, even as they reassured the nation that everything was going to be OK; the billionaire David Geffen posted on Instagram that he planned to ride out the crisis on his 454-foot yacht, Rising Sun, adding, “I’m hoping everybody is staying safe”; large corporations lobbied successfully against a proposal to provide paid sick leave to every American worker, pleading they couldn’t afford the cost.
Less affluent Americans will bear the brunt in health and wealth. Already they suffer disproportionately from the diseases of labor like black lung and mesothelioma; the diseases of poverty like obesity and diabetes; and the opioid epidemic that has raged in the communities where opportunity is in short supply. By one estimate, these patterns of poor health mean those at the bottom of the income spectrum are twice as likely to die from Covid-19. Many are losing their jobs; those still working generally cannot do so from the safety of the living room couch. They risk death to obtain the necessities of life.
Children, relatively safe from the coronavirus itself, are in particular danger from the economic fallout. Public schools are one of the great equalizing forces in American life; the shift to online learning means existing inequalities matter more. Millions of children lack reliable internet access. The principal of a high school in Phoenix found three students huddled under a blanket outside the building on a rainy day, using the school’s wireless network to complete their required schoolwork because they could not log in from their homes.
And research shows the impact of economic traumas in childhood are long-lasting. The children of parents who lose work, for example, end up earning less over their own lifetimes.
The crisis has also exposed the federal government’s lack of resources, competence and ambition. The government failed to contain the virus through a program of testing and targeted quarantines; it is struggling to provide states with the medical equipment necessary to help those who fall ill; and instead of moving more aggressively to contain the economic damage, the federal government has allowed companies to lay off millions of workers. The unemployment rate in the United States has most likely already reached the highest level since the Great Depression.
A
major reason for the faltering response is a chimerical expectation
that markets will perform the work of government. The White House has
for the most part refused to mandate or coordinate production of
critical medical supplies. Indeed, the federal government has bid
against states for available supplies and encouraged states to bid
against one another. It is an embrace of markets so extreme it might
seem farcical if it wasn’t resulting in unnecessary deaths.
Corporate action and philanthropy certainly have their places, particularly in the short term, given President Trump’s feckless leadership and the tattered condition of the government he heads. But they are poor substitutes for effective stewardship by public institutions. What America needs is a just and activist government. The nature of democracy is that we are together responsible for saving ourselves.
Americans need to recover the optimism that has so often lighted the path forward.
The crucible of a crisis provides the opportunity to forge a better society, but the crisis itself does not do the work. Crises expose problems, but they do not supply alternatives, let alone political will. Change requires ideas and leadership. Nations often pass through the same kinds of crises repeatedly, either unable to imagine a different path or unwilling to walk it.
The worst crises often occur under weak leadership; that is a big part of how an initial problem spirals out of control. Americans had every reason to despair of President James Buchanan’s ability to lead the nation through a civil war, or of President Herbert Hoover’s ability to lead the nation out of the Great Depression. Now, as then, the country is burdened with weak leadership — and it has a chance to replace that leadership, as it did in 1860 and 1932.
There is also a need for new ideas, and the revival of older ideas, about what the government owes the nation’s citizens, what corporations owe employees and what we owe one another.
The multi-trillion-dollar scale of the government’s response to the crisis, for all its flaws and inadequacies, offers a powerful reminder that there is no replacement for an activist state. The political scientist Francis Fukuyama has observed that the nations best weathering the coronavirus pandemic are those like Singapore and Germany, where there is broad trust in government — and where the state merits that confidence. A critical part of America’s post-crisis rebuilding project is to restore the effectiveness of the government and to rebuild public confidence in it.
Corporate action and philanthropy certainly have their places, particularly in the short term, given President Trump’s feckless leadership and the tattered condition of the government he heads. But they are poor substitutes for effective stewardship by public institutions. What America needs is a just and activist government. The nature of democracy is that we are together responsible for saving ourselves.
Americans need to recover the optimism that has so often lighted the path forward.
The crucible of a crisis provides the opportunity to forge a better society, but the crisis itself does not do the work. Crises expose problems, but they do not supply alternatives, let alone political will. Change requires ideas and leadership. Nations often pass through the same kinds of crises repeatedly, either unable to imagine a different path or unwilling to walk it.
The worst crises often occur under weak leadership; that is a big part of how an initial problem spirals out of control. Americans had every reason to despair of President James Buchanan’s ability to lead the nation through a civil war, or of President Herbert Hoover’s ability to lead the nation out of the Great Depression. Now, as then, the country is burdened with weak leadership — and it has a chance to replace that leadership, as it did in 1860 and 1932.
There is also a need for new ideas, and the revival of older ideas, about what the government owes the nation’s citizens, what corporations owe employees and what we owe one another.
The multi-trillion-dollar scale of the government’s response to the crisis, for all its flaws and inadequacies, offers a powerful reminder that there is no replacement for an activist state. The political scientist Francis Fukuyama has observed that the nations best weathering the coronavirus pandemic are those like Singapore and Germany, where there is broad trust in government — and where the state merits that confidence. A critical part of America’s post-crisis rebuilding project is to restore the effectiveness of the government and to rebuild public confidence in it.
A major investment in public health would be a fitting place to start.
The larger project, however, is to increase the resilience of American society. Generations of federal policymakers have prioritized the pursuit of economic growth with scant regard for stability or distribution. This moment demands a restoration of the national commitment to a richer conception of freedom: economic security and equality of opportunity. That’s why Times Opinion is publishing this project across the next two months, to envision how to turn the America we have into the America we need.
The purpose of the federal government, Lincoln wrote to Congress on July 4, 1861, was “to elevate the condition of men, to lift artificial burdens from all shoulders, and to give everyone an unfettered start and a fair chance in the race of life.” The Homestead Act in particular was a concrete step in that direction: 10 percent of all the land in the United States was ultimately distributed in 160-acre chunks. But Lincoln’s conception of “everyone” did not include everyone: The Homestead Act rested on the expropriation of Native American lands.
Roosevelt shared Lincoln’s vision of government, but industry had replaced agriculture as the wellspring of prosperity, so he focused on ensuring a more equitable distribution of the nation’s manufacturing output — although African-Americans were treated as second-class citizens in many New Deal programs.
The United States today is in need of new measures to stake all Americans in the modern economy.
To give Americans a fair chance in the race of life, the government must begin from birth. The United States must reclaim the core truth of the Supreme Court’s seminal decision in Brown v. Board of Education: So long as Americans are segregated, their opportunities can never be equal. One of the most important steps the United States can take to ensure all children have the opportunity to thrive is to bulldoze enduring patterns of racial and economic segregation. Zoning laws that limit residential development in the very areas where good jobs are most abundant are one of the most important structural obstacles to a more integrated nation.
Over the course of this project, we will examine other ways to equalize opportunity early in life, and also to restore a healthier balance of power between employers and workers.
One of the clearest lessons of the pandemic is that many employers feel shockingly little obligation to protect the health and welfare of their workers, and workers have been left with little means to organize or resist. Amazon, one of the nation’s largest employers, fired a worker protesting safety conditions at the company’s warehouses on the Orwellian grounds that his protest was itself a safety hazard. A manager at a Uline call center instructed employees not to tell colleagues if they weren’t feeling well because it might cause “unnecessary panic.”
And the nation’s tattered social safety net is in desperate need of reinforcement. Americans need reliable access to health care. Americans need affordable options for child care and for the care of older members of their families, a growing crisis in an aging nation. No one, and especially not children, should ever go hungry. Everyone deserves a place to call home.
The larger project, however, is to increase the resilience of American society. Generations of federal policymakers have prioritized the pursuit of economic growth with scant regard for stability or distribution. This moment demands a restoration of the national commitment to a richer conception of freedom: economic security and equality of opportunity. That’s why Times Opinion is publishing this project across the next two months, to envision how to turn the America we have into the America we need.
The purpose of the federal government, Lincoln wrote to Congress on July 4, 1861, was “to elevate the condition of men, to lift artificial burdens from all shoulders, and to give everyone an unfettered start and a fair chance in the race of life.” The Homestead Act in particular was a concrete step in that direction: 10 percent of all the land in the United States was ultimately distributed in 160-acre chunks. But Lincoln’s conception of “everyone” did not include everyone: The Homestead Act rested on the expropriation of Native American lands.
Roosevelt shared Lincoln’s vision of government, but industry had replaced agriculture as the wellspring of prosperity, so he focused on ensuring a more equitable distribution of the nation’s manufacturing output — although African-Americans were treated as second-class citizens in many New Deal programs.
The United States today is in need of new measures to stake all Americans in the modern economy.
To give Americans a fair chance in the race of life, the government must begin from birth. The United States must reclaim the core truth of the Supreme Court’s seminal decision in Brown v. Board of Education: So long as Americans are segregated, their opportunities can never be equal. One of the most important steps the United States can take to ensure all children have the opportunity to thrive is to bulldoze enduring patterns of racial and economic segregation. Zoning laws that limit residential development in the very areas where good jobs are most abundant are one of the most important structural obstacles to a more integrated nation.
Over the course of this project, we will examine other ways to equalize opportunity early in life, and also to restore a healthier balance of power between employers and workers.
One of the clearest lessons of the pandemic is that many employers feel shockingly little obligation to protect the health and welfare of their workers, and workers have been left with little means to organize or resist. Amazon, one of the nation’s largest employers, fired a worker protesting safety conditions at the company’s warehouses on the Orwellian grounds that his protest was itself a safety hazard. A manager at a Uline call center instructed employees not to tell colleagues if they weren’t feeling well because it might cause “unnecessary panic.”
And the nation’s tattered social safety net is in desperate need of reinforcement. Americans need reliable access to health care. Americans need affordable options for child care and for the care of older members of their families, a growing crisis in an aging nation. No one, and especially not children, should ever go hungry. Everyone deserves a place to call home.
Just a little more than a
decade ago, Americans lived through a very different kind of crisis — a
financial collapse — that exposed similar fragilities in American
society. The government’s response was inadequate. The recovery was
still underway when the coronavirus arrived, and partly because recovery
had come so slowly, America’s political leaders had failed to take
advantage of the intervening years to prepare for the inevitability of
fresh tests.
The nation cannot afford a repeat performance, particularly as other challenges to our society already loom, most of all the imperative to slow global warming.
The United States has a chance to emerge from this latest crisis as a stronger nation, more just, more free and more resilient. We must seize the opportunity.
https://www.nytimes.com/2020/04/09/opinion/coronavirus-inequality-america.html?action=click&module=Opinion&pgtype=Homepage
The nation cannot afford a repeat performance, particularly as other challenges to our society already loom, most of all the imperative to slow global warming.
The United States has a chance to emerge from this latest crisis as a stronger nation, more just, more free and more resilient. We must seize the opportunity.
https://www.nytimes.com/2020/04/09/opinion/coronavirus-inequality-america.html?action=click&module=Opinion&pgtype=Homepage
The U.S. Approach to Public Health: Neglect, Panic, Repeat
Time to give new life to an old idea: A strong public health system is the best guarantor of good health.
by Jeneen Interlandi - NYT - April 9, 2020
A once-in-a-century public health crisis is unfolding, and
the richest country in the world is struggling to mount an effective
response. Hospitals
don’t have enough gowns or masks to protect doctors and nurses, nor
enough intensive care beds to treat the surge of patients. Laboratories
don’t have the equipment to diagnose cases quickly or in bulk, and state
and local health departments across the country don’t have the manpower
to track the disease’s spread. Perhaps worst of all, urgent messages
about the importance of social distancing and the need for temporary
shutdowns have been muddied by politics.
Nearly all of these problems might have been averted by a strong, national public health system, but in America, no such system exists.
It’s a state of affairs that belies the country’s long public health tradition. Before the turn of the previous century, when yellow fever, tuberculosis and other plagues ravaged the country’s largest cities at regular intervals, public health was generally accepted as a key component of the social contract. Even before scientists identified the microbes that cause such diseases, governments and individuals understood that a combination of leadership, planning and cooperation was needed to keep them at bay. Some of the nation’s oldest public health departments — in Boston, New York and Baltimore — were built on that premise.
By pushing infectious disease outbreaks to the margins, those health departments helped usher in what scientists refer to as the epidemiological transition: the remarkable leveling off of preventable deaths among children and working-age adults. That leveling off continued in the second half of the 20th century, as new federal laws ensured the protection of food, air and water from contamination, and national campaigns brought the scourges of nicotine addiction and sexually transmitted infections under control.
Nearly all of these problems might have been averted by a strong, national public health system, but in America, no such system exists.
It’s a state of affairs that belies the country’s long public health tradition. Before the turn of the previous century, when yellow fever, tuberculosis and other plagues ravaged the country’s largest cities at regular intervals, public health was generally accepted as a key component of the social contract. Even before scientists identified the microbes that cause such diseases, governments and individuals understood that a combination of leadership, planning and cooperation was needed to keep them at bay. Some of the nation’s oldest public health departments — in Boston, New York and Baltimore — were built on that premise.
By pushing infectious disease outbreaks to the margins, those health departments helped usher in what scientists refer to as the epidemiological transition: the remarkable leveling off of preventable deaths among children and working-age adults. That leveling off continued in the second half of the 20th century, as new federal laws ensured the protection of food, air and water from contamination, and national campaigns brought the scourges of nicotine addiction and sexually transmitted infections under control.
So
great was the effect of these public health measures that by the time
the century turned again, life expectancy in the United States had risen
sharply, from less than 50 years to nearly 80. “Public health is the
best bang for our collective buck,” Tom Frieden, a former director of
the Centers for Disease Control and Prevention, told me. “It has
consistently saved the most lives for the least amount of money.”
One would never guess as much today. Across the same century that saw so many public health victories, public health itself fell victim to larger forces.
“It was like a great forgetting took place,” Wendy Parmet, a public health law scholar at Northeastern University, told me. “As the memory of epidemics faded, individual rights became much more important than collective responsibility.” And as medicine grew more sophisticated, health began to be seen as purely a personal matter.
Health care spending grew by 52 percent in the past decade, while the budgets of local health departments shrank by as much as 24 percent, according to a 2019 report from the public health nonprofit Trust for America’s Health, and the C.D.C.’s budget remained flat. Today, public health claims just 3 cents of every health dollar spent in the country.
The results of that imbalance were apparent long before Covid-19 began its march across the globe. Local health departments eliminated more than 50,000 jobs — epidemiologists, laboratory technicians, public information specialists — between 2008 and 2017. That’s nearly 23 percent of their total work force.
One would never guess as much today. Across the same century that saw so many public health victories, public health itself fell victim to larger forces.
“It was like a great forgetting took place,” Wendy Parmet, a public health law scholar at Northeastern University, told me. “As the memory of epidemics faded, individual rights became much more important than collective responsibility.” And as medicine grew more sophisticated, health began to be seen as purely a personal matter.
Health care spending grew by 52 percent in the past decade, while the budgets of local health departments shrank by as much as 24 percent, according to a 2019 report from the public health nonprofit Trust for America’s Health, and the C.D.C.’s budget remained flat. Today, public health claims just 3 cents of every health dollar spent in the country.
The results of that imbalance were apparent long before Covid-19 began its march across the globe. Local health departments eliminated more than 50,000 jobs — epidemiologists, laboratory technicians, public information specialists — between 2008 and 2017. That’s nearly 23 percent of their total work force.
Crucial
programs — including ones that provide vaccinations, test for sexually
transmitted infections and monitor local food and water supplies — have
been trimmed or eliminated. As a result, several old public health foes
have returned: Measles and syphilis are both resurgent, as is nicotine consumption among teenagers and the contamination of food and water with bacteria and lead.
Each of these crises has received its own flurry of outrage, but none of them have been enough to break what experts say is the nation’s default public health strategy: neglect, panic, repeat.
“We ignore the public health sector unless there’s a major catastrophe,” said Scott Becker, the head of the Association for Public Health Laboratories. “Then we throw a pile of money at the problem. Then we rescind that money as soon as the crisis abates.”
There is a better way.
Imagine a public health system in which all public health entities used the same cutting edge technology in their laboratories and on their computers. This would include equipment that enables rapid diagnostic tests to be developed and deployed quickly in a crisis; web portals where data on disease spread, hospital capacity and high-risk communities can be logged and shared across the country; and user-friendly apps that enable private citizens to facilitate the efforts of epidemiologists.
The technology to create such a system already exists — it only has to be adapted and implemented.
That, of course, requires investment. In 2019, a consortium of public health organizations lobbied the federal government for $1 billion to help the nation’s public health system modernize its data infrastructure. They were granted $50 million. In the wake of Covid-19, that sum has been increased to $500 million. But much more is needed. There is a $5.4 billion gap between current public health spending and the cost of modernizing public health infrastructure, according to the Trust for America’s Health report.
However much money is ultimately allotted for this work, it will have to be deployed equitably, in high-income and low-income communities alike. Health departments everywhere are struggling to contain the Covid-19 pandemic, but that struggle is particularly acute in marginalized communities, where health is already fragile, public health departments are sometimes nonexistent and mistrust of officials tends to run high.
Early data from several states indicates that Hispanics and African-Americans already account for a disproportionately high number of coronavirus-related deaths, a finding that is both unsurprising and unacceptable. A better system would direct federal aid to where it’s needed most — and would work to eradicate legacies of injustice and abuse that mar the history of public health victories.
Each of these crises has received its own flurry of outrage, but none of them have been enough to break what experts say is the nation’s default public health strategy: neglect, panic, repeat.
“We ignore the public health sector unless there’s a major catastrophe,” said Scott Becker, the head of the Association for Public Health Laboratories. “Then we throw a pile of money at the problem. Then we rescind that money as soon as the crisis abates.”
There is a better way.
Imagine a public health system in which all public health entities used the same cutting edge technology in their laboratories and on their computers. This would include equipment that enables rapid diagnostic tests to be developed and deployed quickly in a crisis; web portals where data on disease spread, hospital capacity and high-risk communities can be logged and shared across the country; and user-friendly apps that enable private citizens to facilitate the efforts of epidemiologists.
The technology to create such a system already exists — it only has to be adapted and implemented.
That, of course, requires investment. In 2019, a consortium of public health organizations lobbied the federal government for $1 billion to help the nation’s public health system modernize its data infrastructure. They were granted $50 million. In the wake of Covid-19, that sum has been increased to $500 million. But much more is needed. There is a $5.4 billion gap between current public health spending and the cost of modernizing public health infrastructure, according to the Trust for America’s Health report.
However much money is ultimately allotted for this work, it will have to be deployed equitably, in high-income and low-income communities alike. Health departments everywhere are struggling to contain the Covid-19 pandemic, but that struggle is particularly acute in marginalized communities, where health is already fragile, public health departments are sometimes nonexistent and mistrust of officials tends to run high.
Early data from several states indicates that Hispanics and African-Americans already account for a disproportionately high number of coronavirus-related deaths, a finding that is both unsurprising and unacceptable. A better system would direct federal aid to where it’s needed most — and would work to eradicate legacies of injustice and abuse that mar the history of public health victories.
Of
course, none of these changes will help if the underlying system is not
grounded in and guided by rigorous, apolitical science. Public health
agencies were created precisely because the decisions required to stop a
pandemic in its tracks, or protect the nation’s food supply, or keep
measles at bay were considered too difficult and too important to be
swayed by politics.
The vision for public health reform is not especially complicated or expensive. But it is bold — and it will require boldness from every corner of the country.
Politicians will have to incorporate public health into their priorities; they might start by making “Public Health for All” as urgent a rallying cry as any concerning health insurance. Universal access to health care is a human right, but it will not protect us from the next pandemic — or clean water crisis, for that matter.
Captains of industry will have to commit acts of genuine altruism, because not all of the innovations needed to build a modern public health system will be clearly lucrative. If you’re making a fortune out of cornering the market on ventilators, for example, designing a cheaper, easier-to-make version of your product might sound like bad business. Likewise, developing vaccines and antibiotics may seem like a risky investment compared with the prospect of another million-dollar cancer drug. But when the next pandemic threat arrives, millions of lives — not to mention the entire global economy — may depend on exactly these things.
Mind-sets will have to change, too. A society that prizes individual liberty above all else is bound to treat health as a private matter. But if Covid-19 has taught us anything, it’s that our health and safety depend on collective action. That’s what public health is all about.
https://www.nytimes.com/2020/04/09/opinion/coronavirus-public-health-system-us.html?action=click&module=Top%20Stories&pgtype=Homepage
The vision for public health reform is not especially complicated or expensive. But it is bold — and it will require boldness from every corner of the country.
Politicians will have to incorporate public health into their priorities; they might start by making “Public Health for All” as urgent a rallying cry as any concerning health insurance. Universal access to health care is a human right, but it will not protect us from the next pandemic — or clean water crisis, for that matter.
Captains of industry will have to commit acts of genuine altruism, because not all of the innovations needed to build a modern public health system will be clearly lucrative. If you’re making a fortune out of cornering the market on ventilators, for example, designing a cheaper, easier-to-make version of your product might sound like bad business. Likewise, developing vaccines and antibiotics may seem like a risky investment compared with the prospect of another million-dollar cancer drug. But when the next pandemic threat arrives, millions of lives — not to mention the entire global economy — may depend on exactly these things.
Mind-sets will have to change, too. A society that prizes individual liberty above all else is bound to treat health as a private matter. But if Covid-19 has taught us anything, it’s that our health and safety depend on collective action. That’s what public health is all about.
https://www.nytimes.com/2020/04/09/opinion/coronavirus-public-health-system-us.html?action=click&module=Top%20Stories&pgtype=Homepage
Medicare for Each of Us in the Age of the Coronavirus
The
U.S. public—and increasingly the business community—are becoming
acutely aware of the rising costs and inadequacies of our current
for-profit system, particularly as the current epidemic unfolds. There
is no other choice but Medicare for All.
by Peter Arno and Philip Caper - Common Dreams - April 3, 2020
Over the past two weeks, the explosive growth of the coronavirus pandemic has forced nearly 10 million Americans to file for unemployment benefits. Along with their jobs, many have lost their health insurance, if they had any to begin with. Aside from possibly spelling disaster for these newly unemployed workers and their families, this situation puts both the public health and economic wellbeing of our country at great risk. A clearer rationale for universal, affordable, lifetime health coverage as exemplified under a Medicare For All framework would be hard to find.
In this article we outline the need for a universal health plan, its historical context, and the obstacles raised by the medical-industrial complex that must be overcome.
There is a large elephant in the room in the national discussion of Medicare for All: the transformation of the US health care system’s core mission from the prevention, diagnosis, and treatment of illness—and the promotion of healing—to an approach dominated by large, publicly traded corporate entities dedicated to growing profitability and share price, that is, the business of medicine.
The problem is not that these corporate entities are doing something they shouldn’t. They are simply doing too much of what they were created to do—generate wealth for their owners. Unlike any other wealthy country, we let them do it. The dilemma of the US health care system is due not to a failure of capitalism or corporatism per se, but a failure to implement a public policy that adequately constrains their excesses.
"How is it that we spend more on health care than any other nation, yet have arrived at such a sorry state of affairs?"
Since the late 1970s, US public policy regarding health care has trended toward an increasing dependence on for-pofit corporations and their accompanying reliance on the tools of the marketplace—such as competition, consolidation, marketing, and consumer choice—to expand access and assure quality in the provision of medical care.
This commercialized, commodified, and corporatized model is driving the US public’s demand for fundamental reform and has elevated the issue of health care to the top of the political agenda in the current presidential election campaign.
Costs have risen relentlessly, and the quality of and access to care for many Americans has deteriorated. The cultural changes accompanying these trends have affected every segment of the US health care system, including those that remain nominally not-for-profit. Excessive focus on health care as a business has had a destructive effect on both patients and caregivers, leading to increasing difficulties for many patients in accessing care and to anger, frustration, and burnout for many caregivers, especially those attempting to provide critical primary care.
As a result, the ranks of primary care providers have eroded, and that erosion continues. One of the major reasons for burnout in this group is the clash between its members’ professional ethics (put the patient first and “first do no harm”) and the profit-oriented demands of their corporate employers. Applying Band-Aids can’t cure the underlying causes of disease in medicine or public policy. Ignoring the underlying pathology in public policy, as in clinical medicine, is destined to fail.
Many of the symptoms of our dysfunctional health care system are not in dispute:
"The theology of the market and the strongly held—but mistaken—belief that the problems of US health care can be solved if only the market could be perfected have effectively obstructed the development of a rational, efficient, and humane national health care policy."The answer is that only in the United States has corporatism engulfed so much of medical care and come so close to dominating the doctor-patient relationship. Publicly traded, profit-driven entities—under constant pressure from Wall Street—control the financing and delivery of medical care in the US to an extent seen nowhere else in the world. For instance, seven investor-owned publicly traded health insurers now control almost a trillion dollars ($913 billion) of total national health care spending and cover half the US population. In 2019, their revenue increased by 31 percent, while their profits grew by 66 percent.
The corporatization of medical care may be the single most
distinguishing characteristic of the modern US health care system and
the one that has had the most profound impact on it since the early
1980s. The theology of the market and the strongly held—but
mistaken—belief that the problems of US health care can be solved if
only the market could be perfected have effectively obstructed the
development of a rational, efficient, and humane national health care
policy.
There are three main reasons to pursue a public policy that embraces genuine health care reform:
"The real struggle for a universal single-payer system in the US is not technical or economic but almost entirely political."The various “option” reform proposals will not simplify our confusing health care system nor will they lead to universal coverage. None have adequate means to restrain health care costs. So why go down this road? Is it too difficult for the US to guarantee everyone access to affordable care when every other developed country in the world has done so?
The stated reason put forth in favor of these mixed option approaches is that Americans want “choice.” But choice of what? We know with certainty from former insurance company executives such as Wendell Potter that the false “choice” meme polls well with the US public and was used to undermine the Clinton reform efforts more than 25 years ago. It is being widely used today to manipulate public opinion.
But choice in our current system is largely an illusion. In 2019, 67.8 million workers across the country separated from their job at some point during the year—either through layoffs, terminations, or switching jobs. This labor turnover data leaves little doubt that people with employer-sponsored insurance are losing their insurance constantly, as are their spouses and children. And even for those who stay at the same job, insurance coverage often changes. In 2019, more than half of all firms offering health benefits reported shopping for a new health plan and, among those, nearly 20 percent actually changed insurance carriers. Trading off choice of doctors or hospitals for choice of insurance companies is a bad bargain.
The other major objection to a universal single-payer program is cost. Yet, public financing for health care is not a matter of raising new money for health care but of reducing total health care outlays and distributing payments more equitably and efficiently. Nearly every credible study concludes that a single-payer universal framework, with all its increased benefits, would be less costly than the status quo, more effective in restraining future cost increases, and more popular with the public—as 50 years of experience with Medicare has demonstrated.
The status quo generates hundreds of billions of dollars in surplus and profits to private stakeholders, who need only spend a small portion (millions of dollars) to influence legislators, manipulate public opinion, distort the facts, and obfuscate the issues with multiple competing reform efforts.
The US public and increasingly the business community are becoming acutely aware of the rising costs and inadequacies of our current system, particularly as the current epidemic unfolds. It is the growing social movement, which rejects the false and misleading narratives, that will lead us to a universal single-payer system—truly the most effective way to reform our health care system for the benefit of the American people.
by Peter Arno and Philip Caper - Common Dreams - April 3, 2020
Over the past two weeks, the explosive growth of the coronavirus pandemic has forced nearly 10 million Americans to file for unemployment benefits. Along with their jobs, many have lost their health insurance, if they had any to begin with. Aside from possibly spelling disaster for these newly unemployed workers and their families, this situation puts both the public health and economic wellbeing of our country at great risk. A clearer rationale for universal, affordable, lifetime health coverage as exemplified under a Medicare For All framework would be hard to find.
In this article we outline the need for a universal health plan, its historical context, and the obstacles raised by the medical-industrial complex that must be overcome.
There is a large elephant in the room in the national discussion of Medicare for All: the transformation of the US health care system’s core mission from the prevention, diagnosis, and treatment of illness—and the promotion of healing—to an approach dominated by large, publicly traded corporate entities dedicated to growing profitability and share price, that is, the business of medicine.
The problem is not that these corporate entities are doing something they shouldn’t. They are simply doing too much of what they were created to do—generate wealth for their owners. Unlike any other wealthy country, we let them do it. The dilemma of the US health care system is due not to a failure of capitalism or corporatism per se, but a failure to implement a public policy that adequately constrains their excesses.
"How is it that we spend more on health care than any other nation, yet have arrived at such a sorry state of affairs?"
Since the late 1970s, US public policy regarding health care has trended toward an increasing dependence on for-pofit corporations and their accompanying reliance on the tools of the marketplace—such as competition, consolidation, marketing, and consumer choice—to expand access and assure quality in the provision of medical care.
This commercialized, commodified, and corporatized model is driving the US public’s demand for fundamental reform and has elevated the issue of health care to the top of the political agenda in the current presidential election campaign.
Costs have risen relentlessly, and the quality of and access to care for many Americans has deteriorated. The cultural changes accompanying these trends have affected every segment of the US health care system, including those that remain nominally not-for-profit. Excessive focus on health care as a business has had a destructive effect on both patients and caregivers, leading to increasing difficulties for many patients in accessing care and to anger, frustration, and burnout for many caregivers, especially those attempting to provide critical primary care.
As a result, the ranks of primary care providers have eroded, and that erosion continues. One of the major reasons for burnout in this group is the clash between its members’ professional ethics (put the patient first and “first do no harm”) and the profit-oriented demands of their corporate employers. Applying Band-Aids can’t cure the underlying causes of disease in medicine or public policy. Ignoring the underlying pathology in public policy, as in clinical medicine, is destined to fail.
Many of the symptoms of our dysfunctional health care system are not in dispute:
- We pay more than twice as much per person on total health care spending and on prescription drugs in comparison to other developed countries. This spending totals nearly 18 percent of our economy
- Between 2008 and 2018, premiums for employer-sponsored insurance plans increased 55 percent, twice as fast as workers’ earnings (26 percent). Over the same time period, the average health insurance deductible for covered workers increased by 212 percent
- An average employer-sponsored family health insurance policy now exceeds $28,000 per year, with employers paying about $16,000 and employees paying about $12,000
- Almost half (45 percent) of US adults ages 19 to 64, or more than 88 million people, were inadequately insured over the past year (either they were uninsured, had a gap in coverage, or were underinsured; that is, they had insurance all year but their out-of-pocket costs were so high that they frequently did not receive the care they needed).
- Compared to other developed countries, the US ranks near the bottom on a variety of health indicators including infant mortality, life expectancy, and preventable mortality.
"The theology of the market and the strongly held—but mistaken—belief that the problems of US health care can be solved if only the market could be perfected have effectively obstructed the development of a rational, efficient, and humane national health care policy."The answer is that only in the United States has corporatism engulfed so much of medical care and come so close to dominating the doctor-patient relationship. Publicly traded, profit-driven entities—under constant pressure from Wall Street—control the financing and delivery of medical care in the US to an extent seen nowhere else in the world. For instance, seven investor-owned publicly traded health insurers now control almost a trillion dollars ($913 billion) of total national health care spending and cover half the US population. In 2019, their revenue increased by 31 percent, while their profits grew by 66 percent.
SCROLL TO CONTINUE WITH CONTENT
Get our best delivered to your inbox.
There are three main reasons to pursue a public policy that embraces genuine health care reform:
- Saving lives: To simplify our complex and confusing health care system while providing universal affordable health care coverage;
- Affordability: To rein in the relentless rise in health care costs that are cannibalizing private and public budgets; and
- Improving quality: To eliminate profitability and share price as the dominant and all-consuming mission of the entities that provide health care services and products when that mission influences clinical decision making. Profitability should be the servant of any health care system’s mission, not its master as seems to be increasingly the case in the US.
What Is The Best Approach To Reform?
It is not an exaggeration to say that no reform other than publicly financed, single-payer universal health care will solve the problems of our health care system. This is true whether we are talking about a public option, a Medicare option, Medicare buy-in, Medicare extra, or any other half-measure. The main reason is because of the savings that are inherent only in a truly universal single-payer plan. Specifically, the administrative and bureaucratic savings gained by eliminating private insurers are the largest potential source of savings in a universal single-payer framework, yet all the “option” reforms listed above leave largely intact the tangle of wasteful, inefficient, and costly private commercial health insurers. The second largest source of savings comes through reducing the cost of prescription drugs by using the negotiating leverage of the federal government to bring down prices, as is done in most other developed countries. The ability, will, and policy tools (such as global budgeting) to restrain these and other costs in a single-payer framework are the key to reining in the relentless rise in health care expenditures and providing universal coverage."The real struggle for a universal single-payer system in the US is not technical or economic but almost entirely political."The various “option” reform proposals will not simplify our confusing health care system nor will they lead to universal coverage. None have adequate means to restrain health care costs. So why go down this road? Is it too difficult for the US to guarantee everyone access to affordable care when every other developed country in the world has done so?
The stated reason put forth in favor of these mixed option approaches is that Americans want “choice.” But choice of what? We know with certainty from former insurance company executives such as Wendell Potter that the false “choice” meme polls well with the US public and was used to undermine the Clinton reform efforts more than 25 years ago. It is being widely used today to manipulate public opinion.
But choice in our current system is largely an illusion. In 2019, 67.8 million workers across the country separated from their job at some point during the year—either through layoffs, terminations, or switching jobs. This labor turnover data leaves little doubt that people with employer-sponsored insurance are losing their insurance constantly, as are their spouses and children. And even for those who stay at the same job, insurance coverage often changes. In 2019, more than half of all firms offering health benefits reported shopping for a new health plan and, among those, nearly 20 percent actually changed insurance carriers. Trading off choice of doctors or hospitals for choice of insurance companies is a bad bargain.
The other major objection to a universal single-payer program is cost. Yet, public financing for health care is not a matter of raising new money for health care but of reducing total health care outlays and distributing payments more equitably and efficiently. Nearly every credible study concludes that a single-payer universal framework, with all its increased benefits, would be less costly than the status quo, more effective in restraining future cost increases, and more popular with the public—as 50 years of experience with Medicare has demonstrated.
The status quo generates hundreds of billions of dollars in surplus and profits to private stakeholders, who need only spend a small portion (millions of dollars) to influence legislators, manipulate public opinion, distort the facts, and obfuscate the issues with multiple competing reform efforts.
Conclusion
The real struggle for a universal single-payer system in the US is not technical or economic but almost entirely political. Retaining the status quo (for example, the Affordable Care Act) is the least disruptive course for the existing medical-industrial complex, and therefore the politically easiest route. Unfortunately, the status quo is disruptive to the lives of most Americans and the least effective route in attacking the underlying pathology of the US health care system—corporatism run amok. Following that route will do little more than kick the can down the road, which will require repeatedly revisiting the deficiencies in our health care system outlined above until we get it right.The US public and increasingly the business community are becoming acutely aware of the rising costs and inadequacies of our current system, particularly as the current epidemic unfolds. It is the growing social movement, which rejects the false and misleading narratives, that will lead us to a universal single-payer system—truly the most effective way to reform our health care system for the benefit of the American people.
Our work is licensed under a Creative Commons Attribution-Share Alike 3.0 License. Feel free to republish and share widely.
https://www.commondreams.org/views/2020/04/03/medicare-each-us-age-coronavirus?cd-origin=rss&utm_term=AO&utm_campaign=Weekly%20Newsletter&utm_content=email&utm_source=Weekly%20Newsletter&utm_medium=Email
https://www.commondreams.org/views/2020/04/03/medicare-each-us-age-coronavirus?cd-origin=rss&utm_term=AO&utm_campaign=Weekly%20Newsletter&utm_content=email&utm_source=Weekly%20Newsletter&utm_medium=Email
Why Private Equity Is Furious Over a Paper in a Dermatology Journal
by Katie Hafner - NYT - October 26, 2018
Early this month, a respected medical journal published a
research paper on its website that analyzed the effects of a business
trend roiling the field of dermatology: the rapid entrance of private
equity firms into the specialty by buying and running practices around
the country.
Eight days later, after an outcry from private equity executives and dermatologists associated with private equity firms, the editor of the publication removed the paper from the site. No reason was given.
Furor over the publication and subsequent removal of the article has deepened a rift in the field over what some see as the “corporatization” of dermatology and other areas of medicine.
The paper was published on the website of the Journal of the American Academy of Dermatology on Oct. 5, posted along with numerous other articles labeled “In Press Accepted Manuscript.” Most articles with this designation eventually appear in a print edition of the journal; some remain online.
Eight days later, after an outcry from private equity executives and dermatologists associated with private equity firms, the editor of the publication removed the paper from the site. No reason was given.
Furor over the publication and subsequent removal of the article has deepened a rift in the field over what some see as the “corporatization” of dermatology and other areas of medicine.
The paper was published on the website of the Journal of the American Academy of Dermatology on Oct. 5, posted along with numerous other articles labeled “In Press Accepted Manuscript.” Most articles with this designation eventually appear in a print edition of the journal; some remain online.
Dr. Dirk Elston, the
journal’s editor, said in an email that he replaced the article with a
notice of “temporary removal” after receiving multiple calls and emails
“expressing concerns about the accuracy of a few parts” of the article.
On Wednesday, nearly two weeks after removing the article, Dr. Elston told the authors they had a choice: They could correct “factual errors” or retract the paper.
The authors maintain that the article does not contain any factual errors and that several of the corrections requested had to do with protecting the reputation of the specialty and the leaders of the American Academy of Dermatology, the association that publishes the journal. Later on Wednesday, they submitted some revisions.
The article had gone through the standard editorial process of academic journals, undergoing multiple revisions based on feedback from peer-reviewers selected by the journal, before being accepted for publication. It presents data to support a conclusion that private equity firms acquire “outlier” practices — that is, practices that perform an unusually high number of well-reimbursed procedures and bill high amounts to Medicare.
“It was interesting when we ran the numbers and we were counting how many practices with billing outliers were being acquired by private equity,” said Dr. Joseph Francis, a dermatologist in Florida who is a co-author on the paper. “With every revision of the paper, that number kept increasing. So it didn’t seem like an anomaly.”
On Wednesday, nearly two weeks after removing the article, Dr. Elston told the authors they had a choice: They could correct “factual errors” or retract the paper.
The authors maintain that the article does not contain any factual errors and that several of the corrections requested had to do with protecting the reputation of the specialty and the leaders of the American Academy of Dermatology, the association that publishes the journal. Later on Wednesday, they submitted some revisions.
The article had gone through the standard editorial process of academic journals, undergoing multiple revisions based on feedback from peer-reviewers selected by the journal, before being accepted for publication. It presents data to support a conclusion that private equity firms acquire “outlier” practices — that is, practices that perform an unusually high number of well-reimbursed procedures and bill high amounts to Medicare.
“It was interesting when we ran the numbers and we were counting how many practices with billing outliers were being acquired by private equity,” said Dr. Joseph Francis, a dermatologist in Florida who is a co-author on the paper. “With every revision of the paper, that number kept increasing. So it didn’t seem like an anomaly.”
He added, “It wasn’t clear
whether these investors realized that the high billing might point to
anything irregular. They might have just seen that this was a practice
with booming business.”
The paper also notes that many practices backed by private equity firms have opened or acquired labs to process pathology specimens, potentially another source of profit.
Among those who objected to the article was Dr. George Hruza, the incoming president of the American Academy of Dermatology. Dr. Hruza, whose one-year term as president begins in March, is a dermatologist in Chesterfield, Mo. In 2016 he sold his own dermatology practice to United Skin Specialists, a firm that manages dermatology practices and is backed by private equity. He currently serves on the board of directors of United Skin Specialists, which he said is an unpaid position.
Dr. Hruza is not named in the journal article, but he said he is easily identified by the authors’ reference to his pending presidency of the academy, and to United Skin Specialists.
In an interview, Dr. Hruza said he did not ask that the paper be taken down. He did, however, confirm that he expressed his concerns to Dr. Elston, the editor, after it was posted. Two days later, Dr. Elston removed the paper. A flurry of intense conversations ensued among Dr. Elston; Dr. Hruza; the current academy president, Dr. Suzanne Olbricht; a lawyer for the dermatology academy; and the paper’s authors.
Editor's Note -
The preceding two posts remind of the old medical bromide "A healthy person is nothing but an under-diagnosed patient"
-SPC
The paper also notes that many practices backed by private equity firms have opened or acquired labs to process pathology specimens, potentially another source of profit.
Among those who objected to the article was Dr. George Hruza, the incoming president of the American Academy of Dermatology. Dr. Hruza, whose one-year term as president begins in March, is a dermatologist in Chesterfield, Mo. In 2016 he sold his own dermatology practice to United Skin Specialists, a firm that manages dermatology practices and is backed by private equity. He currently serves on the board of directors of United Skin Specialists, which he said is an unpaid position.
Dr. Hruza is not named in the journal article, but he said he is easily identified by the authors’ reference to his pending presidency of the academy, and to United Skin Specialists.
In an interview, Dr. Hruza said he did not ask that the paper be taken down. He did, however, confirm that he expressed his concerns to Dr. Elston, the editor, after it was posted. Two days later, Dr. Elston removed the paper. A flurry of intense conversations ensued among Dr. Elston; Dr. Hruza; the current academy president, Dr. Suzanne Olbricht; a lawyer for the dermatology academy; and the paper’s authors.
Specifically, Dr. Hruza said, he objected to one of the paper’s
conclusions: Influential dermatology leaders are being recruited to work
for and promote dermatology practices backed by private-equity firms.
“Implying motivation is a stretch,” he said. Dr. Hruza has asked for specific wording changes to that section of the paper.
Among the changes the editor of the journal asked the authors to make was the removal of identifiable references to influential dermatologists, including Dr. Hruza.
Interference with a scientific paper from within the ranks of a medical society is highly unusual, say experts in the medical publishing field. The sudden disappearance of the paper has others in the medical publishing world scratching their heads.
“The process of science requires that people be allowed to publish their data as long as it has been reviewed by peers who find it accurate in that moment,” said Dr. Mitchell Katz, president and chief executive of NYC Health & Hospitals and Deputy Editor of the journal JAMA Internal Medicine.
As for corrections, Dr. Katz added, “usually you would post a correct copy rather than removing a paper for days on end.”
Dr. Elston said others who objected to the article included Dr. Darrell Rigel, a prominent dermatologist in New York who is a former president of the academy and whose practice is now owned by Schweiger Dermatology, a private equity-backed practice. Dr. Rigel did not respond to requests for comment.
Dermatologists account for one percent of physicians in the United States, but 15 percent of recent private equity acquisitions of medical practices have involved dermatology practices. Other specialties that have attracted private equity investment include orthopedics, radiology, cardiology, urgent care, anesthesiology and ophthalmology.
Among the changes the editor of the journal asked the authors to make was the removal of identifiable references to influential dermatologists, including Dr. Hruza.
Interference with a scientific paper from within the ranks of a medical society is highly unusual, say experts in the medical publishing field. The sudden disappearance of the paper has others in the medical publishing world scratching their heads.
“The process of science requires that people be allowed to publish their data as long as it has been reviewed by peers who find it accurate in that moment,” said Dr. Mitchell Katz, president and chief executive of NYC Health & Hospitals and Deputy Editor of the journal JAMA Internal Medicine.
As for corrections, Dr. Katz added, “usually you would post a correct copy rather than removing a paper for days on end.”
Dr. Elston said others who objected to the article included Dr. Darrell Rigel, a prominent dermatologist in New York who is a former president of the academy and whose practice is now owned by Schweiger Dermatology, a private equity-backed practice. Dr. Rigel did not respond to requests for comment.
Dermatologists account for one percent of physicians in the United States, but 15 percent of recent private equity acquisitions of medical practices have involved dermatology practices. Other specialties that have attracted private equity investment include orthopedics, radiology, cardiology, urgent care, anesthesiology and ophthalmology.
One of the paper’s five
supplemental tables lists 32 dermatology practices in the United States
that have been formed or acquired by private equity firms. Many are
large practices with dozens of physicians.
Shortly after the article disappeared from the journal’s website, a copy was posted to a Facebook group composed of some 3,500 dermatologists. Several participants in the Facebook group praised the article for shining a light on the effects of private equity on their specialty, and were outraged that the article had been withdrawn.
“If this were an article on psoriasis no one would be questioning it, but this was going to ruffle some feathers,” said Dr. Curtis Asbury, a dermatologist in Selbyville, Del., and an active participant in the Facebook group.
The lead author of the paper was Dr. Sailesh Konda, an assistant clinical professor of dermatology at the University of Florida College of Medicine. Dr. Konda, 34, said he first grew interested in the topic when several of his trainees went to work for private equity-backed practices and told him of clinical environments that emphasized profits at the expense of patient care. He said that over the past year he had given 16 talks around the country to medical residents and dermatology societies about private equity. Dr. Francis has joined him for some of the sessions.
Dr. Konda said he and his co-authors spent a year working on the paper. After the paper went out for review, he said, “we received constructive feedback.” Most of the comments, he said, were about maintaining a neutral tone.
“We strived to not use any polemic words, which could be interpreted as bias,” he said. “We decided to just deal with the facts, which would speak for themselves.”
This week a lawyer for Advanced Dermatology and Cosmetic Surgery, which is backed by private equity and is the largest dermatology practice in the United States, called the general counsel at the University of Florida, where two of the authors are employed, demanding specific changes to the paper. The general counsel for the university declined to comment. Dr. Matt Leavitt, the chief executive of Advanced Dermatology, did not respond to requests for a comment.
Dr. Konda says he plans to continue his research into private equity. “I am passionate about this topic,” he said. “I realize we live in a capitalist society and money is a driving force behind many decisions regardless of the industry. However, I believe there has to be a balance between profit and patient care.”
https://www.nytimes.com/2018/10/26/health/private-equity-dermatology.html?action=click&module=RelatedLinks&pgtype=Article
Shortly after the article disappeared from the journal’s website, a copy was posted to a Facebook group composed of some 3,500 dermatologists. Several participants in the Facebook group praised the article for shining a light on the effects of private equity on their specialty, and were outraged that the article had been withdrawn.
“If this were an article on psoriasis no one would be questioning it, but this was going to ruffle some feathers,” said Dr. Curtis Asbury, a dermatologist in Selbyville, Del., and an active participant in the Facebook group.
The lead author of the paper was Dr. Sailesh Konda, an assistant clinical professor of dermatology at the University of Florida College of Medicine. Dr. Konda, 34, said he first grew interested in the topic when several of his trainees went to work for private equity-backed practices and told him of clinical environments that emphasized profits at the expense of patient care. He said that over the past year he had given 16 talks around the country to medical residents and dermatology societies about private equity. Dr. Francis has joined him for some of the sessions.
Dr. Konda said he and his co-authors spent a year working on the paper. After the paper went out for review, he said, “we received constructive feedback.” Most of the comments, he said, were about maintaining a neutral tone.
“We strived to not use any polemic words, which could be interpreted as bias,” he said. “We decided to just deal with the facts, which would speak for themselves.”
This week a lawyer for Advanced Dermatology and Cosmetic Surgery, which is backed by private equity and is the largest dermatology practice in the United States, called the general counsel at the University of Florida, where two of the authors are employed, demanding specific changes to the paper. The general counsel for the university declined to comment. Dr. Matt Leavitt, the chief executive of Advanced Dermatology, did not respond to requests for a comment.
Dr. Konda says he plans to continue his research into private equity. “I am passionate about this topic,” he said. “I realize we live in a capitalist society and money is a driving force behind many decisions regardless of the industry. However, I believe there has to be a balance between profit and patient care.”
https://www.nytimes.com/2018/10/26/health/private-equity-dermatology.html?action=click&module=RelatedLinks&pgtype=Article
Skin Cancers Rise, Along With Questionable Treatments
By Katie Hafner and
John Dalman had been in the waiting room at a Loxahatchee,
Fla., dermatology clinic for less than 15 minutes when he turned to his
wife and told her they needed to leave. Now.
“It was like a fight or flight impulse,” he said.
His face numbed for skin-cancer surgery, Mr. Dalman, 69, sat surrounded by a half-dozen other patients with bandages on their faces, scalps, necks, arms and legs. At a previous visit, a young physician assistant had taken 10 skin biopsies, which showed slow growing, nonlethal cancerous lesions. Expecting to have the lesions simply scraped off at the next visit, he had instead been told he needed surgery on many of them, as well as a full course of radiation lasting many weeks.
The once sleepy field of dermatology is bustling these days, as baby boomers, who spent their youth largely unaware of the sun’s risk, hit old age. The number of skin cancer diagnoses in people over 65, along with corresponding biopsies and treatment, is soaring. But some in the specialty, as well as other medical experts, are beginning to question the necessity of aggressive screening and treatment, especially in frail, elderly patients, given that the majority of skin cancers are unlikely to be fatal.
“You can always do things,” said Dr. Charles A. Crecelius, a St. Louis geriatrician who has studied care of medically complex seniors. “But just because you can do it, does that mean you should do it?”
“It was like a fight or flight impulse,” he said.
His face numbed for skin-cancer surgery, Mr. Dalman, 69, sat surrounded by a half-dozen other patients with bandages on their faces, scalps, necks, arms and legs. At a previous visit, a young physician assistant had taken 10 skin biopsies, which showed slow growing, nonlethal cancerous lesions. Expecting to have the lesions simply scraped off at the next visit, he had instead been told he needed surgery on many of them, as well as a full course of radiation lasting many weeks.
The once sleepy field of dermatology is bustling these days, as baby boomers, who spent their youth largely unaware of the sun’s risk, hit old age. The number of skin cancer diagnoses in people over 65, along with corresponding biopsies and treatment, is soaring. But some in the specialty, as well as other medical experts, are beginning to question the necessity of aggressive screening and treatment, especially in frail, elderly patients, given that the majority of skin cancers are unlikely to be fatal.
“You can always do things,” said Dr. Charles A. Crecelius, a St. Louis geriatrician who has studied care of medically complex seniors. “But just because you can do it, does that mean you should do it?”
Mr. Dalman’s
instinct to question his treatment plan was validated when he went to
see a dermatologist in a different practice. The doctor dismissed
radiation as unnecessary, removed many of the lesions with a scrape,
applied small Band-Aids, and was finished in 30 minutes.
Dermatology — a specialty built not on flashy, leading edge medicine but on thousands of small, often banal procedures — has become increasingly lucrative in recent years. The annual dermatology services market in the United States, excluding cosmetic procedures, is nearly $11 billion and growing, according to IBISWorld, a market research firm. The business potential has attracted private equity firms, which are buying up dermatology practices around the country, and installing crews of lesser-trained practitioners — like the physician assistants who saw Mr. Dalman — to perform exams and procedures in even greater volume.
The vast majority of dermatologists care for patients with integrity and professionalism, and their work has played an essential role in the diagnosis of complex skin-related diseases, including melanoma, the most dangerous form of skin cancer, which is increasingly caught early.
But while melanoma is on the rise, it remains relatively uncommon. The incidence of basal and squamous cell carcinomas of the skin, which are rarely life-threatening, is 18 to 20 times higher than that of melanoma. Each year in the United States more than 5.4 million such cases are treated in more than 3.3 million people, a 250 percent rise since 1994.
The New York Times analyzed Medicare billing data for dermatology from 2012 through 2015, as well as a national database of medical services maintained by the American Medical Association that goes back more than a decade. Nearly all dermatologic procedures are performed on an outpatient, fee-for-service basis.
Dermatology — a specialty built not on flashy, leading edge medicine but on thousands of small, often banal procedures — has become increasingly lucrative in recent years. The annual dermatology services market in the United States, excluding cosmetic procedures, is nearly $11 billion and growing, according to IBISWorld, a market research firm. The business potential has attracted private equity firms, which are buying up dermatology practices around the country, and installing crews of lesser-trained practitioners — like the physician assistants who saw Mr. Dalman — to perform exams and procedures in even greater volume.
The vast majority of dermatologists care for patients with integrity and professionalism, and their work has played an essential role in the diagnosis of complex skin-related diseases, including melanoma, the most dangerous form of skin cancer, which is increasingly caught early.
But while melanoma is on the rise, it remains relatively uncommon. The incidence of basal and squamous cell carcinomas of the skin, which are rarely life-threatening, is 18 to 20 times higher than that of melanoma. Each year in the United States more than 5.4 million such cases are treated in more than 3.3 million people, a 250 percent rise since 1994.
The New York Times analyzed Medicare billing data for dermatology from 2012 through 2015, as well as a national database of medical services maintained by the American Medical Association that goes back more than a decade. Nearly all dermatologic procedures are performed on an outpatient, fee-for-service basis.
The Times
analysis found a marked increase in the number of skin biopsies per
Medicare beneficiary in the past decade; a sharp rise in the number of
physician assistants, mostly unsupervised, performing dermatologic
procedures; and large numbers of invasive dermatologic procedures
performed on elderly patients near the end of life.
In 2015, the most recent year for which data was available, the number of skin biopsies performed on patients in the traditional Medicare Part B program had risen 55 percent from a decade earlier — despite a slight decrease in the program’s enrollment over all.
Skin cancers are more common in older people, which means Medicare pays for much of the treatment. In 2015, 5.9 million skin biopsies on Medicare recipients were performed.
More than 15 percent of the biopsies billed to Medicare that year were performed by physician assistants or nurse practitioners working independently. In 2005, almost none were, said Dr. Brett Coldiron, a former president of the American Academy of Dermatology, who has studied the use of clinicians who are not physicians in medical practices.
Dr. Coldiron, a dermatologist in Cincinnati, said he was skeptical of the growing use of such clinicians in the specialty. “Ads will say ‘See our dermatology providers,’” he said. “But what’s really going on is these practices, with all this private equity money behind them, hire a bunch of P.A.’s and nurses and stick them out in clinics on their own. And they’re acting like doctors.”
Dr. Steven K. Grekin, a dermatologist, said that when he founded Bedside, many of the nursing home patients had not been examined by a dermatologist for several years.
In 2015, the most recent year for which data was available, the number of skin biopsies performed on patients in the traditional Medicare Part B program had risen 55 percent from a decade earlier — despite a slight decrease in the program’s enrollment over all.
Skin cancers are more common in older people, which means Medicare pays for much of the treatment. In 2015, 5.9 million skin biopsies on Medicare recipients were performed.
More than 15 percent of the biopsies billed to Medicare that year were performed by physician assistants or nurse practitioners working independently. In 2005, almost none were, said Dr. Brett Coldiron, a former president of the American Academy of Dermatology, who has studied the use of clinicians who are not physicians in medical practices.
Dr. Coldiron, a dermatologist in Cincinnati, said he was skeptical of the growing use of such clinicians in the specialty. “Ads will say ‘See our dermatology providers,’” he said. “But what’s really going on is these practices, with all this private equity money behind them, hire a bunch of P.A.’s and nurses and stick them out in clinics on their own. And they’re acting like doctors.”
Dermatology on Wheels
Bedside Dermatology, a mobile practice in Michigan, sends clinicians to 72 nursing homes throughout the state for skin checks and treatment.Dr. Steven K. Grekin, a dermatologist, said that when he founded Bedside, many of the nursing home patients had not been examined by a dermatologist for several years.
“We were seeing a real unmet need,” he said.
In 2015, Bedside Dermatology’s traveling crews performed thousands of cryosurgeries — spraying liquid nitrogen on precancerous lesions with an instrument that resembles a blowtorch. Other spots on the nursing home patients’ skin were injected with steroids, or removed with minor surgery.
Examining the 2015 Medicare billing codes of three physician assistants and one nurse practitioner employed by Bedside Dermatology, The Times found that 75 percent of the patients they treated for various skin problems had been diagnosed with Alzheimer’s disease. Most of the lesions on these patients were very unlikely to be dangerous, experts said, and the patients might not even have been aware of them.
“Patients with a high level of disease burden still deserve and require treatment,” Dr. Grekin said. “If they are in pain, it should be treated. If they itch, they deserve relief.”
Dr. Eleni Linos, a dermatologist and epidemiologist at the University of California, San Francisco, who has argued against aggressive treatment of skin cancers other than melanomas in the frail elderly, said that if a lesion was bothering a patient, “of course we would recommend treatment.” However, she added, many such lesions are asymptomatic.
Dr. Linos added that physicians underestimate the side effects of skin cancer procedures. Complications such as poor wound healing, bleeding and infection are common in the months following treatment, especially among older patients with multiple other problems. About 27 percent report problems, her research has found.
“A procedure that is simple for a young healthy person may be a lot harder for someone who is very frail,” she said.
The work of Bedside Dermatology reflects a wider tendency to diagnose and treat patients for skin issues near the end of life. Arcadia Healthcare Solutions, a health analytics firm, analyzed dermatologic procedures done on 17,820 patients over age 65 in the last year of life, and found that skin biopsies and the freezing of precancerous lesions were performed frequently, often weeks before death.
In 2015, Bedside Dermatology’s traveling crews performed thousands of cryosurgeries — spraying liquid nitrogen on precancerous lesions with an instrument that resembles a blowtorch. Other spots on the nursing home patients’ skin were injected with steroids, or removed with minor surgery.
Examining the 2015 Medicare billing codes of three physician assistants and one nurse practitioner employed by Bedside Dermatology, The Times found that 75 percent of the patients they treated for various skin problems had been diagnosed with Alzheimer’s disease. Most of the lesions on these patients were very unlikely to be dangerous, experts said, and the patients might not even have been aware of them.
“Patients with a high level of disease burden still deserve and require treatment,” Dr. Grekin said. “If they are in pain, it should be treated. If they itch, they deserve relief.”
Dr. Eleni Linos, a dermatologist and epidemiologist at the University of California, San Francisco, who has argued against aggressive treatment of skin cancers other than melanomas in the frail elderly, said that if a lesion was bothering a patient, “of course we would recommend treatment.” However, she added, many such lesions are asymptomatic.
Dr. Linos added that physicians underestimate the side effects of skin cancer procedures. Complications such as poor wound healing, bleeding and infection are common in the months following treatment, especially among older patients with multiple other problems. About 27 percent report problems, her research has found.
“A procedure that is simple for a young healthy person may be a lot harder for someone who is very frail,” she said.
The work of Bedside Dermatology reflects a wider tendency to diagnose and treat patients for skin issues near the end of life. Arcadia Healthcare Solutions, a health analytics firm, analyzed dermatologic procedures done on 17,820 patients over age 65 in the last year of life, and found that skin biopsies and the freezing of precancerous lesions were performed frequently, often weeks before death.
Arcadia
found that the same was true for Mohs surgery, a sophisticated
procedure for basal and squamous cell skin cancers that involves slicing
off a skin cancer in layers, with microscopic pathology performed each
time a layer is excised until the growth has been entirely removed. Each
layer taken is reimbursed separately.
In 2015, one out of every five Mohs procedures reimbursed by Medicare was performed on a patient 85 or older, The Times found.
ADCS has its headquarters in Maitland, Fla., in a sleek suite of offices and cubicles the size of a football field. One morning early this year, the buzz of corporate expansion was everywhere. A delivery crew wheeled in a stack of cubicle partitions. Employees at a large phone bank scheduled appointments around the country. A transition team was preparing to visit a newly acquired practice in Pennsylvania, and Dr. Matt Leavitt, ADCS’s founder and chief executive, was congratulating his director of business development on snagging a sought-after recruit.
In an email last week, Dr. Leavitt said the company currently has 192 physicians, but declined to confirm other numbers because ADCS is privately held. The company’s website advertises “180+ locations.” The website also lists 124 physician assistants. That is a 400 percent increase from 2008, according to web pages preserved by the Internet Archive’s Wayback Machine. ADCS offers a six-month fellowship program for physician assistants to provide additional training in dermatology.
In 2015, one out of every five Mohs procedures reimbursed by Medicare was performed on a patient 85 or older, The Times found.
Rise of Physician Assistants
Bedside Dermatology is owned by Advanced Dermatology and Cosmetic Surgery, the largest dermatology practice in the country, with a database of four million active or recently established patients. Last year, Harvest Partners, a private equity firm, invested a reported $600 million in the practice, known as ADCS.ADCS has its headquarters in Maitland, Fla., in a sleek suite of offices and cubicles the size of a football field. One morning early this year, the buzz of corporate expansion was everywhere. A delivery crew wheeled in a stack of cubicle partitions. Employees at a large phone bank scheduled appointments around the country. A transition team was preparing to visit a newly acquired practice in Pennsylvania, and Dr. Matt Leavitt, ADCS’s founder and chief executive, was congratulating his director of business development on snagging a sought-after recruit.
In an email last week, Dr. Leavitt said the company currently has 192 physicians, but declined to confirm other numbers because ADCS is privately held. The company’s website advertises “180+ locations.” The website also lists 124 physician assistants. That is a 400 percent increase from 2008, according to web pages preserved by the Internet Archive’s Wayback Machine. ADCS offers a six-month fellowship program for physician assistants to provide additional training in dermatology.
“My
number one goal would be to have people take skin cancer much more
seriously than they have, especially baby boomers,” said Dr. Leavitt, a
dermatologist. “And we’ve got to continue to work at getting better
access for patients.”
While health care experts agree that access to care is of
growing importance, there is an ongoing debate over whether
practitioners who are not physicians are qualified to make diagnoses,
identify skin cancers and decide when to perform biopsies — skills
dermatologists acquire through extensive training — particularly among
the elderly.
The frequency with which physician assistants and nurse practitioners take skin biopsies — compared with M.D.’s — was the subject of a 2015 study at the University of Wisconsin, Madison. Based on 1,102 biopsies from 743 patients, researchers found that physician assistants and nurse practitioners performed nearly six biopsies for every skin cancer found — more than twice the number performed by physicians.
Riley Wood, age 82, arrived one morning last February at an ADCS clinic in Heathrow, Florida, for a skin check with David Fitzmaurice, a physician assistant.
For Mr. Fitzmaurice, the exam was routine; Mr. Wood was one of a few dozen patients he sees each day. On the day a reporter observed him, Mr. Fitzmaurice moved quickly through the visits, many of which entailed procedures like biopsies and cryosurgery.
Mr. Wood had already had two other cancers — kidney and throat. Mr. Fitzmaurice decided Mr. Wood needed two biopsies — one on his scalp, for a suspected squamous cell carcinoma, and a second on his neck, for a spot that might be a melanoma.
The bleeding from the biopsy wound to Mr. Wood’s neck persisted for several minutes, leaving the patient worried and depleted.
The frequency with which physician assistants and nurse practitioners take skin biopsies — compared with M.D.’s — was the subject of a 2015 study at the University of Wisconsin, Madison. Based on 1,102 biopsies from 743 patients, researchers found that physician assistants and nurse practitioners performed nearly six biopsies for every skin cancer found — more than twice the number performed by physicians.
Riley Wood, age 82, arrived one morning last February at an ADCS clinic in Heathrow, Florida, for a skin check with David Fitzmaurice, a physician assistant.
For Mr. Fitzmaurice, the exam was routine; Mr. Wood was one of a few dozen patients he sees each day. On the day a reporter observed him, Mr. Fitzmaurice moved quickly through the visits, many of which entailed procedures like biopsies and cryosurgery.
Mr. Wood had already had two other cancers — kidney and throat. Mr. Fitzmaurice decided Mr. Wood needed two biopsies — one on his scalp, for a suspected squamous cell carcinoma, and a second on his neck, for a spot that might be a melanoma.
The bleeding from the biopsy wound to Mr. Wood’s neck persisted for several minutes, leaving the patient worried and depleted.
“I
don’t like needles,” said Mr. Wood, in a voice close to a whisper,
adding that the word cancer frightened him. Still, Mr. Wood said, he
usually goes with the recommendations of Mr. Fitzmaurice, whom he called
“Dr. David.” “I like him. He’s very thorough and cordial.”
With Mr. Wood’s permission, a reporter photographed the area Mr. Fitzmaurice biopsied for a suspected melanoma, and sent the image to nine physician-dermatologists. A few dismissed the biopsied lesion as nothing, while others said it was hard to tell from the photograph. None said the spot had the telltale signs of melanoma.
Yet all nine dermatologists, with no prompting, pointed to an adjacent lesion that had gone unremarked by Mr. Fitzmaurice, saying it looked like a skin cancer that was not melanoma.
With Mr. Wood’s permission, a reporter photographed the area Mr. Fitzmaurice biopsied for a suspected melanoma, and sent the image to nine physician-dermatologists. A few dismissed the biopsied lesion as nothing, while others said it was hard to tell from the photograph. None said the spot had the telltale signs of melanoma.
Yet all nine dermatologists, with no prompting, pointed to an adjacent lesion that had gone unremarked by Mr. Fitzmaurice, saying it looked like a skin cancer that was not melanoma.
Two months later in a telephone interview, the reporter
asked Dr. Leavitt about Mr. Fitzmaurice’s apparent oversight. Dr.
Leavitt defended his employee, saying Mr. Fitzmaurice had probably seen
the spot but his higher priority was the suspected melanoma.
The morning after the interview, Mr. Wood received a call from ADCS, telling him to come in for a second look. The spot Mr. Fitzmaurice biopsied for melanoma turned out to be benign. The one next to it, which Mr. Fitzmaurice did not flag, was in fact a squamous cell carcinoma in situ, Dr. Leavitt said in a follow-up email.
While Dr. Leavitt pointed out that “routine skin checks are a great way to catch potential problems early,” Dr. Coldiron said he was wary of clinicians who are not physicians doing basic skin checks, given the evidence that those often lead to unnecessary biopsies.
The morning after the interview, Mr. Wood received a call from ADCS, telling him to come in for a second look. The spot Mr. Fitzmaurice biopsied for melanoma turned out to be benign. The one next to it, which Mr. Fitzmaurice did not flag, was in fact a squamous cell carcinoma in situ, Dr. Leavitt said in a follow-up email.
While Dr. Leavitt pointed out that “routine skin checks are a great way to catch potential problems early,” Dr. Coldiron said he was wary of clinicians who are not physicians doing basic skin checks, given the evidence that those often lead to unnecessary biopsies.
Arielle
Rought, a physician assistant with ADCS who is in her late 20s, called
skin checks “our bread and butter.” On the day a reporter visited, Ms.
Rought biopsied a spot on a patient’s hand to rule out melanoma. Her
supervising physician was standing out in the hall, yet she did not ask
him to take a look. Asked why she had not called him into the room, she
said she did not consider it necessary. The biopsy was negative.
In an emailed statement, the president of the American Academy of Dermatology, Dr. Henry W. Lim, said: “The AAD believes the optimum degree of dermatologic care is delivered when a board-certified physician dermatologist provides direct, on-site supervision to all non-dermatologist personnel.”
Ms. Rought said it was not unusual for a skin check to lead her to to freeze as many as 30 precancerous lesions called actinic keratoses on a patient during a single visit. Actinic keratoses are called precancerous because they can sometimes turn into squamous cell carcinoma. Ms. Rought said her “rule of thumb” was that 20 percent of actinic keratoses progress to cancer.
While that might once have been the popular understanding, research now suggests otherwise. Dr. Martin A. Weinstock, a professor of dermatology and epidemiology at Brown University, reported in a 2009 study of men with a history of two or more skin cancers that were not melanomas that the risk of an actinic keratosis progressing to skin cancer was about 1 percent after a year, and 4 percent after four years. More than 50 percent of the lesions went away on their own.
Dr. Lim said the dermatology academy’s position is that actinic keratoses should be treated, as it is impossible to know which ones will turn into cancer, but some specialists are questioning whether that’s necessary.
In an emailed statement, the president of the American Academy of Dermatology, Dr. Henry W. Lim, said: “The AAD believes the optimum degree of dermatologic care is delivered when a board-certified physician dermatologist provides direct, on-site supervision to all non-dermatologist personnel.”
Ms. Rought said it was not unusual for a skin check to lead her to to freeze as many as 30 precancerous lesions called actinic keratoses on a patient during a single visit. Actinic keratoses are called precancerous because they can sometimes turn into squamous cell carcinoma. Ms. Rought said her “rule of thumb” was that 20 percent of actinic keratoses progress to cancer.
While that might once have been the popular understanding, research now suggests otherwise. Dr. Martin A. Weinstock, a professor of dermatology and epidemiology at Brown University, reported in a 2009 study of men with a history of two or more skin cancers that were not melanomas that the risk of an actinic keratosis progressing to skin cancer was about 1 percent after a year, and 4 percent after four years. More than 50 percent of the lesions went away on their own.
Dr. Lim said the dermatology academy’s position is that actinic keratoses should be treated, as it is impossible to know which ones will turn into cancer, but some specialists are questioning whether that’s necessary.
The Doctor Is Not In
The experience of Mr. Dalman, the patient who fled the waiting room, began in January, when he made an appointment as a new patient at the clinic of Dr. Joseph Masessa, believing he would be seen by the dermatologist. Instead, he was seen by a young woman in a lab coat, whom he assumed was a physician, though she did not identify herself as one. She biopsied 10 different lesions.
At his next visit in
February, he was seen by another young woman, whom he also took to be a
physician. As it turned out, both women were physician assistants.
The second physician assistant told Mr. Dalman that he would need radiation on basal cell carcinomas on his temple, shoulder and ear. He said he tried to argue with her, explaining that he’d had many similar lesions in the past that were removed with a simple scrape.
He said she countered that if she attempted to remove the lesion above his right eye, he might end up unable to blink that eye. And without superficial radiation on his ear, he was in danger of losing the entire ear. She said he would also need Mohs surgery on several of the basal cell carcinomas. She did not respond to requests from The New York Times to speak about the case.
Although Dr. Masessa signed Mr. Dalman’s chart, Mr. Dalman never met him. This could be because the clinic he went to, northwest of West Palm Beach, Fla., is one of more than a dozen clinics scattered across three states associated with Dr. Masessa, who is based in New Jersey but licensed in Florida. Supervision of physician assistants is required by state law. The Florida Department of Health website lists Dr. Masessa as supervising four physician assistants in the state.
Dr. Masessa did not respond to repeated requests for comment. An associate, who identified himself as Jeff Masessa, returned a call and asked for questions by email. Neither he nor Dr. Masessa responded to a detailed list of questions, despite repeated follow-up emails from The Times.
On the day of Mr. Dalman’s surgery, the same physician assistant injected a local anesthetic, then instructed Mr. Dalman to return to the waiting room, Mr. Dalman said.
Then something dawned on him. Since he had not laid eyes on a physician in several visits, he worried that the physician assistant would be doing the procedure. The prospect made him nervous and he decided to make a swift exit.
The second physician assistant told Mr. Dalman that he would need radiation on basal cell carcinomas on his temple, shoulder and ear. He said he tried to argue with her, explaining that he’d had many similar lesions in the past that were removed with a simple scrape.
He said she countered that if she attempted to remove the lesion above his right eye, he might end up unable to blink that eye. And without superficial radiation on his ear, he was in danger of losing the entire ear. She said he would also need Mohs surgery on several of the basal cell carcinomas. She did not respond to requests from The New York Times to speak about the case.
Although Dr. Masessa signed Mr. Dalman’s chart, Mr. Dalman never met him. This could be because the clinic he went to, northwest of West Palm Beach, Fla., is one of more than a dozen clinics scattered across three states associated with Dr. Masessa, who is based in New Jersey but licensed in Florida. Supervision of physician assistants is required by state law. The Florida Department of Health website lists Dr. Masessa as supervising four physician assistants in the state.
Dr. Masessa did not respond to repeated requests for comment. An associate, who identified himself as Jeff Masessa, returned a call and asked for questions by email. Neither he nor Dr. Masessa responded to a detailed list of questions, despite repeated follow-up emails from The Times.
On the day of Mr. Dalman’s surgery, the same physician assistant injected a local anesthetic, then instructed Mr. Dalman to return to the waiting room, Mr. Dalman said.
Then something dawned on him. Since he had not laid eyes on a physician in several visits, he worried that the physician assistant would be doing the procedure. The prospect made him nervous and he decided to make a swift exit.
Mr. Dalman later went to see Dr.
Joseph Francis, a dermatologist near West Palm Beach. Dr. Francis said
there was no indication for superficial radiation, a treatment of which
the American Academy of Dermatology has voiced skepticism. Moreover, Dr.
Francis decided, many of the basal cell carcinomas could be scraped
off.
Dr. Francis said he was shocked not only by the number of biopsies that had been taken at once, but also by the aggressive treatment proposed.
Moreover, when he reviewed Mr. Dalman’s records from Dr. Masessa’s clinic, he saw four skin exams documented over the four-month period. But when he examined the patient, Dr. Francis noticed a pigmented, asymmetrical spot slightly bigger than a pencil eraser on Mr. Dalman’s shoulder.
It turned out to be a malignant melanoma, not documented by the physician assistant. Dr. Francis removed it before it had a chance to spread.
https://www.nytimes.com/2017/11/20/health/dermatology-skin-cancer.html
Dr. Francis said he was shocked not only by the number of biopsies that had been taken at once, but also by the aggressive treatment proposed.
Moreover, when he reviewed Mr. Dalman’s records from Dr. Masessa’s clinic, he saw four skin exams documented over the four-month period. But when he examined the patient, Dr. Francis noticed a pigmented, asymmetrical spot slightly bigger than a pencil eraser on Mr. Dalman’s shoulder.
It turned out to be a malignant melanoma, not documented by the physician assistant. Dr. Francis removed it before it had a chance to spread.
https://www.nytimes.com/2017/11/20/health/dermatology-skin-cancer.html
The preceding two posts remind of the old medical bromide "A healthy person is nothing but an under-diagnosed patient"
-SPC
Pharmaceutical Profits and Public Health Are Not Incompatible
We need the capital and creativity of the private sector to take on the coronavirus.
By Daniel Hemel and
The rapid spread of the coronavirus has revived a
decades-old debate over pharmaceutical policy, with both sides doubling
down on long-held views. Advocates for broader drug access insist that
pharmaceutical companies must not be allowed to reap large profits from
Covid-19 vaccines and treatments. Free-market true believers — including
officials in the Trump administration — argue that pharmaceutical
businesses must be allowed to set prices beyond some patients’ reach.
This either-or choice was always a false framing. And as the Covid-19 crisis tragically illustrates, it’s a dangerous one too. Patient advocates need to acknowledge that pharmaceutical companies aren’t the enemy — the virus is. But it’s equally urgent for free-marketers to recognize that with government help, we can reward businesses for groundbreaking innovations without sacrificing poorer patients along the way.
The latest flare-up in this battle began even before the first recorded Covid-19 death in the United States. At a Feb. 26 hearing, Representative Jan Schakowsky, Democrat of Illinois, pressed Health and Human Services Secretary Alex Azar to pledge that any Covid-19 vaccine or treatment would “be affordable for anyone who needs it.” Mr. Azar refused, saying “we can’t control that price because we need the private sector to invest.”
Predictably, Mr. Azar’s statement set off fireworks on Capitol Hill. Congressional Democrats called on him to reverse his stance. Representative Schakowsky demanded that Mr. Azar “not allow any pharmaceutical manufacturer to set a price” for a Covid-19 vaccine that “would cause private insurers to raise premiums or further exacerbate the federal deficit.”
This either-or choice was always a false framing. And as the Covid-19 crisis tragically illustrates, it’s a dangerous one too. Patient advocates need to acknowledge that pharmaceutical companies aren’t the enemy — the virus is. But it’s equally urgent for free-marketers to recognize that with government help, we can reward businesses for groundbreaking innovations without sacrificing poorer patients along the way.
The latest flare-up in this battle began even before the first recorded Covid-19 death in the United States. At a Feb. 26 hearing, Representative Jan Schakowsky, Democrat of Illinois, pressed Health and Human Services Secretary Alex Azar to pledge that any Covid-19 vaccine or treatment would “be affordable for anyone who needs it.” Mr. Azar refused, saying “we can’t control that price because we need the private sector to invest.”
Predictably, Mr. Azar’s statement set off fireworks on Capitol Hill. Congressional Democrats called on him to reverse his stance. Representative Schakowsky demanded that Mr. Azar “not allow any pharmaceutical manufacturer to set a price” for a Covid-19 vaccine that “would cause private insurers to raise premiums or further exacerbate the federal deficit.”
As
the death toll from Covid-19 began to mount, the push to limit returns
to pharmaceutical companies at the cutting edge of coronavirus research
only intensified. Gilead Sciences, a California-based biotechnology
company whose antiviral drug remdesivir has emerged as a potential
Covid-19 treatment, soon became a target.
On March 23, the Food and Drug Administration granted Gilead’s request to designate remdesivir as an “orphan” drug. A drug qualifies as an orphan if it treats a disease affecting fewer than 200,000 people in the United States at the time of the application, even if the disease becomes more widespread. The number of Covid-19 diagnoses in the United States fell well below that threshold at the time Gilead applied and when the F.D.A. designated the drug an orphan.
An orphan designation would have kept generic remdesivir off the market until 2027 unless the F.D.A. determined that Gilead could not meet demand for the drug. But remdesivir is covered by Gilead patents that do not expire until at least 2035, so this benefit is largely duplicative of what Gilead already enjoys under patent law. More immediately, an orphan designation would have allowed Gilead to claim tax credits for 25 percent of clinical trial expenses — a benefit potentially in the range of $40 million.
Although $40 million is a drop in the bucket compared with Covid-19’s social costs, politicians balked. Senator Bernie Sanders called Gilead’s application for orphan status “truly outrageous” — a day before he voted for a stimulus package with more than $50 billion in grants and low-interest loans to airlines. The consumer-rights advocacy group Public Citizen and 50 other organizations denounced Gilead’s use of “a loophole in the law to profiteer off a deadly pandemic.” The backlash quickly led Gilead to withdraw its request.
Ours is not the only country where Covid-19 has unleashed efforts to stamp out pharmaceutical profits. Last month, the Geneva-based Doctors Without Borders broadly called for “no patents or profiteering on drugs, tests or vaccines” for Covid-19. That campaign followed efforts by Canada, Israel, Germany and 33 members of the European Parliament to limit or override patents for drugs directed at the virus.
On March 23, the Food and Drug Administration granted Gilead’s request to designate remdesivir as an “orphan” drug. A drug qualifies as an orphan if it treats a disease affecting fewer than 200,000 people in the United States at the time of the application, even if the disease becomes more widespread. The number of Covid-19 diagnoses in the United States fell well below that threshold at the time Gilead applied and when the F.D.A. designated the drug an orphan.
An orphan designation would have kept generic remdesivir off the market until 2027 unless the F.D.A. determined that Gilead could not meet demand for the drug. But remdesivir is covered by Gilead patents that do not expire until at least 2035, so this benefit is largely duplicative of what Gilead already enjoys under patent law. More immediately, an orphan designation would have allowed Gilead to claim tax credits for 25 percent of clinical trial expenses — a benefit potentially in the range of $40 million.
Although $40 million is a drop in the bucket compared with Covid-19’s social costs, politicians balked. Senator Bernie Sanders called Gilead’s application for orphan status “truly outrageous” — a day before he voted for a stimulus package with more than $50 billion in grants and low-interest loans to airlines. The consumer-rights advocacy group Public Citizen and 50 other organizations denounced Gilead’s use of “a loophole in the law to profiteer off a deadly pandemic.” The backlash quickly led Gilead to withdraw its request.
Ours is not the only country where Covid-19 has unleashed efforts to stamp out pharmaceutical profits. Last month, the Geneva-based Doctors Without Borders broadly called for “no patents or profiteering on drugs, tests or vaccines” for Covid-19. That campaign followed efforts by Canada, Israel, Germany and 33 members of the European Parliament to limit or override patents for drugs directed at the virus.
This doesn’t mean that governments must make a choice between ensuring patient access and encouraging drug development. With creative policymaking and political will, we can — and ought to have — both.
Governments can offer strong incentives to drugmakers while ensuring affordability by committing to patent buyouts for effective treatments. In a buyout, the government purchases the patents on a new drug — typically at a price that matches or exceeds what the patent holder otherwise would have earned — and then allows makers of generics to produce and sell low-cost versions. If, for example, clinical trials establish the efficacy of remdesivir in treating Covid-19, then the federal government should buy the U.S. rights to the drug from Gilead and give generic manufacturers free rein to ramp up production.
How much should the government pay? If remdesivir saves 10,000 American lives, then its value to our society — using traditional tools of cost-benefit analysis — would be as much as $100 billion. For a fraction of that sum, H.H.S. could buy the drug rights from Gilead and still leave the company with an eye-popping profit. Unfortunately, the $2 trillion Covid-19 stimulus package passed last month included only $11 billion that H.H.S. can use for patent buyouts, and the department will most likely need to draw down some of those funds for other purposes, like procuring diagnostic tests and purchasing other medical equipment. Mr. Azar’s department needs more money for patent buyouts.
Another time-tested tool for rewarding innovation while ensuring widespread access to new technologies is a “challenge prize.” We have proposed a prize for an effective coronavirus vaccine of $500 per vaccinated person, with the federal government footing the full bill. That almost certainly would make a Covid-19 vaccine profitable — potentially one of the most profitable drugs in history.
Patent buyouts and challenge prizes would of course add to the federal deficit — something that Representative Schakowsky, for one, said she was unwilling to do if it meant drugmakers would profit. But with Covid-19 already shutting down the economy and stealing thousands of lives, cutting costs on drugs directed at the disease is the very definition of penny-wise and pound-foolish. Worse yet, if we refuse to offer generous rewards for vaccines and treatments this time, we will find fewer pharmaceutical companies willing to invest in vaccines and treatments that address threats likely to emerge or return, such as the Zika virus, Dengue fever and new types of influenza.
None of this is to suggest that the only way to spur innovation is to dangle large payouts in the faces of pharmaceutical businesses. Reputational incentives and altruistic inclinations will lead some companies to pursue Covid-19 cures. Scientists employed by government agencies and academic institutions will make major breakthroughs too.
But to contain Covid-19 now and sustain a pipeline of drugs directed at other infections with pandemic potential, we will almost certainly need to enlist the capital and creativity of the private sector. We don’t need to compromise patient access, but we will need to promise profits to businesses that develop effective vaccines and treatments. Among all the costs that we as a society will bear because of this virus and later ones, the payout to pharmaceutical companies will be a rounding error.
https://www.nytimes.com/2020/04/08/opinion/coronavirus-drug-company-profits.html
No comments:
Post a Comment