fbpx

An Overview of Insurance in the 20th Century

Patrick Daly, III

Patrick Daly, III

By PATRICK DALY, III
Your Voice Contributer

Over the course of this coming year, as businesses in the United States respond to the terms of the Affordable Care Act, a significant number of Americans will no longer find themselves provided with a health care plan that is maintained by the company for which they work. For better or for worse, these workers will be compelled to purchase insurance on the Obamacare public exchanges. Conservative estimates put that number at 30 million. In all likelihood, it will reach twice that. Given the complexity of the new law, the decision will not be an easy one to make for many.

Of course, the individual mandate aspect of the law has already led to over five million policy cancellations. Many of these Americans who have lost their medical insurance now have to visit the exchanges and make difficult choices. As a Millennial, I am among that group. But the learning curve for me regarding the new health care law has not been as arduous as it has been for others. Over the last several years, my understanding of the new law has been enriched by past undergraduate studies; in preparation for a future medical career, I have taken courses that have had some relevance to the health care debate. As a Millennial, I also have had countless conversations (and arguments) with friends and family about the law. More often than not, these conversations revolve around some financial aspect of the law itself. What has struck me is that while people have been gradually trying to absorb the new law, some are not familiar with the events leading up to the health care reform, especially that regarding the creation of health insurance in our country during the 20th century. I would like to use this space to briefly explain that little-known history, for it can lend vital context to the current debate.

Formal health care in America began in the late 19th century with the fairly dramatic rise in doctors, who by today’s standards would be considered general practitioners. This rise was no doubt the result in large part of the establishment of land grant universities in this country following the Civil War. These institutions of higher learning trained and certified many medical students. Perhaps what is most relevant about this period is the common economic model that was then used, that of patients choosing their doctors and incurring the full cost of care. Since medical insurance as a formal practice had not yet emerged, medical treatment itself was fairly straightforward. For the most part, it was also a fairly inexpensive process. It is important to keep in mind that all the expensive medical devises, therapies and treatments of the last several decades were yet to be developed. Primitive by today’s standards, the singular process of a patient paying his or her doctor nonetheless was an efficient and practical system for the times. Whether it was a procedure as simple as setting a broken arm or something more complicated as removing a tumor, back in the early decades the patient would directly pay the doctor for services rendered. Neither private insurance nor government subsidy was involved in the process. Even if the illness involved hospital care, the patient would be responsible for the full cost of the stay.

With the onset of the Great Depression, however, hospitals began to suffer financially when patients were unable to pay. This led state legislatures to create an insurance system known as Blue Cross. As a provider-driven process, the Blue Cross approach meant that the doctors not only had full control over care, but also comprised the boards that made the insurance standards. Over time, doctors came to so like the Blue Cross system that it countenanced private insurance companies, whose main clients were businesses that offered health care packages to their workers in exchange for lower wages. Ironically, workers at first objected to any intermediary that came between them and their doctor. Workers liked to make their own health care decisions. But as America grew more prosperous following World War II, supply and demand led to companies offering more attractive health care packages that workers came to accept.

The next phase of this story came during the latter half of the 1950s when workers for the first time came to demand health care insurance as a term of employment. The federal government, while still a spectator in the process, nonetheless liked this business model for the same reason that proponents of Obamacare today like the Affordable Care Act: it would provide for Americans particularly susceptible to the vagaries of illness and disease. Back in the day, the stakes were much higher. Whereas many people today stand to lose financially, and in some cases medically, if they do not have insurance, there still remains both Medicaid and the emergency room for them. But in the first half of the 20th century, many people actually died of easily preventable illnesses because of the lack of the financial resources to pay for treatment.

During the 1950s, tax laws were designed that made health care insurance more attractive. Local, state and federal governments created agencies which oversaw the process. As the new tax policy shifted the burden of health insurance, the cost of health care for many American workers was being subsidized by the government. Of course, the monies needed to sustain the subsidizing agencies that emerged in the 1960s, that of Medicare and Medicaid, were not entities with fixed costs. The cost of health care steadily rose, so that the percentage of federal spending on health care has grown from 24 percent in 1960 to over 50 percent in 2013. Of course, the politicians responsible for raising these needed monies could not afford to have it perceived as direct taxation for medical services, as that would have had negative connotation for an anti-tax public. So the monies were got through the shifting and disguising of tax receipts.

As noted above, there also emerged at this time the medical-industrial complex, which designed all kinds of new technologies and pharmaceuticals that led to exciting cures and remedies. These developments were not without a dramatically rising medical debt that became a new reality. Until very recently, however, most Americans have been largely indifferent to the dangers of this particular debt. Such indifference was based on a fairly accurate perception that the relationship between doctor and patient was free of outside influence and thus remained intact. So long as people were making their own medical decisions, they cared not that some of their federal tax monies went to subsidizing the poor and sick.

Even so, the elephant in the room still remained the phenomenon of a complete dissociation between the consumer (the patient) and the payee (the government). This arrangement has had any number of ramifications. Not the least of these is that the medical-industrial complex has forged a myth that essentially describes how an equilibrium has been reached, one where the three major players—patient, doctor and insurance company–are all now working in concert. What the myth overlooks here is that as actual medical treatments have become spectacular in terms of success, so, too, has the medical-industrial complex received equally spectacular payments for the devices and therapies behind that success. Sadly, despite the United States actually starting to reach the breaking point in the late 1980s in  terms of sustainability, word of the crisis did not enter the consciousness of most Americans. It was not until five million Americans recently lost their health care coverage, and that many more are about to lose it this year, that the national debate on financing health care has begun in earnest.

It is my belief that the debate will not subside any time soon. Let us pray that as debate emerges, a sensible partisan agreement will lead to insurance reforms that cover the poor and indigent, even the solemn relationship between patient and doctor.

Patrick Daly, III is a resident of Louisville. He is a graduate of the School of Arts and Letters at Indiana University Southeast. The above essay is based on the background section of his senior capstone research paper, the topic of which was the interactive medical theories of positivism, social constructivism, and postmodernism.