Your Health Care Is the Love Child of Science and Insurance

Getty
Getty

Justin Ford Kimball had to wonder if he was crazy for taking the job as vice president of Baylor Hospital in Dallas. It was 1929, and hours into the job, he realized that his new employer was about a month away from insolvency. Kimball and his team needed an infusion of cash, but where to get it?

Fortunately for all involved, Kimball—a lawyer by training—had previously been the superintendent of the Dallas school administration. In the midst of his tenure, the influenza pandemic of 1918 had ravaged the world, killing more than 50 million people. More Americans died from the flu (675,000) than perished in World War I. While fewer than one thousand died in the Dallas area, the plague rattled an already fatigued populace in the midst of war.

Kimball had a knack for actuarial sciences—the mathematical and statistical analysis of risk—and after a bit of number crunching, he realized that if the Dallas teachers each contributed just $1 a month, they could “insure” themselves with an in-house disability benefit of $5 a day if they fell sick. The sick benefit plan was very popular, but most important, it taught him how to think about risk and disease.

The Flu Epidemic That Even Killed Love

So, in 1929, Justin Ford Kimball urgently began digging into the hospital data. He was shocked to realize that one of the chief causes of Baylor’s financial woes was lack of payment from Dallas teachers. Kimball had an idea: why not investigate the illness data from his previous job and compare that to the financial data from his new job?

After analyzing the data, Kimball proposed that his old friends at the Dallas school administration participate in a “hospital prepayment program,” wherein teachers would make a monthly 50 cents contribution in exchange for completely covered medical care at the Baylor Hospital.

The paltry monthly subscription was a rough estimate of what it would take to keep the hospital afloat, predicated upon a 75 percent participation rate. The precarious Baylor hospital was counting on teachers responding to an unprecedented sales pitch, and through one of history’s great ironies, the subscription was slated to go live on Oct. 29, 1929. “The Plan” needed participants to feel uneasy about their ability to pay hospital bills, and the simultaneous catastrophe of Black Tuesday and Justin Ford Kimball’s lifeline meant that teachers signed up in droves. The Plan worked, and soon employees from other Dallas industries joined.

Life and property insurance companies had always shied away from health care, but Kimball’s self-taught actuarial skills saved his hospital. In the ’30s, copycat plans began to spring up around the country and were well received by Americans beleaguered by war, pandemic, and financial collapse. The plans were the only health insurance that most Americans had ever heard of and explain why insurance was linked to employment for decades, and why Congress was inspired to invent a post-retirement government health insurance program in 1965—Medicare.

In Minnesota, a local organization formed its own hospital prepayment program, initially calling it the “Blue Plan.” After some deliberation, a new symbol was chosen for the company, and it will be unsurprising that it was a “blue cross.” As it became a national powerhouse, Blue Cross became the preeminent health insurance company in the country.

Within a decade of the founding of Blue Cross, the American Medical Association finally warmed to the idea of health insurance for physicians’ bills, and Blue Shield was born (remember, Blue Cross only covered hospital bills). In time, Blue Cross and Blue Shield merged, and it didn’t take long for the commercial insurance industry to begin offering their own health insurance products.

Advances in medical and surgical sciences were shifting the center of care from patients’ homes and small doctor offices to the hospital, coinciding with the vanishing of “house calls.” The ‘30s saw a metamorphosis in medical sciences, and for the first time, hospitals stopped being places to go and die, but instead became buildings to house healers.

From the beginning, health insurance subscriptions were tied to work, leaving the unemployed and elderly out in the cold. At first, this didn’t matter much—there was a dearth of effective treatments in the first part of the 20th century. There were no MRIs or CT scans, there were very few elective operations (like hip replacement, rotator cuff repair, or ACL reconstruction), and open-heart surgery had not yet been invented. But as surgery became more sophisticated, one important revolutionary discovery changed the world.

Alexander Fleming—a young Scottish physician—worked as a bacteriologist at St. Mary’s Hospital in London during the first several decades of the 20th century. He didn’t have much to show for his research, but as summer turned to fall in 1928, Fleming returned to London from a holiday by the sea. When he arrived at his petite laboratory at St. Mary’s Hospital, a jumbled stack of Petri dishes was on a tabletop, including a dish that had fallen off its perch and lost its lid. The story goes that he glanced at the Petri dish and quickly did a double take—dozens of round spots of staphylococci carpeted the dish, but their spread was limited by a large island of white mold on one side of the dish. The blotch of mold had a surrounding beltway, a demilitarized zone of sorts, where there were no bacterial colonies and no fungus. Fleming muttered softly to himself, “That’s odd.”

Fleming pondered why the mold was able to hold the bacterial colonies at bay, and a question occurred to him: Was the mold making a substance that was deadly to the staphylococci? Fleming and his coworkers investigated the mold—called Penicillium—and labored to identify its bacteria-fighting molecule. Once isolated, Fleming named it “penicillin” and found that it was lethal to staphylococci and streptococci. Surprisingly, although penicillin was discovered in 1928, it did not become clinically available until 1941 owing to the finicky nature of cultivating the mold. Two desperate Oxford researchers, Howard Florey and Ernst Chain, unlocked the secret of penicillium growth just in time for Allied use in World War II, and soon doctors around the world were administering penicillin and other newly discovered antibiotics.

The discovery of antibiotics supercharged doctors’ ability to care for patients. For centuries, physicians and surgeons had been draining abscesses and scooping pus out of chronically infected patients. During war, the chance of dying from a gunshot wound—even a minor injury—was oftentimes 80 percent, with patients dying in agonizing slow-motion. But the arrival of antibiotics revolutionized the treatment of acute and chronic infections, including tuberculosis.

During America’s Civil War, no one believed in bacteria or germs; hospitals were small death houses; fewer than 1 percent of American doctors called themselves surgeons; x-rays were non-existent; and surgery for any condition inside the abdomen and chest was an absolute impossibility. A century later, in the decade after World War II, advanced metal alloys were developed, plastics and polymers were invented, and transistors were created. Of course, the major uses for all of these materials were outside the human body, but you cannot make any modern medical device without them. The simultaneous development of plastics, alloys, and transistors came at the same time that hospitals were modernizing, and with the advent of antibiotics, doctors could consider the unmanageable: implanting foreign material into their patients.

This is the Implant Revolution.

The grand transformation of health care overlapped the modernization of air travel, food processing, telephone communication, television broadcasting, and clothing manufacturing. Like our time, every facet of the mid-20th century world was dramatically changing, so odd combinations of cutting-edge technology existed adjacent to vestiges of the past. As New York was getting its second airport in 1938 (La Guardia; Newark was the first), the last dirt road was getting paved in Manhattan.

The construction of hospitals had ground to a halt during World War II but surged with the passage of the Hill-Burton Act of 1946, a federal government plan of financing hospitals. Between 1946 and 1960, 1,182 new hospitals were built in the U.S., and the expansion of gleaming new operating rooms and the explosion of private health insurance coincided with the arrival of antibiotics and modern materials.

The perfect storm of post-war expansion of the middle class, private health insurance, modern hospitals, and the implant revolution meant that patients had the ability to pay for health care at the same time that health care was becoming, for the first time, effective. And of course, effective health care is expensive.

The two greatest examples of effective (and expensive) health care are joint replacement operations and open-heart surgery. Up until the ’50s, if a patient suffered crippling joint arthritis or a lethal heart attack, the only care that existed was “comfort care.”

In the final stages of hip arthritis, patients experience an incapacitating level of pain, lack of motion, and weakness. The erosion of the cushioning cartilage, the formation of bone spurs, and buildup of inflammatory fluid renders the patient an invalid, incapable of working and robbing them of independence.

In the ’40s, the only anti-inflammatory medicine available in the world was aspirin. Hard to believe, but “Take two aspirin and call me in the morning” was the sole medical treatment in the armamentarium of every doctor when it came to body aches and pain. Incredibly cheap, and pathetically ineffective. When my medical forefathers counseled middle-aged (and older) patients with advanced arthritis, a warm pat on the back, a shrug of the shoulders, and a handful of aspirin was the only response.

Today, surgical joint replacement is incredibly effective. Yes, infection and dislocation remain a concern at least 1 percent of the time, but there is overwhelming evidence that replacing a joint is a life changer—alleviating pain, restoring function, and winning independence. It is also very expensive. Instead of a dollar’s worth of pills, a joint replacement costs over $30,000. But it works.

Similarly, heart attack care until the ’60s was incredibly primitive. The arrival of antibiotics dramatically decreased the mortality rate of pneumonia and tuberculosis, and as Americans became increasingly addicted to cigarettes, heart disease became the most important cause of death by mid-century. (Fewer than 5 percent of Americans smoked in 1900, but 42 percent smoked by 1965). Nitro tablets, aspirin, and clot-busting medicines were the only treatment in the face of a sudden heart attack in the ’60s, but the invention of coronary artery bypass surgery in 1967 and coronary angiography and stenting in the ’80s transformed cardiac care. In patients older than 65, the hospital death rate following a heart attack has decreased from 38 percent in 1970 to 7 percent in 2010. The conclusion: if you can make it to the hospital while having a heart attack, your chance of living—in this country—is shockingly good. Again, it is significantly more expensive than letting a patient die or giving some pills and crossing your fingers… but it works.

In the year 2020, there will be 2 million joint replacement operations and 2 million heart operations. None of those operations would have been possible before the implant revolution. The expensive medical devices used during those operations would have been out of reach of most consumers before the creation of insurance, which begs the questions: did the establishment of health insurance lead to the rise of surgical interventions? Did the ability to pay for expensive hospitalizations and costly devices fuel the construction of hospitals, and launch the implant companies?

I believe the answer to those questions is an easy “yes.”

Which is not to say that the rise of surgical interventions is inappropriate, but our society has to decide if we are willing to pay for less vulnerability in the face of disease, cancer, and trauma. It is unpopular to say in some circles, but it is a fact that Americans have greater access to elite health care, less waiting time, and improved cancer outcomes than almost anywhere in the world. But it comes at a price—higher insurance premiums and higher copays, and the sad truth of far too many uninsured Americans.

Outside the scope of this article is differential pricing of drugs and medical devices in America versus other countries—that certainly drives up the cost of healthcare in this country—but the simple fact remains that modern interventions are simultaneously effective and expensive. I’ll avoid the debate on how we should pay for all of this, but remember this: it’s not just the “greedy health insurance corporations” or “lazy bureaucratic workers” that make health care expensive; it’s costly because we use it, and we use it because it works.

Read more at The Daily Beast.

Get our top stories in your inbox every day. Sign up now!

Daily Beast Membership: Beast Inside goes deeper on the stories that matter to you. Learn more.