Our History

Our Role as a Champion for Essential Hospitals and the People and Communities They Serve

For more than four decades, America’s Essential Hospitals has been the foremost advocate in Washington, D.C., for hospitals that care for people who face financial and social barriers to care. We support our members’ safety net mission with advocacy, education, research, and leadership development. Our accomplishments to further health care access and equity are many: from disproportionate share hospital funding to laws that reduce drug costs for low-income patients to the recent fight for COVID-19 relief. Since 1981, America’s Essential Hospitals has been there for its members and their communities.

Roots as Public Hospitals


emergence.jpg
Ward K of Armory Square Hospital, Washington, D.C.

The American hospital as we know it today emerged over the course of about 60 years, beginning around the time of the Civil War. Physician-staffed hospitals, with professional nursing and specialized departments and services, were products of urbanization and economic expansion during the Second Industrial Revolution—together with massive immigration and rapid strides in medicine itself. About 1880, asepsis (sterilizing) opened broad new horizons for surgeons. As physicians looked to the future with a new sense of hope, hospitals became symbolic of their new optimism and authority.

Each of today’s essential hospitals has a unique story, and many trace their origins to this historic period of growth. Some were established originally by states and counties, others by municipalities. Once established and staffed by trained physicians and nurses, public and not-for-profit hospitals became key components in a rapidly expanding medical culture.

By the 1920s, the hospital was a place where one could hope illness might be treated and even cured. Although some hospitals started to reduce their charitable role in favor of attracting an upper middle class clientele, public hospitals continued to operate with a commitment to treat the poor and a consequent fiscal challenge.

All hospitals before the 1920s had operated without much money. Physicians donated their time, and costs for nurses and staff tended to be low. For the first time, hospitals required significant funds, just as doctors and surgeons began getting paid and nursing and staffing were professionalized. Many urban public hospitals recast themselves appropriately as major and, sometimes, highly regarded institutions, often establishing affiliation with universities and medical schools. At the same time, they remained committed to the mission of treating all, and they became ever more vulnerable in the marketplace.


Los Angeles County General Hospital, 1960.

The Great Depression represented a challenge to hospitals, and decades before the term “safety net” came about, the public hospital began to play such a role on a de facto basis. Municipal hospitals were particularly busy during the 1930s. In one statistical analysis, between 1929 and 1933, public hospitals saw an increase of 21 percent in patient load with an average occupancy rate of 90 percent in 1933. (The non-governmental hospital patient load declined by 12 percent during the same period, with an occupancy rate of just 55 percent in 1933).

Increased funding for public hospitals was not forthcoming, however. A 1936 survey found that in New York — for example — some non-governmental hospitals subsidized for indigent care restricted it, nonetheless; public hospitals had no such choice. Demographic shifts and rapid advances in scientific medicine determined the character of the public hospital after World War II. With a mature biochemistry, medicine had a growing authority supported by the advent of penicillin for a host of diseases, streptomycin as a cure for tuberculosis, and the Salk vaccine for polio.

At the same time, with unprecedented prosperity and unemployment rates as low as 2 percent, millions of Americans moved from cities to suburbs, and their health needs were met largely by private insurance. For a time around 1950, the future of city hospitals seemed bright in various ways. Despite a persistent lack of funds, many were well-staffed and affiliated with medical and nursing schools.

But while they benefited from and helped to implement a multitude of midcentury medical advances, public hospitals were scarcely favored by the population shift from the urban center to the suburban sprawl. Within less than a generation, tax bases eroded in the large cities in which many public hospitals were located, leaving concentrations of poor and unemployed in their precincts, far outside the American dream. Outpatient visits to public hospitals skyrocketed, rising some 310 percent from 1944 to 1965.


establishing.jpg
The JPS Hospital, Fort Worth, Texas, in 1930

Predictions in the 1960s and even later forecast the demise of the public hospital, in large part due to the projected benefits the poor and elderly could receive from Medicaid and Medicare. Although the number of public beds declined and some hospitals eventually would shut their doors due to lack of funding, nothing like that came about.

Medicaid from the start failed to cover at least one-quarter of the poorest patients, while Medicare never paid more than about half the costs of care for seniors; and those costs rapidly mounted in real-dollar terms. While the consumer price index rose 300 percent between 1960 and 1980, the per diem cost of hospital beds rose by 900 percent.

A clear need for “open door” hospitals with a high standard of care continued to exist and, in fact, to expand. Creation of a Commission on Public-General Hospitals in 1976 brought a report two years later that crystallized the fact that, “The future of the urban public-general hospital . . . must be uniquely treated.”

The National Association of Public Hospitals — now America’s Essential Hospitals — was established just to that aim in 1980 as an umbrella advocacy group. It opened for business the day before the inauguration of Ronald Reagan, and one of its early legislative victories was recognition for the disproportionate share hospital (DSH) that serves more than its share of low-income patients. DSH alone over the years has brought public hospitals billions of dollars and represents a critical source of funding.

Another aspect of the “safety net” hospital (as they began to designate themselves) was a redesigned commitment to the communities they served. Today, in addition to inpatient and outpatient services, safety net hospitals typically offer HIV/AIDS care, substance use disorder counseling, prenatal and obstetrics care, level I trauma care, and other specialized, lifesaving services. These services broaden the scope and reach of public hospitals while maintaining their basic mission.


Bellevue Hospital, New York.

As 20th century medicine advanced, despite the electrocardiogram and improved understanding of basic cardiovascular physiology, quantifiable chemical details about the way the heart works — in both health and disease — were long in coming.

An extraordinary series of experiments by Dickinson Richards and André Cournand in the 1940s, just as heart disease began to spike in industrialized countries, brought unprecedented precision to the diagnosis of heart and circulatory disease. With a research program that adapted an old tool (the catheter) and a hypermodern instrument (the fluoroscope), Richards and Cournand were able to make unprecedented measurements of blood flow from within the heart itself. Viewing the heart, lungs, and pulmonary circulation as an integrated system, they developed descriptions of hemodynamics that led to a new taxonomy of heart disease.

Pioneering work at the pulmonary-coronary laboratory at Bellevue Hospital, now part of America’s Essential Hospitals member NYC Health + Hospitals, overcame beliefs about invasive procedures and the human heart. This work opened the way to a wide variety of diagnostic and therapeutic uses for catheterization, including angiography, angioplasty, and stent implantation. With Werner Forssmann, Cournand and Richards were awarded the Nobel Prize in 1956.


History_04
First Hospital Ambulance, Bellevue Hospital Center, NY, 1869

When a sleek, horse-drawn ambulance made its debut at New York City’s Bellevue Hospital in 1869, tucked beneath the driver’s seat was a quart of brandy. There were tourniquets, sponges, bandages, splints, blankets and — if you envisioned difficult customers — a straitjacket. The driver cleared traffic ahead with an imperious gong, and a doctor bounced along in back.

Removable floor slats served as a stretcher. The first such service in the world was so innovative, it was soon imitated in major cities across the country and throughout Europe. These vehicles laid a clear milestone in hospital history, but they also testify to the strict limitations of medicine in the 1870s — an era in which tobacco was used to stave off infection, and asepsis (sterilization) was tomorrow’s invention.

The ambulance used by Bellevue, now part of America’s Essential Hospitals member NYC Health + Hospitals, was the brainchild of Edward B. Dalton, a staff surgeon whose administrative skills won him an appointment as Inspector of the Army of the Potomac during the Civil War. Placed in charge of transport and care of the wounded, he created an efficient service for bringing casualties to field hospitals.

Returning to Bellevue after the war, Dalton recognized how a relatively lightweight vehicle (600-800 pounds) could be adapted to the streets of burgeoning New York City. The first year they operated, Bellevue ambulances answered some 1,401 calls. Two decades later, the service brought in nearly 4,400 patients. Not until 1924, a generation after the arrival of the automobile, did the last horses retire, turned out to pasture at an upstate farm.


Blood bank, John H. Stroger Jr. Hospital of Cook County.

Serving the industrial hub of the nation, Cook County Hospital in Chicago (now America’s Essential Hospitals member John H. Stroger, Jr. Hospital of Cook County) opened a huge new facility in 1916 and, by 1925, was treating some 42,000 patients annually. It was an institution, wrote a contemporary, “where the turnover is rapid and emergencies constantly at hand.”

The hospital’s director of therapeutics, Bernard Fantus, a highly respected figure in American medicine, in 1937 created what is generally recognized as the first hospital-based blood exchange facility in the United States. The concept was soon copied the world over; and the brutalities of the Second World War would only underscore its vital significance. Knowledge and technology made the blood bank a much-desired innovation after the key discovery of the ABO blood groups by chemist and physician Karl Landsteiner in 1900.

The value of extra blood for surgery was obvious, but collection and, especially, storage were problems. Discovery of anticoagulants and ways to store blood, at least briefly, enabled Britain to use “blood depots” in World War I; and by the early 1930s, the Soviet Union had developed a system of blood exchange.

Although not widely publicized, the Mayo Clinic effectively employed a blood bank in 1935. But the facility that opened at Cook County Hospital on March 15, 1937, became widely understood as the first hospital blood bank, and Fantus’ article in the Journal of the American Medical Association that same year, titled “The Therapy of the Cook County Hospital,” was highly influential.

“No one acquainted with the situation constantly arising in large general hospitals doubts its value,” wrote LeRoy Sloan MD, in his obituary of Fantus in 1940. The model was adopted around the country and throughout the world.

Download Helpful Safety Net Publications