The Great Depression represented a challenge to hospitals and, decades before the term “safety net” came about, the public hospital began to play such a role on a de facto basis. Municipal hospitals were particularly busy during the 1930s. In one statistical analysis, between 1929 and 1933, public hospitals saw an increase of 21 percent in patient load with an average occupancy rate of 90 percent in 1933. (The non-governmental hospital patient load declined by 12 percent during the same period, with an occupancy rate of just 55 percent in 1933).
Increased funding for public hospitals was not forthcoming, however. A 1936 survey found that in New York—for example—although subsidized for indigent care, the not-for-profit hospitals restricted it; the public hospitals had no such choice. Demographic shifts and rapid advances in scientific medicine determined the character of the public hospital after World War II. With a mature biochemistry, medicine had a growing authority supported by the advent of penicillin for a host of diseases, streptomycin as a cure for tuberculosis, and the Salk vaccine for polio.
At the same time, with unprecedented prosperity and unemployment rates as low as 2 percent, millions of Americans moved from cities to suburbs, and their health needs were met largely by private insurance. For a time around 1950, the future of city hospitals seemed bright in various ways. In spite of persistent lack of funds, many were well-staffed and affiliated with medical and nursing schools. However, although they benefited from, and helped to implement, the multitude of mid-century medical advances, public hospitals were scarcely favored by the population shift from the urban center to the suburban sprawl. Within less than a generation, tax bases eroded in the large cities in which many public hospitals were located, leaving concentrations of poor and unemployed in their precincts, far outside the American dream. Outpatient visits to public hospitals skyrocketed, rising some 310 percent from 1944 to 1965.