You are on page 1of 12

American Healthcare 1

The American Healthcare System Mallory Powers Glen Allen High School

American Healthcare 2

It has been estimated that approximately 45,000 deaths occur annually in the United States as a result of lack of access to health care (Harvard Medical School and Cambridge Health Alliance, 2009). The same Harvard Medical School and Cambridge Health Alliance study found that uninsured, working-age Americans have a 40 percent higher death risk than privately insured counterparts (2009). These results prove the alarming necessity of health insurance, a guarantee not granted to Americans that is mandated in 67.5% of national constitutions globally (Schimmel, 2013). There is a strong impetus for this dearth and denial of health care within America, delaying the implementation of a more comprehensive, inclusive healthcare system in the United States similar to those of other developed, foreign nations. The lineage of healthcare within the United States is intricate and dependent upon historical precedent. The struggle for universally subsidized healthcare began in 1912 during the Progressive Era when Teddy Roosevelt ran his 1912 presidential campaign on the basis of healthcare reform (Orentlicher, 2012). The prospects and reality of reform, however, were quickly derailed with the victory of Woodrow Wilson who shifted the publics focus from healthcare to industrial workers unions (Orentlicher, 2012). Germanys adoption of a national healthcare program also solidified Americas refusal to address the issue as a frightened attitude of communism and fascism infiltrated the United States psyche. This rejection of socialist tendencies continued throughout the 1920s with America undermining the potency of social programming. Because governmental attempts to provide health insurance to all of its citizens became stymied and futile, the burden of coverage transferred to the private sector. During the 1920s, Baylor University Hospital in Dallas, Texas decided to revolutionize health care and the way in

American Healthcare 3 which people paid for it (Blumberg & Davidson, 2009). The hospital decided that the most efficient and pragmatic form of coverage was to collect a small amount of money every month from people in the area and in exchange, the hospital would cover all hospital bills when people fell ill. This form of payment became the norm of insurance coverage, and it was later renamed Blue Cross. Thus began the modern system of health insurance. The effort for government involvement in the healthcare system did not completely cease, and in 1935, specifically, President Franklin Delano Roosevelt passed Social Security. The bill initially included a provision for a national healthcare program but it was promptly eliminated when Roosevelt felt intense opposition toward the act and lost his support from physicians (Orentlicher, 2012). Because of the failure of the Social Security healthcare provision and the continual success of the Blue Cross program, there was incentive for employers to provide benefits for their workers. Incidentally, the Internal Revenue Service created two laws, one in 1943 and the other in 1945, that made employer-based healthcare tax free, only further distancing the federal government from providing insurance to its citizen (Blumberg & Davidson, 2009). Contrarily, the Truman Administration, from 1945 to 1953, renewed the fight for national health coverage, but the United States was in the midst of the Cold War and, as a result, terrified of programs indicating sympathy toward economic methodologies other than capitalism. Furthermore, Truman faced a Congress comprised of predominantly conservative Republicans and southern Democrats, a seemingly insurmountable obstacle in passing comprehensive healthcare reform legislation (Orentlicher, 2012). Incidentally, Americas Constitution does not guarantee health care for its people, as many developed nations do, but the fight for health care to be recognized as a right commenced in the 1950s with the formation and implementation of

American Healthcare 4 Medicare and Medicaid. These social programs seek to insure the elderly and the poor, and Medicaid, the program that aids those with incomes at or below 138% of the federal poverty level, was developed initially, followed by Medicare (Glied & Ma, 2013). The passage of these social insurances did not come until President Lyndon B. Johnsons presidency, an overhaul characteristic of the Johnson administration. This shift in the way in which health care is administered advanced healthcare from a privilege to a limited right, but developed the argument condemning those who are not able to subsidize health care for themselves. Multiple decades after the passage of Medicare and Medicaid, Congress passed the Emergency Medical Treatment and Active Labor Act (EMTALA), which states that hospital emergency departments cannot deny care to persons who are in active labor or experiencing an urgent need for care regardless of citizenship, legal status, or ability to pay (Orentlicher, 2012). EMTALA was passed in 1986 under President Ronald Reagan and increased the right to access health care, albeit less farreaching than other countries extensive set of healthcare-related rights. In 1993, Bill Clinton took office and announced that his first initiative as president would be to reform the healthcare system in the United States. He cited that there were 37 million Americans without insurance, and that the policy overhaul would be lead by Hillary Clinton over a period of 100 days (Pfiffner, n.d.). The 100 day limit was not met, but the text of the bill was introduced to Congress on October 27th of that year (Pfiffner, n.d.). The bill met its demise for several reasons (uninvolvement of Congress and special interest groups, policy incompetence, and Republican attacks), but Bill Clintons 1990s crusade for a nationalized healthcare system renewed the national conversation about healthcare and its limitations. The complicated nature of health care access and utilization most likely stems from the United States core ideology, a product of the Constitution (Orentlicher, 2012). At the inception

American Healthcare 5 of America, the Founders wrote a constitution whose success and implementation hinged upon the recognition of negative rights, or the boundaries by which the national government must comply. While most other nations abide by constitutions or governing doctrines that enumerate positive rights, what the government must ensure to the people, the United States balance of governmental restraint and assistance is in constant flux, dictated by judicial rulings. There has been no historical, judicial precedent set which demands the federal government provide health care for every citizen of the United States, therefore, America continues to operate as a negativerights dominated society, preventing the legislature from ever proposing policy that would define and guarantee health care as a basic, fundamental right of humans, the cornerstone of most developed nations healthcare systems. Americas negative-rights societal structure impedes the health care system and its ability to function because the burden of providing health insurance falls on privatized, for-profit insurance companies, heightening the disparity between the United States healthcare system and those of OECD, the Organization for Economic Cooperation and Development, countries. France and Germany, for example, heavily regulate non-profit insurance companies as a means of incentivising low prices and high accessibility. Furthermore, over 200 insurance companies insure over 90% of Germanys population (Schimmel, 2013) as opposed to the United States where, in 45 states, 51% of the insurance market is controlled by a mere 2 insurance companies (Unite Here, 2014). Canada and Britain both have public, single-payer healthcare systems, and all residents of Britain are eligible for insurance under the National Health Service (Ragupathy, 2012), while not all United States residents are granted the ability to receive health insurance. Similarly, a 2013 study conducted by the National Association of Social Workers found that poor, uninsured Canadian women diagnosed with colon cancer are more likely to receive surgery

American Healthcare 6 and chemotherapy, wait shorter amounts of time for care, and experience higher survival rates than their poor, uninsured American counterparts (Gorey, 2013). There is an apparent lack of access to health care in the United States that is not present in other countries that provide all of their citizens the ability to utilize their respective systems. In addition to seemingly blocking some residents from receiving health care benefits, the United States also manages to spend more money per capita on healthcare than Australia, Canada, Denmark, France, Germany, Japan, the Netherlands, New Zealand, Norway, Sweden, Switzerland, and the United Kingdom (Squires, 2012). The health care that is administered in America is not of a higher quality, just a higher price (Squires, 2012). Compared to the OECD median, the United States possesses fewer hospital beds and shorter lengths of hospital stay, yet healthcare accounts for 17.7% of its GDP (Kumar, 2013). Not surprisingly, there is a positive correlation between healthcare costs and national income, but if this trend were to remain true when applied to America, the country would spend only $4,500 per capita on health care, as opposed to the current $8,000 (Squires, 2012). There are several sources of this exorbitant and excessive spending, one of them being the high cost of prescription medications in America. An analysis by Gerard Anderson, professor at the Bloomberg School of Public Health at Johns Hopkins University, found that United States prices for the 30 most commonly prescribed drugs are one third higher than Canada and Germany and more than double the cost in Australia, France, the Netherlands, New Zealand, and the U.K. Anderson also noted that the average American takes fewer prescriptions than those in France or Canada. Ten percent of healthcare costs in the United States are credited to drugs, yet Americans receive less for their money than any other country (Rosenthal, 2013). Two hundred and fifty dollars buys 2 bottles of the nasal inhaler-spray Rhinocort in America,

American Healthcare 7 while it affords 51 bottles in Romania (Rosenthal, 2013). Similarly, $250 buys 19 pills of the antibiotic Augmentin in America, while it affords 445 pills in Belgium (Rosenthal, 2013). Worst of all, $250 buys 51 pills of the gout medication Colcrys in America, while it affords 9,158 pills in Saudi Arabia (Rosenthal, 2013). These costs do not only burden those who wish to purchase the drugs for themselves; taxpayers must also face the stark truth of high costs of medication. The state of California spent $61 million on asthma medications through their Medicaid program in 2012, a price that taxpayers help subsidize (Rosenthal, 2013). It is highly likely that citizens are indirectly paying for inhalers without ever having been diagnosed with asthma, the bare reality of inordinate prescription costs and the burden it places on all citizens of the United States. The ability of pharmaceutical companies within the United States to bend laws in their favor is highly dissimilar to other developed, foreign nations. In 2012, the pharmaceutical industry spent $250 million successfully lobbying the government for, essentially, preferential treatment. A result of their continual efforts was a ruling in Washington that forbids Medicare from negotiating drug prices (Rosenthal, 2013). Also, prior to a 2013 Supreme Court decision, pharmaceutical companies were allowed to openly pay rival companies to keep their generic, and thus less expensive, medications off the market, a process aptly nicknamed pay-for-delay (Wyatt, 2013). In a 5-to-3 decision, the Supreme Court ruled that the Federal Exchange Commission may now sue pharmaceutical companies that seek to eliminate competition, a sweeping alteration to the pharmaceutical industry. The stronghold of high costs continue to extend beyond pharma itself, however, with restrictions placed upon the United States PatientCentered Outcomes Research Institute. This is the organization authorized by Congress to conduct research used by patients and physicians to make the most informed health decisions.

American Healthcare 8 The Institute, however, is restricted from factoring costs of medications and treatments into their analyses and recommendations. This is a uniquely American restraint that agitates already toohigh prices and places the burden on those tasked with providing the money for such treatment options (Rosenthal, 2013). It is also hypothesized that the exorbitant rate of chronic disease within the United States (133 million people suffered from at least one in 2004) has crippled the American healthcare system. The United States is best equipped to treat acute ailments, not continued, chronic disease (Ameringer, 2012). It is not equipped to manage long-term illness, such as cancer, diabetes, hypertension, stroke, heart disease, asthma, and mental disorders because the system is flooded with medical specialists instead of primary care physicians. European countries, specifically, place emphasis on the importance of primary care physicians (internists, pediatricians, and family practitioners) while only 32% of doctors in the United States are primary care physicians (Ameringer, 2012). The United States has the less general practitioners per 1000 residents than Canada, France, Germany, Italy, Japan, and the United Kingdom. Susan Dentzer, member of the Institute of Medicine, has commented that nations with strong primary care systems have better-quality health care with lower costs, yet there has been a meek 4% increase in the number of primary care physicians in America since 1997 while the population of the country has increased 20% since then (Tully, 2014). This negative trend provided impetus for the American Association of Medical Colleges to estimate that the United States will face a shortage of roughly 46,000 primary care doctors by the year 2020, a medical crisis independent of any other various issues. The current state of American physicians is out of balance, with too little supply and too much demand for primary care. The constant drop in the number of primary care doctors

American Healthcare 9 skyrockets prices and eliminates competition, negatively altering the nature of the system for the consumer (Tully, 2014). According to Merritt Hawkins, a physician search and consulting firm, a primary care physician in America earns an average $185,000 annually, while medical specialists earn an average of $300,000 annually (Tully, 2014). This price disparity is a result of the fee-for-service payment, a structure to which the United States subscribes. This catalog of procedures with coordinating codes and prices dictates the amount of money a doctor is paid. This presents an issue, though, for primary care doctors because the typical procedures they perform do not cleanly correspond with a code and dollar amount. Performing a hip replacement, for example, is easier to label and reimburse than an internist deciding the correct dosage of insulin for a diabetic. This structural flaw discourages medical students from entering the primary care field (only one fourth of new doctors entering the medical field are general practitioners), a potentially disastrous trend for the American patient (Tully, 2014). In an attempt to eradicate the healthcare system of these issues, the Patient Protection and Affordable Care Act was passed on March 23, 2010. This policy overhaul was the first drastic alteration to the healthcare system since the passage of Medicare and Medicaid in the 1960s, and the law has proven to be the major legislative accomplishment of the Obama Administration. While the Affordable Care Act possesses roughly eight distinct provisions that alter the landscape of the healthcare system, none of them directly address the flaws in the pharmaceutical industry. Instead, it focuses its efforts on increasing access to healthcare through expansions and standards. The largest, and arguably most controversial, component of the law is the individual mandate, or the requirement of all United States citizens to purchase health insurance that are not already covered by some form of private insurance or federal assistance. By doing this, the federal government is aiming to alleviate costs. If a person opts out of Affordable Care Act

American Healthcare 10 mandated insurance, the 2014 tax penalty is $95 or 1% of household income, whichever is higher (n.a., 2014). This fee is raised to $325 or 2% in 2015 and $695 or 2.5% in 2016 (n.a., 2014). While the individual mandate has endured expansive and countless Republican attacks (disregarding the fact that the individual mandate was first developed and implemented in Mitt Romneys conservative state of Massachusetts), a 2012, 5-to-4 Supreme Court decision in National Federation of Independent Business v. Sebelius ruled that the individual mandate component of the Affordable Care Act is within Congress Constitutional right to tax. Also included in the Affordable Care Act is the requirement that insurance companies are no longer allowed to discriminate against individuals with preexisting conditions by charging them raised prices for insurance or not providing insurance for them at all. Additionally, the Affordable Care Act sets standards for insurance plans and the services they must provide. There is now a minimal guideline for plans in order to provide comprehensive care for those covered by insurance. As a result, some plans may be discontinued because of their inadequacy and inability to properly cover those they insure, inadequacy as defined by the federal government. Also outlined in the legislation is a more inclusive version of Medicaid, raising the eligibility threshold to incomes 133% of the federal poverty level (Glied & Ma, 2013). This number may also be interpreted as 138% of the federal poverty level because the Affordable Care Act updates and streamlines the process for calculating income across states (n.a., 2014). The Affordable Care Act does not alter the federal-state relationship through which Medicaid operates, highlighted by the ruling in National Federation of Independent Business v. Sebelius which found it unconstitutional of the national government to suspend states Medicaid funding as a result of their refusal to expand the Medicaid programs in their respective states. As a direct result of this decision, 21 states have refused to expand their Medicaid programs even though,

American Healthcare 11 under the Affordable Care Act, the federal government will fully subsidize the expansions until 2022, the year in which the federal government will continually pay for 90% of the expansions (Glied & Ma, 2013). The expansions that have taken place, however, are now covering 4.5 million new customers, bolstering the potency of the Affordable Care Act (Klein, 2014). In addition to expanding Medicaid eligibility, the Affordable Care Act attempts to regulate, if not increase, the stability of the primary care system within the United States. The burden of blame for the inconsistency lies with the pay-for-service methodology plaguing health care payment, and the Affordable Care Act is reforming Medicare payments from fee-for-service to an entirely different procedure, bundled payments. This shifts countless, arbitrary payments to one compact purchase, dictated by means of estimating the expectant cost of the procedure. As a manner of addressing the primary care dearth in America, the Affordable Care Act created the Accountable Care Organizations (ACOs) that are large, integrated groups of physicians, practices, and hospitals that follow a set of requirements established by Medicare that work as a unit for the patient, not as disjointed entities (Tully, 2014). There is also encouragement to become a general practitioner from the Affordable Care Act, as the 2014 budget allots and additional $5.23 billion over the ten years to aid in the creation of of new doctors, while also creating 1,300 new residency spots for those interested in entering the primary care field (Tully, 2014). The linkage between employers and health insurance is another focus of the Affordable Care Act, the target of the employer mandate. Similar to the tax penalty that individuals must pay as a result of not opting into the healthcare market, employers must pay an employee shared responsibility payment on federal tax returns upon failure to provide health insurance for all full-time employees. This applies only to businesses comprised of 50 or more employees, but

American Healthcare 12 those with 100 or more must start providing benefits by 2015, while smaller businesses (50-99 employees) must start providing benefits by 2016 (n.a., 2014). The fee for refusal to provide health insurance is $2,000 annually per employee, with the first 30 full-time employees not counting toward the fee (n.a., 2014). These may seem drastic or unjust, but 96% of businesses in the United States are completely unaffected by this mandate (n.a., 2014). As a result, the employer mandate impacts only 0.2% of the nations small businesses that will be required to provide health insurance for their employees or pay a tax penalty (n.a., 2014). The nature and structure of the American healthcare system is complicated and messy. It is comprised of historical accidents and inconvenient timing, all culminating in an inefficient, inadequate, arguably discriminatory entity that has become a political agenda instead of a public service. Not all hope is lost, though. Twenty years ago, even ten years ago, the prospect of sweeping healthcare reform was counted as an impossibility, a cumbersome and futile hope that was discarded four decades prior. And this reform, the Patient Protection and Affordable Care Act, will probably not be the absolute, definitive answer. It may not, and will not, be the demise of the nation either, as some like to suggest. It is a median piece of legislation that will prove beneficial and detrimental in different realms, but it is reform nonetheless. If it serves no other purpose, the Affordable Care Act renews the national dialogue about healthcare and its lackluster history, a vital step in the continuation of the (literal) life of the United States.

You might also like