You are on page 1of 33

T

Topical aff must be legislation -- judicial action is not T


Establish means legislate -- courts only rule on established law
Websters 10 Webster's New World College Dictionary Copyright 2010 by
Wiley Publishing, Inc., Cleveland, Ohio.
Used by arrangement with John Wiley & Sons, Inc.
http://www.yourdictionary.com/establish
establish to order, ordain, or enact (a law, statute, etc.) permanently

Policy requires Congress


Koch 6 - Dudley W. Woodbridge Professor of Law, William and Mary School of Law.
B.A., University of Maryland, not that Charles Koch (Charles, FCC v. WNCN
LISTENERS GUILD: AN OLDFASHIONED REMEDY FOR WHAT AILS CURRENT JUDICIAL
REVIEW LAW, Administrative Law Review vol 58, Hein Online)
Of these, Judge McGowan's opinion, in particular, provides a theoretically sound and useful framework. Judge
McGowan focused the Circuit's disagreement on the "reading of the [a]ct" in which judicial authority is dominant. 8
Thus, he selected the battleground advantageous to [BEGIN FOOTNOTE] 3. See FCC v. Sanders Bros. Radio Station,
309 U.S. 470, 475 (1940) (stating that Congress wished to allow broadcasters to compete and to succeed or fail
based on the ability to offer programs attractive to the public). 4. FCC v. WNCN Listeners Guild, 450 U.S. at 589. 5.
Id. at 591. In the broad sense, "policy" decisions are those that advance or protect some collective goals of the
community as opposed to those decisions that respect or secure some individual or group rights. See also Ronald
Dworkin, Hard Cases, 88 HARv. L. REV. 1057, 1058 (1975), reprinted in RONALD DWORKIN, TAKING RIGHTS
SERIOUSLY 81-130 (1977) (exploring the distinction between arguments of principle and policy); HENRY M. HART, JR.
& ALBERT M. SACKS, THE LEGAL PROCESS: BASIC PROBLEMS IN THE MAKING AND APPLICATION OF LAW 141

the
term "policy" means such decisions assigned to the agency and policies made
by legislators are embodied in the statutory language and hence are not
"made" either by the agency or the courts , but are derived through the various techniques
(William N. Eskridge, Jr. & Philip P. Frickey ed., 1994) ("A policy is simply a statement of objectives."). Here

of statutory interpretation. 6. FCC v. WNCN Listeners Guild, 450 U.S. at 592-93. See, e.g., Ronald M. Levin,
Identifying Questions of Law in Administrative Law, 74 GEO. L.J. 1 (1985) (scrutinizing the difference between
questions of law and other questions, such as policy). 7. WNCN Listeners Guild v. FCC, 610 F.2d 838, 838 (D.C. Cir.
1979). 8. Id. at 842.

The Chevron doctrine makes no change in this fundamental

principle. See, e.g., Great Plains Coop. v. CFTC, 205 F.3d 353, 356 (8th Cir. 2000) (using the Chevron opinion as
supporting the conclusion that "statutory interpretation is the province of the judiciary"); Antipova v. U.S. Att'y Gen.,
392 F.3d 1259, 1261 (1 1th Cir. 2004) (explaining that the court reviews "the agency's statutory interpretation of its
laws and regulations de novo .... However, we defer to the agency's interpretation if it is reasonable and does not
contradict the clear intent of Congress"). See generally 3 CHARLES H. KOCH, JR., ADMINISTRATIVE LAW AND
PRACTICE 12.32[1] (2d ed. 1997) (offering many more examples). [END FOOTNOTE] the court. He nonetheless
noted that an administrative decision under delegated policymaking authority would be subject only to hard look
review, which he properly characterizes: "[The Commission] must take a 'hard look' at the salient problems." 9 That
is, the court must assure that the agency took a hard look, not take a hard look itself. "Only [the Commission], and
not this court, has the expertise to formulate rules welltailored to the intricacies of radio broadcasting, and the
flexibility to adjust those rules to changing conditions .... And only it has the power to determine how to perform its
regulatory function within the substantive and procedural bounds of applicable law."' 0 In other words, the court
must assure that the agency is acting within its statutory authority and, once it determines the agency is acting
within delegated policymaking authority, the court is largely out of the picture. Upon crossing this boundary, the
judicial job is limited to assuring that the policy is not arbitrary by determining whether the agency took a hard
look. The basic review system is revealed as Judge McGowan continues: "[The prior case] represents, not a policy,
but rather the law of the land as enacted by Congress and interpreted by the Court...."" He properly noted that

this distinction not only implicates the allocation of decisionmaking


authority between a reviewing court and an agency, but between both and
Congress: This court has neither the expertise nor the constitutional
authority to make "policy" as the word is commonly understood .... That
role is reserved to the Congress , and, within the bounds of delegated authority, to the
Commission. But in matters of interpreting the "law" the final say is constitutionally committed to the judiciary . . . .
Although the distinction between law and policy is never clearcut, it is nonetheless a touchstone of the proper
relation between court and agency that we ignore at our peril.

The affirmative interpretation is bad for debate.


Limits and ground are necessary for negative preparation and
clash. We permit a good number of cases, but they make the
topic too big. The agent adds a whole new set of plans -- all
the district and circuit courts are possible agents. This is
compounded by issues of legal precedents and impacts having
nothing to do with emissions.

T
We meet congress brings the law into force
CI Establish means to put into force
Webster, Merriam-Webster products and services are backed by the largest team
of professional dictionary editors and writers in America, and one of the largest in
the world, establish, no date, http://www.merriamwebster.com/dictionary/establish
establish : to cause (someone or something) to be widely known
and accepted to put (someone or something) in a position, role, etc., that
will last for a long time
Simple Definition of

Prefer this interpretation:


1) Neg ground You get disads to the enforcement of the law
includes politics because enforcement has electoral
consequences - CPs are in the lit disad links are comparative
to the status quo
2) Aff ground defending new programs isnt mean tested
solvency advocates are weak
3) CPP is core of the topic its the biggest regulatory attempt
by the EPA excluding it is bad for education and guarantees
neg DA links are always non-unique
Any interp that excludes the CPP is bad core of the topic
Gamboa, Suzanne, NBC News Senior writer covering Latinos and politics, 5
Questions: Latina Climate Scientist On Carbon Emissions Rule, June 29, 20 15,
http://www.nbcnews.com/news/latino/5-questions-latina-scientist-carbon-emissionsrule-n403271
The White House is releasing a
Clean Power Plan that will start moving the U.S. toward a clean energy economy - it's the first ever
restriction on carbon emissions . We've had other restrictions - but we
haven't had any on carbon emissions and this is a huge step in limiting carbon
NBC: What is the big announcement on plant emissions from the president? Hernandez Hammer:

pollution. (Carbon dioxide is the primary greenhouse gas contributing to climate change.) The goal is that by 2030, the U.S. would
reduce carbon emissions from coal-fired plants by 32 percent. It's not only good for the U.S. but also, in terms of our position in the
world, it allows us to be leaders in encouraging other countries to take more steps toward clean energy

Prefer reasonability Competing interpretation is a race to the


bottom we should have to significant alter negative link
ground our aff is the most core of the topic

2NC

2NC Circumvention
Aff links to uncertain precedent Roberts will undermine
precedent
William D. Araiza, Law Prof @ Brooklyn, Summer 2012, PLAYING WELL WITH
OTHERS-BUT STILL WINNING, 46 Ga. L. Rev. 1059, ln
How can a judge undermine precedent while still following it ? This Essay considers
the methods by which Supreme Court Justices may weaken precedent without
explicitly overruling cases by strategically adopting an approach to stare
decisis that is less explicitly aggressive than their colleagues'. Adding to the literature of "stealth
overruling," this Essay considers examples of such methods from Chief Justice Roberts's first five years on the Supreme Court. These

Roberts knows how to engage in stealth overruling


and, more broadly, how to use his colleagues' preferences to maintain a formal commitment to
judicial humility while achieving jurisprudential change . As such, they reveal important
examples indicate that Chief Justice

insights about how Justices can operate strategically to achieve their preferences within both the opportunities and the confines
inherent in a multi-judge court. After five years, many have accused the Roberts Court of aggressively attacking precedent. No less a
figure than Justice O'Connor, whose retirement marked the effective start of that Court, has expressed concern about the Roberts
Court's willingness to overrule prior decisions. n1 Then-Judge Roberts's famous confirmation hearing analogy of judging to umpiring
n2 and his professed respect for stare decisis n3 make for a dramatic narrative in which a nominee piously describes a humble role
for judges but then, once safely confirmed, sets out with a wrecking ball. The charge may have merit, but a short essay is not the
vehicle to make that determination. Simply pointing to a few high-profile [*1061] overrulings, as critics sometimes do, proves little.
n4 Rather, an in-depth examination of the issue requires considering the situations where the overruling dog did not bark-that is,
where the Court could have overruled a prior case but declined to do so. n5 Such an investigation also calls for both historical
perspective and nuance. n6 Reaching interesting conclusions about the Roberts Court's treatment of stare decisis requires that we
identify a baseline of how previous Courts have treated that principle. If impressionistic conclusions based on a few dramatic
examples are enough to consider the charge proven, then the Rehnquist n7 and Warren n8 Courts are presumably guilty also.
Moreover, not all overrulings are created equal. Determining the extent of the Roberts Court's alleged disregard of precedent also
requires considering the importance of the precedents the Court has in fact rejected. Consider Justice White's dissent in INS v.
Chadha. n9 White characterized the majority's rejection of the legislative veto as effectively striking down hundreds of statutes and
eliminating a then-major feature of the modern administrative state. n10 Chadha was not a case where the Court overruled
precedent. Justice White's complaint about the far-reaching nature of the Court's decision, however, reminds us that identifying
judicial aggressiveness, whatever its form, requires [*1062] more than simply adding up the number of cases where the Court has
acted aggressively. n11 This Essay considers the Roberts Court and stare decisis from a different angle. It examines several methods

Roberts arguably has used the multi-judge nature of the Supreme Court to his
advantage in undermining precedent without explicitly calling for its
overruling. n12 These examples do not prove that the Court as a whole, or the Chief Justice in particular, is bent on undoing
the work of prior Courts. Instead, they illustrate the ways in which a Justice can work within the formal
confines of precedent to achieve fundamentally different results , either in
the short or long term. n13 The methods described below depend in part on the distinction between the result a
by which Chief Justice

court reaches in a case and the reasoning it employs. The nature of the Supreme Court as a multi-judge court makes this distinction
possible: often times, the Court may agree on a result but split sharply on its reasoning. n14 This opens up room for a creative
Justice to undermine precedent, even as the Justice expresses reasons that appear moderate-in particular, more moderate than
those who are more inclined to overrule explicitly. In so doing, the Justice may create the conditions for the ultimate rejection of that
precedent, even while publicly counseling restraint-indeed, even while voting to uphold that [*1063] precedent. n15 In short, this
Essay considers methods by which Justices can play well with others-both those that came before (via respect for stare decisis) and
current colleagues (by strategically positioning themselves among them)-and still achieve their ultimate goal. n16 This Essay

stealth
overruling-the practice of limiting or even eviscerat ing a precedent while
ostensibly remaining faithful to it. n17 This phenomenon has become a major topic of scholarly discussion
situates itself at the intersection of two ongoing debates about judicial behavior. The first examines the concept of

during the last five years, n18 as scholars have identified and analyzed examples of the Roberts Court engaging in such conductconduct generally thought to have resulted from the replacement of a sometimes centrist Justice O'Connor with a more reliably
conservative Justice Alito. n19 The examples in this Essay illustrate instances where the Court or a plurality thereof arguably has
engaged in such conduct. n20 The lessons one can draw from these examples will help shape an understanding of the stealth
overruling phenomenon, and the extent to which the Roberts Court performs it. Second, this Essay engages the debate about the
implications of the Supreme Court's character as a collegial body. Scholars long have acknowledged that critiques of the Court must
account for its collegial nature rather than simply treating it as a purposive [*1064] individual. n21 This Essay contributes to that

Roberts may in certain cases strategically use his colleagues' calls


for more explicit overruling of precedent as a tool in maintaining his and the Court's
debate by considering how Chief Justice

reputation as faithful to stare decisis while nevertheless pushing the law


away from precedents.

Warming

D
Warming much slower than their impacts assume their
models are flawed and our authors use the newest and best
science
volcanoes, solar forcing, natural variability
Fyfe et. al 16 [John, Canadian Centre for Climate Modelling and Analysis,
Environment and Climate Change university of Vancouver, Gerald Meehl, National
Center for Atmospheric Research, Boulder, Colorado, Matthew England, ARC Centre
of Excellence for Climate System Science, University of New South Wales, Michael
Mann, Department of Meteorology and Earth and Environmental Systems Institute,
Pennsylvania State University, Benjamin Santer, Program for Climate Model
Diagnosis and Intercomparison (PCMDI), Lawrence Livermore National Laboratory,
Gregory Flato, Canadian Centre for Climate Modelling and Analysis, Environment
and Climate Change Canada, University of Victoria, Ed Hawkins, National Centre for
Atmospheric Science, Department of Meteorology, University of Reading, Nathan
Gillet, Canadian Centre for Climate Modelling and Analysis, Environment and
Climate Change Canada, University of Victoria, Shang-Ping Xie, Scripps Institution of
Oceanography, University of California San Diego, Yu Kosaka, Research Center for
Advanced Science and Technology, University of Tokyo, Making sense of the early2000s warming slowdown, Nature Journal March 2016, pg. 227-28]
-

Our results support previous findings of a reduced rate of surface warming over
the 20012014 period a period in which anthropogenic forcing increased at a relatively
constant rate. Recent research that has identified and corrected the errors and inhomogeneities in the surface air
temperature record4 is of high scientific value. Investigations have also identified non-climatic artefacts in
tropospheric temperatures inferred from radiosondes30 and satellites31, and important errors in ocean heat
uptake estimates25. Newly identified observational errors do not, however, negate

the existence of a real reduction in the surface warming rate in the early
twenty-first century relative to the 1970s1990s. This reduction arises through the
combined effects of internal decadal variability 1118, volcanic 19,23 and solar
activity, and decadal changes in anthropogenic aerosol forcing 32. The warming
slowdown has motivated substantial research into decadal climate variability and uncertainties in key external
forcings. As a result, the scientific community is now better able to explain temperature variations such as
those experienced during the early twenty-first century33, and perhaps even to make skilful predictions of such
fluctuations in the future. For example, climate model predictions initialized with recent observations indicate a
transition to a positive phase of the IPO with increased rates of global surface temperature warming (ref. 34, and

climate models did not (on


reproduce the observed temperature trend over the early twentyfirst century6, in spite of the continued increase in anthropogenic forcing .
G.A. Meehl, A. Hu and H.Teng, manuscript in preparation). In summary,
average)

This mismatch focused attention on a compelling science problem a problem deserving of scientific scrutiny.

Based on our analysis, which relies on physical understanding of the key processes and forcings
involved, we find that the rate of warming over the early twenty-first century is slower than that of the
previous few decades. This slowdown is evident in time series of GMST and in the global mean
temperature of the lower troposphere. The magnitude and statistical significance of observed trends (and the
magnitude and significance of their differences relative to model expectations) depends on the start and end dates

Research into the nature and causes of the slowdown has triggered improved
understanding of observational biases, radiative forcing and internal variability . This has led to
of the intervals considered23.

widespread recognition that modulation by internal variability is large


enough to produce a significantly reduced rate of surface temperature
increase for a decade or even more particularly if internal variability is augmented by the
The legacy of this new understanding
will certainly outlive the recent warming slowdown. This is particularly true in the embryonic field of decadal
externally driven cooling caused by a succession of volcanic eruptions.

climate prediction, where the challenge is to simulate how the combined effects of external forcing and internal
variability produce the time-evolving regional climate we will experience over the next ten years.

Adaptation innovation is limitless their impact authors dont


assume innovation
Indur Goklany 10, policy analyst for the Department of the Interior phd from
MSU, Population, Consumption, Carbon Emissions, and Human Well-Being in the
Age of Industrialization (Part IV There Are No PAT Answers, or Why NeoMalthusians Get It Wrong), April 26,
http://www.masterresource.org/2010/04/population-consumption-carbon-emissionsand-human-well-being-in-the-age-of-industrialization-part-iv-there-are-no-patanswers-or-why-neo-malthusians-get-it-wrong/
Neo-Malthusians believe that humanity is doomed unless it reins in population, affluence and technological change,
and the associated consumption of materials, energy and chemicals. But, as shown in the previous posts and

empirical data on virtually every objective indicator of human well-being


indicates that the state of humanity has never been better, despite unprecedented
levels of population, economic development, and new technologies . In fact, human
beings have never been longer lived, healthier, wealthier, more educated, freer, and
more equal than they are today. Why does the Neo-Malthusian worldview fail the
reality check? The fundamental reasons why their projections fail are because they
assume that population, affluence and technology the three terms on the right hand side of
the IPAT equation are independent of each other. Equally importantly, they have
misunderstood the nature of each of these terms, and the nature of the
misunderstanding is essentially the same, namely, that contrary to their claims,
each of these factors instead of making matters progressively worse is, in the long
run, necessary for solving whatever problems plague humanity . Compounding these
misunderstandings, environmentalists and Neo-Malthusians frequently conflate
human well-being with environmental well-being . While the latter influences the former, the
two arent the same. Few inside, and even fewer outside, rich countries would rank environmental indicators
elsewhere,

among the most important indicators of human well-being except, possibly, access to safe water and sanitation.
These two environmental indicators also double as indicators of human well-being because they have a large and

they are subsumed within life expectancy, which, as


noted, is the single most important indicator of human well-being . The UNDPs Human
direct bearing on human health. In any case,

Development Index, for instance, uses three indicators life expectancy, per capita income and some combined
measure of education and literacy. None of these three are related to the environment. The disconnect between
environmental indicators and indicators of human well-being is further evidenced by the fact that over the last

the most critical indicators of human well-being life expectancy, mortality rates,
generally improved
regardless of whether environmental indicators (e.g., levels of air and water pollution, loss of
biodiversity) fluctuated up or down (see, e.g., the previous post and here). Moreover, fears that the
worlds population would continue to increase exponentially have failed to
materialize. The worlds population growth rate peaked in the late 1960s . Population
century,

prevalence of hunger and malnutrition, literacy, education, child labor, or poverty

increased by 10.6% from 196570, but only 6.0% from 200005. Many countries are now concerned that fewer
young people means that their social security systems are unsustainable.

Projections now suggest that

the worlds population may peak at around 9 billion around mid-century (see here). The
slowdown in the population growth rate, unanticipated by Neo-Malthusians, can be
attributed to the fact that population (P) is dependent on affluence (or the desire for
affluence) and technology (A and T in the IPAT equation). Empirical data show that as people
get wealthier or desire greater wealth for themselves or their offspring, they tend to
have fewer children. Cross-country data shows that the total fertility rate (TFR), which measures the
number of children per women of child-bearing age, drops as affluence (measured by GDP per capita)
increases (see Figure 1). Moreover, for any given level of affluence, TFR has generally
dropped over time because of changes in technology, and societal attitudes shaped
by the desire for economic development (see here). Most importantly, it is not, contrary to
Neo-Malthusian fears, doomed to rise inexorably , absent coercive policies. Neo-Malthusians
also overlook the fact that, in general, affluence, technology and human well-being
reinforce each other in a Cycle of Progress (Goklany 2007a, pp. 79-97). If existing
technologies are unable to reduce impacts or otherwise improve the quality of life,
wealth and human capital can be harnessed to improve existing technologies or
create new ones that will. HIV/AIDS is a case in point. The world was unprepared to deal with HIV/AIDS
when it first appeared. For practical purposes, it was a death sentence for anyone who got it. It took the wealth of
the most developed countries to harness the human capital to develop an understanding of the disease and devise
therapies. From 1995 to 2004, age-adjusted death rates due to HIV declined by over 70 percent in the US (USBC
2008). Rich countries now cope with it, and developing countries are benefiting from the technologies that the
former developed through the application of economic and human resources, and institutions at their disposal.
Moreover, both technology and affluence are necessary because while technology provides the methods to reduce

affluence provides the means to


research, develop and afford the necessary technologies. Not surprisingly, access to HIV
problems afflicting humanity, including environmental problems,

therapies is greater in developed countries than in developing countries. And in many developing countries access
would be even lower but for wealthy charities and governments from rich countries (Goklany 2007a, pp. 7997).
Because technology is largely based on accretion of knowledge, it ought to advance with time, independent of
affluence provided society is open to scientific and technological inquiry and does not squelch technological

indicators of human well-being improve not only


with affluence but also with time (a surrogate for technology). This is evident in Figure 1, which shows
change for whatever reason. Consequently,

TFR dropping with time for any specific level of GDP per capita. It is also illustrated in Figure 2 for life expectancy,

the entire life


expectancy curve has been raised upward with the passage of time, a surrogate for
technological change (broadly defined). Other indicators of human well-being e.g., crop
yield, food supplies per capita, access to safe water and sanitation, literacy, mortality also improve with
affluence and, separately, with time/technology (see here and here). This indicates that
secular technological change and economic development, rather than making
matters worse, have actually enhanced societys ability to solve its problems and
advanced its quality of life. Moreover, population is not just a factor in consumption. It
is the basis for human capital. No humans, no human capital . Humans are not just
mouths, but also hands and brains. As famously noted by Julian Simon, they are the Ultimate
Resource. This is something Neo-Malthusians have difficulty in comprehending .
which shows that wealthier societies have higher average life expectancies, and that

Notably, a World Bank study, Where is the Wealth of Nations?, indicated that human capital and the value of
institutions constitute the largest share of wealth in virtually all countries. A population that is poor, with low
human capital, low affluence, and lacking in technological knowhow is more likely to have higher mortality rates,
and lower life expectancy than a population that is well educated, affluent and technologically sophisticated, no

These factors human capital, affluence and technology acting


in concert over the long haul, have enabled technology for the most part to improve
matters faster than any deterioration due to population, affluence (GDP per person) or
their product (GDP). This has helped keep environmental damage in check, (e.g., for cropland, a
measure of habitat converted to human uses) or even reverse it (e.g., for water pollution, and indoor and
matter what its size.

traditional outdoor air pollution), particularly in the richer countries. Note that since the product of population (P)
and affluence (A or GDP per capita) is equivalent to the GDP then according to the IPAT identity, which specifies that
I = P x A x T, the technology term (T) is by definition the impact (I) per GDP (see Part II in this series of posts). Ill
call this the impact intensity. If the impact is specified in terms of emissions, then the technology term is equivalent
to the emissions intensity, that is, emissions per GDP. Therefore the change in impact intensity (or emissions
intensity) over a specified period is a measure of technological change over that period. Since matters improve if
impact/emissions intensity drops, a negative sign in front of the change in impact intensity denotes that
technological change has reduced the impact. Table 1 shows estimates of the changes in impacts intensity, or
technological change, over the long term for a sample of environmental indicators for various time periods and
geographical aggregations. Additional results regarding technological change over different time periods and

in the long run,


technological change has, more often than not, reduced impacts. The reduction in
many cases is by an order of magnitude or more! Thus, notwithstanding plausible Neo-Malthusian
arguments that technological change would eventually increase environmental impacts, historical data
suggest that, in fact, technological change ultimately reduces impacts , provided technology
countries are available from the original source (here). These results indicate that

is not rejected through an inappropriate exercise of the precautionary principle or compromised via subsidies (which

although
population, affluence and technology can create some problems for humanity and
the planet, they are also the agents for solving these very problems . In the IPAT
equation, the dependence of the I term on the P, A and T terms is not fixed. It
evolves over time. And the Neo-Malthusian mistake has been to assume that the
relationship is fixed, or if it is not, then it changes for the worse . A corollary to this is that
projections of future impacts spanning a few decades but which do not account for
technological change as a function of time and affluence, more likely than not, will
overestimate impacts, perhaps by orders of magnitude. In fact, this is one reason why
many estimates of the future impacts of climate change are suspect, because most
do not account for changes in adaptive capacity either due to secular technological
change or increases in economic development (see here and here). Famously, Yogi Berra is
usually flow from the general public to politically favored elements of society). To summarize,

supposed to have said, Its tough to make predictions, especially about the future. Most analysts recognize this.
They know that just because one can explain and hindcast the past, it does not guarantee that one can forecast the

Neo-Malthusians, by contrast, cannot hindcast the past but are confident they can
forecast the future. Finally, had the solutions they espouse been put into effect a couple of centuries ago,
future.

most of us alive today would be dead and those who were not would be living poorer, shorter, and unhealthier lives,
constantly subject to the vagaries of nature, surviving from harvest to harvest, spending more of our time in
darkness because lighting would be a luxury, and our days in the drudgery of menial tasks because under their
skewed application of the precautionary principle (see here, here and here) fossil fuel consumption would be

lower reliance
on fossil fuels would mean greater demand for fuelwood , and the forests would be
denuded. Second, less fossil fuels also means less fertilizer and pesticides and, therefore,
lower agricultural productivity. To compensate for lost productivity,, more habitat
would need to be converted to agricultural uses. But habitat conversion (including
deforestation) not climate change is already the greatest threat to biodiversity !
severely curtailed, if not banned. Nor would the rest of nature necessarily be better off. First,

China
Relations fail but no impact to hostility
Blackwill 2009 former US ambassador to India and US National Security
Council Deputy for Iraq, former dean of the Kennedy School of Government at
Harvard (Robert D., RAND, The Geopolitical Consequences of the World Economic
RecessionA Caution,
http://www.rand.org/pubs/occasional_papers/2009/RAND_OP275.pdf, WEA)
Alternatively, will the current world economic crisis change relations between China and the United States in a
much more positive and intimate direction, producing what some are calling a transcendent G-2? This seems

the United States and China have profoundly


different visions of Asian security. For Washington, maintaining U.S.
alliances in Asia is the hub of its concept of Asian security, whereas, for Beijing,
Americas alliance system is a destabilizing factor in Asian security and over time
should wither away. These opposing concepts will be an enduring source of
tension between the two sides. Second, these two countries systematically
prepare for war against one another, which is reflected in their military
doctrines, their weapons procurement and force modernization, and their deployments and military exercises.
As long as this is the case, it will provide a formidable psychological and material
barrier to much closer bilateral relations. Third, the United States is critical of
Chinas external resource acquisition policy, which Washington believes could threaten
both American economic and security interests in the developing world. Fourth, despite their deep
economic dependence on each other, U.S.-China economic relations are inherently
fragile. China sells too much to the United States and buys too little, and the
United States saves too little and borrows too much from China. This will inevitably lead to a
backlash in the United States and a Chinese preoccupation with the value of its American investments. Fifth,
Chinese environmental policy will be an increasing problem, both for U.S.
policymakers who are committed to bringing China fully into global efforts to reduce climate degradation and for
Chinese leaders who are just as determined to emphasize domestic economic growth
over international climate regimes. Sixth, China and the United States have wholly
different domestic political arrangements that make a sustained entente
difficult to manage. Americans continue to care about human rights in China, and Beijing resents what it
improbable for seven reasons. First,

regards as U.S. interference in its domestic affairs. This will be a drag on the bilateral relationship for the

any extended application by Washington of Chimerica, as Moritz


would so alarm Americas Asian allies,
beginning with Japan, that the United States would soon retreat from the concept.24
Nevertheless, these factors are unlikely to lead to a substantial downturn in
U.S.-China bilateral ties. In addition to their economic interdependence, both
nations have important reasons to keep their interaction more or less stable. As
foreseeable future. And seventh,

Schularick of Berlins Free University has called it,23

Washington wants to concentrate on its many problems elsewhere in the world, especially in the Greater Middle
East, Beijing prefers to keep its focus on its domestic economic development and political stability. Neither wants
the bilateral relationship to get out of hand. In sum, a positive strategic breakthrough in the U.S.-China relationship
or a serious deterioration in bilateral interaction both seem doubtful in the period ahead. And the current economic

Therefore,
the U.S.-China relationship in five years will probably look pretty much as it does today
part cooperation, part competition, part suspicionunaffected by todays economic
downturn will not essentially affect the abiding primary and constraining factors on the two sides.

time of troubles, except in the increasing unlikely event of a cross-strait crisis and confrontation.

Econ

2NC OV
GHG regs impose huge economic costs on manufacturing--ensures offshoring and leakageturns the case and causes
backlash against future mitigation strategies
Aldy, 16 (Public Policy Prof- JFK School of Government at Harvard University,
"Frameworks for Evaluating Policy Approaches to Address the Competitiveness
Concerns of Mitigating Greenhouse Gas Emissions." Resources for the Future
Discussion Paper (2016): 16-06)
Most public policies intended to mitigate greenhouse gas emissions impose
economic costs. Requiring automobile manufacturers to improve the fuel economy of the cars they sell
increases the costs of making new cars and translates into higher prices faced by consumers. Mandating
utilities to lower the carbon intensity of their power generation will cause them to
shift investment into higher-cost generating technologies, which in turn will
result in higher electricity rates. Setting a price on carbon for fossil fuels
throughout the economy will raise energy prices. The costs of these climate policies
may negatively affect domestic firms if their competitors do not face comparable emissions
regulation or taxation. In particular, energy intensive manufacturing industries have
expressed concerns that domestic climate change policy could impose adverse
competitiveness effects because it would raise their production costs
relative to those of their foreign competitors. To be more exact, the
competitiveness effect reflects the impacts of the differential in carbon prices or the
effective gap in the shadow price of carbon between two domestic climate
programs on those countries net imports. Thus firms operating under the higher carbon price
experience adverse competitiveness effects if their domestic or foreign market share declines. In turn, this could
result in lower production, job loss, and relocation of factories and related
operations to countries without a domestic climate policy (Jaffe et al. 2009). These
competitiveness effects have more than just economic consequences. The potential for relocating
emissions-intensive activities to unregulated countries would result in higher
emissions in these countries than they would have experienced otherwise. This
emissions leakage would undermine the environmental benefits of the domestic
climate policy and lower societal welfare . Moreover, implementing a public policy
that results in both job loss and lower-than-expected environmental
benefits could weaken public and political support for mitigating
greenhouse gas emissions. Policymakers have several options for addressing these competitiveness
risks. They could impose tariffs reflecting the embedded carbon emissions in imports, such that domestically
produced goods and their foreign competitors face a common carbon price (Weisbach 2015; Agan et al. 2015).
Climate policy could direct benefits to potentially vulnerable firms, such as through free allowance allocations in
cap-and-trade programs or targeted tax credits (Gray and Metcalf 2015; Aldy and Pizer 2009). Some northern
European carbon tax programs have explicitly exempted energy-intensive manufacturing from their carbon tax
(Aldy and Stavins 2012). Policymakers could work through multilateral negotiations to ensure that major trade
partners undertake comparable domestic emissions mitigation policies. They could take such multilateral
coordination a step further by linking domestic mitigation programs among trade partners, which could then yield a
common carbon price for businesses operating under all linked programs. These policy options, however, carry their
own risks. They may run afoul of current obligations under the World Trade Organization (Trachtman 2015). The
design of such policies may result in a loss in social welfare and limit the ability of the government to offset
potentially regressive impacts of pricing carbon. Competitiveness policies may also have important implications for

the choice and design of competitiveness


policies may entail political risks that could also weaken support for the broader
ongoing international climate negotiations. Finally,

domestic climate change policy program. In this paper, I elaborate in more detail the potential
competitiveness risks of a domestic carbon pricing policy, drawing from an extensive theoretical, modeling, and
statistical literature. I then examine the potential risks and pitfalls associated with policy responses intended to
address competitiveness. Based on this context, the paper concludes with a framework for considering the
economic, environmental, legal, diplomatic, and political factors at play in the design of policy approaches to
address the competitiveness concerns of climate change policy.

Turns the 1st advantage


Eaglen et al 12 Mackenzie Eaglen (American Enterprise Institute), Rebecca Grant (IRIS Research),
Robert P. Haffa (Haffa Defense Consulting) Michael O'Hanlon (The Brookings Institution), Peter W. Singer (The
Brookings Institution), Martin Sullivan (Commonwealth Consulting), and Barry Watts (Center for Strategic and
Budgetary Assessments). The Arsenal of Democracy and How to Preserve It: Key Issues in Defense Industrial
Policy. Brookings Institution. January 2012. https://www.brookings.edu/wpcontent/uploads/2016/06/0126_defense_industrial_base_ohanlon.pdf
Yet there are severe challenges that could result to the nations security interests even with 10 percent cutbacks.
Despite the likely potential of lesser resources, the demand side of the equation does not seem likely to grow easier.

The international security environment is challenging and complex . Chinas


economic, political and now military rise continues. Its direction is uncertain, but it
has already raised tension, especially in the South China Sea . Irans ambitions and
machinations remain foreboding, with its nuclear plans entering a new phase of both capability but also crisis.

North Korea is all the more uncertain with a leadership transition, but has a history
of brinkmanship and indeed even the occasional use of force against the South, not
to mention nuclear weapons-related activities that raise deep concern. And the
hopeful series of revolutions in the broader Arab world in 2011, while inspiring at
many levels, also seem likely to raise uncertainty in the broader Middle East.
Revolutions are inherently unpredictable and often messy geostrategic events. On top of these remain
commitments in Afghanistan and beyond and the frequent U.S. military role in
humanitarian disaster relief. Thus, there are broad challenges for American defense
planners as they try to address this challenging world with fewer available
resources. The current wave of defense cuts is also different than past defense budget reductions in their likely
industrial impact, as the U.S. defense industrial base is in a much different place than it was in the past. Defense
industrial issues are too often viewed through the lens of jobs and pet projects to protect in congressional districts.

But the overall health of the firms that supply the technologies our armed forces
utilize does have national security resonance . Qualitative superiority in weaponry
and other key military technology has become an essential element of American
military power in the modern era not only for winning wars but for deterring
them . That requires world-class scientific and manufacturing
capabilities which in turn can also generate civilian and military export
opportunities for the United States in a globalized marketplace.

UNQ Prices Low


Energy-intensive industry growth is increasing and electricity
prices are low now
Dyl 16 Katie Dyl, Curtin University, Ph.D., Geochemistry, UCLA, Industrial and
electric power sectors drive projected growth in U.S. natural gas use, U.S. Energy
Information Administration, May 26th, 2016,
http://www.eia.gov/todayinenergy/detail.cfm?id=26412
U.S. consumption of natural gas is projected to rise from 28 trillion cubic feet (Tcf) in
2015 to 34 Tcf in 2040, an average increase of about 1% annually, according to EIA's Annual Energy Outlook 2016

The industrial and electric power sectors make up 49%


of this growth, respectively, while consumption growth in the residential, commercial, and
transportation sectors is much lower. Much of this growth in natural gas consumption results from
relatively low natural gas prices. In the AEO2016 Reference case, average annual U.S. natural gas
(AEO2016) Reference case.
and 34%

prices at the Henry Hub are expected to remain around or below $5.00 per million British thermal units (MMBtu) (in
2015 dollars) through 2040. The Henry Hub spot price averaged $2.62/MMBtu in 2015, the lowest annual average
price since 1995. Prices rise through 2020 in the AEO2016 Reference case projection as natural gas demand
increases, particularly for exports of liquefied natural gas (LNG). Currently, most U.S. natural gas exports are sent to
Mexico by pipeline, but LNG exports, including those from several facilities currently built or under construction,

The persistent,
low price of U.S. natural gas is the primary driver for increased
natural gas consumption in the industrial sector. Energy-intensive
industries and those that use natural gas as a feedstock, such as bulk
chemicals, make up most of the increase in natural gas consumption. Low
natural gas prices also support long-term consumption growth in the
electric power sector. Natural gas use for power generation reached a
record high in 2015 and is expected to be high in 2016 as well, likely surpassing coal
account for most of the expected increases in total U.S. natural gas exports through 2020.
relatively

on an annual average basis. However, a relatively steep rise in natural gas prices through 2020 (rising 11% per
year) and rapid growth in renewable generationspurred by renewable tax credits that were extended in 2015

Throughout
the 2020s and 2030s, electricity generation using natural gas increases
again. Because natural gas-fired electricity generation produces fewer
carbon dioxide emissions than coal-fired generation, natural gas is
expected to play a large role in compliance with the Clean Power Plan for
existing generation from fossil fuels, which takes effect in 2022. The electric power sector's
also contribute to a decline in power generation fueled by natural gas between 2016 and 2021.

total consumption of natural gas from 2020 through 2030 is 6 Tcf greater in the AEO2016 Reference case than in a
case where the Clean Power Plan is not implemented (No CPP).

Nat gas boom solves electricity prices AND the warming


advantage extend Kreutzer carbon tax disrupts this
trajectory
Crawford 6/29 Jonathan Crawford, Bloomberg, Natural Gas Fills the Gap as
Coal Drops Out of U.S. Power Market, 2016,
http://www.bloomberg.com/news/articles/2016-06-29/natural-gas-fills-the-gap-ascoal-drops-out-of-u-s-power-market
Power prices have dropped by
40% in largest wholesale market Five years ago, opponents of newly proposed clean-air rules sounded dire
warnings of blackouts and surging electricity prices if coal-burning plants were shuttered. Welcome to
Coal burners producing power for 40 million homes were closed

2016. Instead of rising, the price of electricity in the nations largest grid
is now 40 percent lower than it was back then, even as a record 346 coalburning units, producing enough electricity to supply 40 million homes, were retired. The
difference: Americas shale boom unleashed cheap and abundant natural
gas that burns more cleanly than coal. Youve seen the coal come out of the market and then youve seen a
response from industry to capitalize on that hole in the supply mix, said Ethan Paterno, a Denver-based energy

emergence
as the worlds largest producer of natural gas has not only sped up the
closure of coal-burning plants. Its also put the U.S. on a surer path to
meeting an international accord to slash global warming pollutants and to
industry specialist with PA Consulting Group. The low gas prices are a big, big deal. The nations

comply with a host of federal environmental mandates estimated to yield billions of dollars in health benefits. The
mid-Atlantic grid, which stretches from Maryland to Chicago, has been ground-zero for coal-plant shutdowns as the
generators compete with gas burners that have access to the cheapest supplies in the country. In that network, the
largest in the U.S., about 20,000 megawatts of gas-fired plants are projected to connect by mid-2019, said Paterno,
or enough generation to power for about 20 million homes. The coal closures were driven mostly by the U.S.
Environmental Protection Agencys pollution rule, which the agency said would cost $9.6 billion annually to
implement. The burning of cleaner burning also produces health benefits, including fewer heart attacks, sick days
and up to 11,000 fewer premature deaths annually, worth $37 billion to $90 billion each year, according to the EPA.

More gas-fired generation also helped the U.S. cut emissions of carbon
dioxide last year to 21 percent below 2005 levels . The country has set a target to cut
greenhouse gases by at least 26 percent below 2005 levels by 2025. Natural gas emits about half as much carbon
dioxide as coal when generating power. The pollution regulation, which requires plants to meet tighter emission
limits on mercury and other toxins that can be met with the installation of costly scrubbers, first came into force in
April 2015. The rule and cheap power prices resulted in the retirement of 13,000 megawatts of coal-fired generation
just last year. Thats just a slice of more than 36,000 megawatts of coal capacity that has been shuttered since
2011. Blackout Fears Power producers including Duke Energy Corp. and Luminant Generation Company LLC warned
in August 2011 that the coal closures would leave the nations grid at risk of price spikes and outages, while a U.S.
government study projected a boost in prices in parts of the coal-heavy Midwest. Senator James Inhofe, a
Republican from Oklahoma, said in August 2011 that the EPA was reckless in proposing a rule that threatened to put
a significant strain on the electric grid from the forced closure of hundreds of coal plants, and which would raise
electricity rates across the country. Wholesale power in PJM Interconnection LLCs benchmark West hub, which
includes deliveries to Washington, averaged $30.08 a megawatt-hour in the first quarter, down 41 percent from

The transition was smoothed by other factors as well. Milder


weather and technologies boosting efficiency, such as energy-conserving light bulbs and
$51.17 in 2011. Greater Efficiency

refrigerators, have slowed demand. With consumption stagnant, cheap fuel is an incentive to build replacement
plants. Natural gas prices in Pennsylvania, the home of the most prolific shale reserve, plummeted to as low as 59
cents per million British thermal units last year, down from more than $14 in 2008. Gas production has more than

The shale boom and coal plant retirements have made the U.S. one of
the top three locations for building power plants, along with the Middle East and South
doubled since 2012.

Asia, according to General Electric Co.

UQ
Manufacturing is up PMI index proves
Moutray 12/15 (Chad Moutray, Ph.D., Economics from Southern Illinois
University, is chief economist for the National Association of Manufacturers (NAM).
December 15, 2016 http://www.shopfloor.org/2016/12/markit-u-s-manufacturingoutput-in-december-grew-at-strongest-rate-since-march-2015/) swap
U.S. Manufacturing Output in December Grew at Strongest Rate Since
March 2015 The Markit Flash U.S. Manufacturing PMI edged up from 54.1 in
November to 54.2 in December, a 21-month high . This mostly mirrored
assessments about new orders growth (up from 55.5 to 55.6), which also expanded at the
fastest pace over that time frame. Other indicators were mixed but encouraging. Employment
expanded at its highest rate in 18 months (up from 52.4 to 54.1), whereas output grew
modestly but pulled back a little in December (down from 56.0 to 55.1). On a more disappointing note, exports
slowed to a near crawl but were positive for the sixth time in the past seven months
Markit:

(down from 51.0 to 50.3). Softer international demand, however, should not be surprising given the strong U.S.
dollar. Overall, this report provides some encouragement for manufacturers ,
many of whom have been rather cautious in their economic outlook for much of the past two years.

PMI has the best methodology prefer it


IHS no date (HIS Markit, international financial services company. About PMI
data https://www.markiteconomics.com/Survey/Page.mvc/AboutPMIData) swap
Purchasing Managers Index (PMI) surveys have been developed in many countries to provide purchasing
professionals, business decision-makers and economic analysts with an accurate and timely set of data to help

PMI data are based on monthly surveys of


carefully selected companies . These provide an advance indication of what
is really happening in the private sector economy by tracking variables
such as output, new orders, stock levels, employment and prices across
the manufacturing, construction, retail and service sectors . The PMI
surveys are based on fact , not opinion, and are among the first indicators of economic
conditions published each month. The data are collected using identical methods in all
countries so that international comparisons may be made. The surveys achieve considerable press coverage
on a regular basis and are widely used by purchasing professionals in the
manufacturing sector, by senior management across the corporate sector
and by economic analysts in financial institutions. In particular, central banks
in many key economies use the data to help guide interest rate decisions. Key features of PMI
data: provide reliable fact-based indicators as opposed to opinion or
confidence-based indicators. are produced rapidly, and far faster than comparable official data
series. are released on a monthly basis. cover almost all private sector economic activity in many
countries (including the all-important service sectors). are not revised after publication. are produced
using the same methodology in all countries where we operate, enabling accurate international
comparisons. In many cases, the advantages offered by PMI data reflect deficiencies
in official economic statistics , which include: 1. Infrequent publication Many
better understand industry conditions.

government data series, such as gross domestic product (GDP), are published only quarterly. The PMI is published
monthly. 2. Delays in publication A significant period of time often elapses before official data are published. The
PMIs provide data sometimes several months ahead of official series. 3. Subject to

revision after first

publication

Even once the official data are published, they are frequently subject to substantial revision. Such
revisions mean it is hard to confidently make business decisions based only on these statistics. PMI data, in
contrast, are not revised after first publication (with the exception of very minor occasional changes to seasonal
adjustments). 4.

Lack of comparability with equivalent measures used for other

countries Not all statistical bodies compile data using the same methodologies. Gross domestic product data
for the Eurozone are, for example, compiled using significantly different statistical techniques than equivalent data
for Japan. The above problems result in a situation where purchasing professionals and economic analysts are
attempting to monitor current industry conditions using data that are already out of date when published, which are
possibly revised substantially after first publication and which are difficult to interpret against other countries data.

The PMI surveys have therefore been designed to provide business


decision-makers with up-to-date, accurate and reliable data on which to
benchmark performance and base business strategy.

Manufacturing productivity is increasing employment is


irrelevant
Kincer 12/15 (H. Blair Kincer, CPA and partner at Novogradac and Company LLP,
specializing in market analysis and valuation advisory services (GoVal). U.S.
Manufacturing Growth Continues, but Jobs Are Trailing December 15, 2016
https://www.novoco.com/notes-from-novogradac/us-manufacturing-growthcontinues-jobs-are-trailing) swap
Whats not in question is that the significance of the U.S. role in global manufacturing is increasing in the 21st

America is ascendant in manufacturing. Output over Employment For the past 40


years, the U.S. contribution to global manufacturing output remained
constant at approximately 21 percent of overall world manufacturing
output. However, American manufacturing employment declined
significantly over this same period. The divergence between output and employment is the
result of advancements in robotics and materials science that increased
productivity, transforming manufacturing from a relatively labor-intensive
industry to a much more capital-intensive industry . Another factor that contributed to
century:

the decline in U.S. manufacturing employment is the outsourcing of production by American companies that shifted
operations overseas, where labor costs are lower. The following graph illustrates the divergence between
manufacturing output and employment. Note that shaded areas indicate recessionary periods.

After 2010, U.S. manufacturing employment began to increase for the first time in more than a decade, marking a
new era. Labor economists pointed to the relatively balanced costs of labor across the world as a leading factor.
Before the rapid expansion and refinement of technological capabilities in the late 1990s and the accelerated pace
of globalization that accompanied it, foreign countries benefited from a comparative advantage in manufacturing by
leveraging low labor costs. As global markets became more integrated over time, the foreign labor cost advantage

the United States enjoys relatively low costs for


capital, raw materials and transportation. Significantly, the U.S. became the
worlds largest producer of oil in late 2014, surpassing Russia and Saudi
Arabia and giving domestic manufacturers privileged access to this
fundamental driver of growth. U.S. Moving Up in Competitiveness While productivity
enhancements dislocated many American workers, those enhancements
also increased the competitiveness of American manufacturing exports in
eroded significantly. Furthermore,

the global marketplace. The accounting firm Deloitte publishes a Global Manufacturing Competitiveness Index,
which ranks 40 nations based on a number of factors including labor cost/productivity, education, infrastructure,
supplier networks, intellectual property protections and regulatory/environmental requirements .

In the 2016
version of this report, the U.S. ranked second behind China and ahead of
Germany. The same report projects that by 2020 the U.S. will overtake China to
become the worlds most desirable country for manufacturing businesses .
In particular, the increasingly vital role of proprietary and complex technology in production processes has raised
the appeal of countries that provide strong intellectual property protection and educated work forces, rather than
the lowest labor costs. This new dynamic tilted the advantage back toward developed nations, which tend to
feature superior legal protections and skilled labor forces. The following tables illustrate the Deloitte rankings for
2016 and 2020 (projected).

UNQ Trump Growth


Economy Improving despite Protectionist ramblings from
Trump
Angela Monaghan, 11-14-2016, "US economy predicted to lead global growth,"
Guardian, https://www.theguardian.com/business/2016/nov/14/moodys-predicts-uslead-global-economic-growth-protectionism-trump, MoStateCEW
The US is expected to lead global growth higher over the next two years
despite the growing threat posed by protectionist policies , Moodys Investors
Service has warned. The rating agency predicted the US would be the fastest
growing of the G7 leading industrial countries in 2017 and 2018, with
short-term growth boosted by Donald Trumps plans to cuts taxes and
spend more on American infrastructure. US growth is expected to rise
from 1.6% this year to 2.2% in 2017 and 2.1% in 2018. While prolonged
policy uncertainty could weigh on already weak investment growth, there
could be an upside to growth from increased fiscal expenditure, especially
infrastructure spending, and tax cuts , said Elena Duggar, associate managing director at Moodys. A
protectionist stance [from Trump] on trade and immigration would be detrimental in the medium term. We are living in a depression that's why Trump
took the White House Globally, 2016 is expected to be the slowest year for growth since the financial crisis, at 2.6%, before picking up to 2.9% in 2017
and 2018. Duggar said that threats to the outlook for the world economy included mounting anti-globalisation sentiment and fragility in the EU, with
elections due in France, Germany and the Netherlands, and the Italian referendum on constitutional reform later this year. With the unanticipated
outcomes of the Brexit vote in the UK and the US presidential election, it has become evident that nationalistic and anti-globalisation sentiments are
gaining traction globally. Going forward, there is likely to be an increased tendency toward protectionist economic policies in advanced economies. The
risk of rising political discord and an increase in EU fragmentation has increased, she added. Britains growth prospects will be hindered by lingering
uncertainty about a future outside the EU, Moodys said, lowering its forecasts for 2017 and 2018. While the UK is expected to be the fastest growing of

Uncertainty around the future


of the economy outside the common market will dampen business
investment spending and potentially consumption, particularly, if
businesses hold back on hiring, Duggar said. On the other hand, monetary policy
accommodation will support the economy, limiting the slowdown in
growth. Hes right, the economy is sick and businesses like Trumps are part of the disease Moodys said the sharp fall in the value of the pound
the G7 economies in 2016, growth is expected to slow sharply to 1% in 2017 and 2018.

since the Brexit vote on 23 June was unlikely to be the major boost to UK exports predicted by some. A weak pound is, however, expected to push
consumer price inflation higher, from 1% now to 2.2% in 2017 before falling again to 1.7% in 2018. This is lower than other economists, including those at
the Bank of England, have predicted. Forecasters at the National Institute of Economic and Social Research (NIESR) believe UK inflation will reach almost
3% next year. Duggar said: There is a high degree of uncertainty surrounding the UKs economic outlook, since it depends on the ultimate outcome of a
multi-year trade negotiation process with the EU, and other economies. Our baseline growth scenario for the UK assumes that it would be able to
negotiate a free trade agreement with the EU. However, a failure to negotiate could substantially worsen sentiment, triggering a material correction in
asset prices, a house price downturn, and more substantial declines in investment and consumption spending. The rise in protectionist discourse globally
could also prove to be a hurdle for the UK, and challenge long-term prospects.

A2 Green Tech
Only our studies take this effect into account
Robert Michaels and Robert Murphy, January 2009. Michaels is a professor of economics at
California State University and a senior fellow at the Institute for Energy Research. Murphy is director of the Institute
for Energy Research. Green Jobs: Fact or Fiction? Institute for Energy Research.
http://www.instituteforenergyresearch.org/green-jobs-fact-or-fiction/

studies fail to properly account for the job


destruction that their recommendations would entail. For example, the Center for
Even if job creation per se is the goal, the

American Progress (CAP) study recommends a $100 billion expenditure to be financed through the sale of carbon
allowances under a cap-and-trade program. CAP estimates that this fiscal stimulus will result in the creation of

the CAP methodology treats the $100 billion as manna


from heaven; it does not consider the direct and indirect adverse effects
(including job destruction) of imposing higher costs on a wide array of energy-intensive
two million jobs [v]. Yet

industries and thereby raising prices for consumers.


Double counting of jobs and overly simplistic treatment of the labor market.

green studies critiqued in this report implicitly assume that there is a limitless
pool of idle labor which can fill the new green slots created by government
spending. Yet to the extent that some of the new green jobs are filled by workers
who were previously employed, estimates of job creation are overstated,
perhaps significantly so. In addition, the studies do not account for the rise in worker
productivity over time. Thus their long-range forecasts of total jobs created by green programs
are inflated, even on their own terms.
The

To its credit, CAP alludes to potential inflationary labor shortages from job creation [vi] due to its proposed
program, but dismisses the concern as irrelevant for an economy in recession. The thinking is that the workers
going into the new green jobs will simply reduce the unemployment rate, rather than siphoning talented people

The CAP analysis ignores the fact that other industries,


not favored by the green subsidies or mandates, would have been able to
draw on the pool of unemployed workers as the economy recovers. With fewer
away from other industries.

workers seeking jobs, job creation in non-green sectors will be lower than it otherwise would have been.

some of the infrastructure plans will require a long time to


implement and then reach completion. Their implementation over time could
contribute to inflationary labor shortages once the current recession
has passed.
Moreover,

b) Their evidence doesnt assume greening traditional


industries --- our dataset is the most comprehensive
Jennifer Winter and Michal C Moore, October 2013. Moore is economist and Professor of Energy
Economics at The School of Public Policy at the University of Calgary. He is the former chief economist at the U.S.
National Renewable Laboratory in Golden Colorado. Winter is Research Associate in the Energy and Environmental
Policy at The School of Public Policy. The Green Jobs Fantasy: Why the Economic and Environmental Reality Can
Never Live Up To the Political Promise, The School of Public Policy. Volume 6, Issue 31.
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2338234

limiting the definition of green jobs or the green economy


to businesses that produce green products or services the definition chosen
by the OECD,28 Statistics Canada,29 Bezdek et al.30 and The Brookings
Institution31 would exclude what could be considered green jobs in traditional
industries, such as individuals implementing energy-efficiency programs at non-green places of
ECO Canada27 notes that

employment. This reflects the difference between the output approach compared to the process approach for
defining green jobs. The output approach identifies establishments that produce green goods and services and then
counts the associated jobs; the process approach identifies establishments that use environmentally friendly

The output approach identifies


jobs related to producing a certain set of goods and services, such as solar panels.
The process approach identifies activities and associated jobs that
favorably impact the environment although the product or service
produced is itself not green.32
production processes and practices and counts the associated jobs.

the Bureau of Labor


Statistics follows the output and the process approach. This is also the
most comprehensive and data-driven, as the process approach captures
occupations and jobs missed by the output approach. On the other hand, the definition
Measurement and definition of the number of green jobs in the United States by

used by Statistics Canada is to simply define green industries rather than green jobs firms operating in Canada
that are involved in the production of environmental goods or the provision of environmental services.

c) Their evidence overestimates renewables growth


Grcan Glen, 2/3/2011. Senior Energy Economist, Center for Energy Economics, Bureau of Economic
Geology, The University of Texas at Austin. Defining, Measuring and Predicting Green Jobs, Copenhagen
Consensus Center. http://s3.documentcloud.org/documents/262778/ccc-green-jobs.pdf

job creation estimates are based on green technology growth


scenarios that are significantly more aggressive than commonly
referenced forecasts by the EIA or International Energy Agency (IEA) among others. For example, Global
In many studies,

Insight (2008) depends on some very aggressive growth assumptions for renewable power, far above official
government forecasts. For example, according to the Annual Energy Outlook 2008 by the Energy Information

renewable energy generation (including conventional hydro) will grow


at an average rate of two percent per year and reach 12 percent of the
total generation capacity by 2030 from nine percent in 2008.12 But the GI scenario is
for renewable generation excluding conventional hydro (currently about three percent of the sector)
to increase its share to 27 percent by 2028 (page 12).
Administration (EIA),

Asmus (2008) targets 33 percent share for renewables in California by 2020.


In 2010, excluding conventional large hydro, 12-13 percent of electricity in California is
generated by renewables. Including large hydro, California gets more than 24 percent of its electricity
from renewable sources. Asmus (2008) appears to include large hydro in the 33 percent goal (Figure 1 on page 5)
but as mentioned before large hydro is controversial. The study does not advocate construction of large hydro but
rather focuses on wind, solar, geothermal and other green technologies to meet the target. In fact, the author
criticizes California Public Utilities Commission for its efforts to protect consumers against higher energy prices by

the study admits the


higher cost of green technologies and that consumers, either through their energy or
tax bills, will have to pay higher prices to achieve the 33 percent goal . However,
capping the price of renewables generation relative to natural gas prices. As such,

there is no counterbalancing discussion of the negative impacts of these


higher energy costs on the household consumption and business
competitiveness.

Link
Massively drives up prices EPA agrees
Jarrett, MPSC former commissioner and energy attorney, 2016
(Terry, States are right to worry about clean power plan costs, 7-8,
http://thehill.com/blogs/pundits-blog/energy-environment/286906-states-are-rightto-worry-about-clean-power-plan-costs)
For starters,

the EIA says the plan will mean significantly higher prices

for residential and commercial electricity. They attribute this to higher transmission and
distribution costs coming at a time when electricity consumption will also grow slightly (in 2015-2040.) Interestingly, the EIA
projects that these higher electricity prices will actually reduce demand 2%
by 2030. Why? Because compliance actions and higher prices will force cash-strapped consumers to
adopt their own austerity measures . A key part of the CPP is the
dismantling of coal-fired power in the U.S. As the EIA sees it, Coals share of total electricity generation, which was 50%
in 2005 and 33% in 2015, falls to 21% in 2030 and to 18% in 2040. Coal power plants currently anchor
Americas base-load electricity generation, so its understandable that their elimination
would drive up prices . But is such a move justified? The EIA projects that renewable energy (solar and wind) will play a
significant role in meeting electricity demand growth throughout most of the country. Its a bold gamble, since the EIA believes that renewables will
account for 27% of total U.S. generation by 2040. But EIA data shows wind and solar power supplying only 5.6% of U.S. electricity generation in 2015. So,

EIA data on Germany , where


residential retail electric prices have risen , and are expected to keep
rising, due to higher taxes and fees for renewable power . Overall, Germanys
foray into green energy has driven the average residential electricity price
to 35 cents/kWh, almost three times the U.S. average of 13 cents/kWh. Along with Denmark, Germany has
some of the highest residential electricity prices in Europe.
the jump to 27% will require significant investments. Whats instructive is

Economic growth is stable based on energy and industrial


output---new policy shocks tip the economy back into
recession.
Samson 12/29 Adam Samson, reporter for the Financial Times, S&P 500 on
track for biggest recovery since the Great Recession, December 29th, 2016,
https://www.ft.com/content/1dc1a8ae-cc83-11e6-b8ce-b9c03770f8b1
Wall Streets most important equity barometer is on track for its biggest annual
recovery since the beginning of the bull market , underscoring the sharp turn in investor
sentiment following the worst start to a year on record. The S&P 500 index is poised for an 11
per cent gain in 2016, shaking off a brutal start that at its lowest point in February left it down 10.5 per cent
for the year. Its 21.5 percentage point swing is the widest since 2009 the year Wall Street emerged from the bear
market triggered by the financial crisis. We have had a reset, said Peter Stournaras, a portfolio manager at
BlackRock. The start of 2016 was dominated by fears that the US economys recovery from the Great Recession,
among the longest expansions since the mid-1800s, had lost momentum. Output growth cooled in the first quarter
to an annual pace of just 0.8 per cent. Unease about the US was compounded by fears over Chinas economic
growth, along with ructions in the Asian countrys financial market. A sharp decline in the price of crude oil, to just

the market has


undergone a transition from fears about recession to optimism about reflation,
above $26 a barrel, from $61 in mid-2015, deepened Wall Streets woes. But since then,

said David Lebovitz, a global market strategist with JPMorgan.

Corporate America is looking

healthier the S&P 500 snapped a five-quarter profit recession in the third quarter with a 3.1 per cent yearon-year rise in earnings, FactSet data show. Meanwhile, the US economic growth rate picked up
to 3.5 per cent in that period, the fastest since 2014. Wall Street analysts reckon
S&P 500 profit growth will accelerate further in 2017, posting a year-on-year pace of
11.2 per cent in first three months of the year, helped by a rebound in earnings among
energy companies. The election of Donald Trump in the US election in November has catalysed the rally
in US stocks amid speculation that the businessmans plans for a vast government spending programme, lower
taxes and looser regulations will stoke higher economic growth, inflation and potentially corporate profits. Investors
are also increasingly hopeful that the stronger economic backdrop will allow the Federal Reserve to continue
normalising monetary policy after holding rates near historic lows since the financial crisis. US bank shares have
been significant beneficiaries of the post-election march higher, with the sector up by almost a quarter for 2016.

Energy shares, too, have made a stunning recovery, pushing the sector up
by 25 per cent in the year to date, as the price of crude oil has moved to
the $55 a barrel price range after Opec and other big exporters struck a
deal aimed at narrowing the glut of supply on global markets . Despite the
bout of optimism, strategists remain cautious over the outlook for next
year, noting that it will be important to see what comes of Mr Trumps
policy proposals, many of which need to be passed by Congress. Weve been told this really nice story but
we need to see how it plays out, said Mr Lebovitz.

2NC Econ
Global economic governance institutions solve the terminal
impact
-

Increased lending capacity


IMF surveillance initiatives
Increased social net capacity to prevent rally around the flag decisions

IMF 16 IMFs Response to the Global Economic Crisis, March 22 nd, 2016,
http://www.imf.org/en/About/Factsheets/Sheets/2016/07/27/15/19/Response-to-theGlobal-Economic-Crisis
Creating a crisis firewall. To meet ever increasing financing needs of countries
hit by the global financial crisis and to help strengthen global economic and
financial stability, the Fund greatly bolstered its lending capacity after the onset
of the global crisis in 2008. This was done by increasing quota subscriptions of member
countriesthe IMF's main source of financingand securing large borrowing
agreements. Stepping up crisis lending. The IMF overhauled its lending framework to make it better suited to
country needs, giving greater emphasis to crisis prevention, and streamlined program conditionality. Since the start
of the crisis, the IMF committed well over $700 billion in financing to its member countries. Helping the worlds
poorest. The IMF undertook an unprecedented reform of its policies toward low-income countries and quadrupled

Sharpening IMF analysis and policy advice. The IMF


provided risk analysis and policy advice to help member countries overcome the
challenges and spillovers from the global economic crisis . It also implemented several
major initiatives to strengthen and adapt surveillance to a more globalized
and interconnected world, taking into account lessons learned from the crisis. Reforming the IMFs
resources devoted to concessional lending.

governance. To strengthen its legitimacy, in April 2008 and November 2010, the IMF agreed on wide-ranging
governance reforms to reflect the increasing importance of emerging market countries. The reforms also ensured
that smaller developing countries would retain their influence in the IMF. Creating a crisis firewall Increasing the

financial resources available for IMF support to member countries was a key part
of the efforts to overcome the global financial crisis . In 2009 and 2010, members
provided additional financial resources to the Fund through bilateral borrowing agreements for about SDR 170
billion (about US$250 billion at current exchange rates). These resources were subsequently incorporated into
expanded New Arrangements to Borrow (NAB), increasing their size from SDR 34 billion to SDR 370 billion (about
$510 billion). In 2012, to respond to worsening global financial conditions, a number of members pledged to further
enhance IMF resources through a new round of bilateral borrowing. By the end of 2015, 35 agreements for a total of

The 14th General Review of Quotas,


doubled the IMFs permanent resources to SDR 477 billion
(about $663 billion). The conditions for implementing the increases were met in January 2016.
about SDR 280 billion ($390 billion) were finalized.
approved in December 2010,

Subsequently, the NAB credit arrangements were rolled back from SDR 370 billion to SDR 182 billion in conjunction
with the payments for the 14th Review quota increases, while remaining an important backstop to quota resources.
Currently, the Funds total lending capacity (comprising quotas, the NAB, and the 2012 Borrowing Agreements after
prudential balances) stands at about SDR 690 billion (about $950 billion). In addition to increasing the Funds own
lending capacity, in 2009, the membership agreed to make a general allocation of SDRs equivalent at the time to
$250 billion, resulting in a near ten-fold increase in SDRs. This represented a significant increase in reserves for
many countries, in particular low-income countries. Reforming the IMFs lending framework To better support
countries during the global economic crisis, the IMF beefed up its lending capacity and approved a major overhaul
of how it lends money by offering higher and more frontloaded amounts and tailoring loan terms to countries
varying strengths and circumstances. Credit line for strong performers. The Flexible Credit Line (FCL),
introduced in April 2009 and further enhanced in August 2010, is a lending tool for countries with very strong

provides large and upfront access to IMF resources, mainly as a form of


insurance for crisis prevention. There are no policy conditions to be met once a country has been
fundamentals that

approved for the credit line. Colombia, Mexico, and Poland have been provided combined access up to about $100

billion under the FCL (no drawings have been made under these arrangements). FCL approval has been found to
lead to lower borrowing costs and increased room for policy maneuver. Access to liquidity on flexible terms.
Heightened regional or global stress can affect countries that under normal circumstances would not likely be at

Providing rapid and adequate short-term liquidity to such crisis bystanders


during periods of stress could bolster market confidence, limit contagion,
and reduce the overall cost of crises. The Precautionary and Liquidity Line (PLL), which was
established in 2011, is designed to meet the liquidity needs of member countries with sound
risk of crisis.

economic fundamentals but with some remaining vulnerabilitiesMacedonia and Morocco used the PLL. Reformed
terms for IMF lending. Structural performance criteria were discontinued for all IMF loans, including for programs
with low-income countries. Structural reforms continue to be part of IMF-supported programs, but have become

The IMF helped


governments protect and even increase social spending , including social assistance. In particular, the IMF
promoted measures to increase spending on, and improve the targeting of social safety net
programs that can mitigate the impact of the crisis on the most vulnerable
more focused on areas critical to a countrys recovery. Emphasis on social protection.

in society. Crisis Program Review. The IMF conducted several reviews to learn from Fund-supported programs that

reviews found that Fund-supported programs


helped chart a path through the global financial crisis that avoided the
counterfactual scenario many initially feared, involving a cataclysmic
meltdown of the global economic system. Given the radical differences between the 2008
began after the 2008 global crisis. The

crisis and its predecessors, decisions were made amid significant uncertainty about shocks, transmission channels,

Program outcomes helped inform the design of later


programs, and contributed to broadening the array of feasible policies over time by strengthening
frameworks and reducing the risk of contagion.
and policy responses.

Chem
Chemical industry strong but faces challenges
Zacks Equity Research 8/9 (Zacks Equity Research, Chemical Industry
Stock Outlook - Aug 2016, Zacks, August 09, 2016,
https://www.zacks.com/commentary/88001/chemical-industry-stock-outlook---aug2016)
The chemical industry is still in gradual recovery mode from the trough of the
Great Recession. Despite a spate of headwinds, the highly cyclical
industry fared reasonably well i n the first half of the year, helped by continued
strength across automotive and housing markets -- two major end-use markets for
chemicals. Chemical companies are increasingly looking for cost synergy
opportunities and enhanced operational scale through consolidations amid nagging
macroeconomic challenges . These companies also remain actively focused on
increasing their reach in high-growth markets in a bid to cut their exposure to businesses
that are struggling with depressed demand. Strategic measures including cost
management and productivity improvement also remain the prime focus of these companies to stay
afloat in a still-difficult global economic backdrop. Some industry-specific challenges,
the Eurozones feeble recovery and concerns over Chinas future growth
remain sources of near-term uncertainties for the chemical industry. Moreover,
chemical makers are also feeling the bite of weak demand in the energy
markets amid a still depressed oil price environment. A strong dollar is also hurting U.S.
chemical exports, reducing their attractiveness in overseas markets. Lingering
weakness in China -- a key market for chemicals -- is expected to remain a major drag
on the chemical industry in the short haul. A persistent credit crunch, overcapacity and weak
infrastructure and manufacturing investment are hurting the worlds second-largest economy. In addition, the

the fertilizer and agricultural chemicals space remains cloudy due


to sluggish economic conditions in certain developing markets,
particularly Brazil. Notwithstanding these challenges, the chemical
industry is expected to continue to recuperate through the balance of
2016, supported by continued strength in the light vehicles market, positive trends in the construction space
and significant shale-linked capital investment. The U.S. chemical industry is poised for
growth this year and the next despite several challenges including a strong dollar and a
difficult oil price environment. According to the American Chemistry Council (ACC), an industry trade group, U.S.
chemical production will rise 1.6% in 2016 and 3.7% in 2017 . Barring production of
the pharmaceuticals segment, output is expected to go up 2.7% this year and 4.1% in
2017. In particular, the trade group expects basic chemicals production to expand 3.1% in 2016 and 4.9% in
2017. Chemical production is also expected to increase across all regions of
the country this year. The ACC envisions the U.S. chemical industry to
continue to gather momentum over the next several years on the heels of
new capital investments, capacity additions and feedstock cost advantage, and
even transcend the nations overall economic growth in the long term.
The shale gas bounty and ample supply of natural gas liquids has been a
huge driving force behind chemical investment on plants and equipment in the U.S. and
outlook for

have provided domestic petrochemicals producers a compelling cost advantage over their global counterparts.

The ACC expects this competitiveness to drive export demand and new
capital investment in the country. The shale revolution has made the U.S. an attractive investment
hotspot. Chemical makers including Dow Chemical (DOW - Analyst Report) , LyondellBasell Industries (LYB Analyst Report) , BASF (BASFY - Snapshot Report) , Eastman Chemical (EMN - Analyst Report) , Celanese (CE -

are investing heavily on shale gas-linked


projects to take advantage of abundant natural gas supplies which is
expected to boost capacity and export over the next several years. The ACC expects domestic
Analyst Report) and Westlake Chemical (WLK)

chemical industry capital spending to increase 10.4% in 2016 and 7.8% in 2017.

You might also like