You are on page 1of 210

Topicality

Topicality 1NC- Oceans


A. Interpretation- the Earths Oceans are the 5 major oceans
NALMS 14 North American Lake Management Society, WATER WORDS
GLOSSARY, http://www.nalms.org/home/publications/water-words-glossary/O.cmsx
OCEAN
Generally, the whole body of salt water which covers nearly three fourths of the
surface of the globe. The average depth of the ocean is estimated to be about
13,000 feet (3,960 meters); the greatest reported depth is 34,218 feet (10,430
meters), north of Mindanao in the Western Pacific Ocean. The ocean bottom is a
generally level or gently undulating plain, covered with a fine red or gray clay, or, in
certain regions, with ooze of organic origin. The water, whose composition is fairly
constant, contains on the average 3 percent of dissolved salts; of this solid portion,
common salt forms about 78 percent, magnesium salts 15-16 percent, calcium salts
4 percent, with smaller amounts of various other substances. The density of ocean
water is about 1.026 (relative to distilled water, or pure H2O). The oceans are
divided into the Atlantic, Pacific, Indian, Arctic, and Antarctic Oceans.

And, the federal government is the central government,


distinguished from the states
OED 89 (Oxford English Dictionary, 2ed. XIX, p. 795)

b. Of or pertaining to the political unity so constituted, as distinguished from the


separate states composing it.

B. Violation- the IOOS is not limited to USFG action- it


includes state, regional, and private sectors
IOOS report to congress 13 [Official US IOOS report sent to congress. 2013, U.S. Integrated Ocean
Observing System (U.S. IOOS) 2013 Report to Congress,
http://www.ioos.noaa.gov/about/governance/ioos_report_congress2013.pdf //jweideman]

U.S. IOOS works with its eyes on the future. The successes of U.S. IOOS are achieved through
cooperation and coordination among Federal agencies, U.S. IOOS Regional Associations, State
and regional agencies, and the private sector. This cooperation and coordination requires a sound

governance and management structure. In 2011 and 2012, program milestones called for in U.S. IOOS legislation
were achieved, laying the groundwork for more success in the future. First, the U.S. IOOS Advisory Committee was
established. Second, the Independent Cost Estimate was delivered to Congress. As part of the estimate, each of the
11 U.S. IOOS Regional Associations completed 10-year build-out plans, describing services and products to address
local user needs and outlining key assets required to meet the Nations greater ocean-observing needs.

And, the IOOS also applies to the Great Lakes


NOS and NOAA 14 [Federal Agency Name(s): National Ocean Service (NOS), National Oceanic and Atmospheric

Administration (NOAA), Department of Commerce Funding Opportunity Title: FY2014 Marine Sensor and Other Advanced Observing
Technologies Transition Project. ANNOUNCEMENT OF FEDERAL FUNDING OPPORTUNITY EXECUTIVE SUMMARY,
http://www.ioos.noaa.gov/funding/fy14ffo_msi_noaa_nos_ioos_2014_2003854.pdf //jweideman]

U.S. IOOS seeks to increase the rate that new or existing marine sensor
technologies are transitioned into operations mode in order to facilitate the efficient collection of ocean,
coastal and Great Lakes observations. The Marine Sensor Transition topic is focused on transitioning
1. Marine Sensor Transition Topic:

marine sensors from research to operations mode to meet the demonstrated operational needs of end-users.

Letters of Intent (LOIs) are being solicited for this topic with particular emphasis on a) projects comprised
of multi-sector teams of partners, b) projects that will meet the demonstrated operational needs of end-users, and
c) sensors that are at or above TRL 6. Applicants with sensors for ocean acidification that are at or above TRL 6 are
also eligible to apply to this topic if they have strong commitments for operational transition

C. Voting issue for fairness and ground- extra topicality


forces the neg to waste time debating T just to get back
to square one, and it allows the aff to gain extra
advantages, counterplan answers, and link turns to disads

2NC Impact-Education
Definitions are key to education about IOOS
IOOS report to congress 13

[Official US IOOS report sent to congress. 2013, U.S. Integrated Ocean Observing
System (U.S. IOOS) 2013 Report to Congress, http://www.ioos.noaa.gov/about/governance/ioos_report_congress2013.pdf
//jweideman]

The use of standard terms or vocabularies to describe ocean observations data is


critical to facilitating broad sharing and integration of data. Working closely SECOORA and the
community-based Marine Metadata Interoperability Project, U.S. IOOS has published nine
recommended vocabularies over the past 12 months for review by the ocean observing community
including lists for platforms, parameters, core variables, and biological terms. These efforts are helping lead the
ocean observing community towards significantly improved levels of consistency via an improved
semantic framework through which users can adopt recommended vocabularies or
convert their vocabularies to terms that are perhaps used more widely.

2NC Cards- Oceans


IOOS includes monitoring the Great Lakes
IOOS report to congress 13

[Official US IOOS report sent to congress. 2013, U.S. Integrated Ocean Observing
System (U.S. IOOS) 2013 Report to Congress, http://www.oos.noaa.gov/about/governance/ioos_report_congress2013.pdf
//jweideman]

The IOOC recognizes that U.S. IOOS must be responsive to environmental crises while
maintaining the regular long-term ocean observation infrastructure required to
support operational oceanography and climate research. As a source of our Nations ocean
data and products, U.S. IOOS often serves as a resource for the development of targeted
applications for a specific location or sector . At the same time, U.S. IOOS organizes data
from across regions and sectors to foster the national and international application of local
data and products broadly across oceans, coasts, and Great Lakes. Events over the
last few years, including Hurricane Sandy and the Deep Water Horizon oil spill have
awakened U.S. communities to the value and necessity of timely ocean information.
IOOC commends U.S. IOOS for responsive and capable support to the Nation in
these events in addition to diverse everyday support to the Nations maritime
economy. We have much more work to do to build and organize the ocean-observing infrastructure of the
Nateion and look forward to wrking with congress on this continuing challenge.

Ocean exploration is distinct from Great Lakes observation


COR 01 ~ Committee On Resources, OCEAN EXPLORATION AND COASTAL AND
OCEAN OBSERVING SYSTEMS, Science Serial No. 10726 Resources Serial No. 107
47, Accessed 7/3/14 //RJ
On a summer day, our eyes and ears can sense an approaching thunderstorm. Our
senses are extended by radar and satellites to detect advancing storm systems. Our
senses are being extended yet again to anticipate changing states affecting coasts
and oceans, our environment, and our climate. To truly understand the consequences of our actions on the environment and the environments impact on us,
data obtained through ocean exploration, coastal observations , and ocean
observa- tions will be critical.
Coastal observations include observations in the Nations ports, bays,
estuaries, Great Lakes, the waters of the EEZ, and adjacent land cover .
Some of the properties measured in coastal zones, such as temperature and
currents, are the same as those measured in the larger, basin-scale ocean
observation systems. However, the users and applications of those data can be
quite different. For those properties that are similar, there should be a consistent
plan for deployment in the coastal and open ocean systems so that coastal
observations represent a nested hierarchy of observa- tions collected at higher
resolution than those from the open ocean.

Oceans are only the 5 major bodies of water landlocked


and adjacent lakes and rivers are excluded.
Rosenberg 14 ~ Matt Rosenberg, Master's in Geography from CSU, Names for
Water Bodies,
http://geography.about.com/od/physicalgeography/a/waterbodies.htm, accessed
7/3/14 //RJ

Water bodies are described by a plethora of different names in English - rivers,


streams, ponds, bays, gulfs, and seas , to name a few. Many of these terms'
definitions overlap and thus become confusing when one attempts to pigeon-hole a
type of water body. Read on to find out the similarities (and differences) between
terms used to describe water bodies.
We'll begin with the different forms of flowing water. The smallest water channels
are often called brooks but creeks are often larger than brooks but may either be
permanent or intermittent. Creeks are also sometimes known as streams but the
word stream is quite a generic term for any body of flowing water. Streams can be
intermittent or permanent and can be on the surface of the earth, underground, or
even within an ocean (such as the Gulf Stream).
A river is a large stream that flows over land. It is often a perennial water body and
usually flows in a specific channel, with a considerable volume of water. The world's
shortest river, the D River, in Oregon, is only 120 feet long and connects Devil's
Lake directly to the Pacific Ocean.
A pond is a small lake, most often in a natural depression. Like a stream, the word
lake is quite a generic term - it refers to any accumulation of water surrounded by
land - although it is often of a considerable size. A very large lake that contains salt
water, is known as a sea (except the Sea of Galilee, which is actually a freshwater
lake).
A sea can also be attached to, or even part of, an ocean. For example, the Caspian
Sea is a large saline lake surrounded by land, the Mediterranean Sea is attached to
the Atlantic Ocean, and the Sargasso Sea is a portion of the Atlantic Ocean,
surrounded by water.
Oceans are the ultimate bodies of water and refers to the five oceans Atlantic, Pacific, Arctic, Indian, and Southern . The equator divides the Atlantic
Ocean and Pacific Oceans into the North and South Atlantic Ocean and the North
and South Pacific Ocean.

The plan explodes ground- includes the great lakes


NOS and NOAA 14

[Federal Agency Name(s): National Ocean Service (NOS), National Oceanic and Atmospheric
Administration (NOAA), Department of Commerce Funding Opportunity Title: FY2014 Marine Sensor and Other Advanced Observing
Technologies Transition Project. ANNOUNCEMENT OF FEDERAL FUNDING OPPORTUNITY EXECUTIVE SUMMARY,
http://www.ioos.noaa.gov/funding/fy14ffo_msi_noaa_nos_ioos_2014_2003854.pdf //jweideman]

U.S. IOOS seeks to increase the rate that new or existing marine sensor
technologies are transitioned into operations mode in order to facilitate the efficient collection of ocean,
coastal and Great Lakes observations. The Marine Sensor Transition topic is focused on transitioning
1. Marine Sensor Transition Topic:

marine sensors from research to operations mode to meet the demonstrated operational needs of end-users.

Letters of Intent (LOIs) are being solicited for this topic with particular emphasis on a) projects comprised
of multi-sector teams of partners, b) projects that will meet the demonstrated operational needs of end-users, and
c) sensors that are at or above TRL 6. Applicants with sensors for ocean acidification that are at or above TRL 6 are
also eligible to apply to this topic if they have strong commitments for operational transition

2NC Cards- USFG


The data sharing components are the critical part of IOOSthey cant say they just dont do the extra-topical parts
IOOS report to congress 13 [Official US IOOS report sent to congress. 2013, U.S. Integrated Ocean
Observing System (U.S. IOOS) 2013 Report to Congress,
http://www.ioos.noaa.gov/about/governance/ioos_report_congress2013.pdf //jweideman]

Observations are of little value if they cannot be found, accessed, and transformed
into useful products. The U.S. IOOS Data Management and Communications
subsystem, or DMAC, is the central operational infrastructure for assessing,
disseminating, and integrating existing and future ocean observations data. As a
core functional component for U.S. IOOS, establishing DMAC capabilities continues
to be a principal focus for the program and a primary responsibility of the U.S. IOOS
Program Office in NOAA. Importance and Objectives of DMAC Although DMAC implementation
remains a work in progress, a fully implemented DMAC subsystem will be capable of
delivering real-time, delayed-mode, and historical data. The data will include in situ
and remotely sensed physical, chemical, and biological observations as well as
model-generated outputs, including forecasts, to U.S. IOOS users and of delivering
all forms of data to and from secure archive facilities. Achieving this requires a
governance framework for recommending and promoting standards and policies to
be implemented by data providers across the U.S. IOOS enterprise , to provide seamless
long-term preservation and reuse of data across regional and national boundaries and across disciplines. The
governance framework includes tools for data access, distribution, discovery, visualization, and analysis; standards
for metadata, vocabularies, and quality control and quality assurance; and procedures for the entire ocean data life
cycle. The DMAC design must be responsive to user needs and it must, at a minimum, make data and products
discoverable and accessible, and provide essential metadata regarding sources, methods, and quality. The overall

DMAC objectives are for U.S. IOOS data providers to develop and maintain capabilities to: Deliver
accurate and timely ocean observations and model outputs to a range of
consumers; including government, academic, private sector users, and the general
public; using specifications common across all providers Deploy the information
system components (including infrastructure and relevant personnel) for full lifecycle management of observations, from collection to product creation, public delivery, system
documentation, and archiving Establish robust data exchange responsive to variable customer requirements as
well as routine feedback, which is not tightly bound to a specific application of the data or particular end-user

U.S. IOOS daia providers therefore are being encouraged lo address the
following DMAC- specific objectives: A standards-based foundation for DMAC capabilities: U.S.
IOOS partners must clearly demonstrate how they will ensure the establishment and
maintenance of a standards- based approach for delivering their ocean observations
decision support tool

data and associated products to users through local, regional and global/international data networks Exposure of

U.S. IOOS partners must describe how they will ensure coastal
ocean observations are exposed to users via a service- oriented architecture and
recommended data services that will ensure increased data interoperability
including the use of improved metadata and uniform quality-control methods
and access to coastal ocean observations:

Certification and governance of U.S. IOOS data and products: U.S. IOOS partners must present a description of how
they will participate in establishing an effective U.S. IOOS governance process for data certification standards and
compliance procedures. This objective is part of an overall accreditation process which includes the other U.S. IOOS
subsystems (observing, modeling and analysis, and governance)

Federal Government means the United States government


Blacks Law 99 (Dictionary, Seventh Edition, p.703)

The U.S. governmentalso termed national government

National government, not states or localities


Blacks Law 99 (Dictionary, Seventh Edition, p.703)
A national government that exercises some degree of control over smaller political
units that have surrendered some degree of power in exchange for the right to
participate in national political matters

Central government
AHD 92 (American Heritage Dictionary of the English Language, p. 647)
relating to the central government of a federation as distinct from the
its member units.

federal3. Of or
governments of

IOOS includes state, regional, and private sectors


IOOS report to congress 13

[Official US IOOS report sent to congress. 2013, U.S. Integrated Ocean Observing
System (U.S. IOOS) 2013 Report to Congress, http://www.ioos.noaa.gov/about/governance/ioos_report_congress2013.pdf
//jweideman]

U.S. IOOS works with its eyes on the future. The successes of U.S. IOOS are achieved through
cooperation and coordination among Federal agencies, U.S. IOOS Regional Associations, State
and regional agencies, and the private sector. This cooperation and coordination requires a sound
governance and management structure. In 2011 and 2012, program milestones called for in U.S. IOOS legislation
were achieved, laying the groundwork for more success in the future. First, the \U.S. IOOS Advisory Committee was
established. Second, the Independent Cost Estimate was delivered to Congress. As part of the estimate, each of the
11 U.S. IOOS Regional Associations completed 10-year build-out plans, describing services and products to address
local user needs and outlining key assets required to meet the Nations greater ocean-observing needs.

Disads

Politics

1NC Politics Link


Causes fights anti-environmentalism
Farr 2013
Sam, Member of the House of Representatives (D-CA) Chair of House Oceans
Caucus, Review&Forecast Collaboration Helps to Understand And Adapt to Ocean,
Climate Changes http://sea-technology.com/pdf/st_0113.pdf
The past year in Washington, D.C., was rife with infighting between the two political
parties. On issue after is- sue, the opposing sides were unable to reach a compromise on meaningful legislation
for the American people. This division was most noticeable when dis- cussing the fate of
our oceans. The widening chasm between the two political parties resulted in
divergent paths for ocean policy: one with Presi- dent Barack Obama pushing the National
Ocean Policy forward, and the other with U.S. House Republicans under- mining those
efforts with opposing votes and funding cuts . Marine, Environmental Policy Regression The
112th Congress was called the "most anti-environ- mental Congress in history " in a
report published by House Democrats and has been credited for undermining the ma- jor environmental legislation

After the Tea Party landslide in the congressional elections of 2010,


conservatives on Capitol Hill began to flex their muscles to roll back environmental
protections. Since taking power in January 2011, House Republicans held roughly 300 votes to undermine basic
of the past 40 years.

environmental protections that have existed for decades. To put that in per- spective, that was almost one in every
five votes held in Congress during the past two years. These were votes to al- low additional oil and gas drilling in
coastal waters, while simultaneously limiting the environmental review process for offshore drilling sites. There
were repeal attempts to un- dermine the Clean Water Act and to roll back protections for threatened fish and other

There were also attempts to block measures to address climate


changes, ig- noring the consequences of inaction, such as sea level rise and ocean
acidification.
marine species.

2NC Obama PC Link


Costs political capital, requires Obama to push
Farr 2013
Sam, Member of the House of Representatives (D-CA) Chair of House Oceans
Caucus, Review&Forecast Collaboration Helps to Understand And Adapt to Ocean,
Climate Changes http://sea-technology.com/pdf/st_0113.pdf
Bui while the tide of sound ocean policy was retreating in Congress, it was rising on
the other end of Pennsylvania Avenue. The National Ocean Policy Understanding the
need for a federal policy to address the oceans after the Deepwater Horizon
disaster, President Obama used an executive order in 2010 to create the Na- tional
Ocean Policy. Modeled after the Oceans-21 legisla- tion I championed in Congress,
this order meant that the U.S. would have a comprehensive plan to guide our stewardship of the oceans, coasts and Great Lakes. Despite push back from the House of
Representatives, the president continued implementing his National Ocean Policy
this past year through existing law, prioritizing ocean stewardship and promoting
interagency collaboration to ad- dress the challenges facing our marine
environment. For in- stance, the National Oceans Council created the www.data.
gov/ocean portal, which went live just before the start of 2012 and has grown
throughout the year to provide access to data and information about the oceans
and coasts. The U.S. Geological Survey and other National Ocean Council agencies
have been working with federal and state partners to identify the data sets needed
by ocean users, such as fish- eries and living marine resources data, elevation and
shore- line information, and ocean observations.

2NC NOAA Link


Congress hates increased funding for NOAA programs---the
plan would be a fight
Andrew Jensen 12, Peninsula Clarion, Apr 27 2012, Congress takes another ax to
NOAA budget, http://peninsulaclarion.com/news/2012-04-27/congress-takesanother-ax-to-noaa-budget
Frustrated senators from coastal states are wielding the power of the purse to rein in
the N ational O ceanic and A tmospheric A dministration and refocus the agency's priorities on its core missions. During recent
Murkowski ensured no funds would be provided
in fiscal year 2013 for coastal marine spatial planning , a key component of President Barack Obama's
appropriations subcommittee hearings April 17, Sen. Lisa

National Ocean Policy. Murkowski also pushed for an additional $3 million for regional fishery management councils and secured $15 million for the
Pacific Salmon Treaty that was in line to be cut by NOAA's proposed budget (for $65 million total). On April 24, the full Senate Appropriations Committee
approved the Commerce Department budget with language inserted by Sen. John Kerry, D-Mass., and Sen. Olympia Snowe, R-Maine, into NOAA's budget
that would transfer $119 million currently unrestricted funds and require they be used for stock assessments, surveys and monitoring, cooperative
research and fisheries grants. The $119 million is derived from Saltonstall-Kennedy funds, which are levies collected on seafood imports by the
Department of Agriculture. Thirty percent of the import levies are transferred to NOAA annually, and without Kerry's language there are no restrictions on
how NOAA may use the funds.

In a Congress defined by fierce partisanship, no federal agency

has drawn as much fire from both parties as NOAA and its Administrator Jane Lubchenco. Sen. Scott
Brown, R-Mass., has repeatedly demanded accountability for NOAA Office of Law Enforcement abuses uncovered by the Commerce Department Inspector
General that included the use of fishermen's fines to purchase a luxury boat that was only used for joyriding around Puget Sound. There is currently
another Inspector General investigation under way into the regional fishery management council rulemaking process that was requested last August by
Massachusetts Reps. John Tierney and Barney Frank, both Democrats. In July 2010, both Frank and Tierney called for Lubchenco to step down, a
remarkable statement for members of Obama's party to make about one of his top appointments. Frank introduced companion legislation to Kerry's in
the House earlier this year, where it should sail through in a body that has repeatedly stripped out tens of millions in budget requests for catch share
programs. Catch share programs are Lubchenco's favored policy for fisheries management and have been widely panned after implementation in New
England in 2010 resulted in massive consolidation of the groundfish catch onto the largest fishing vessels. Another New England crisis this year with Gulf
of Maine cod also drove Kerry's action after a two-year old stock assessment was revised sharply downward and threatened to close down the fishery.
Unlike many fisheries in Alaska such as pollock, crab and halibut, there are not annual stock assessment surveys around the country. Without a new stock
assessment for Gulf of Maine cod, the 2013 season will be in jeopardy. "I applaud Senator Kerry for his leadership on this issue and for making sure that
this funding is used for its intended purpose - to help the fishing industry, not to cover NOAA's administrative overhead," Frank said in a statement. "We
are at a critical juncture at which we absolutely must provide more funding for cooperative fisheries science so we can base management policies on
sound data, and we should make good use of the world-class institutions in the Bay State which have special expertise in this area." Alaska's Sen. Mark
Begich and Murkowski, as well as Rep. Don Young have also denounced the National Ocean Policy as particularly misguided, not only for diverting core
funding in a time of tightening budgets but for creating a massive new bureaucracy that threatens to overlap existing authorities for the regional fishery
management councils and local governments. The first 92 pages of the draft policy released Jan. 12 call for more than 50 actions, nine priorities, a new
National Ocean Council, nine Regional Planning Bodies tasked with creating Coastal Marine Spatial Plans, several interagency committees and taskforces,
pilot projects, training in ecosystem-based management for federal employees, new water quality standards and the incorporation of the policy into
regulatory and permitting decisions. Some of the action items call for the involvement of as many as 27 federal agencies. Another requires high-quality
marine waters to be identified and new or modified water quality and monitoring protocols to be established. Young hosted a field hearing of the House
Natural Resources Committee in Anchorage April 3 where he blasted the administration for refusing to explain exactly how it is paying for implementing
the National Ocean Policy. "This National Ocean Policy is a bad idea," Young said. "It will create more uncertainty for businesses and will limit job growth.
It will also compound the potential for litigation by groups that oppose human activities. To make matters worse, the administration refuses to tell
Congress how much money it will be diverting from other uses to fund this new policy." Natural Resources Committee Chairman Doc Hastings, R-Wash.,
sent a letter House Appropriations Committee Chairman Hal Rogers asking that every appropriations bill expressly prohibit any funds to be used for
implementing the National Ocean Policy. Another letter was sent April 12 to Rogers by more than 80 stakeholder groups from the Gulf of Mexico to the
Bering Sea echoing the call to ban all federal funds for use in the policy implementation. "The risk of unintended economic and societal consequences
remains high, due in part to the unprecedented geographic scale under which the policy is to be established," the stakeholder letter states. "Concerns are
further heightened because the policy has already been cited as justification in a federal decision restricting access to certain areas for commercial
activity."

Congress refused to fund some $27 million in budget requests for

NOAA in fiscal year 2012


in January anyway.

to implement the National Ocean Policy, but the administration released its draft implementation policy

Russian Oil

Russian Oil DA 1NC


Russias economy is at a tipping point- its holding on because
of oil
The economist 14

[The economist is an economic news source. May 3 2014,


The crisis in Ukraine is hurting an already weakening economy http://www.economist.com/news/finance-and-economics/21601536crisis-ukraine-hurting-already-weakening-economy-tipping-scales //jweideman]

WESTERN measures against Russiaasset freezes and visa restrictions aimed at


people and firms close to Vladimir Putinmay be pinpricks, but the crisis in Ukraine
has already taken its toll on Russias economy and financial markets. Capital flight in the first
three months of 2014 is thought to exceed $60 billion. The stockmarket is down by 20% since the start of the
year and the rouble has dropped by 8% against the dollar. Worries about the devaluation feeding through
to consumer prices have prompted the central bank to yank up interest rates , from
5.5% at the start of March to 7.5%. The IMF reckons the economy is in recession; this week it cut its growth
forecast for 2014 from 1.3% to 0.2%. Despite these upsets, Russia appears to hold strong
economic as well as military cards. It provides 24% of the European Unions gas and
30% of its oil. Its grip on Ukraines gas and oil consumption is tighter still. T hat makes it
hard for the West to design sanctions that do not backfire.

Satellite mapping finds oil- it increases production capabilities


Short 11

[Nicholas M. Short, Sr is a geologist who received degrees in that field from St. Louis University (B.S.), Washington
University (M.A.), and the Massachusetts Institute of Technology (Ph.D.); he also spent a year in graduate studies in the geosciences
at The Pennsylvania State University. In his early post-graduate career, he worked for Gulf Research & Development Co., the
Lawrence Livermore Laboratory, and the University of Houston. During the 1960s he specialized in the effects of underground
nuclear explosions and asteroidal impacts on rocks (shock metamorphism), and was one of the original Principal Investigators of the
Apollo 11 and 12 moon rocks. 2011, Finding Oil and Gas from Space https://apollomapping.com/wpcontent/user_uploads/2011/11/NASA_Remote_Sensing_Tutorial_Oil_and_Gas.pdf //jweideman]

Exploration for oil and gas has always


depended on surface maps of rock types and structures that point directly to, or at least
hint at, subsurface conditions favorable to accumulating oil and gas. Thus, looking at
surfaces from satellites is a practical, cost-effective way to produce appropriate
maps. But verifying the presence of hydrocarbons below surface requires two
essential steps: 1) doing geophysical surveys; and 2) drilling into the subsurface to
actually detect and extract oil or gas or both . This Tutorial website sponsored by the Society of Exploration
If precious metals are not your forte, then try the petroleum industry.

Geophysicists is a simplified summary of the basics of hydrocarbon exploration. Oil and gas result from the decay of organisms mostly marine plants (especially microscopic algae and similar free-floating vegetation) and small animals such as fish - that are
buried in muds that convert to shale. Heating through burial and pressure from the overlying later sediments help in the process.
(Coal forms from decay of buried plants that occur mainly in swamps and lagoons which are eventually buried by younger
sediments.). The decaying liquids and gases from petroleum source beds, dominantly shales after muds convert to hard rock,
migrate from their sources to become trapped in a variety of structural or stratigraphic conditions shown in this illustration:The oil
and gas must migrate from deeper source beds into suitable reservoir rocks. These are usually porous sandstones, but limestones
with solution cavities and even fractured igneous or metamorphic rocks can contain openings into which the petroleum products

the reservoir rocks must be surrounded (at least above) by


impermeable (refers to minimal ability to allow flow through any openings - pores or
fractures) rock, most commonly shales. The oil and gas, generally confined under some pressure, will escape
accumulate. An essential condition:

to the surface - either naturally when the trap is intersected by downward moving erosional surfaces or by being penetrated by a
drill. If pressure is high the oil and/or gas moves of its own accord to the surface but if pressure is initially low or drops over time,
pumping is required.

Exploration for new petroleum sources begins with a search for


surface manifestations of suitable traps (but many times these are hidden by burial and other factors govern
the decision to explore). Mapping of surface conditions begins with reconnaissance, and if
that indicates the presence of hydrocarbons, then detailed mapping begins . Originally,
both of these maps required field work. Often, the mapping job became easier by using aerial photos. After the mapping, much of
the more intensive exploration depends on geophysical methods (principally, seismic) that can give 3-D constructions of subsurface
structural and stratigraphic traps for the hydrocarbons. Then, the potential traps are sampled by exploratory drilling and their
properties measured.

Remote sensing from satellites or aircraft strives to find one or more

indicators of surface anomalies. This diagram sets the framework for the approach used; this is the so-called
microseepage model, which leads to specific geochemical anomalies:

New US oil production collapses Russias economy


Woodhill 14

[Louis, Contributor at forbes, entrepenour and investor. 3/3/14, It's Time To Drive Russia Bankrupt Again
http://www.forbes.com/sites/louiswoodhill/2014/03/03/its-time-to-drive-russia-bankrupt-again/ //jweideman]

The high oil prices of 1980 were not real, and Reagan knew it. They were being caused by the weakness of
the U.S. dollar, which had lost 94% of its value in terms of gold between 1969 and 1980. Reagan immediately
decontrolled U.S. oil prices, to unleash the supply side of the U.S. economy . Even more
importantly, Reagan backed Federal Reserve Chairman Paul Volckers campaign to strengthen and stabilize the U.S. dollar. By the
end of Reagans two terms in office, real oil prices had plunged to $27.88/bbl. As
Russia does today, the old USSR depended upon oil exports for most of its foreign
exchange earnings, and much of its government revenue. The 68% reduction in real
oil prices during the Reagan years drove the USSR bankrupt. In May 1990, Gorbachev called
German Chancellor Helmut Kohl and begged him for a loan of $12 billion to stave off financial disaster. Kohl advanced only $3 billion.
By August of 1990, Gorbachev was back, pleading for more loans. In December 1991, the Soviet Union collapsed .
President Bill Clintons strong dollar policy (implemented via Federal Reserve Vice-Chairman Wayne Angells secret commodity
price rule system) kept real oil prices low during the 1990s, despite rising world oil demand. Real crude oil prices during Clintons

At real oil price levels like this, Russia is financially


incapable of causing much trouble. It was George W. Bush and Barack Obamas
feckless weak dollar policy that let the Russian geopolitical genie out of the bottle .
time in office averaged only $27.16/bbl.

From the end of 2000 to the end of 2013, the gold value of the dollar fell by 77%, and real oil prices tripled, to $111.76/bbl. It is

The Russian government has


approved a 2014 budget calling for revenues of $409.6 billion, spending of $419.6
billion, and a deficit of $10.0 billion, or 0.4% of expected GDP of $2.5 trillion. Unlike
the U.S., which has deep financial markets and prints the worlds reserve currency,
Russia cannot run large fiscal deficits without creating hyperinflation. Given that
Russia expects to get about half of its revenue from taxes on its oil and gas
industry, it is clear that it would not take much of a decline in world oil prices to
create financial difficulties for Russia. Assuming year-end 2013 prices for crude oil ($111.76/bbl) and natural
these artificially high oil prices that are fueling Putins mischief machine.

gas ($66.00/FOE* bbl) the total revenue of Russias petroleum industry is $662.3 billion (26.5% of GDP), and Russians oil and gas

Obviously, a decline in world oil prices would cause


the Russian economy and the Russian government significant financial pain.
export earnings are $362.2 billion, or 14.5% of GDP.

Nuclear war
Filger 9

[Sheldon, founder of Global Economic Crisis, The Huffington Post,, 5.10.9, http://www.huffingtonpost.com/sheldonfilger/russian-economy-faces-dis_b_201147.html // jweideman]

In Russia, historically, economic health and political stability are intertwined to a


degree that is rarely encountered in other major industrialized economies. It was the economic
stagnation of the former Soviet Union that led to its political downfall . Similarly,
Medvedev and Putin, both intimately acquainted with their nation's history, are unquestionably alarmed at the
prospect that Russia's economic crisis will endanger the nation's political stability,
achieved at great cost after years of chaos following the demise of the Soviet Union. Already, strikes and protests are occurring
among rank and file workers facing unemployment or non-payment of their salaries. Recent polling demonstrates that the once

Beyond the political elites are the


financial oligarchs, who have been forced to deleverage, even unloading their
yachts and executive jets in a desperate attempt to raise cash. Should the
Russian economy deteriorate to the point where economic collapse is not out of
the question, the impact will go far beyond the obvious accelerant such an outcome would
be for the Global Economic Crisis. There is a geopolitical dimension that is even more relevant then the economic
context. Despite its economic vulnerabilities and perceived decline from superpower
status, Russia remains one of only two nations on earth with a nuclear arsenal of
supreme popularity ratings of Putin and Medvedev are eroding rapidly.

sufficient scope and capability to destroy the world as we know it. For that reason, it is not
only President Medvedev and Prime Minister Putin who will be lying awake at nights over the prospect that a national economic
crisis can transform itself into a virulent and destabilizing social and political upheaval. It just may be possible that U.S.

UQ: econ
Russias economy is weak, but growth is coming
ITARR-TASS 7/3

[Russian news agency. 7/3/14, Russias economic growth to accelerate to over 3% in 2017
http://en.itar-tass.com/economy/738849 //jweideman]

The Russian government expects the national economy to grow , despite the unfavorable
economic scenario for this year, Prime Minister Dmitry Medvedev said on Thursday. Medvedev chaired a
government meeting on Thursday to discuss the federal budget parameters and
state program financing until 2017. This scenario (of Russias social and economic
development) unfortunately envisages a general economic deterioration and slower
economic growth this year. Then, the growth is expected to accelerate gradually to 2%
next year and to over 3% in 2017, Medvedev said. The 2015-2017 federal budget will set
aside over 2.7 trillion rubles ($78 billion) for the payment of wages and salaries, the
premier said, adding the governments social obligations were a priority in the forthcoming budget period. More than a half
of federal budget funds are spent through the mechanism of state programs. In
2015, the government intends to implement 46 targeted state programs to the
amount of over 1 trillion rubles (about $30 billion) in budget financing, the premier said. The same
budget financing principle will now be used by regional and local governments,
including Russias two new constituent entities - the Republic of Crimea and the city
of Sevastopol, the premier said.

Russia will avoid recession


AFP 7/10

[Agence France-Presse is an international news agency headquartered in Paris. It is the oldest news agency in the
world and one of the largest. 7/10/14, Russia avoids recession in second quarter
https://au.news.yahoo.com/thewest/business/world/a/24425122/russia-avoids-recession-in-second-quarter/ //jweideman]

Russia managed to avoid sliding into a recession in the second quarter, its
deputy economy minister said Wednesday , citing preliminary estimates showing zero growth for the three
months ending June. The emerging giant was widely expected to post a technical recession
with two consecutive quarters of contraction, after massive capital flight over the
uncertainty caused by the Ukraine crisis . "We were expecting a possible technical recession in the second
quarter," Russia's deputy economy minister Andrei Klepach said, according to Interfax
news agency. But it now "appears that we will avoid recession, our preliminary
forecast is one of zero growth after adjusting for seasonal variations," he said.
Official data is due to be published by the statistics institute during the summer ,
Moscow (AFP) -

although no specific date has been given. Russia's growth slowed down sharply in the first quarter as investors, fearing the impact
of Western sanctions over Moscow's annexation of Crimea, withdrew billions worth of their assets. In the first three months of the

Klepach estimated that on an annual


basis, Russia's annual growth would better the 0.5 percent previously forecast. "The
situation is unstable, but we think... that the trend would be positive despite
everything," he said, adding however that it would be a "slight recovery rather than solid growth". The government has
year, output contracted by 0.3 percent compared to the previous quarter.

until now not issued any quarter to quarter growth forecast. Economy minister Alexei Ulyukayev on Monday issued a year-on-year
forecast, saying

output expanded by 1.2 percent in the second quarter of the year.

Russia is experiencing growth


AFP 7/7

[Agence France-Presse is an international news agency headquartered in Paris. It is the oldest news agency in the
world and one of the largest. 7/7/14, Russian output expands 1.2% in second quarter: economy minister
http://www.globalpost.com/dispatch/news/afp/140707/russian-output-expands-12-second-quarter-economy-minister-0 //jweideman]

Russia's economy minister said Monday that output expanded by 1.2 percent in the
second quarter compared to the same period last year , a preliminary figure that is "slightly better"
than expected despite the Ukraine crisis. "The results are slightly better than we predicted, with the
emphasis on 'slightly'," economy minister Alexei Ulyukayev said. He added that the "refined" official figure by Russia's statistics
agency will be released later. The IMF said last month that Russia is already in recession, while the central bank said growth in 2014

Ulyukayev said that second quarter figure, up from 0.9


percent in the first quarter and 0.2 percent better than his ministry's expectations,
was likely bolstered by industrial growth . "This year industry is becoming the main
locomotive of growth," Ulyukayev told President Vladimir Putin in a meeting, citing industrial growth reaching 2.5
is likely to slow to just 0.4 percent.

percent, up from 1.8 percent in the first quarter.

UQ AT: sanctions
There wont be more sanctions
Al Jazeera 14

[Al Jazeera also known as Aljazeera and JSC (Jazeera Satellite Channel), is a Doha-based broadcaster owned
by the Al Jazeera Media Network, June 5 2014. G7 holds off from further Russia sanctions
http://www.aljazeera.com/news/europe/2014/06/g7-hold-off-from-further-russia-sanctions-20146545044590707.html //jweideman]

Leaders of the G7 group of industrial nations have limited themselves to warning


Russia of another round of sanctions as they urged President Vladimir Putin to stop
destabilising Ukraine. On the first day of the group's summit in Brussels on Wednesday, the bloc said that Putin must
pull Russian troops back from his country's border with Ukraine and stem the pro-Russian uprising in the east of the country."Actions
to destabilise eastern Ukraine are unacceptable and must stop," said the group in a joint statement after the meeting. "We stand
ready to intensify targeted sanctions and to implement significant additional restrictive measures to impose further costs on Russia

However, Angela Merkel, the German chancellor, said that further


sanctions would take effect only if there had been "no progress whatsoever ". Al Jazeera's
James Bays, reporting from Brussels, said: "The G7 leader are talking here again about tougher sanctions on Russia. But
they are talking about the same tougher sanctions they have been talking about for
months now." Meanwhile, Putin reached out a hand towards dialogue, despite being banned from
should events so require."

the summit following Russia's annexation of Crimea in March, saying that he was ready to meet Ukraine's president-elect Petro
Poroshenko and US President Barack Obama.

U-Production decline
Oil production is declining
Summers 14

[Dave, author for economonitor, july 7 14, Oil Production Numbers Keep Going Down
http://www.economonitor.com/blog/2014/07/oil-production-numbers-keep-going-down/ //jweideman]

the event, when one can


a growing oil supply to one that is now declining.
Before that relatively absolute point, there will likely come a time when global
supply can no longer match the global demand for oil that exists at that price. We
are beginning to approach the latter of these two conditions, with the former being increasingly
One problem with defining a peak in global oil production is that it is only really evident sometime after
look in the rear view mirror and see the transition from

probable in the non-too distant future. Rising prices continually change this latter condition, and may initially disguise the arrival of
the peak, but it is becoming inevitable. Over the past two years there has been a steady growth in demand, which OPEC expects to
continue at around the 1 mbd range, as has been the recent pattern. The challenge, on a global scale, has been to identify where
the matching growth in supply will come from, given the declining production from older oilfields and the decline rate of most of the

At present the United States is sitting with folk being relatively


complacent, anticipating that global oil supplies will remain sufficient, and that the
availability of enough oil in the global market to supply that reducing volume of oil
that the US cannot produce for itself will continue to exist . Increasingly over the next couple of
horizontal fracked wells in shale.

years this is going to turn out to have created a false sense of security, and led to decisions on energy that will not easily be
reversed. Consider that the Canadians have now decided to build their Pipeline to the Pacific. The Northern Gateway pipeline that
Enbridge will build from the oil sands to the port of Kitimat.

Oil production is decreasing off the coast


Larino 14

[Jennifer, Staff writer for NOLA.com | The Times-Picayune covering energy, banking/finance and general business
news in the greater New Orleans area. 7/8/14, Oil, gas production declining in Gulf of Mexico federal waters, report says
http://www.nola.com/business/index.ssf/2014/07/oil_gas_production_in_federal.html //jweideman]

Oil and gas found off the coast of Louisiana and other Gulf Coast states made up almost
one quarter of all fossil fuel production on federal lands in 2013, reinforcing the region's role as
a driving force in the U.S. energy industry, according to updated government data. But a closer look at the
numbers shows the region's oil and gas production has been in steady decline for
much of the past decade. A new U.S. Energy Information Administration report
shows federal waters in the Gulf of Mexico in 2013 accounted for 23 percent of the
16.85 trillion British thermal units (Btu) of fossil fuels produced on land and water
owned by the federal government. That was more than any other state or region aside from Wyoming, which has
seen strong natural gas production in recent years. The report did not include data on oil and gas production on private lands, which

But
production in the offshore gulf has also fallen every year since 2003. According to
the report, total fossil fuel production in the region is less than half of what it was a
decade ago, down 49 percent from 7.57 trillion Btu in 2003 to 3.86 trillion in 2013. The report notes that the region has seen
a sharp decline in natural gas production as older offshore fields dry out and more companies invest in newer gas finds
makes up most production in many onshore oil and gas fields, including the Haynesville Shale in northwest Louisiana.

onshore, where hydraulic fracturing has led to a boom production. Natural gas production in the offshore gulf was down 74 percent
from 2003 to 2013. The region's oil production has declined, though less drastically. The offshore gulf produced about 447 million
barrels of oil in 2013, down from a high of 584 million barrels in 2010. Still, the region accounted for 69 percent of all the crude oil
produced on all federal lands and waters last year.

Offshore well exploration is ineffective and expensive now


API 14 [The American Petroleum Institute, commonly referred to as API, is the largest U.S trade association for the oil and
natural gas industry. January 2014, Offshore Access to Oil and Gas http://www.api.org/policy-and-issues/policyitems/exploration/~/media/Files/Oil-and-Natural-Gas/Offshore/OffshoreAccess-primer-highres.pdf//jweideman]
Sometimes when a lease is not producing, critics claim it is idle. Much more often than not, non-producing leases are not idle at
all; they are under geological evaluation or in development and could become an important source of domestic supply.

Companies purchase leases hoping they will hold enough oil or natural gas to
benefit consumers and become economically viable for production. Companies can
spend millions of dollars to purchase a lease and then explore and develop it, only
to find that it does not contain oil and natural gas in commercial quantities. It is not
unusual for a company to spend in excess of $100 million only to drill a dry hole.
The reason is that a company usually only has limited knowledge of resource
potential when it buys a lease. Only after the lease is acquired will the company be in a position to evaluate it,
usually with a very costly seismic survey followed by an exploration well. If a company does not find oil or natural gas in commercial
quantities, the company hands the lease back to the government, incurs the loss of invested money and moves on to more
promising leases. If a company finds resources in commercial quantities, it will produce the lease. But there sometimes can be
delays often as long as ten years for environmental and engineering studies, to acquire permits, to install production facilities
(or platforms for offshore leases) and to build the necessary infrastructure to bring the resources to market. Litigation, landowner
disputes and regulatory hurdles also can delay the process.

Oil is getting harder to find


Spak 11 [Kevin, staff writer for newser. 2/16/11, Exxon: Oil Becoming Hard to Find
http://www.newser.com/story/112166/exxon-oil-becoming-hard-to-find.html //jweideman]

Exxon Mobil Corp. is having trouble finding more oil, it revealed yesterday in its
earnings call. The company said it was depleting its reserves, replacing only 95 out
of every 100 barrels it pumps. Though it tried to put a positive spin on things by saying it had made up for the
shortfall by bolstering its natural gas supplies, experts tell the Wall Street Journal thats probably a
shift born of grim necessity. Oil companies across the board are drifting toward
natural gas, because oil is harder and harder to come by , according to the Journal. Most accessible
fields are already tapped out, while new fields are either technically or politically difficult to exploit. The good old days are gone and
not to be repeated, says one analyst. Natural gas is not going to give you the same punch as oil.

2NC Link Wall


Satellites prevent rig destroying eddys- empirically solves production
ESA 6 [European space agency, 1 March 2006, SATELLITE DATA USED TO WARN OIL INDUSTRY OF POTENTIALLY DANGEROUS
EDDY http://www.esa.int/Our_Activities/Observing_the_Earth/Satellite_data_used_to_
warn_oil_industry_of_potentially_dangerous_eddy//jweideman]

Ocean FOCUS began issuing forecasts on 16 February 2006 just in time to warn oil
production operators of a new warm eddy that has formed in the oil and gasproducing region of the Gulf of Mexico. These eddies, similar to underwater
hurricanes, spin off the Loop Current an intrusion of warm surface water that flows northward from the
Caribbean Sea through the Yucatan Strait from the Gulf Stream and can cause extensive and costly damage
to underwater equipment due to the extensive deep water oil production activities in the
region.The Ocean FOCUS service is a unique service that provides ocean current forecasts to the offshore oil
production industry to give prior warning of the arrival of eddies. The service is
based on a combination of state-of-the-art ocean models and satellite
measurements. Oil companies require early warning of these eddies in order to
minimise loss of production, optimise deep water drilling activities and prevent damage to critical equipment. The
Loop Current and eddies shedding from it pose two types of problems for underwater production systems: direct force and induced

The impact
of these eddies can be very costly in terms of downtime in production and
exploration and damage to sub sea components.
vibrations, which create more stress than direct force and results in higher levels of fatigue and structural failure.

IOOS leads to more effective production


NOAA 13 [U.S. Department of Commerce www.ioos.noaa.gov May 2013 National Oceanic and Atmospheric Administration
Integrated Ocean Observing System (IOOS) Program. May 2013, IOOS: Partnerships Serving Lives and Livelihoods
http://www.ioos.noaa.gov/communications/handouts/partnerships_lives_livelihoods_flyer.pdf //jweideman]

IOOS supplies critical information about our Nation's waters. Scientists working to understand
climate change, governments adapting to changes in the Arctic, -nunicipalities monitoring local water quality, industries
jnderstanding coastal and marine spatial planning all have the same need :

reliable and timely access to data


and information that informs decision making. Improving Lives and Livelihoods IOOS enhances our
economy. Improved information allows offshore oil and gas platform and coastal
operators, municipal Dlanners, and those with interests n the coastal zone to
minimize impacts of natural hazards, sea level rise, and flooding. This infor- mation
improves marine forecasts so mariners can optimize shipping routes, saving time
and reducing fuel expenses - translating into cost savings for consumers .IOOS benefits our
safety and environment. A network of water quality monitoring buoys on the Great Lakes makes beaches safer y detecting and
predicting the Dresence of bacteria (E. coli). Mariners use IOOS wave and surface current data to navigate ships safely under
bridges and in narrow channels.

Satellites are effective at finding new oil wells


DTU 9

[Technical University of Denmark (DTU) 2/27/9, New Oil Deposits Can Be Identified Through Satellite Images
http://www.sciencedaily.com/releases/2009/02/090226110812.htm //jweideman]

A new map of the Earths gravitational force based on satellite measurements


makes it much less resource intensive to find new oil deposits. The map will be particularly
useful as the ice melts in the oil-rich Arctic regions . Ole Baltazar, senior scientist at the National Space
Institute, Technical University of Denmark (DTU Space), headed the development of
the map. The US company Fugro, one of the worlds leading oil exploration companies, is one of the companies that have
already made use of the gravitational map. The company has now initiated a research partnership with DTU Space. Ole
Baltazars gravitational map is the most precise and has the widest coverage to date, says Li Xiong, Vice President and Head

the map is particularly useful in


coastal areas, where the majority of the oil is located. Satellite measurements
Geophysicist with Fugro. On account of its high resolution and accuracy,

result in high precision Ole Baltazars map shows variations in gravitational force across
the surface of the Earth and knowledge about these small variations is a valuable
tool in oil exploration. Subterranean oil deposits are encapsulated in relatively light
materials such as limestone and clay and because these materials are light, they
have less gravitational force than the surrounding materials. Ole Baltazars map is based on
satellite measurements and has a hitherto unseen level of detail and accuracy. With this map in your hands, it is, therefore,
easier to find new deposits of oil underground. Climate change is revealing new sea regions The
gravitational map from DTU Space is unique on account of its resolution of only 2 km and the fact that it covers both land and sea
regions. Oil companies use the map in the first phases of oil exploration. Previously, interesting areas were typically selected using

The interesting areas appear clearly on the


map and the companies can, therefore, plan their exploration much more efficiently.
protracted, expensive measurements from planes or ships.

The map will also be worth its weight in gold when the ice in the Arctic seriously begins to melt, revealing large sea regions where it
is suspected that there are large deposits of oil underground. With our map, the companies can more quickly start to drill for oil in
the right places without first having to go through a resource-intensive exploration process, explains Ole Baltazar. Based on height
measurements instead of direct gravitation measurements The success of the gravitational map is due in large part to the fact that
it is not based on direct gravitation measurements but on observations of the height of the sea, which reflects the gravitation.

Height measurements have the advantage that it is possible to determine the


gravitational field very locally and thus make a gravitational map with a resolution
of a few km. For comparison, the resolution of satellite measurements of gravitational force is typically around 200 km.
Satellite gravitation measurements are used, for example, to explore conditions in the deeper strata of the Earth, but are not well
suited to our purposes, Ole Baltazar explains.

IL-Prices k2 russias economy


High oil prices are key to political and economic stability in
Russia
Ioffe 12

[Julia, Foreign policys Moscow correspondant. 6/12/12, What will it take to push russians over the edge
http://www.foreignpolicy.com/articles/2012/06/12/powder_keg?page=0,1 //jweideman]

There is also the economic factor to consider. The Russian economy is currently
growing at a relatively healthy 3.5 percent, but it's useful to recall the whopping
growth rates Russia was posting just a few years ago. I n 2007, the year before the world financial
crisis hit Russia, Russia's GDP growth topped 8 percent. It had been growing at that pace,
buoyed by soaring commodity prices, for almost a decade, and it was not accidental that this
was the decade in which Putin made his pact with the people: You get financial and
consumer comforts, and we get political power. It's hard to maintain such a pact
when the goodies stop flowing. Which brings us to the looming issue of the Russian budget deficit. To keep
the people happy and out of politics , the Russian government has promised a lot of
things to a lot of people. (Putin's campaign promises alone are estimated by the
Russian Central Bank to cost at least $170 billion.) To balance its budget with such
magnanimity, Russia needs high oil prices, to the point where last month, the
Ministry of Economic Development announced that an $80 barrel of oil would be a
"crisis." Keeping in mind that oil is now about $98 a barrel, and that Russia used to be able to balance its budgets just fine with
oil at a fraction of the price, this doesn't look too good for Putin. Factor in the worsening European crisis -- Europe is still Russia's
biggest energy customer -- and the fact that the state has put off unpopular but increasingly necessary reforms, like raising utility
prices, and you find yourself looking at a powder keg.

Russias economy and regime is oil dependent- reverse causal


Schuman 12

[Michael, is an American author and journalist who specializes in Asian economics, politics and history. He is
currently the Asia business correspondent for TIME Magazine. July 5 2012, Why Vladimir Putin Needs Higher Oil Prices
http://business.time.com/2012/07/05/why-vladimir-putin-needs-higher-oil-prices/ //jweideman]

Falling oil prices make just about everyone happy. For strapped consumers in struggling developed nations, lower
oil prices mean a smaller payout at the pump, freeing up room in strained wallets to spend on other things and boosting
economic growth. In the developing world, lower oil prices mean reduced inflationary pressures, which will give central bankers
more room to stimulate sagging growth. With the global economy still climbing out of the 2008 financial crisis, policymakers around

But Vladimir Putin is not one of them.


The economy that the Russian President has built not only runs on oil, but runs on
oil priced extremely high. Falling oil prices means rising problems for Russia both
for the strength of its economic performance, and possibly, the strength of Putin
himself. Despite the fact that Russia has been labeled one of the worlds most
promising emerging markets, often mentioned in the same breath as China and
India, the Russian economy is actually quite different from the others. While India gains
growth benefits from an expanding population, Russia, like much of Europe, is aging; while
economists fret over Chinas excessive dependence on investment, Russia badly
needs more of it. Most of all, Russia is little more than an oil state in disguise. The
country is the largest producer of oil in the world (yes, bigger even than Saudi Arabia), and Russias dependence on
crude has been increasing. About a decade ago, oil and gas accounted for less than
half of Russias exports; in recent years, that share has risen to two-thirds. Most of
all, oil provides more than half of the federal governments revenues. Whats more, the
economic model Putin has designed in Russia relies heavily not just on oil, but high oil prices. Oil lubricates the
Russian economy by making possible the increases in government largesse that
have fueled Russian consumption. Budget spending reached 23.6% of GDP in the first quarter of 2012, up from
the world can welcome lower oil prices as a rare piece of helpful news.

What that means is Putin requires a higher oil price to meet his
spending requirements today than he did just a few years ago. Research firm Capital Economics
15.2% four years earlier.

figures that the government budget balanced at an oil price of $55 a barrel in 2008, but that now it balances at close to $120. Oil
prices today have fallen far below that, with Brent near $100 and U.S. crude less than $90. The farther oil prices fall, the more
pressure is placed on Putins budget, and the harder it is for him to keep spreading oil wealth to the greater population through the

With a large swath of the populace angered by his re-election to the


nations presidency in March, and protests erupting on the streets of Moscow, Putin can ill-afford a
significant blow to the economy, or his ability to use government resources to firm up his popularity. Thats why
government.

Putin hasnt been scaling back even as oil prices fall. His government is earmarking $40 billion to support the economy, if necessary,
over the next two years. He does have financial wiggle room, even with oil prices falling. Moscow has wisely stashed away

a rainy day fund it can tap to fill its budget needs. But Putin doesnt have the
flexibility he used to have. The fund has shrunk, from almost 8% of GDP in 2008 to a
touch more than 3% today. The package, says Capital Economics, simply highlights the weaknesses of Russias
economy: This cuts to the heart of a problem we have highlighted before namely that Russia is now much more
dependent on high and rising oil prices than in the past The fact that the share of
permanent spending (e.g. on salaries and pensions) has increasedcreates
additional problems should oil prices drop back (and is also a concern from the perspective of mediumpetrodollars into

term growth)The present growth model looks unsustainable unless oil prices remain at or above $120pb.

IL-Production lowers prices


Production lowers oil prices
Leonardo 12

[Maugeri, Leonardo. Global Oil Production is Surging: Implications for Prices, Geopolitics, and the
Environment. Policy Brief, Belfer Center for Science and International Affairs, Harvard Kennedy School, June
2012.http://belfercenter.ksg.harvard.edu/files/maugeri_policybrief.pdf //jweideman]

Oil Prices May Collapse. Contrary to prevailing wisdom that increasing global
demand for oil will increase prices, the report finds oil production capacity is
growing at such an unprecedented level that supply might outpace consumption.
When the glut of oil hits the market, it could trigger a collapse in oil prices. While the age
of "cheap oil" may be ending, it is still uncertain what the future level of oil prices might be. Technology may turn today's expensive

The oil market will remain highly volatile until 2015 and prone to
extreme movements in opposite directions, representing a challenge for investors.
After 2015, however, most of the oil exploration and development projects analyzed in the report will advance significantly and
contribute to a shoring up of the world's production capacity. This could provoke overproduction and lead
to a significant, steady dip of oil prices, unless oil demand were to grow at a sustained yearly rate of at least
1.6 percent trough 2020. Shifting Market Has Geopolitical Consequences. The United States could conceivably
produce up to 65 percent of its oil consumption needs domestically , and import the
remainder from North American sources and thus dramatically affect the debate
around dependence on foreign oil. However the reality will not change much, since there is one global oil market
oil into tomorrow's cheap oil.

in which all countries are interdependent. A global oil market tempers the meaningfulness of self- sufficiency, and Canada,
Venezuela, and Brazil may decide to export their oil and gas production to non- U.S. markets purely for commercial reasons.

considering the recent political focus on U.S. energy security, even the spirit
of oil self-sufficiency could have profound implications for domestic energy policy
and foreign policy. While the unique conditions for the shale boom in the United States cannot be easily replicated in
However,

other parts of the world in the short-term, there are unknown and untapped resources around the globe and the results of future
exploration development could be surprising. This combined with China's increasing influence in the Middle East oil realm will
continue to alter the geopolitics of energy landscape for many decades.

New production decreases global prices


Tallet et al 14

[Harry vidas, Martin Tallett Tom OConnor David Freyman William Pepper Briana Adams Thu Nguyen Robert
Hugman Alanna Bock ICF International EnSys Energy. March 31, 2014. The Impacts of U.S. Crude Oil Exports on Domestic Crude
Production, GDP, Employment, Trade, and Consumer Costs http://www.api.org/~/media/Files/Policy/LNG-Exports/LNG-primer/APICrude-Exports-Study-by-ICF-3-31-2014.pdf //jweideman]

increase in U.S. crude production accompanied by a relaxation of crude export constraints would tend
to increase the overall global supply of crude oil, thus pulling downward pressure
global oil prices. Although the U.S. is the second largest oil producer in the world and could soon be the largest by 2015,
according to the International Energy Agency (IEA)48, the price impact of crude exports is determined
by the incremental production, rather than total production . For this study, ICF used Brent crude as
The

proxy for the global crude price as affected by forces of global crude supply and demand. The impact of lifting crude exports on
Brent prices, as shown in Exhibit 4-19, is relatively small, about $0.05 to $0.60/bbl in the Low- Differential Scenario and about $0.25

Brent prices are affected by various


factors such as emerging supply sources, OPEC responses to increasing U.S. and Canadian production, and
to $1,05/bbl in the High-Differential Scenario. It should be noted that

geopolitical events. Changes in any of these factors could mean actual Brent prices would deviate significantly from our forecasts.

However, in general, higher global production leads to lower crude prices,

all other factors


being equal.Allowing crude exports results in a Brent average price of $103.85/bbl over the 2015-2035 period, down S0.35/bbl from
the Low-Differential Scenario without exports and down $0.75/bbl from the High-Differential Scenario without exports.While global
crude prices drop, domestic crude prices gain strength when exports are allowed because lifting the restrictions helps relieve the
U.S. crude oversupply situation and allows U.S. crudes to fully compete and achieve pricing in international markets close to those of
similar crude types. Exhibit 4-20 shows WTI prices increase to an average of $98.95/bbl over the 2015-2035 period in the LowDifferential and High-Differential With Exports Cases as opposed to $96.70/bbl in the Low-Differential No Exports Case and
$94.95/bbl in the High-Differential No Exports Case. The range of increase related to allowing exports is $2.25 to $4.00/bbl averaged
over 2015 to 2035.

Production decreases prices


Alquist and Guenette 13

[Ron Alquist and Justin-Damien Gunette work for the bank of Canada. 2013, A Blessing
in Disguise: The Implications of High Global Oil Prices for the North American Market, http://www.bankofcanada.ca/wpcontent/uploads/2013/07/wp2013-23.pdf //jweideman]

The presence of these unconventional sources of oil throughout the world and the ability to recover
them makes a large expansion in the physical production of oil a possibility. Recent
estimates suggest that about 3.2 trillion barrels of unconventional crude oil, including up to 240 billion barrels of tight oil, are
available worldwide (IEA 2012a). By 2035, about 14 per cent of oil production will consist of unconventional oil, an increase of 9
percentage points.

The potential for unconventional oil extraction around the world has led some
oil industry analysts to describe scenarios in which the world experiences an oil glut
and a decline in oil prices over the medium term (Maugeri 2012).

Impact-Proliferaiton
Russias economy is key to stop proliferation
Bukharin 3 [Oleg, August, he is affiliated with Princeton University and received his Ph.D. in physics from the Moscow
Institute of Physics and Technology The Future of Russia: The Nuclear Factor,
http://www.princeton.edu/~lisd/publications/wp_russiaseries_bukharin.pdf,//jweideman]
There are presently no definite answers about the future of the nuclear security agenda in Russia. The Russian nuclear legacy its nuclear forces, the
nuclear-weapons production and power-generation complex, huge stocks of nuclear-useable highly enriched uranium and plutonium, and environmental

What is clear is that nuclear security and


proliferation risks will be high as long as there remain their underlying causes: the
oversized and underfunded nuclear complex, the economic turmoil, and wide-spread crime and corruption. The
magnitude of the problem, however, could vary significantly depending on Russias progress in downsizing of its nuclear
weapons complex; its ability to maintain core competence in the nuclear field; availability of
funding for the nuclear industry and safeguards and security programs; political
commitment by the Russian government to improve nuclear security; and
international cooperation. Economically-prosperous Russia, the rule of law, and a smaller, safer and more
secure nuclear complex would make nuclear risks manageable . An integration of the Russian nuclear complex into the
clean-up problems is not going to go away anytime soon.

worlds nuclear industry, increased transparency of nuclear operations, and cooperative nuclear security relations with the United States and other
western countries are also essential to reducing nuclear dangers and preventing catastrophic terrorism.

Proliferation is the worst it destabilizes hegemony and all


other nations and escalates to nuclear war deterrence
doesnt account for irrational actors
Maass 10 Richard Maass, working for his Ph. D. in political science at Notre dame
University, and currently teaches classes there on International Relations. 2010
Nuclear Proliferation and Declining U.S. Hegemony
http://www.hamilton.edu/documents//levitt-center/Maass_article.pdf
On August 29, 1949, The Soviet Union successfully tested its first nuclear fission
bomb, signaling the end of U.S. hegemony in the international arena. On September
11th, 2001, the worlds single most powerful nation watched in awe as the very
symbols of its prosperity fell to rubble in the streets of New York City. The United
States undisputedly has a greater share of world power than any other country in
history (Brooks and Wolforth, 2008, pg. 2). Yet even a global hegemon is ultimately fallible and
vulnerable to rash acts of violence as it conducts itself in a rational manner and
assumes the same from other states. Conventional strategic thought and military action no longer prevail in an
era of increased globalization. Developing states and irrational actors play increasingly
influential roles in the international arena. Beginning with the U.S.S.R. in 1949, nuclear
proliferation has exponentially increased states relative military capabilities as well
as global levels of political instability. Through ideas such as nuclear peace theory,
liberal political scholars developed several models under which nuclear weapons not
only maintain but increase global tranquility. These philosophies assume rationality
on the part of political actors in an increasingly irrational world plagued by
terrorism, despotic totalitarianism, geo-political instability and failed international
institutionalism. Realistically, proliferation of nuclear [weapons]constitutes a
threat to international peace and security (UN Security Council, 2006, pg. 1). Nuclear security
threats arise in four forms: the threat of existing arsenals, the emergence of new
nuclear states, the collapse of international non-proliferation regimes and the rise of

nuclear terrorism. Due to their asymmetric destabilizing and equalizing effects,


nuclear weapons erode the unipolarity of the international system by balancing
political actors relative military power and security . In the face of this inevitable nuclear proliferation
and its effects on relative power, the United States must accept a position of declining
hegemony. Despite nuclear proliferations controversial nature, states continue to develop the technologies requisite for
constructing nuclear weapons. What motivates men to create the most terrifying weapons ever created by human kindunique in
their destructive power and in their lack of direct military utility(Cirincione, 2007, pg. 47)? Why then do states pursue the

nuclear weapons comprise a symbolic asset of


strength and as a prerequisite for great power status (Cirincione, 2007, pg. 47). On a simplistic
controversial and costly path of proliferation? To states,

level, nuclear weapons make states feel more powerful, respected and influential in world politics. When it is in their best interest,

states develop nuclear capabilities to ensure their own sovereignty and to


potentially deter other states from attacking. According to realist thinkers, nuclear
weapons provide the ultimate security guarantor in an anarchic international
system (Cirincione, 2007, pg. 51). Proliferation optimists and rational deterrence theorists,
such as Kenneth Waltz, argue proliferation stabilizes international security and
promotes peace. Rational deterrence theory states that nations refrain from nuclear
conflict because of the extraordinarily high cost . Arguably the most powerful military technology ever
developed by man, nuclear weapons have only twice been deployed in actual conflict, due to the devastation they incur. Nuclear
weapons increase the potential damage of any given military conflict due to their immense destructive capabilities. Summarizing
rational deterrence framework, Waltz asserts states are deterred by the prospect of suffering severe damage and by their inability
to do much to limit it (Sagan and Waltz, 2003, pg 32). According to the rational deterrence framework, political actors refrain from
both conventional and nuclear conflict because of the unacceptably high costs. Ultimately an assumption, rational deterrence theory

Nuclear proliferation exponentially increases the possibility


of non-proliferation regime collapse and nuclear conflict, reducing all states relative
power. Nuclear peace theory seems plausible, but like any mathematical model it
may only marginally apply to world politics and the dynamics of nuclear
proliferation, due to the fact that international security is not reducible to the
theory of mathematical games (Bracken, 2002, pg. 403). Rather, the spread of nuclear weapons
exponentially decreases the stability of regional and global politics by intensifying
regional rivalries and political tensions, both of which may potentially catalyze a
nuclear catastrophe. Frustrated with a lack of results through conventional conflict, desperate states may look to nuclear
arsenals as a source of absolute resolution for any given conflict. The use of nuclear weapons, even in a
limited theater, could plausibly trigger chain reactions rippling across the globe.
With their interests and sovereignty threatened, other nuclear states will eventually
use their own weapons in an effort to ensure national security. President Kennedy warned of the
lacks any empirically tested evidence.

danger of nuclear proliferation in 1963: I ask you to stop and think for a moment what it would mean to have nuclear weapons in so
many hands, in the hands of countriesthere would be no rest for anyone then, no stability, no real securitythere would only be
the increased chance of accidental war, and an increased necessity for the great powers to involve themselves in what otherwise

Proliferation decreases the relative security of all


states not only through the possibility of direct conflict, but also by threatening
foreign and domestic interests. As the sole international hegemon, the U.S. seeks to
use its power to insure its security and influence international politics in a way that
reflects its own interests and values (Huntington, 1993, pg. 70). In addition to creating a direct
security threat, further proliferation jeopardizes the United States ability to project
its primacy and promote its interests internationally
would be local conflicts (Cirincione, 2007, pg. 103).

Impact-Relations
US-Russia tensions high now
Labott 14

[Elise Labott, CNN Foreign Affairs Reporter. 3/11/14, Ukraine impasse stirs U.S.-Russia tensions
http://www.cnn.com/2014/03/10/world/europe/ukraine-us-russia-tensions/ //jweideman]

Tensions between the United States and Russia over the crisis in Crimea have
exploded into an open row as Russia rejects U.S. diplomatic efforts to solve the
impasse. Russian Foreign Minister Sergey Lavrov said Monday that U.S. Secretary of State John Kerry postponed a face-to-face
meeting with Russian President Vladimir Putin to discuss American proposals, which Moscow has effectively rejected, on solving the

The meeting, which Russia said was supposed to happen Monday, would have
marked the highest-level contact between the two countries since Russian troops
took up positions in Crimea, and would have come ahead of Sunday's potentially
explosive vote on whether Crimea should split from Ukraine and join Russia. But Kerry
told Lavrov he needed to know Moscow would engage seriously on a diplomatic solution before meeting with the Russian leader .
He also wanted to see and end to Russia's "provocative steps" before traveling to
Russia. Expert: We need a 'Plan B' for Ukraine Pro-Russian forces muscle into base What Bush admin. got wrong on Russia
Relations between Russia and the West have grown increasingly tense since Russian soldiers seized effective
control of the pro-Russian region. The United States and other European powers
have threatened possible sanctions in response to Russia's moves, but Moscow has
shown little sign of backing down.
crisis.

At: dutch disease


Russia doesnt have dutch disease, but oil still matters
Adomanis 12

[Mark, contributer to forbes, specializes in Russian economics. 6/22/12, Is Russia Suffering From Dutch
Disease? http://www.forbes.com/sites/markadomanis/2012/06/22/is-russia-suffering-from-dutch-disease///jweideman]

the reason that Russia is not experiencing Dutch Disease (which is something you would
is that the
world economy has been in turmoil for most of the past 4 years: there has been a
flight to quality in safe assets and currencies which has surely worked to
weaken the ruble and depress its value. The new normal is actually a pretty bizarre state of affairs, and is
Part of

normally expect in a country that has earned such an enormous pile of money from selling oil and natural gas)

characterized by any number of things, such as negative real interest rates on German Bunds and US treasuries, that ten years ago

Russias economy faces an awful lot of risks, and its overdependence on natural resources is extremely dangerous, particularly at a time that global growth
is slamming to a halt. Buckley is right that Russia needs to diversify , and that its government will find this process to
be an extremely difficult and complicated one. But, at the present time, one of the very few
economic risks that Russia doesnt face is Dutch Disease: its currency isnt overvalued and, if
would have seemed impossible.

anything, is actually trending lower against the main reserve currencies.

No Russian dutch disease


Dobrynskaya and Turkisch 9

[Victoria Dobrynskaya & Edouard Turkisch


CEPII. September 2009, Is Russia sick with the Dutch disease? http://www.cepii.com/PDF_PUB/wp/2009/wp200920.pdf//jweideman]

The fear that the Russian economy may become too dependent on the energy
sector and not sufficiently diversified has influenced monetary policy over the last
ten years. This policy was aimed at preventing the nominal appreciation of the rouble in order to maintain industrial
competitiveness. In this paper, using Rosstat and CHELEM databases, we study whether Russia
suffered the Dutch disease in 1999-2007. We do find some symptoms of it in Russia:
there was a strong real appreciation of the rouble, real wages increased,
employment decreased in manufacturing industries and rose in services sector.
However, there was no sign of a de- industrialisation, what contradicts the theory of
the Dutch disease. Indeed, industrial production increased significantly. Furthermore, the symptoms present
in Russia can be the consequences of other factors than the existence of natural
resources. The appreciation of the rouble in real terms came partly from the Balassa-Samuelson effect. The quick development
of the services was partly due to the fact that services were not put forward during the Soviet Union limes. The outflow of labour
from the manufacturing industries resulted in inflow of labour in services sector rather than in the energy sector.

AT: China turn


The US and china cooperate over energy
ECP 13

[US China Energy Cooperation Project. 2013-11-20, U.S.-China Energy Cooperation Program (ECP) and Chinese
Institute of Electronics (CIE) Launch Joint U.S.-China Green Data Center Industrial Initiative,
http://www.uschinaecp.org/en/News/NewsDetail.aspx?nid=f74ebbdf-f9be-418a-8147-a12a7e31088b&cid=d952ba0f-3ba2-43728b26-ef635b67d638 //JWEIDEMAN]

On November 20, 2013, with support of the Chinese Ministry of Industry and
Information Technology (MIIT) and the U.S. Trade and Development Agency (USTDA),
the U.S.-China Energy Cooperation Program (ECP) and Chinese Institute of
Electronics (CIE) jointly launched the U.S.-China Green Data Center Industry
Partnership at the US China Green Data Center Workshop at the Xiyuan Hotel in Beijing. The
key related issues of the development of the energy efficient/green data center
sector, which includes: market, needs, opportunities, challenges, technology
solutions and best practices, evaluation methods and etc, have been addressed and discussed at
the workshop. At the workshop, the two sides on behalf of the participating U.S. and Chinese
industries, signed a MEMORANDUM OF UNDERSTANDING For Cooperation on Green
Data Center to promote US-China Industry cooperation in Chinas green data center sector. Senior
officials from the Department of Energy Efficiency and Resources Utilization of MIIT, USTDA, Foreign Commercial Service and the U.S.
Department of Energy China Office of the U.S. Embassy in Beijing witnessed the signing ceremony. Industry experts from Intel, UTC,
Caterpillar, Cisco, Emerson, Fuxing Xiaocheng, NewCloud, Neusoft, Huawei, Inspur, ZTE attended the workshop. The three-year joint

ECP and CIE aims to provide valuable reference and living best practices
for green data center development in China through deeply cooperation between
both US and China industries. Specifically, these include: 1. Green data center technical guideline, technology
program between

catalogue, and green data center best practice portfolio. 2. Green data center related incentive plan, business model, monitoring
and evaluation method and system and etc. 3. Green data center energy saving demonstration projects, needs assessment, and
industry entry study etc. 4. Capacity building: green data center expert committee establishment, practical training, and certificate
training and study tour.

The US and China are committed to cooperation over energy


Department of energy 11

[United states department of energy. January 2011, U.S.-China Clean Energy


Cooperation A Progress rePort by the U.s. DePArtment of energy http://www.us-chinacerc.org/pdfs/US_China_Clean_Energy_Progress_Report.pdf//jweideman]

The United States and the Peoples Republic of China have worked together on
science and technology for more than 30 years. Under the Science and Technology
Cooperation Agreement of 1979, signed soon after normalization of diplomatic relations, our two
countries have cooperated in a diverse range of fields, including basic research in
physics and chemistry, earth and atmospheric sciences, a variety of energy-related
areas, environmental management, agriculture, fisheries, civil industrial technology,
geology, health, and natural disaster planning. More recently, in the face of
emerging global challenges such as energy security and climate change, the United
States and China entered into a new phase of mutually beneficial cooperation. In
June 2008, the U.S.-China Ten Year Framework for Cooperation on Energy and the
Environment was created and today it includes action plans for cooperation on
energy efficiency, electricity, transportation, air, water, wetlands, nature reserves and protected areas. In November 2009,
President Barack Obama and President Hu Jintao announced seven new U.S.- China
clean energy initiatives during their Beijing summit. In doing so, the leaders of the worlds two largest
energy producers and consumers affirmed the importance of the transition to a clean and low-carbon economyand the vast
opportunities for citizens of both countries in that transition.the following joint initiatives were announced in november 2009: U.S.-

Scientists and engineers from both countries are working


together to develop clean energy technologies, initially focusing on building energy
efficiency, clean coal and clean vehicles. Both countries are contributing equally to
China Clean Energy Research Center.

$150 million in financial support from public and private sources over five years.
Electric Vehicles Initiative. This initiative includes the joint development of standards for charging plugs and testing protocols of
batteries and other devices, demonstration projects in paired cities to collect and share data on charging patterns and consumer
preferences, joint development of technical roadmaps, and public education projects. Energy Efficiency Action Plan. Both
governments are working together with the private sector to develop energy efficient building codes and rating systems, benchmark
industrial energy efficiency, train building inspectors and energy efficiency auditors for industrial facilities, harmonize test
procedures and performance metrics for energy-efficient consumer products, and exchange best practices in energy efficiency

Energy innovation in one country accelerates clean energy deployment


in all countries. And the combined research expertise and market size of the U.S. and China provide an unprecedented
labeling systems.

opportunity to develop clean energy solutions that will reduce pollution and improve energy security while enhancing economic
growth globally.

AT: Oil shocks


Oil shocks dont collapse the economy
Fels, Pradhan, and Andreopoulos 11

[By Joachim Fels, Manoj Pradhan, Spyros Andreopoulos: Write for


Morgan Stanley- economic institution. March 4, 2011. This Time Is Different
http://www.morganstanley.com/views/gef/archive/2011/20110304-Fri.html //jweideman]

Will elevated oil prices generate a sharp slowdown? The recent run-up in oil prices
has investors worried that a sharp slowdown in global growth lies ahead. Worse,
with the oil market increasingly pricing the risk that the shock may be permanent the Dec 2012 futures contract trades at around US$109 per barrel - there is even talk of the possibility of a
global recession. We think not, for two broad reasons. First, oil prices constitute
wealth redistribution, not wealth destruction. And second, this time is different: as
things stand, we think that the current oil shock is unlikely to harm growth much.
Higher oil prices redistribute wealth rather than destroying it. From a global
perspective, higher oil prices do not mean that wealth is destroyed. Rather, it is
redistributed - from net oil importers to net exporters (see below on The Effects of Oil Price Shocks).
While this wealth transfer can be substantial, much of it is, over time, recycled back into net oilimporter economies: Oil exporters will spend some of their newly gained oil wealth on imports from net importers;
hence, increased import demand replaces some of the domestic demand lost due to
the wealth transfer; Oil exporters will also purchase assets in net importer
economies, providing support for asset markets there. Even so, oil shocks have
always been redistributive - yet arguably they have done considerable damage to the global economy in the past.
The 1970s, when oil price spikes preceded two successive recessions, are held up as the prime examples of the harm that oil can do
to the economy. So, it is tempting to look at the effects of previous oil shocks in order to infer the consequences of the current one.
We would urge caution, however, as all else is not equal. Many things are different this time, so a given increase in the price of oil

Nature of the Shock It is worth bearing


in mind that the increase in the oil price since September has been due to a mixture
of demand and supply influences. Much of the rise in the price of Brent since September from a level of about
should be less harmful now than it would have been in the 1970s: 1.

US$80 per barrel to about US$100 was due to demand, in our view, with the oil market repricing the global growth and inflation
trajectory. It is only the recent rise, due to events in the Middle East, that constitutes a supply shock. That is, supply has accounted

Oil demand shocks due


to strength in the real economy are endogenous; they are unlikely to derail growth.
The price of oil merely acts like an elastic leash on the dog that is the global
economy. If the dog surges ahead too quickly, the constricting effect of the leash
will make sure it slows down - but it will not stop. Between 2003 and 2007, the real price of oil roughly
for less than half of the oil price increase since September. Why does the distinction matter?

doubled, with little evident harm to the global economy. But exogenous, supply-induced shocks tend to be stagflationary in nature.
In the dog metaphor, an oil supply shock could act as a jerk on the leash, which could bring the dog to a halt. 2. Initial Conditions
This recovery is still in relatively early stages and thus more robust than a late-cycle recovery. This is partly because personal
savings rates are higher early in the cycle than late in the cycle, providing more of a cushion for consumer spending. Also, corporate
profit margins are usually higher early on in the cycle than late in the cycle, and companies can thus more easily absorb the cost
push from higher oil prices. Most importantly, global monetary conditions are very loose, providing a cushion to the real economy. 3.

The oil intensity of global GDP is lower. That is, less oil is used
today to produce a dollar of GDP. Hence, a given percentage increase in the
inflation-adjusted price of oil will do less damage to the economy now. Labour
markets are more flexible: The effect of an oil supply shock on employment and output is larger, the less real wages
Structure of the Economy

adjust in response. More flexible labour markets make real wages more responsive to supply shocks, thus dampening the effects on
production and employment. Monetary policy is more credible: A more credible central bank needs to act less, all else equal, to
achieve a given objective. In the case of an oil shock, the implication is that monetary policy needs to do less harm to the real
economy to achieve a given dampening effect on inflation. 4. Policy Response in Net Importing Economies: Rational Inaction' Means
Central Banks Will Keep Rates on Hold The policy response to an oil shock is an important determinant of the overall economic
effect. If central banks were to tighten in response to inflation, the dampening effect of the initial oil shock on the economy would be
amplified. We have just argued, in very general terms, that a more credible monetary policy requires less of a response to achieve a
desired effect. In this particular case, however, we do not expect much of a response in the first place: in our view, most central
banks will not tighten policy aggressively in response to the recent surge in oil prices. This is likely to be true for both DM and EM
monetary authorities. Again, this reaction - or rather, the lack of one - is rational: it is virtually impossible to forecast whether the
current supply worries will abate or not, so a wait and see' stance makes sense. 5. Oil Exporters' Behaviour May Be Materially
Different Oil exporters will likely spend more of the wealth transfer than usual: With the risk of social unrest increasing,

governments will be inclined to increase spending and transfers in order to maintain stability. This suggests that a larger part of the
initial wealth transfer from net exporters to net importers will be reversed, over time, through goods imports of the former from the
latter. At current prices, oil producers will probably want to make up a meaningful part of the Libyan production shortfall over the
medium term - not least because doing so will generate additional revenue with which social stability can be bought. Our commodity
strategy team thinks that there is enough spare capacity to do so (see Crude Oil: MENA Turm-OIL, February 25, 2011).

Farms DA

1NC
Water pollution from runoff on US farms has avoided
regulation so far but continued exemption is not guaranteed
David C. Roberts 9, assistant professor in the Department of Agribusiness and

Applied Economics at North Dakota State University; Christopher D. Clark, associate


professor in the Department of Agricultural Economics at the University of
Tennessee; and Burton C. English, William M. Park and Roland K. Roberts, professors
in the Department of Agricultural Economics at the University of Tennessee,
Estimating Annualized Riparian Buffer Costs for the Harpeth River Watershed,
Appl. Econ. Perspect. Pol. (Winter 2009) 31 (4): 894-913, oxford journals
Since the passage of the Clean Water Act (CWA), largely unregulated nonpoint
sources (NPS) of pollution have contributed to an increasing share of the nations water quality
problems (Copeland). While the amount of agricultural land in the United States has remained relatively constant over the past thirty years,
agricultural production has increased because of technological advances and increasingly fertilizer-intensive agricultural practices. These fertilizerintensive practices have been a major contributor to a threefold increase in the nitrate load in the Mississippi River entering the Gulf of Mexico
(Greenhalgh and Sauer). When the states and territories surveyed the nations rivers and streams, they found 39% of the segments surveyed impaired for
one or more uses. Agriculture was listed as a contributor to 48% of these impairmentsmore than twice as much as any other source (USEPA 2002).
Reducing environmental externalities associated with agriculture poses a number of difficult policy challenges. In the context of water quality,

NPS

runoff from agricultural lands has proven to be relatively resistant to


regulation . For example, the CWA imposes effluent limitations on point sources but not
on NPS and explicitly exempts agricultural stormwater discharges from the definition of
The void created by the absence of federal regulation has been filled
with a variety of voluntary programs (Houck). More recently, it has been filled by the promotion of policies that would allow point
point sources (Houck).1

sources subject to the CWAs effluent limitations to satisfy some of these restrictions by leasing effluent reductions from NPS. This has come to be known
as water quality trading (USEPA 2003b, 1996). States, attracted by the possibility of reducing pollution abatement costs, have implemented a number of

States have also stepped into the void by enacting statutes that address
NPS pollution by requiring landowners to observe mandatory setback or buffer requirements along waterways. These statutes
tend to be limited in their application to specific activities or waterways, or both. For example, while the
Georgia Erosion and Sedimentation Control Act prohibits land-disturbing activities within 7.6 meters (25 feet) of the banks of any state water, it
exempts a fairly extensive list of activities from the prohibition, including agricultural practices, forestry land
water quality trading (WQT) programs (Breetz et al.).

management practices, dairy operations, livestock and poultry management practices, construction of farm buildings (Georgia Department of Natural
Resources). North Carolina has adopted regulations designed to protect a 15.2-meter (50-feet) riparian buffer along waterways in the Neuse and TarPamlico River Basins (North Carolina Department of Environment and Natural Resources). Virginias Chesapeake Bay Preservation Act provides for 30.4meter (100-feet) buffer areas along specifically designated Resource Protection Areas, but allows encroachments into buffer areas for agricultural and
silvicultural activities under certain conditions (Chesapeake Bay Local Assistance Board). Riparian buffer stripsareas of trees, shrubs, or other
vegetation along surface water bodieshave also played a prominent role in federal and state voluntary programs to reduce NPS pollution. For example,
the Conservation Reserve Program has supported the use of buffer strips since 1988 (Prato and Shi) and funds made available to states through the CWAs
Section 319 program are often used to subsidize buffer strip installation (Nakao and Sohngen). Buffer strips are proving an attractive policy option because
of their effectiveness in intercepting and removing nutrients, sediment, organic matter, and other pollutants before these pollutants enter surface water
and because of the other environmental benefits they provide, including improved terrestrial and aquatic habitat, flood control, stream bank stabilization,

NPS pollution has typically avoided the type of regulation


imposed on point sources. However, its prominent role in the impairment of surface and ground water quality means that water
quality cannot be meaningfully improved without addressing agricultural NPS pollutant loadings. The possibility that the costs of
abating pollution from NPS are lower than costs of abating pollution from point
sources (Faeth) may also make NPS an attractive target . In any event, buffer strips are proving to be a relatively
popular means of controlling NPS discharges. Although agriculture has often been excluded from
regulation, there is no guarantee that the agricultural sector will continue to
enjoy this treatment. For example, North Carolinas Department of Water Quality
drafted, but did not adopt , regulations that would have required all crop farms
adjacent to streams to install buffer strips (Schwabe). In the absence of regulation of agricultural NPS, it seems likely
and esthetics (Qiu, Prato, and Boehm).

that alternatives to regulation will continue, if not increase. Many of these will promote the installation of buffer strips through subsidies of one form or

another. For example, the Conservation Reserve Enhancement Program, which subsidizes conservation-oriented practices by landowners, has established
a goal of installing 157,000 miles of riparian buffers, filter strips, and wetlands in states in the Chesapeake Bay watershed (Bonham, Bosch, and Pease).

IOOS leads to new policies to prevent harmful algal blooms--results in stricter regulation on non-point sources of nutrient
pollution like agriculture
Dr. Donald Anderson 8, Senior Scientist in the Biology Department of the Woods
Hole Oceanographic Institution and Director of the U.S. National Office for Marine
Biotoxins and Harmful Algal Blooms, Phd in Aquatic Sciences from MIT, Written
testimony presented to the House Committee on Science and Technology,
Subcommittee on Energy and Environment, July 10 2008,
http://www.whoi.edu/page.do?pid=8915&tid=282&cid=46007
These are but a few of the advances in understanding that have accrued from ECOHAB regional funding. Equally important are the discoveries that

Management options for dealing with the


HABs include reducing their incidence and extent (prevention), stopping or containing blooms
(control), and minimizing impacts (mitigation). Where possible, it is preferable to prevent HABs rather than to treat their symptoms. Since
increased pollution and nutrient loading may enhance the growth of some HAB
species, these events may be prevented by reducing pollution inputs to coastal waters,
particularly industrial, agricultural, and domestic effluents high in plant nutrients. This
is especially important in shallow, poorly flushed coastal waters that are most susceptible to nutrient-related algal problems. As
mentioned above, research on the links between certain HABs and nutrients has highlighted the importance of
non-point sources of nutrients (e.g., from agricultural activities , fossil-fuel
combustion, and animal feeding operations). The most effective HAB management tools are monitoring
programs that involve sampling and testing of wild or cultured seafood products directly from the natural environment, as this allows
unequivocal tracking of toxins to their site of origin and targeted regulatory action .
provide management tools to reduce the impacts of HABs on coastal resources.
impacts of 8

Numerous monitoring programs of this type have been established in U.S. coastal waters, typically by state agencies. This monitoring has become quite
expensive, however, due to the proliferation of toxins and potentially affected resources. States are faced with flat or declining budgets and yet need to
monitor for a growing list of HAB toxins and potentially affected fisheries resources. Technologies are thus urgently needed to facilitate the detection and
characterization of HAB cells and blooms. One very useful technology that has been developed through recent HAB research relies on species- or strainspecific probes that can be used to label only the HAB cells of interest so they can then be detected visually, electronically, or chemically. Progress has
been rapid and probes of several different types are now available for many of the harmful algae, along with techniques for their application in the rapid
and accurate identification, enumeration, and isolation of individual species. One example of the direct application of this technology in operational HAB
monitoring is for the New York and New Jersey brown tide organism, Aureococcus anophagefferens. The causative organism is so small and non-descript
that it is virtually impossible to identify and count cells using traditional microscopic techniques. Antibody probes were developed that bind only to A.
anophagefferens cells, and these are now used routinely in monitoring programs run by state and local authorities, greatly improving counting time and
accuracy. These probes are being incorporated into a variety of different assay systems, including some that can be mounted on buoys and left
unattended while they robotically sample the water and test for HAB cells. Clustered with other instruments that measure the physical, chemical, and
optical characteristics of the water column, information can be collected and used to make algal forecasts of impending toxicity. These instruments are
taking advantage of advances in ocean optics, as well as the new molecular and analytical methodologies that allow the toxic cells or chemicals (such as

A clear need has been identified for improved


instrumentation for HAB cell and toxin detection, and additional resources are needed in this regard. This can be
accomplished during development of the Integrated Ocean Observing System ( IOOS ) for U.S. coastal waters, and
through a targeted research program on HAB prevention, control, and mitigation (see below). These are needed if we are to
achieve our vision of future HAB monitoring and management programs an integrated system that
HAB toxins) to be detected with great sensitivity and specificity.

includes arrays of moored instruments as sentinels along the U.S. coastline, detecting HABs as they develop and radioing the information to resource
managers. Just as in weather forecasting, this information can be assimilated into numerical models to improve forecast accuracy.

That wrecks farm productivity without reducing nutrient


pollution
Carl Shaffer 14, President of the Pennsylvania Farm Bureau, TESTIMONY TO THE
HOUSE COMMITTEE ON TRANSPORTATION AND INFRASTRUCTURE SUBCOMMITTEE
ON WATER RESOURCES AND ENVIRONMENT REGARDING NUTRIENT TRADING AND
WATER QUALITY, March 25 2014,
http://transportation.house.gov/uploadedfiles/2014-03-25-shaffer.pdf
Fifth, the underlying assumption that it is easy and inexpensive for farmers and nonpoint
sources to reduce nutrient loading is a myth. Concept of Water Quality Trading Farm Bureau policy supports
the concept of water quality trading; implicit in that is the notion that participation by farmers is voluntary and that the system reflects the realities of

agriculture. Farm Bureau has a long history of supporting market-based approaches to improving the environment. We have also encouraged states to
include trading in their toolbox to help implement state water quality programs because trading and offsets can reduce costs associated with achieving
environmental improvements. Even with that history of support, however, farmers and ranchers remain skeptical of trading programs in general and
those associated with water quality specifically, and for good reason. Farmers grow things and understand that nutrient enrichment is a predictable
outcome of all human activities not just farming and ranching activities. Farmers understand that agricultural activities like those in virtually every
other part of life, such as shopping malls, golf courses, residential areas to name just a few can affect the amount of nutrients that reach our waters. The
fact is, each and every one of us plays a role in water quality; we all contribute to nutrient loading, either directly or indirectly, through the food we

EPAs environmental strategies too


often focus more on affixing blame for problems or regulating some activity or
person, rather than finding solutions that recognize and seek balance. EPAs toolbox is both limited and dominated by an approach that focuses
consume, the products we purchase, and the way we live our lives. Unfortunately,

heavily on pollution prevention and reduction based on the concept of polluter pays. For conventional pollutants, this approach has resulted in costly

there is an ongoing misperception that


agriculture chronically over-applies nutrients. Nutrients, however, are not
conventional pollutants they are a combination of pollutants from point sources and pollution from nonpoint
controls and restrictive permits on point sources. At the same time,

sources . The fact is, nutrients are critical for optimal productivity in agriculture
even though farmers and ranchers are striving for the best possible ecosystem
function. Managing nutrients is extremely complicated because there is not one
practice, technology or approach that can optimize nutrient utilization throughout
the environment. Therefore, we need policy options that are balanced. We must develop solutions that optimize outcomes. We all want: 1)
safe, affordable and abundant food, fiber and fuel; 2) vibrant and growing communities with jobs and expanding economic activity; and 3) fishable and
swimmable waters. The challenges presented by trading and offset programs are the complex interplay of economic scenarios that could play out over

regulatory offsets

time when such programs are taken to their logical conclusions. For example, if
are required for any new development or
for expanding economic activity, one would expect a regulatory offsets process to trade low-value economic activity for high-value activity. In real life,
however, such a program would not be likely to require an old home to be torn down before a new home could be constructed. Likewise, the construction
and operation of a new manufacturing facility and the jobs inherent to that economic activity would not likely come at the expense of other high-value

result will undoubtedly be a shift in


development activities out of lower value areas, likely rural areas and farmland , into high
value urban areas. The downside of such an offsets program can be represented by simple math. For example, within an urban area, land
economic activity. But trading programs will allow tradeoffs and the

suitable for building a factory could be valued at $100,000 or more per acre, while land in the same geographic area suitable to produce corn or soybeans
could be valued at $10,000 per acre. In a market-based system, it would appear to be only rational to extinguish the environmental externalities
generated by the farmland to offset the externalities associated with the higher value economic activity of manufacturing. While this may be an extreme
example, the reality is that the nation has never used water quality as a mechanism to cap or, in some cases like the Chesapeake Bay, reduce economic

The long-run reality for farmers and ranchers would be that, over time, rural
areas will have fewer and fewer means to sustain themselves . Trading and Offsets are Creatures of
activity.

State Statutes The Clean Water Act leaves the task of controlling water pollution largely to the states; it expressly recognizes, preserves and protects
the primary responsibilities and rights of States to prevent, reduce, and eliminate pollution [and] to plan the development and use of land and water
resources. Authorized federal involvement in state actions is carefully limited. Under no circumstances does the act authorize EPA to assume state
responsibility to develop a planning process or a Total Maximum Daily Load (TMDL) implementation plan. It is within these contexts that trading programs
are often contemplated. As such, states may implement trading and offsets programs established under state laws. In addition, states retain the flexibility
to choose both if and how to use trading in the implementation of state water quality programs. Nutrient Standards May Not be Affordable or Attainable
Without Trading Optimizing nitrogen and phosphorus utilizations through trading may hold potential, but there are significant scientific, market and
regulatory challenges. First, from a scientific standpoint, there is no direct relationship between agricultural nutrient management practices and nutrient
loss. If the relationship were direct, trading would be straightforward, transparent and enable orderly operations of markets. Second, under the Clean
Water Act, states establish and EPA approves water quality standards and criteria. States are currently feeling pressure from EPA to adopt default numeric
nutrient standards and criteria based on the level of nutrients found in pristine waters. Such an approach holds the prospect of establishing standards that

If EPA is successful, cities,


agriculture and other sources of nutrients will incur significant regulatory costs without any
guarantee that water quality improvements will match the required level of investment. Restrictive state
force states to adopt costly control measures that, in the end, are not realistically attainable.

standards that are not based on reference waters can be unachievable and require costly control and management measures. EPA and States Are
Imposing Barriers for Markets and Trading Achieving the environmental and economic goals of point source - nonpoint source (PS-NPS) water quality
trading depends on having understandable rules that clearly define what is being traded and the parameters of the exchange. Trading rules and
procedures establish who can trade, what is traded (credit definition), duration of a credit, baseline requirements (for calculating credits), accepted
procedures for calculating credits, how the trade occurs, trading ratios, verification, liability rules, and enforcement procedures. In theory, trading
assumes market participants have full information about the cost and effectiveness of their nutrient reduction options and can instantly and, at little-to-nocost, obtain information on credit market prices and quantities. However, in the real world people are faced with limited time, resources, skills and
acquaintance with markets. Complex rules and inadequate institutional design can result in poor buyer or seller participation, coordination failures and
lack of desired outcomes. (Shortle, 2013). In fact, ex-post assessments of PS-NPS water quality trading programs already in existence have generally
been negative about their performance. Most have seen little or no trading activity, with the expected role for nonpoint sources unrealized. A number of
reasons have been presented including a lack of trading partners (due to limited regional scale or underlying economics), inadequate regulatory
incentives, uncertainty about trading rules and practice performance, excessively high PS-NPS trading ratios (increasing the cost of nonpoint credits), legal
and regulatory obstacles (including liability concerns), high transaction costs, and participant unfamiliarity and inexperience. Pennsylvanias experience
with water quality trading illustrates a number of the challenges I have mentioned. For example, the rules underlying Pennsylvanias nutrient credit
trading program, created in large part in response to an EPA mandate to reduce pollution in the Chesapeake Bay watershed, are the product of a multiyear stakeholder negotiation process that was codified in regulation in 2010. However, shortly thereafter, EPA announced that it would undertake a review
of the offset and trading program in each Chesapeake Bay jurisdiction. EPAs assessment included questions about whether or not Pennsylvanias
agricultural trading baseline met the requirements of TMDLin spite of the fact that trades had already taken place under the program rules in place at
the time. Further, the assessment included EPAs expectations that Pennsylvania would demonstrate that the existing baseline was sufficient to meet the
TMDL, or otherwise make necessary adjustments to the baseline acceptable to EPA. In response to EPAs review, Pennsylvania has since proposed a
number of possible changes to its trading program that have raised serious questions among existing and potential credit generators and users about
what the rules governing the market for credits will look like going forward. Specifically, many are concerned about what happens to non-point source
credit generators, primarily farmers, who have generated and sold credits under Pennsylvania's existing program, and who may have long-term
commitments to provide credits for years into the future. The uncertainty is not conducive to sustaining a successful, transparent, long-term water quality

trading program. The Myths

Its Neither Easy Nor Inexpensive

It is often assumed that agriculture can supply

credits less expensively than other nonpoint and point sources. Whether or not this is true depends heavily on the trading rules and procedures described
previously. Baseline requirements represent one trading rule that has an important impact on agricultures ability to be the low-price supplier of credits.
Baseline requirements establish the level of stewardship farmers and ranchers perform on a parcel of land before they are eligible to participate in the
trading program and actually produce credits for sale. Any abatement necessary to meet the baseline cannot be sold as credits, but is instead credited to
meeting the load allocation for agriculture. When baselines are more stringent than current practices, a farmer would only be willing to create and sell
credits if the expected credit price were high enough to cover the cost of meeting the baseline plus the cost of any measures taken to produce additional
abatement. This increases the cost of supplying credits, and reduces the amount of credits purchased by point sources. Current research suggests that
concerns about baseline requirements are well founded. Stephenson et al. (2010) found that when the baseline is more stringent than current practices,
agricultural credit costs for nitrogen can surpass costs per pound (for marginal abatement) for point sources because the baseline has claimed the lowestcost pollutant reductions. Ghosh et al. (2011) found that Pennsylvanias baseline requirements significantly increased the cost of entering a trading
program, making it unlikely that nonpoint sources that could reduce nutrient losses for the lowest unit costs would enter the market. Wisconsin has
expressed concern that EPAs approach to defining baselines could obstruct agricultural sources participation in trading programs and possibly impede
water quality improvements (Kramer, 2003). The impact of baseline requirements is a crucial matter and fundamental to the successful operation of any
trading program, though its impact is not unique. Any trading rule or requirement that is incorrectly developed can have similar effects: fewer nonpoint
source credits purchased by point sources, and total abatement costs for regulated sources higher than they could have been. As a regulatory agency,
EPA appears to have difficulty appreciating the realities of how markets function. The agency is not necessarily tasked with creating private markets and
most people would probably agree that the agency has difficulty appreciating the realities of how real markets function. As a result, environmental
markets are suffering from a significant creditability crisis. This ultimately results in skeptical farmers and ranchers who then take a cautious approach to

if it were easy and inexpensive for farmers and


ranchers to reduce nutrient loadings, they would have already figured out a
nutrient trading. Regarding the cost of reducing nutrient loads,

way to capture the benefit associated with incremental nutrients lost to the
environment. Farmers today use some of the most advanced technology in the
world to optimize their productivity. From precision application using 4R nutrient stewardship to GPS technology,
farmers and ranchers are committed to improving their production efficiencies, a
fact that allows them in turn to reduce their environmental footprint. 4R nutrient stewardship is an
effective concept that allows a farmer to use the right fertilizer source, at the right rate, at the right time and with the right placement to optimize nutrient
utilization, while precision agriculture is a farming system that uses technology to allow closer, more site-specific management of the factors affecting
crop production. For example, in precision agriculture, utilizing GPS and yield monitors, farmers can measure their output more precisely by matching
yield data with the location in the field. Special computer-driven equipment can change the rate at which fertilizers, seeds, plant health products and other
inputs are used, based on the needs of the soil and crop in a particular portion of a field. Farmers have embraced precision agriculture and the 4R
philosophy because it is an innovative and science-based approach that enhances environmental protection, expands production, increases farmer
profitability and improves sustainability. Conclusion Your

constituents want affordable and abundant

food , fiber and fuel, and the members of Farm Bureau want the chance to provide them .
Farmers are concerned about the environment. As technology evolves so do farmers. We take advantage of technology, new practices and programs in
order to not only provide safe, affordable and abundant food, fiber and fuel, but also to protect our land, water and air resources.

Extinction
Richard Lugar 2k, Chairman of the Senator Foreign Relations Committee and

Member/Former Chair of the Senate Agriculture Committee calls for a new green
revolution to combat global warming and reduce world instability,
http://www.unep.org/OurPlanet/imgversn/143/lugar.html
In a world confronted by global terrorism, turmoil in the Middle East, burgeoning nuclear threats and other crises, it is easy to lose
sight of the long-range challenges. But we do so at our peril. One of the most daunting of them is meeting the worlds need for food
and energy in this century. At stake is not only preventing starvation and saving the environment, but also world peace and security.

states may go to war over access to resources, and that poverty and famine have
often bred fanaticism and terrorism. Working to feed the world will minimize
History tells us that

factors that contribute to global instability and the proliferation of weapons of


mass destruction . With the world population expected to grow from 6 billion people today to
9 billion by mid-century, the demand for affordable food will increase well beyond current
international production levels. People in rapidly developing nations will have the means greatly to improve their
standard of living and caloric intake. Inevitably, that means eating more meat. This will raise demand for feed grain at the same
time that the growing world population will need vastly more basic food to eat. Complicating a solution to this problem is a dynamic

developing countries often use limited arable land to


expand cities to house their growing populations. As good land disappears, people destroy timber
resources and even rainforests as they try to create more arable land to feed themselves. The longterm environmental consequences could be disastrous for the entire globe.
Productivity revolution To meet the expected demand for food over the next 50 years, we in the United
States will have to grow roughly three times more food on the land we have. Thats a tall order. My
that must be better understood in the West:

farm in Marion County, Indiana, for example, yields on average 8.3 to 8.6 tonnes of corn per hectare typical for a farm in central
Indiana. To triple our production by 2050, we will have to produce an annual average of 25 tonnes per hectare. Can we possibly

boost output that much? Well, its been done before. Advances in the use of fertilizer and water, improved machinery and better
tilling techniques combined to generate a threefold increase in yields since 1935 on our farm back then, my dad produced 2.8 to 3
tonnes per hectare. Much US agriculture has seen similar increases. But of course there is no guarantee that we can achieve those
results again. Given the urgency of expanding food production to meet world demand, we must invest much more in scientific
research and target that money toward projects that promise to have significant national and global impact. For the United States,
that will mean a major shift in the way we conduct and fund agricultural science. Fundamental research will generate the
innovations that will be necessary to feed the world. The
revolution. And our success at

United States

can take a leading position in a productivity

increasing food production may play a decisive humanitarian

role in the survival of billions of people and the health of our planet.

AT Political Opposition
Their evidence overestimates the farm lobbys clout---the farm
bill proves its too weak to influence policies
Ron Nixon 13, NY Times, July 3 2013, Farm Bill Defeat Shows Agricultures
Waning Power, http://www.nytimes.com/2013/07/03/us/politics/farm-bill-defeatshows-agricultures-waning-power.html
WASHINGTON The startling failure of the farm bill last month reflects the declining clout of
the farm lobby

and the once-powerful committees that have jurisdiction over agriculture policy,

and political scientists said

economists

this week. Although a number of factors contributed to the defeat of the bill

including Speaker John A. Boehners failure to rally enough Republican support and Democratic opposition to $20 billion in cuts to
the food stamps program analysts said

the 234 to 195 vote also illustrated the shift in the American

population and political power to more urban areas. There are a small number of Congressional
districts where farming continues to carry much sway, said Vincent H. Smith, a professor of agricultural economics at Montana
State University. Especially

in the House, the farm lobby has been substantially

weakened . For much of American history, the agriculture sectors wielded


tremendous political power. Farm groups were able to get key farm legislation passed by rallying millions of farmers
in nearly every Congressional district. Influential farm state legislators like Representative Jamie L. Whitten of Mississippi, a
Democrat who was chairman of the Appropriations Committee and its subcommittee on agriculture, brought billions in agriculture
financing to their states and fought off attempts to cut subsidy programs despite pressure from both liberals and conservatives. Mr.

But as Americans have moved to the cities and


suburbs, farmers and lawmakers representing districts largely dependent on
agriculture have seen their political muscle steadily decline . Just 2.2 million people now
Whitten died in 1995 after 53 years in Congress.

work in farming in the United States, or about 2.5 percent of the total work force. Farming now accounts for about 1 percent of
gross national product, down from a high of about 9 percent in 1950.

Only 40 lawmakers represent largely

farming districts, according to research by Mr. Smith in 2006. He said that number was probably smaller today.

Political opposition is insufficient to stop environmental


regulation---greenhouse gas regulations prove
Jonathan H. Adler 13, Prof of Law and Director of the Center for Business Law &
Regulation at Case Western Reserve University School of Law, Oct 3 2013,
Conservatives and Environmental Regulation,
http://www.volokh.com/2013/10/03/conservatives-environmental-regulation/
Anti-regulatory rhetoric may be pervasive, but federal environmental
regulation has continued to expand , under Democratic and Republican presidents alike. Antiregulatory conservatives have been able to stem the tide of regulatory initiatives,
but only for a time . The failure to develop and advance non-regulatory alternatives to environmental problems has compromised
efforts to constrain the EPAs regulatory authority. There are plenty of Americans who are suspicious of federal
regulation, but they nonetheless prefer federal environmental regulation to no
environmental protection at all. The failure of anti-regulatory conservatism is on display in the current fight
over federal regulation of greenhouse gases. House Republicans oppose such
regulation (and for good reason), but they have not put forward any alternatives (and many refuse to accept that climate change could be a
problem). Nonetheless , federal regulation of GHGs marches on . The EPA just proposed another set of rules
last month. Blanket

opposition to federal GHG regulation failed to

prevent (or

even

appreciably slow) such regulation , and time is running out. As GHG emission controls get imposed, and companies

invest in the necessary control technologies, the political support for constraining EPA authority over GHGs will decline. Utilities may not want to give up
coal or install costly carbon capture technologies, but once theyve made these investments they will hardly want to see the regulations removed. If these
rules are not stopped soon, it will be too late. This is a reason even those who refuse to accept the likelihood of climate change should consider
alternatives to command and control regulation. Shouting No is insufficient to ensure success.

AT Labor Shortages
Farm labor shortages are a myth that repeatedly fail to pan
out
John Carney 12, CNBC, More Data on The Phony Farm Labor Crisis, 30 Aug

2012, http://www.cnbc.com/id/48847903
Its become something of an annual tradition. Every summer , newspapers around the country
roll out stories of a labor shortage on farms. The fruit is going to rot in the
orchards, crops will go unpicked, agricultural communities will be devastated unless
something is done, the stories predict. Heres a pretty typical version of the story, as told by the San Francisco Chronicle: But the American Farm Bureau
Federation and its California chapter believe there is plenty of reason to worry. "There have been instances in which growers had to disc up whole crops
because they didn't have the workforce to harvest," said Kristi Boswell, the Farm Bureau's director of congressional relations. She points to Georgia, where
whole tomato fields were plowed under last year. "The workforce has been decreasing in the last two to three years, but last year it was drastic." And

Youd think the idea of


a labor shortage in the midst of on ongoing job drought would be laughed out of
town. (Read more: When Are You Retiring?...How Does Never Sound?) But somehow the story keeps getting told
and taken seriously. According to the Farm Labor Shortage Mythologists, American
citizens are just too lazy and would rather go jobless than work on a farm . Heres how the
farmers are saying this year is even worse. In recent years, weve seen just how resilient this narrative really is.

Chronicle puts it: Growers have tried to hire more domestic workers and had hoped that with unemployment rates so high, they'd find local labor. "But
domestic workers don't stick around to finish the job," Boswell said. "It's labor-intensive and often involves hand-picking in grueling conditions." Earlier

the dire farm


labor shortage forecasts of 2011 had absolutely flunked the test of reality . As it turned out,
2011 was one of the best years on record for American farms . One of the criticisms of my reading of
this summer, as these stories started up once again, I tried to introduce some sanity into the discussion by pointing out that

the farm data was that the data about national farm income might have been concealing more local crises. Perhaps the national numbers had come in well
despite the alleged farm crisis, because fields of soy and corn in the Midwest had done well, while the apples of Washington, peanuts of Georgia, and

State-by-state data released by the Department of Agriculture Wednesday


has pretty much destroyed that critique. Lets start with California. The San Francisco Chronicle had
warned: Farmers across California are experiencing the same problem: Seasonal workers who
have been coming for decades to help with the harvest, planting and pruning have dropped off in recent years. With immigration crackdowns, an
California fruits and vegetables went unpicked. Not quite.

aging Mexican population, drug wars at the border and a weakened job market in the United States, the flow of migrants has stopped and may actually

So
what happened? Farm profits, what the Ag Department calls net cash income, in California rose from $11.1 billion
in 2010 to $16.1 in 2011, an eye-popping 45 percent growth. In Washington, the apple harvest was
have reversed, according to the Pew Hispanic Center, a nonprofit, nonpartisan public policy research firm that has been studying the trend.

going to be devastated by a labor shortage. Farm profits instead rose from $1.98 billion to $3.14 billion, a 58
percent

Georgia was going to have a rancid harvest due to its farm labor shortage, according to The
Georgia Report. I guess those peanuts picked themselves, because farm profits in the state rose from $2.5 billion to $2.6 billion.
The mythological Arizona farm labor shortage was supposedly destroying its farm
rise.

sector . Somehow or another , farm profits rose from $734 million to $1.3 billion. Not every state saw
profits boom. Farms in Arkansas, Delaware, Hawaii, Louisiana, New Hampshire, North Carolina, Rhode Island and South Carolina did less well in 2011 than
2010. Alabamas farms saw net cash income fall off a cliff, from $1.2 billion to $773 million. But

clearly this national epidemic

of farm labor shortages just never happened.

Farmers can just raise wages to attract more workers


John Carney 13, CNBC, 5/24/13, Famers Solve Labor Shortage by Raising Pay,

http://www.thedailybeast.com/articles/2013/05/24/famers-solve-labor-shortage-byraising-pay.html
Farm owners have responded to fears of a labor shortage by actually raising
wages by a little bit. The Department of Agriculture reports: Farm operators paid their hired
workers an average wage of $11.91 per hour during the April 2013 reference week, up 4 percent

from a year earlier. Field workers received an average of $10.92 per hour, up 4 percent from a year earlier.
Livestock workers earned $11.46, up 51 cents. The field and livestock worker combined wage rate, at $11.10 per

Hired laborers worked an average of 40.3 hours


during the April 2013 reference week, compared with 39.2 hours a year earlier .
Maybe someone should tell the Partnership for a New American Economy about this. It's just crazy enough
that it may work! By the way, don't worry about this bankrupting the great
hour, was up 48 cents from a year earlier.

American farmer . Farm profits are expected to rise by more than 13 percent this
yearto more than double what they were as recently as 2009.

AT Brazil Solves
Brazil doesnt solve---lack of infrastructure
Alistair Stewart 13, South America Correspondent for DTN/The Progressive
Farmer, 8/6/13, Brazilian Farming's Biggest Problem,
http://www.dtnprogressivefarmer.com/dtnag/view/blog/getBlog.do;jsessionid=2F48B
C2D3422FCE86B5624EA3DE307B4.agfreejvm1?
blogHandle=southamerica&blogEntryId=8a82c0bc3e43976e014055b03e641466
Everybody knows the biggest problems facing Brazilian agribusiness. It's logistics . When the
cost of transporting soybeans from the fields in central Mato Grosso to a China-bound ship reach 40% of the commodity's value, you

Brazil more or less stopped heavy infrastructure investment in the


1970s. So when grain production began to ramp in the center-west, in areas thousands of
miles from the sea, during the 1990s and 2000s, the roads, rail and port facilities eventually became
overrun . The situation gradually deteriorated before descending into chaos
have a serious problem.

over the last year . The freight cost from Sorriso (center-north Mato Grosso) to Paranagua port rose 50% this season,
reaching $3.19 per bushel at the peak of the soybean harvest, while ships were waiting for up to 70 days to load beans at
Paranagua. With demurrage costs (basically ship rental) at $15,000 to $20,000, that wait can be very expensive. "While we don't
resolve the logistics questions, we are inviting other players to enter into the game," said Julio Toledo Piza, chief executive at

Brazil is the only


major grain producer with substantial arable land available to exploit and so is in a
great position to feed a growing world population over the next thirty years, as long as it
BrasilAgro, a corporate farm, at an industry conference to discuss logistics in Sao Paulo Monday.

works out a cheap way of delivering the grain , feed and meat . If not, alternative
producers in South America, Africa and Asia will grow. "This is a huge opportunity that we can't let slip through our grasp," said
Jorge Karl, director of OCEPAR, the farm coops association in Parana state. Governors have belatedly woken up to the logistical
chaos that surrounds them and the federal government has recently given greater priority to infrastructure development. As a
result, a series of plans to improve infrastructure have started moving forward and progress has been made on key grain corridors,
such as the North-South railway, which will eventually connect the new eastern Cerrado soybean fields to ports in the North. "The
situation is vastly improved ... We hope that in six to seven years we can clear the logistics deficit," said Bernardo Figueiredo, head

But while more attention is now being paid to


the infrastructural shortcomings, farmer leaders point out that chronic delays are still the norm .
of EPL, the government's new infrastructure plan overseer.

For example, farmers have been waiting for asphalt along the BR163 highway, which connects central Mato Grosso with northern
Amazon ports, for a decade. Work has progressed over the last two years, leading many to believe the Transport Ministry forecast
that it will be ready in 2014. However, just this week the Ministry admitted that the asphalt may only be completed in 2015, or
after. Similarly, an East-West railway that will connect the soy and cotton fields of western Bahia to the sea is due to be complete in
2015, but work is yet to start on many stretches and, realistically, beans will flow along this route in 2018 at the earliest. Many

projects have been similarly delayed or suspended due to problems with


implementation. Faced with spiraling freight costs, the patience of farm leaders has been wearing thin for a while. "We
other

are told that things are improving but we can't wait. The inability of the government to deliver projects on time is unacceptable,"
according to Carlos Favaro, president of the Brazilian Soybean and Corn Growers Association (APROSOJA-MT). None of the major
new grain infrastructure projects will be ready for upcoming 2013-14 season, and with soybean area set to grow by 5% or more, the

logistical chaos could deepen next year. "We are going to have to muddle through next season," said Luis
Carlos Correa Carvalho, president of the Brazilian Agribusiness Association (ABAG). "Realistically, the situation is only likely to
improve in 2016-17," he added. The

production

high logistics costs will likely slow growth in Brazilian grain

over the next couple of years, but there remains so much pent up demand for logistics in Brazil that any new

export corridor will be inundated as soon as it opens. Speakers at the conference agreed that the fault for the current situation lies
with government incompetence, as there are ample funds to invest. "Credit is everywhere. The money isn't the problem. The
government has to allow it to be invested," said Ingo Ploger, president of the Business Council of Latin America (CEAL). So why is
government so sluggish? For years, the problem was a lack of political will. Simply, farm infrastructure was not a vote winner. More
recently, the realization that Brazil needs farm revenues to underpin the foreign accounts led President Dilma Rousseff to prioritize

after 30 years of neglect, the know-how and systems to implement


big rail, road and port projects just aren't there , explained EPL's Figueiredo. "We suffer because plans
infrastructure. However,

drawn up are often of poor quality and environmental and other authorities don't have expertise in assess them in a timely manner.
That's a big reason why processes are delayed," he explained to the conference. Meanwhile,

corruption remains

rife in Brazil's construction industry. As a result, suspicion of graft is widespread. That

means government auditors are quick to suspend projects when suspicions arise. Until
these problems are solved, Brazil will continue to have production costs 10% above
its competitors in the U.S. and Argentina, noted CEAL's Ploger.

US is the worlds key food producer


Brown 11 (The world is closer to a food crisis than most people realise Lester R.
Brown guardian.co.uk, Tuesday 24 July 2012 07.21 EDT Lester R. Brown is the
president of the Earth Policy Institute and author of Full Planet, Empty Plates: The
New Geopolitics of Food Scarcity, due to be published in October
The United States is the leading producer and exporter of corn, the world's
feedgrain. At home, corn accounts for four-fifths of the US grain harvest. Internationally, the US
corn crop exceeds China's rice and wheat harvests combined. Among the big
three grains corn, wheat, and rice corn is now the leader, with production well above that of wheat and nearly
double that of rice. The corn plant is as sensitive as it is productive . Thirsty and fastgrowing, it is vulnerable to both extreme heat and drought. At elevated temperatures, the corn plant, which is
normally so productive, goes into thermal shock. As spring turned into summer, the thermometer began to rise
across the corn belt. In St Louis, Missouri, in the southern corn belt, the temperature in late June and early July
climbed to 100F or higher 10 days in a row. For the past several weeks, the corn belt has been blanketed with
dehydrating heat. Weekly drought maps published by the University of Nebraska show the drought-stricken area
spreading across more and more of the country until, by mid-July, it engulfed virtually the entire corn belt. Soil
moisture readings in the corn belt are now among the lowest ever recorded. While temperature, rainfall, and
drought serve as indirect indicators of crop growing conditions, each week the US Department of Agriculture
releases a report on the actual state of the corn crop. This year the early reports were promising. On 21 May, 77%
of the US corn crop was rated as good to excellent. The following week the share of the crop in this category
dropped to 72%. Over the next eight weeks, it dropped to 26%, one of the lowest ratings on record. The other 74%
is rated very poor to fair. And the crop is still deteriorating. Over a span of weeks, we have seen how the more
extreme weather events that come with climate change can affect food security. Since the beginning of June, corn

Although the world was


hoping for a good US harvest to replenish dangerously low grain stocks , this is no
longer on the cards. World carryover stocks of grain will fall further at the end of this crop year, making the
prices have increased by nearly one half, reaching an all-time high on 19 July.

food situation even more precarious. Food prices, already elevated, will follow the price of corn upward, quite

Not only is the current food situation deteriorating, but so is the


global food system itself. We saw early signs of the unraveling in 2008 following an
abrupt doubling of world grain prices. As world food prices climbed, exporting
countries began restricting grain exports to keep their domestic food prices down. In
response, governments of importing countries panicked. Some of them turned to
buying or leasing land in other countries on which to produce food for themselves .
Welcome to the new geopolitics of food scarcity. As food supplies tighten,
we are moving into a new food era, one in which it is every country for
itself. The world is in serious trouble on the food front. But there is little
evidence that political leaders have yet grasped the magnitude of what is
happening. The progress in reducing hunger in recent decades has been
reversed. Unless we move quickly to adopt new population, energy, and water policies, the goal of eradicating
hunger will remain just that. Time is running out. The world may be much closer to
an unmanageable food shortage replete with soaring food prices,
spreading food unrest, and ultimately political instability than most
people realise.
possibly to record highs.

Ext UQ
Nutrient runoff from farms is currently unregulated
Ken Kirk 12, Executive Director at the National Association of Clean Water
Agencies, JD from Georgetown University Law Center and Masters in Environmental
Law from GWU Law School, June 20 2012, Is the Clean Water Act Broken and Can
We Fix It? http://blog.nacwa.org/is-the-clean-water-act-broken-and-can-we-fix-it/
Much has been written and will continue to be written about the Clean Water Act this
year as we celebrate the Acts 40th anniversary. I dont think anyone would argue the fact that the Act has been
great for our waterways or that we would be much worse off without it. But now weve entered a much more
difficult and complex period in the history of the Clean Water Act. Federal money continues to dry up. Costs
continue to increase. And requirements have become more stringent. We also have many new challenges to deal
with including climate change, altered weather patterns, and population growth, to name just a few. And
still
to

struggling with how to address wet weather and related runoff, particularly

were

as this relates

nutrients arguably one of our biggest clean water issues right now. And yet, nonpoint

sources remain unregulated . How has this been allowed to continue? Lets look back in time for a
minute. Before the Clean Water Act, discharge was largely unregulated . We know this was
not a good thing. Then, during the first 20 years of the Act, the focus was on wastewater
treatment from domestic and industrial point sources and the broad use of secondary treatment
to accomplish the Acts goals. I believe this was the right thing to do at the time. Unfortunately, this entire
time, nonpoint sources including agricultural operationshave completely
avoided regulatory responsibility for their share of the water pollution problem .
If we continue to ignore this very major source of water quality impairment, then it will come at great cost to all
taxpayers.

Despite water regulations, agricultural runoff is not restricted


Neila Seaman 13, Director, Sierra Club Iowa Chapter, IOWA LEADERS STILL NOT
PROTECTING WATER QUALITY,
http://iowa.sierraclub.org/CAFOs/IALeadersStillNotProtectingWaterQuality.htm
There are two sources of water pollution. Point sources are cities, towns and
industries that treat their wastewater before its piped into a receiving water resource.
They are heavily regulated. Non-point sources are primarily agricultural runoff from
fertilizer and manure and there are basically no regulations on those
sources . The point sources and non-point sources pointing their fingers at each
other as the problem has created a huge logjam that has spanned decades. As a result, the point sources
are now highly regulated and the non-point sources remain unregulated .

Ext Food Insecurity -> War


Food shortages cause nuclear world war 3
FDI 12, Future Directions International, a Research institute providing strategic
analysis of Australias global interests; citing Lindsay Falvery, PhD in Agricultural
Science and former Professor at the University of Melbournes Institute of Land and
Environment, Food and Water Insecurity: International Conflict Triggers & Potential
Conflict Points, http://www.futuredirections.org.au/workshop-papers/537international-conflict-triggers-and-potential-conflict-points-resulting-from-food-andwater-insecurity.html
There is a growing appreciation that the conflicts in the next century will most
likely be fought over a lack of resources. Yet, in a sense, this is not new. Researchers
point to the French and Russian revolutions as conflicts induced by a lack of food. More
recently, Germanys World War Two efforts are said to have been inspired, at least in
part, by its perceived need to gain access to more food. Yet the general sense among those that
attended FDIs recent workshops, was that the scale of the problem in the future could be
significantly greater as a result of population pressures, changing weather, urbanisation, migration, loss of arable
land and other farm inputs, and increased affluence in the developing world. In his book, Small Farmers Secure Food, Lindsay
Falvey, a participant in FDIs March 2012 workshop on the issue of food and conflict, clearly expresses the problem
and why countries across the globe are starting to take note. . He writes (p.36), if people are hungry, especially in
cities, the state is not stable riots, violence, breakdown of law and order and migration result. Hunger feeds
anarchy. This view is also shared by Julian Cribb, who in his book, The Coming Famine, writes that if large
regions of the world run short of food, land or water in the decades that lie ahead, then wholesale,
bloody wars are liable to follow.

He continues:

An increasingly credible scenario

for World War 3 is not so much a confrontation of super powers and their allies, as a festering , selfperpetuating chain of resource conflicts. He also says: The wars of the 21st Century are less likely to be
global conflicts with sharply defined sides and huge armies, than a scrappy mass of failed states, rebellions, civil strife, insurgencies,
terrorism and genocides, sparked by bloody competition over dwindling resources. As another workshop participant put it, people
do not go to war to kill; they go to war over resources, either to protect or to gain the resources for themselves. Another observed

A study by
the I nternational P eace R esearch I nstitute indicates that where food security is

that hunger results in passivity not conflict. Conflict is over resources, not because people are going hungry.

an issue, it is more likely to result in some form of conflict. Darfur, Rwanda,


Eritrea and the Balkans experienced such wars. Governments, especially in developed countries,
are increasingly aware of this phenomenon.

The UK Ministry of Defence, the CIA, the US C enter for

S trategic and I nternational S tudies and the Oslo Peace Research Institute, all identify
famine as a potential trigger for conflicts and possibly even nuclear war .

Ext IOOS Causes Regulation


Improved ocean observation would allow for regulation of
agricultural runoff
Mark J. Kaiser 4, PhD, professor and director of the Research & Development

Division at the Center for Energy Studies at Louisiana State University, PhD and
Allan G. Pulsipher, Professor of Energy Policy in the Center for Energy Studies and
Professor in the Department of Environmental Studies at Louisiana State University,
PhD in Economics from Tulane University, The potential value of improved ocean
observation systems in the Gulf of Mexico, Marine Policy 28 (2004) 469489,
Science Direct
Waterways draining into the GOM transport wastes from 75% of US farms and ranches,
80% of US cropland, hundreds of cities, and thousands of industries located upstream of the GOM coastal zone. Urban
and agricultural runoff contributes large quantities of pesticides, nutrients , and fecal coliform
bacteria. Activities that have contributed or are still contributing to the degradation of coastal water conditions along the Gulf
Coast include the petrochemical, industrial, agricultural, power plants, pulp and paper mills, fish processing, municipal wastewater

Nonpoint sources are difficult to regulate and


currently have the greatest impact on the GOM coastal water quality. Nonpoint pollutant sources
include agriculture , forestry, urban runoff, marinas, recreational boating, and atmospheric deposition. One of
treatment, maritime shipping, and dredging.

the greatest concerns for GOM coastal water quality is an excess of nutrients
can lead to noxious algal blooms, decreased seagrasses, fish kills, and oxygen-depletion events.

which

Improved ocean

observation systems is expected to allow environmental activity in the region to


be better understood and monitored .

Regulation is currently impossible because its hard to


determine the cause of blooms---the plan allows for attribution
that makes regulations possible
David Caron 7, Professor of Biological Sciences at USC, PhD in Biological

Oceanography from MIT and Woods Hole Oceanographic Inst., and Burt Jones,
Adjunct Professor of Biological Sciences at USC, PhD in Biological Oceanography
from Duke, Making Use of Ocean Observing Systems: Applications to Marine
Protected Areas and Water Quality, Sept 25-26 2007,
http://calost.org/pdf/resources/workshops/OceanObserving_Report.pdf
Recent summaries of available information have indicated that coastal ecosystems have witnessed a general increase
in the occurrence and severity of harmful and toxic algal blooms. The coastline of California is no exception,
with powerful neurotoxins such as saxitoxin and domoic acid now commonly observed throughout the state. Numerous factors have been implicated as possible contributors to these

Harmful blooms can result from natural, seasonal supply of nutrients to coastal waters during upwelling and from
anthropogenic inputs of nutrients in river discharges and land runoff. Unfortunately, quantifying

coastal algal blooms.

the contribution of the many potential sources of nutrients that can support algal blooms is a daunting
task because of the difficulty of maintaining continuous observations in the ocean .
Harmful algal blooms are ephemeral

events that can develop quickly and dissipate before their causes can be adequately characterized.

Efforts by municipalities, counties and the state to provide responsible environmental stewardship

of coastal waters

are often

thwarted by the lack of sufficient observational capabilities to document water quality, let alone
determine the cause(s) of adverse events such as harmful algal blooms. Partnerships are desperately needed between ocean observing programs,

research/academic institutions, coastal managers and monitoring programs (including both government and nonprofit) to grapple with the increasing number of environmental influences

Ocean observing systems provide a tool for


monitoring, understanding and ultimately predicting harmful algal blooms. Sophisticated sensing instruments installed on piers, deployed from ocean buoys, or
on algal population growth and toxin production in coastal marine ecosystems.

carried by mobile surface and underwater vehicles provide a constant presence in the ocean that assists scientists to detect impending or emerging events, and guide their sampling
effort. Aquatic sensors can provide information on salinity (which can identify sources of freshwater input to the coastal ocean), temperature (which yields information on the physical
structure of the water column, a primary factor affecting algal growth in nature), chlorophyll fluorescence (which documents the biomass of algae in the water) and dissolved oxygen
(which indicates biological activity and ecosystem health). A variety of ever-improving nutrient sensors can quantify specific substances that may stimulate algal growth (e.g.,
ammonium, nitrate, phosphate). In addition, new sensors that can detect specific microorganisms and/or toxins produced by these species are under development and eventually will
increase the capabilities of ocean observing systems. These

observing systems fill a fundamental gap in our

ability to document harmful or toxic events, and aid our attempts to attribute these events to
specific environmental causes.

Ext Regulations Hurt Farms


Even a small link is enough---farms operate on thin margins
that make the cost of regulation devastating
Daniel R. Mandelker 89, Stamper Professor of Law, Washington University in St.
Louis, June 1989, Controlling Nonpoint Source Water Pollution: Can It Be Done,
Chicago-Kent Law Review Vol. 65, Issue 2,
http://scholarship.kentlaw.iit.edu/cgi/viewcontent.cgi?
article=2776&context=cklawreview
Another obstacle to controlling nonpoint pollution is that the nonpoint source may be
unable to internalize the cost of the control or pass it on to consumers. 53 This
problem particularly arises with controls on agricultural nonpoint sources .
These controls can be expensive in an industry marked by thin margins and low
profitability . Nor are farmers, as an unorganized production group, able to pass
the costs of these controls on to consumers. 54 In contrast, nonpoint source land use controls applied to
urban development may not present this problem. Urban developers may be able to pass the cost of these controls on to their
consumers,55 and local governments can use density bonuses to offset the cost of controls necessary to reduce nonpoint pollution.

Regulations would be ineffective and wreck the economic


viability of farms
AFT 13, American Farmland Trusts Center for Agriculture in the Environment,
August 2013, Controlling Nutrient Runoff on Farms,
http://www.farmland.org/documents/FINAL-ControllingNutrientRunoffonFarms.pdf
Direct regulation of nutrient runoff from farms is highly unlikely in the United States (Williams 2002). The
geographic dimensions make federally designed, nationally uniform technology based performance and emissions standards

difficult to implement without a marked increase in budgeting for individual farm permitting, monitoring and
Local variations in weather, soil salinity, and soil erosion potential, leaching
potential, and freshwater availability present further challenges to an effective national
regulatory regime. Variations in crop type, production practices, livestock type and
concentration, use of irrigation, tillage practices, sediment runoff and fertilizer
runoff all contribute to the difficulty of one size fits all regulation . Social factors like
proximity to metropolitan area, and surrounding land use also influence farm practices. EPA has noted that a program of
this breadth would make it very difficult to implement and enforce regulations . The
economic dimensions of agriculture also pose barriers to regulation. Agriculture
enforcement.

in the U nited S tates has vast economic value , yet is dispersed widely across the
country and by landowner. Faced with the rising costs of inputs and equipment, the farm
industry is quickly consolidating. Increased environmental regulation of farms
may reduce their economic viability due to compliance costs . And the political
dimensions, mentioned earlier, that make regulation of agriculture difficult include a consolidated voting block, strong lobbying and
political pressure.

The cost of complying would be prohibitive


John Leatherman 4, Associate Professor in the Dept of Agricultural Economics at
Kansas State, PhD from the University of Wisconsin, Craig Smith, Graduate Research
Assistant in the Dept of Agricultural Economics at Kansas State, and Jeffrey
Peterson, Assistant Professor in the Dept of Agricultural Economics at Kansas State,

Agu 19-20 2004, An Introduction to Water Quality Trading,


http://www.agmanager.info/events/risk_profit/2004/LeathermanPeterson.pdf
A significant share of the current water quality problems in Kansas stem from nonpoint
source pollution, such as urban storm water and agricultural runoff (KDHE, 2002). Applying the
same command and control regulatory approach to nonpoint source pollution would
be problematic for a number of reasons. The widely distributed nature of nonpoint source
pollution would make regulation and compliance costs frighteningly prohibitive .
Its also likely there would be significant public opposition to broad regulation of the many activities contributing nonpoint source
pollution.

Counterplans

Europe

1NC Europe CP
Counterplan: The European Union should fully develop and
implement an integrated European Ocean Observation System.
Integrated European observation is key to broader data tech
is world class, just a question of implementation.
EMD 2014
REPORT FROM THE JOINT EUROGOOS/EMODNET/EMB/JRC WORKSHOP AT THE
EUROPEAN MARITIME DAY IN BREMEN,The importance of an integrated end-to-end
European Ocean Observing System: key message of EMD 2014
http://eurogoos.eu/2014/06/09/eoos-at-emd-bremen-2014/

Ocean observations are essential for marine science, operational services and
systematic assessment of the marine environmental status. All types of activities in
the marine environment require reliable data and information on the present and future
conditions in which they operate. Many maritime economic sectors (e.g. oil and gas exploration,
maritime transport, fisheries and aquaculture, maritime renewable energy) directly benefit from easily
accessible marine data and information in several ways: improved planning of operations, risk
minimization though increased safety, improved performance and overall reduced cost. Other activities, such as
deep sea mining and marine biotechnology, also benefit from specialized deep-sea observations that were not

The complexity and high density of human activities in European


seas and oceans result in a high demand for marine knowledge in the form of data,
products and services to support marine and maritime activities in Europe, stressing
the need for an integrated European approach to ocean observation and marine
data management (Navigating the Future IV, European Marine Board 2013). While Europe
feasible until recently.

already has a relatively mature ocean observing and data management


infrastructure capability , this is largely fragmented and currently not addressing
the needs of multiple stakeholders . Mechanisms for coordinating existing and
planned ocean observations using a system approach are needed for more
integrated, efficient and sustained observations under the framework of a European
Ocean Observing System (EOOS) following international practice (systems developed by USA, Australia and
Canada) and the call of the EurOCEAN 2010 Conference Declaration . The integration of different
national and local marine data systems into a coherent interconnected whole which
provides free access to observations and data, as pursued by the European Marine Observation and
Data Network (EMODnet) is of key importance for maritime sectors like fisheries, the
environment, transport, research, enterprise and industry. However, much work still needs to
be done in close collaboration with end-users , in particular industry, to further develop EMODnet
into a fully functional, fit for purpose gateway to European marine data and data products taking into account
requirements of multiple users. There is a need for science-industry partnerships to stimulate innovation and
develop a successful EOOS that will further enhance the contribution of marine observations to economic activities

Innovative technologies, developed in collaboration between research


have given several solutions during the past years for more
robust, multi-parametric and systematic observations. This, in turn, is leading to
new and more reliable operational services that support a wide range of maritime
economic activities: fisheries and aquaculture, offshore oil and gas, marine renewable energy, maritime
transport, tourism etc. Other services address the sectors of marine safety, climate and
weather applications, as well as marine environmental assessment . At the end of the
relevant for Blue Growth in Europe.
scientists and the industry,

marine observations, data to knowledge cycle, activities and tools are needed to create added value products for

specific stakeholders, including the wider public, such as the European Atlas of the Seas which allows professionals,
students and anyone interested to explore Europes seas and coasts, their environment, related human activities
and European policies. At the same time, it is critical to evaluate whether we are monitoring/observing what we
actually need. Regional assessments such as performed by the newly established EMODnet sea-basin checkpoints
could provide relevant information, among others to advise Member States about requirements for essential and
optimal observation capability.

Infrastructure for data sharing already exists


Paris and Schneider et al 2013

Jean-Daniel Paris, Nadine Schneider Laboratoire des Sciences du Climat et de


lEnvironnement/IPSL Data sharing across the Atlantic: Gap analysis and
development of a common understanding http://www.coopeus.eu/wpcontent/uploads/2013/12/Carbon-data-sharing-gaps-06-12-2013-D31.pdf
The global need to develop large, cross continental environmental datasets has
been recognized*'. To address this issue, COOPEU5 is a joint US National Science
Foundation (*4SF) and European Commission FP7 (in the frame of the European
Strategy Forum on Research Infrastructures, ESFRI) supported project initiated in
September 2012s. Its main goal is creating a framework to develop interoperable einfrastructures across several environmental and geoscience observatories in
Europe and US. The National Ecological Observatory Network (NEON in the US,
www.neoninc.org) and the Integrated Carbon Observatory System (ICOS in the EU,
http://www.icos-infrastructure.eu/) are two of these governmental supported
observatories. Here, the data products from these two observatories are centered
around greenhouse gas (GHG) concentration, carbon and energy flux observations,
and the surface micrometeorology surrounding these measurements. The objective
of COOPEUS is to coordinate the integration plans for these carbon observations
between Europe and the US. Even though both ICOS and NEON have the ability to
collaborate on effective issues, we fully recognize that this effort cannot be
effectively accomplished without the engagement of many other partners, such as;
National Oceanic and Atmospheric Administration's Global Monitoring Division (US
NOAA GMD), the Group on Earth Observations (GEO, www.earthobservations.org)
and the Group of Earth Observations System of Systems (GEOSS), World
Meteorological Organization (WMO, www.wmo.int), the Belmont Forum
(www.igfagcr.org), ^SF- supported EarthCube (earthcube.ning.com)and DataOne
(www.dataone.org) projects, and a wide variety of regional-pased flux networks (i.e.,
AmeriFlux, Fluxnet). Of course as COOPEUS continues to advance, this list of
partners are not exclusive and are expected to increase. COOPEUS aims to
strengthen and complement these partnerships through a variety of governance
mechanisms, some informal and others formalized (e.g. Memorandum of
Understanding), tailored to each individual organizational governance structure.
Several of these organizations (mentioned above) have a history of collaborating
and sharing of data. In this historical context, we also have recognized what has
worked, exiting limitations, and what can be improved in terms of data sharing and
interoperability. This COOPEUS work task is building upon these relationships and
working history to strengthen these approaches and collaboration.

Yes Data Sharing


Europe will share data with the US solves the aff
COOPEUS 2013
Project CoopEUS WP 4 Ocean Observation Deliverable D4.1- Fact Finding Report
http://www.coopeus.eu/wpcontent/uploads/2013/11/D4_1_Facts_Finding_10_03_FIN.pdf
Sharing scientific data can be the 'playing-field' for an easy and profitable
commencement of the scientific structured collaboration while complementing
respective expertise. This approach also meet the recent trend of EU and US funding
agencies moving towards more openness and more sharing. The analysis and
definition of the basic principles and requirements for a shared data policy can be a
first step for a long-term transatlantic cooperation and can pave the ground for the
data integration on a global scale diminishing significant misalignment in the data
issues. Besides, the adoption and implementation of common data standards can
facilitate the development of infrastructure and data infrastructures that will be
largely interoperable. The international ICT collaborative frame for the
interoperability of scientific data and infrastructures has to be leveraged to take
advantages of the continuous progresses and specialized skills. Parallel initiatives
and projects at global and continental scale have to be exploited with the aim of
supporting the on-going networking process of the Ocean Scientists and ICT
engineers and specialists European ODIP, EUDAT iCORDI GENESI-DEC, GEOwow GPOD, TATOO and worldwide Research Data Alliance, DAITF( http://www.daitf.ore/ ) to
mention some. The present development status of the research infrastructures for
ocean observations 001 and EMSO enable, as they are now, to outline data sharing
practices and protocols across borders and disciplines although with some
limitations. Limiting factors for an easy and extensive data use out of the respective
data production frame and user community can be misalignment of vocabularies
and ontology, metadata Incompleteness and heterogeneities, non standardized
QC/QA practices. These issues shall be the subjects of deepening discussion in
CoopEUS to achieve reciprocal understanding and smooth data interoperability and
exchange.

EU Tech Good
European sensors are cutting edge and real time no
distinction between them and the aff.
COOPEUS 2013

Project CoopEUS WP 4 Ocean Observation Deliverable D4.1- Fact Finding Report


http://www.coopeus.eu/wpcontent/uploads/2013/11/D4_1_Facts_Finding_10_03_FIN.pdf
EMSO will pioneer in delivering multidisciplinary real-time data from the sea by
providing data from the ocean surface through the water column to the benthos and
sub-seafloor. It will be facilitated, in part, by advancements made on Eulerian (fixed)
observatory infrastructures during the ESONET Network of Excellence, EuroSITES programs and the
potential follow-on project Fixed- point Open Ocean Observatories (Fix03). And it will work alongside
other key ESFRI infrastructures such as Euro-Argo, SIOS and EPOS. The
implementation and management of EMSO infrastructure are based on an European
transnational integration effort at EU Members State Governments level and on the
commitment of research institutions from thirteen countries. The long-term
sustainability and the close interaction with the reference European scientific
community, coordinated through the ESONET- Vision upcoming association, is
ensured through the establishment of a European Consortium (ERIC). Science objectives
guide observatory design and dictate the ability to collect data without employing traditional means, such as
research vessels. However, the latter are intended to complement the EMSO network, which will be serviced by a
combination of research vessels and ROVs operations provided by EU Member States in a coordinated network
established by EUROFLEETS. EMSO nodes include cabled and stand-alone observatories with moorings and benthic
instruments, while communicating in real time or in delayed mode, and being serviced through periodic
maintenance cruises. Mobile systems, such as ROVs and AUVs will improve the potential of some of the
observatories by expanding their spatial extent. EMSO nodes are conceived as an integration of local and regional
seafloor and water column in situ infrastructures equipped for both generic and science oriented approach (Ruhl et
al., 2011). Generic sensor module While not all scientific questions can be addressed by each individual
infrastructure, specific set of variables measured at all EMSO sites and depths are considered, including
temperature, conductivity (salinity), pressure (depth), turbidity, dissolved oxygen, ocean currents, and passive
acoustics. These generic sensors can be used to directly address a wide range of geo-hazard warning and scientific
applications related to understanding natural and anthropogenic variation and the possible impacts of climate
change. In the observatory setting, these data can then be relayed back to shore via seafloor cable (real-time) or
satellite telemetry (delayed time). The use of salinity and conductivity sensors spaced regularly along strings and
additional ADCP coverage can capture the themes related to ocean physics. These include understanding winddriven and deep-ocean circulation, planetary waves, and interactions between the Benthic Boundary Layer and the
seabed.

Solves Environment
Europe solves ecosystem management academic,
government, industry partnerships
ESF 2014

European Science Foundation, Arctic 2050: Towards ecosystem-based management


in a changing Arctic Ocean, March 12 2014, http://www.esf.org/media-centre/extsingle-news/article/arctic-2050-towards-ecosystem-based-management-in-achanging-arctic-ocean-1011.html
About 150 scientists, policy makers and members of industry are gathering today at
the 4th European Marine Board Forum in Brussels to discuss how best to manage the
consequences of a changing Arctic Ocean for human health and well-being. The European Marine
Board has convened this flagship event in collaboration with the European Polar Board, working in association with

industry and science must work together


to achieve sustainable management of resources such as fishing and oil and gas
exploration while at the same time, protecting and conserving the Arctic environment.
the European Science Foundation, in the knowledge that

Dramatic changes, largely attributed to anthropogenic activity, have taken place in the Arctic in recent decades.
These changes include melting of glaciers and sea ice, altered oceanic current patterns, movement and
accumulation of contaminants and range shifts in many species. As a result of these changes the Arctic region is
being transformed, with wide-ranging impacts and opportunities including the potential for ice-free shipping routes
in the future, increased activity in oil and gas exploration, changes to Arctic fisheries and biodiversity, and impacts
on residents livelihoods. At present we are unprepared for the environmental and societal implications of
increased human access to the Arctic that will come with the receding ice explains Professor Peter Haugan from
the University of Bergen and vice-Chair of the European Marine Board. We

have not fully anticipated


the consequences of an increase in activities like hydrocarbon exploration, mineral
extraction, bioprospecting and pelagic and demersal fisheries . The 4th EMB Forum,
recognized as an official ICARP III event, promotes the need for an ecosystem-based
management approach in the Arctic Ocean, in order to adapt to and manage rapid
environmental change and commercial exploitation , supporting a key recommendation of the
recently published Arctic Biodiversity Assessment.[1] Moderated by David Shukman, BBC Science Editor, forum
sessions include, Living with a Changing Arctic Ocean, Utilizing and managing Arctic Ocean resources and a
session on Arctic Ocean Observation, building on the European Marine Board call in 2013 for urgent action to
increase our observational capacity across the entire Arctic Ocean (EMB, 2013).[2] Speakers will include industry
representatives from Shell, the International Association of Oil and Gas Producers and the International Maritime

The forum provides a platform to address ecosystem-based management


by stimulating dialogue across sectors to aid common understanding,
collaborative actions and sustainability targets . Later today the forum will culminate with an open
Organisation.

in the Arctic Ocean

panel discussion on the roles of industry and science in achieving sustainable management of the Arctic Ocean.

Solves Acidification
Europe can develop acidification solutions experimentation.
Data is sufficient.
Riebesell 2009

Ulf, professor of marine biogeochemistry and Head, biological Oceanography, leibniz


institute of marine sciences, eXperimeNtAl ApprOAcHes iN OceAN AcidiFicAtiON
reseArcHhttp://www.tos.org/oceanography/archive/22-4_gattuso.pdf
Progress in our ability to make reliable predictions of the impacts of ocean
acidification on marine biota critically depends on our capability to conduct
experiments that cover the relevant temporal and spatial scales. One of the
greatest challenges in this context will be scaling up biotic responses at the cellular
and organismal level to the ecosystem level and their parameterization in regional
ecosystem and global biogeochemical models. EPOCA employs a suite of
experimental approaches to assess marine biota's ocean acidification sensitivities ,
ranging from single species culture experiments to field surveys of the distribution of potentially sensitive caxa In
relation to seawacer carbonate chemistry (Figure BS). Each of these approaches has its distinct strengths and
weaknesses. 3ottle and microcosm experiments allow for high replication of multiple CO?/pH treatments jndei wellcontrolled experimental conditions, thereby yielding high statistical power. However, they typically lack genetic and
species diversity, competitive interac- tion, and the trophic complexity of natural systems, which complicates
extrapolation of results to the real world. Field observations, on the other hand, cover the full range of biological
Interactions and environmental complexities, but they generally provide only a snapshot In time, with little or no
information on the history of the observed biota and environmental conditions prior to sampling. The interpretation
of fi eld data In terms of dose/response relationships Is often obscured by multiple environmental factors

Mesocosms, experimental
enclosures that are designed to approximate natural conditions and that allow
manipulation of environmental factors, provide a powerful tool to link small-scale
single species laboratory experiments with observational and correla* tlve
approaches applied In field surveys. A mesocosm study has an advantage over standard laboratory
simultaneously varying in time and space and by the lack of replication.

tests In that It maintains a natural community under close to natural self-sustaining conditions, taking Into account
relevant aspects of natural systems such as indi- rect effects, biological compensation and recovery, and ecosystem
resilience. Replicate enclosed populations can be experimentally manipulated and the same populations can be

In situ enclosures are that a large


volume of water and most of its associated organisms can be captured with minimal
disturbance. The mesocosm approach is therefore often considered the
experimental ecosystem closest to the real world , without losing the advantage of reliable
sampled repeatedly over time. Further advantages of flexible-wall.

reference conditions and replication. To Improve understanding of the under- lying mechanisms of observed
responses, which are often difficult to infer from mesocosm results, and to facilitate scaling up mesocosm results,
large-scale enclosure experiments should be closely inte- grated with well-controlled laboratory experiments and
modeling of ecosystem responses. Taking advantage of a recently developed mobile, flexible wall mesocosm

EPOCA will conduct a joint mesocosm CO, perturbation experiment In the high
Arctic, involving marine and atmospheric chemists, molecular and cell biologists,
marine ecolo- gists. and biogeochemists . A total of nine mesocosm units 2 m In diameter and lS-m
system.

deep, each containing approximately 45,000 liters of water, will be deployed In Kongsfjord off Ny-Alesund on
Svalbard. The carbonate chemistry of the enclosed water will Initially be manipulated to cover a range of pCO_,
levels from pre-lndustrlal to projected year 2100 values (and possibly beyond) and will be allowed to float freely
during the course of the experiment to mimic varia- tions naturally occurring due to biological activity. The high
level of scientific Integration and cross-disciplinary collaboration of this study Is expected to generate a
comprehensive data set that lends itself to analyses of community-level ocean acidification sensitivities and
ecosyscem/biogeochemical model parameterizations

User Fees

1NC Commercialization CP
Text
The United States Federal Government should adopt a user fee system of
commercialization for the Integrated Ocean Observing System.

Observation One: Competition


Its is possessive-Commercialization is not USFG exploration or
development
Nguyen and Kim 08 (Ngan L.T. Nguyen, graduate from the University of

Natural Sciences, Vietnam and pursuing Masters in Computer Sciences at the


University of Tokyo, and Jin-Dong Kim, Senior research Associate at University of
Manchester, Project Lecturer at University of Tokyo, PhD and Masters in Computer
Seicnce at Korea University, Department of Computer Science, University of Tokyo.
2008, Licensed under the Creative CommonsAttribution-Noncommercial-Share Alike
3.0 Unported, Exploring Domain Differences for the Design of Pronoun Resolution
Systems for Biomedical Text, page 631.
http://www.nltg.brighton.ac.uk/home/Roger.Evans/private/coling2008/cdrom/PAPERS
/pdf/PAPERS079.pdf)
the candidate
antecedent and the context word, which appears in some relationship with the pronoun. This
The combination of C netype and P semw features exploits the co-ocurrence of the semantic type of

combination feature uses the information similar to the semantic compatibility features proposed by Yang (Yang et
al., 2005) and Bergsma (Bergsma and Lin, 2006). Depending on the pronoun type, the feature extractor decides
which relationship is used. For example,

the resolver successfully recognizes the antecedent of


the pronoun its in this discourse: HSF3 is constitutively expressed in the
erythroblast cell line HD6 , the lymphoblast cell line MSB , and embryo fibroblasts ,
and yet its DNA-binding activity is induced only upon exposure of HD6 cells to heat
shock , because HSF3 was detected as a Protein entity, which has a strong association
with the governing head noun activity of the pronoun. Another example is the correct anaphora link between
it and the viral protein in the following sentence, which the other features failed to detect. Tax , the viral
protein , is thought to be crucial in the development of the disease , since it transforms healthy T cells in vitro and
induces tumors in transgenic animals. The correct antecedent was recognized due to the bias given to the
association of the Protein entity type, and the governing verb, transform of the pronoun. The experimental results
show the contribution of the domain knowledge to the pronoun resolution, and the potential combination use of
such knowledge with the syntactic features. Parse features (parg) The combinations of the primitive features of
grammatical roles significantly improved the performance of our resolver. The following examples show the correct

By comparison, PMA is a very inefficient


inducer of the jun gene family in Jurkat cells. Similar to its effect on the induction of
AP1 by okadaic acid, PMA inhibits the induction of c-jun mRNA by okadaic acid. In
this example, the possessive pronoun its in the second sentence corefers to
PMA, the subject of the preceding sentence. Among the combination features in this group, one
anaphora links resulting from using the parse features:

noticeable feature is the combination of C parg, Sdist, and P type which contains the association of the grammatical
role of the candidate, the sentence-based distance, and the pronoun type. The idea of adding this combination is
based on the Centering theory (Walker et al., 1998), a theory of discourse successfully used in pronoun resolution.
This simple feature shows the potential of encoding centering theory in the machine learning features, based on the
parse information.

Observation Two: Net Benefits-The CP solves the case avoids


spending related arguments-Creating a system of user fees
incentivizes commercialization
Woll-OCEANS Conference-9

MTS/IEEE Biloxi - Marine Technology for Our Future: Global and Local Challenges
The role of commercialization in IOOS and the regional associations
http://ieeexplore.ieee.org.ezproxy.uky.edu/xpls/icp.jsp?arnumber=5422377#authors
Since its inception, IOOS has operated in much the same way as federal
government agencies traditionally have - set policy, identify requirements, prioritize
within those requirements, secure federal funding , and allocate the funding via
the RAs, which in turn fund the organizations that actually perform the required
tasks. The organizations funded by the RAs to perform the desired tasks have
included universities, research institutions, and private sector companies. This has
resulted in a loose organizational structure, in which federal (and sometimes state)
agencies provide policy, coordination, and funding; RAs and their member
organizations conduct the science, process data, and operate observing and other
systems; products and services are generated and provided (usually free of charge)
to end users. Fig. 1 illustrates this functional structure. Ideally, funding for these
programs follows the traditional trajectory: a large sum at the beginning to start
the program up, and then some fraction of that annually to support ongoing
Operations and Maintenance (O&M). Fig. 2 depicts this traditional funding profile.
While significant progress has been made recently, the unfortunate fact is that
funding for both initial start up costs and O&M have been less than anticipated. As a
result, worthy new projects remain on the sidelines, and viable observing assets
funded previously are beginning to be decommissioned or inactivated due to
inadequate O&M funding. In response to this situation, some have called for a
retreat from the IOOS organizational concept and a return to individual efforts by
member organizations. Others retain faith in the IOOS concept, but argue that the
enterprise simply needs to be more effective at articulating the requirements and
hazards of inaction to the appropriate members of Congress. This paper
proposes a third option - the embrace of commercialization as a program
objective for IOOS the RAs, and the oceanography enterprise. The active
participation of universities and research institutions in IOOS and the RAs has been
a cornerstone of their success to date and is a logical extension of previous
practices in the academic arena. The avoidance of redundancy in effort and the
benefits of cooperation have long been recognized in establishing and operating
successful programs in the academic community, and participation in IOOS by
university and research institutions has been high. However, the participation
of private sector businesses has been less widespread . A partial list of
reasons would include the lack of a clear answer to why it is in a company's interest
to participate, a business environment that is generally more competitive than
cooperative, and a desire to protect proprietary information. Unfortunately, the
current structure and functionality of the RAs does little to address or
correct those concerns, and in many ways actually exacerbates them. The
linchpins to this proposal are the embrace of commercialization using private

sector businesses as an organizational goal and the imposition of


appropriate fees on the end users for the products and services they
receive . Aside from attracting more business members to IOOS and the RAs, the
imposition of fees would immediately establish a two-way market feedback
mechanism between the end users and the IOOS/RA community. This feedback
mechanism is the most effective means available to help identify those projects that
truly add value and address significant end user needs - if no one is willing to pay
for a particular product or service (or pay enough to cover the costs of generation),
then by definition it is not worth the money being spent on it.

A2: CP Illegitimate
---Literature supports the counterplan and makes it
predictable. Our Woll evidence contrast the plan with the CP
which proves its germane and educational
---Test Its which is a core source of negative ground on the
topic
---Net benefits check abuse and provide a germane policy
warrant for voting negative. They shouldnt get to pick and
choose which parts of the plan to defend.
---Doesnt waste the 1AC-The affirmative still sets the the
initial grund for the debate. The negative must still find some
aspect of the affirmative plan with which to compete. The
negative isnt obligated to run arguments that were
preempted in the 1AC. Its more fair to let both teams partially
determine ground for the debate.
---Dont trivialize the debate-If the difference between the plan
and the counterplan is large enough to generate a net benefit,
than its worth debating. This argument isnt unique as many
affirmatives are only a small departure from the status quo.
---Risk-Reward-The affirmative generates offense from the way
their plan would be implemented. That reward should be offset
by allowing the negative to test the plan.
---Punishment doesnt fit the crime-The judge should evaluate
theory like extra-topicality. The counterplan should be judged
outside their jurisdiction and a lost option for the negative to
advocate.

A2: Conditionality
---Real World-Policy makers do consider multiple options at
once. Their argument guts one of the core elements of policy discussion.
---Best policy justifies-Multiple options make it more likeley that the best

policy will be found. The role of the judge is to endorse the best policy at the end of
the round. If a conditional counterplan has been proven to be the best policy, its
perverse not to allow it to be endorsed.

---Education-Argument breadth has benefits. If depth were the only value, teams
wouldnt be allowed to debate more than one advantage or disadvanatge per round.
Exploring the range of issues on a subject is also intellectualy important.

---Time limits arent an answer


A. Time is finite in debate. Running one argument inherently trades off with another.
B. Other arguments make this non-unique. Multipe topicality arguments, two card
disads, or kritiks equally distort time.
C. Creating time pressure and making time based decisions is an inherent part of
debate strategy. Its an acceptable part of all other debate arguments.

---Counterplans dont introduce unique complexity into the


round. The counterplan may just be a minor alteration of the plan. Disadvantage s
also raise multiple issues.

---Permutations justify-Retaining the status quo as an option is reciprocal to


the affirmatives ability to advocate the plan or permutation.

---Conditionality is reciprocal to the affirmatives ability to


select a case. Since the affirmative selects the ground for the debate they enjoy
a huge preparation advantage. Allowing hypothetical negative arguments helps to
defeat this edge.

---Advocacy concerns arent decisive.


A. In the real world, policies are attacked from avariety of perspectives. In debate
there is only one negative team, so to encompass the true range of potential
counter-affirmative advocacy, multiple positions must be allowed.
B. Most debate practice isnt consistent with the advocacy paradigm. Strategic
concessions by the affirmative and permutations allow the affirmative to advocate
multiple positions.

---Not a voting issue. Emphasis on punishment incentivizes a race to bottom


discouraging substsantive debates.

2NC A2 Perm-Do the CP


---Severs its-Extend our Kim evidence-Commercialization is
not USFG exploration or development-Severance permutations
are illegitimate because no counterplan would compete if the
2AC could pick and choose which parts of the plan to defend.
---Severs increase-exploration doesnt increase unless the
private sector fills in via user fees. The CP doesnt fiat an
expansion of exploration. We read solvency evidence that
commercialization would naturally expand the program
---Privatization is distinct from federal government
Barbier 7 (Carl, US District Judge, TIEN VAN COA, ET AL VERSUS GREGORY
WILSON, ET AL CIVIL ACTION NO: 07-7464 SECTION: J(1) UNITED STATES DISTRICT
COURT FOR THE EASTERN DISTRICT OF LOUISIANA 2007 U.S. Dist. LEXIS 87653)
However, in their motion to remand, Plaintiffs argue that as an independent
contractor, P&J is not an employee of the federal government, and consequently
does not enjoy derivative immunity and cannot invoke the FTCA. Plaintiffs cite
United States v. New Mexico in support of the notion that private contractors,
whether prime or subcontractors, are not government employees nor are they
agents of the federal government. 455 U.S. 720, 102 S. Ct. 1373, 71 L. Ed. 2d 580
(1982). According to the Court, "[t]he congruence of professional interests between
the contractors and the Federal Government is not complete" because "the
contractors remained distinct entities pursuing private ends , and their actions
remained [*4] commercial activities carried on for profit." Id. at 740; see also Powell
v. U.S. Cartridge Co., 339 U.S. 497, 70 S. Ct. 755, 94 L. Ed. 1017 (1950).

---That explodes limits and allows for the thousands of private


actors to become individual plan mechanisms destroys core
generics like agent counterplans or politics and explodes the
neg research burden thats a key internal link to clash which
allows for research skills and decisionmaking.
---Independently, The word substantially means that the
government must play the main role.
CFR No Date (Code of Federal Regulations, Subpart 3.1Safeguards,
http://www.acquisition.gov/far/html/Subpart%203_1.html)

(3) Participating substantially means that the officials involvement is of


significance to the matter. Substantial participation requires more than
official responsibility, knowledge, perfunctory involvement, or
involvement on an administrative or peripheral issue. Participation may be substantial

A finding of substantiality
should be based not only on the effort devoted to a matter, but on the
importance of the effort. While a series of peripheral involvements may be
insubstantial, the single act of approving or participating in a critical step
may be substantial. However, the review of procurement documents solely
to determine compliance with regulatory, administrative, or budgetary
procedures, does not constitute substantial participation in a
procurement.
even though it is not determinative of the outcome of a particular matter.

A2: Perm-Do Both


Links to our spending based net benefits or it severs out of its

2NC Solvency
The CP solves the case
A. Economic incentives-User fees encourage the private sector
to take on a larger role for the financing of IOOS-fills in for
current funding gaps-Thats Woll from the 1NC
B. Re-allocation-Commercialization allows the government to
spend scare resources on other IOOS related projects-solves
better than the aff
Woll-OCEANS Conference-9
MTS/IEEE Biloxi - Marine Technology for Our Future: Global and Local Challenges
The role of commercialization in IOOS and the regional associations
http://ieeexplore.ieee.org.ezproxy.uky.edu/xpls/icp.jsp?arnumber=5422377#authors
A major side benefit to this construct is that it provides a second
significant source of funding for the enterprise . Obviously, end users are not
going to pay fees for a prospective product or service, so there will remain a need
for government funding to start projects up. However, the fees should be set at a
level that will provide a large portion ( ideally all) of the O&M funding required
to sustain projects over time. This second source of funding has the net effect
of freeing up previous government O&M funds to be re-applied to start other new
projects - such that the government funding truly becomes a type of seed funding,
helping worthy projects get off the ground and turning them over to operational
organizations. Fig. 3 depicts how the funding profile could change over time under
this proposal.

C. This is the most effective funding strategy


Woll-OCEANS Conference-9

MTS/IEEE Biloxi - Marine Technology for Our Future: Global and Local Challenges
The role of commercialization in IOOS and the regional associations
http://ieeexplore.ieee.org.ezproxy.uky.edu/xpls/icp.jsp?arnumber=5422377#authors
SECTION IVCONCLUSION
The IOOS program offers an unparalleled opportunity for all members of the
oceanography enterprise to take part in a collaborative effort - one that provides the
best possible stewardship of taxpayer dollars while helping advance practical
scientific knowledge about the world's oceans. With the breadth and scope of
challenges facing the ocean environment, such an approach is not only
prudent - it offers the only reasonable chance of success in the future
fiscal environment.

Quite simply, the federal government will not be able to fund every
worthy project, so it is incumbent on IOOS and the oceanography
enterprise to make the most of every dollar. That means cooperating to make
choices from an enterprise perspective, making the hard choices to eliminate once
promising projects that have not panned out as expected, and being willing to
challenge and change habits that have been successful in years past.

D. Their authors support the CP


U.S. IOOS SUMMIT REPORT 13

A New Decade for the Integrated Ocean Observing System


http://www.ioos.noaa.gov/about/governance/summit2012/usioos_summit_report.pdf
Step 8. Increase Advocacy. There is a need to develop and maintain advocacy for
U.S. IOOS. Challenge. U.S. IOOS is a line item in the NOAA budget, but it remains a
largely unfunded federal mandate. The existing approach of individual agency
ocean observing programs addressing agency- unique missions with uncoordinated
agency budgets is inadequate. This fragmented Federal approach .ies at the hea't of
the U.S. IOOS challenge to thrive. Advocacy will develop naturally if stakeholders
and users are actively engaged and their requirements are being met. But a
proactive advocacy strategy is also needed. There are many users that have come
to rely on U.S. IOOS products without providing any support for the system. The
ocean observing community is not adept at turning that supportive relationship into
advocacy for continued or expanded funding. Significant, well-qualified human
resources are necessary to maintain effective user engagement. U.S. IOOS must
recognize the importance of this process and support implementation of a
user engagement infrastructure in order to meet the vision of U.S. IOOS
for the next decade .

A2: Obstacles to Commericalization


The CPs restructuring solves
Woll-OCEANS Conference-9
MTS/IEEE Biloxi - Marine Technology for Our Future: Global and Local Challenges
The role of commercialization in IOOS and the regional associations
http://ieeexplore.ieee.org.ezproxy.uky.edu/xpls/icp.jsp?arnumber=5422377#authors
SECTION IIIADDITIONAL CHANGES TO OPTIMIZE ENTERPRISE EFFORT
Commercializing the products and services provided to end users by IOOS and its
member organizations would provide a clear definition of successful projects - those
that can garner sustainment O&M funding from user fees. However, for the
enterprise to be most effective and most efficient, one further step must be
considered. Specifically, a realignment of organizational responsibilities to take
maximum advantage of the strengths of the differing types of member
organizations would yield additional benefits. There are four broad categories of
IOOS participants: government, academic (university/research institution), business,
and RA/IOOS staff. A summary of each of their strengths, weaknesses, and proposed
roles is provided in Table 1. These proposed roles are reflected graphically in Fig. 4,
which depicts a more structured and hierarchical organization than is currently in
place. In this type of structure, the government actors (federal and state) would
provide policy, coordination, and funding. The academic sector would focus on
science, research, and providing high quality review and technical advice to
regulators and other government actors. Businesses would focus on execution and
serve as the bridge to the end user. The IOOS and RA staffs would coordinate
between all of the levels, provide project oversight, and administer a data
repository. This is admittedly a significant realignment - one which would ask some
member organizations to forgo activities at which they have been very successful.
However, when approached from an enterprise perspective, the benefits become
clear. Table 2 summarizes some of the major cost benefit comparisons between the
existing organizational structure and that proposed in this paper.

Drones

Drones/Robots Solvency
T-rex AUVs are effective, high tech, and solve data collection
in all areas of the globe
McGann et al 8

[Conor McGann, Frederic Py, Kanna Rajan, Hans Thomas, Richard Henthorn, Rob McEwen Monterey Bay
Aquarium Research Institute. 2008, A Deliberative Architecture for AUV Control, http://ieeexplore.ieee.org/stamp/stamp.jsp?
arnumber=04543343 PDF //jweideman]

Oceanography has traditionally relied on ship-based observations . These have


recently been augmented by robotic platforms such as Autonomous Underwater
Vehicles (AUV) [1-3], which are untethered powered mobile robots able to carry a
range of payloads efficiently over large distances in the deep ocean. A common design relies
on a modular tube-like structure with propulsion at the stern and various sensors, computers and batteries taking up the bulk of the

AUVs have demonstrated their utility in oceanographic research in


gathering time series data by repeated water-column surveys [4], detailed
bathymetric maps of the ocean floor in areas of tectonic activity [5,6] and performed
hazardous under-ice missions [7].Typically AUVs do not communicate with the support ship or shore while
submerged and rely on limited stored battery packs while operating continuously for tens of hours. Current
tube (Fig. 1).

AUV control systems [8] are a variant of the behavior-based Subsumption architecture [9]. A behavior is a modular encapsulation of
a specific control task and includes acquisition of a GPS fix, descent to a target depth, drive to a given waypoint, enforcement of a
mission depth envelope etc. An operator defines each plan as a collection of behaviors with specific start and end times as well as
maximum durations, which are scripted a priori using simple mission planning tools. In practice, missions predominantly consist of
sequential behaviors with duration and task specific parameters equivalent to a linear plan with limited flexibility in task duration.
Such an approach becomes less effective as mission uncertainty increases. Further, the architecture offers no support to manage
the potentially complex interactions that may result amongst behaviors, pushing a greater cognitive burden on behavior developers

This paper describes an automated onboard planning system to


generate robust mission plans using system state and desired goals. By capturing
explicit interactions between behaviors as plan constraints in the domain model and
the use of goal oriented commanding, we expect this approach to reduce the
cognitive burden on AUV operators. Our interest in the near term is to incorporate decision-making capability to
and mission planners.

deal with a range of dynamic and episodic ocean phenomenon that cannot be observed with scripted plans. The remainder of this
paper is laid out as follows. Section II lays out the architecture of our autonomy system, section III details the experimental results to

T-REX (TeleoReactive EXecutive) is a goal-oriented system, with embedded automated planning


[14,15] and adaptive execution. It encapsulates the long-standing notion of a sensedeliberate-act cycle in what is typically considered a hybrid architecture where
sensing, planning and execution are interleaved . In order to make embedded
planning scalable the system enables the scope of deliberation to be partitioned
functionally and temporally and to ensure the current state of the agent is kept
consistent and complete during execution. While T-REX was built for a specific underwater
robotics application, the principles behind its design are applicable in any domain where deliberation and execution are
date, related work follows in section IV with concluding remarks in section V.II. THE T-REX ARCHITECTURE

intertwined.Fig. 2 shows a conceptual view of a Teleo-Reactive Agent. An agent is viewed as the coordinator of a set of concurrent
control loops. Each control loop is embodied in a Teleo-Reactor (or reactor for short) that encapsulates alldetails of how to
accomplish its control objectives. Arrows represent a messaging protocol for exchanging facts and goals between reactors: thin
arrows represent observations of current state; thick arrows represent goals to be accomplished. Reactors are differentiated in 3
ways: Functional scope: indicating the state variables of concern for deliberation and action. Temporal scope: indicating the lookahead window over which to deliberate. Timing requirements: the latency within which this component must deliberate for goals in
its planning horizon. Fig. 2 for example, shows four different reactors; the Mission Manager provides high-level directives to satisfy
the scientific and operational goals of the mission: its temporal scope is the whole mission, taking minutes to deliberate if necessary.

The Navigator and Science Operator manage the execution of sub-goals generated
by the Mission Manager. The temporal scope for both is in the order of a minute
even as they differ in their functional scope. Each refines high-level directives into executable commands
depending on current system state. The Science Operator is able to provide local directives to the Navigator . For example if
it detects an ocean front it can request the navigation mode to switch from a Yo-Yo
pattern in the vertical plane to a Zig-Zag pattern in the horizontal plane, to have
better coverage of the area. Deliberation may safely occur at a latency of 1 second for these

reactors. The Executive provides an interface to a modified version of the existing AUV functional layer. It encapsulates access to
commands and vehicle state variables. The Executive is reasonably approximated as having zero latency within the timing model of
our application since it will accomplish a goal received with no measurable delay, or not at all; in other words it does not deliberate.

AUV tech provides effective mapping capabilities


Yoerger et al. 7

[Dana R. Yoerger1, Michael Jakuba1, Albert M. Bradley1, and Brian Bingham2 1 Woods Hole
Oceanographic Institution 2 Franklin W. Olin College of Engineering. 1/1/2007, Techniques for Deep Sea Near Bottom Survey Using
an Autonomous Underwater Vehicle, http://business.highbeam.com/437280/article-1G1-156721474/techniques-deep-sea-nearbottom-survey-using-autonomous //jweideman]

navigation algorithms that enable an underwater vehicle to


accomplish fully autonomous scientific surveys in the deep sea . These algorithms
allow the vehicle to determine its position, to bottom-follow (maintain a constant
height above sea oor terrain) and avoid obstacles, and to autonomously focus on
the highest value parts of a survey. Scientific exploration of the deep sea has
traditionally been performed using inhabited submersibles , towed vehicles, and
tethered remotely operated vehicles (ROVs). Autonomous underwater vehicles
(AUVs) have begun to replace these vehicles for mapping and survey missions.
Autonomous vehicles complement the capabilities of these existing systems ,
offering superior mapping capabilities, improved logistics, and improved utilization
of the surface support vessel. AUVs are particularly well suited to systematic
preplanned surveys using sonars, in situ chemical sensors, and cameras in the
rugged deep sea terrain that is the focus of many scienti c expeditions. Inhabited
This paper reports

submersibles and ROVs remain the only option for manipulation tasks such as sampling, deploying and recovering

high resolution
maps from AUVs can facilitate these tasks.Figure 1 shows the Autonomous Benthic Explorer (ABE), a
experiments on the sea oor, detailed inspection, and servicing subsea instruments; however,

6000 m autonomous underwater vehicle that our team has been developing and deploying for ne-scale quantitative
survey and mapping of the sea floor. ABE can survey at constant depth or bottom-follow even in rugged terrain, and
it can autonomously determine its position and drive track lines with a precision on the order of several meters.

ABE carries a variety of sensors, including scanning and multibeam sonars; a


magnetometer; a digital still camera; two sets of pumped conductivity and
temperature probes; an acoustic Doppler current pro ler (ADCP); several chemical sensors for
hydrothermal plume mapping; and occasional mission-specific instrumentation. ABE's

shape and thruster placement allow it to maintain control over a wide range of speed, and to stop or back up if
necessary to avoid obstacles. ABE descends to the sea oor with the aid of a descent weight. ABE glides in a
controlled spiral trajectory to ensure that it reaches the desired starting point without consuming signi cant battery
energy. After reaching the sea oor and performing a series of checks, ABE releases its descent weight to become
neutrally buoyant and begins its survey. Throughout the dive, including descent, ABE uses acoustic long-baseline
(LBL) transponder navigation and, when in range of the bottom (< 300 m), bottom-lock acoustic Doppler
measurements to determine its position and velocity.A dive can consist of a mix of hydrothermal plume survey at
constant depth, sonar and magnetics survey following the sea oor (at heights of 50{ 200 m), and digital
photography (height of 5 m). ABE usually surveys until its batteries are depleted (between 15 and 30 hours
depending on sensor payload and terrain). At the end of its dive, ABE releases its ascent weight to become
positively buoyant and returns to the surface. The remainder of this report is organized as follows: Sect. 2
summarizes scienti c survey tasks that have motivated our AUV work, Sect. 3 reports an algorithm for acoustic
positioning, Sect. 4 reports methods for terrainfollowing and obstacle avoidance, Sect. 5 reports a technique for
automated nested survey, and Sect. 6 presents a brief summary and conclusion.2 Precisely Navigated, Coregistered

and coregistered sensors


permit an AUV to characterize the sea floor and the near-bottom environment with
complementary sensing modalities on the meter-scale. This section summarizes scienti c work
in which ABE-derived bathymetric maps, magnetics maps, digital photos, and hydrographic maps have
played critical enabling roles. Meter-scale bathymetric and magnetic maps made using ABE have
provided geologists and geophysicists with new perspectives on important sea oor processes. Combined
magnetics and bathymetric maps show crustal magnetization, which permits the
age and thickness of lava flows to be determined. Combined maps have also been used to
AUV Surveys Proximity to the sea oor, precise navigation, robust control,

identify volcanic features such as lava flow units [1], delimit their fronts, and estimate their thicknesses [2, 3].

Meter-scale bathymetric maps show tectonic features such as faults with great

clarity, even enabling them to be resolved into multiple components [4]. In other cases,
these maps have revealed the relationship between tectonic features and morphology, such as volcanic domes [3],
and hydrothermal vents [1]. ABE bathymetric maps have proved to be of su cient detail and precision for one
collaborator to reconstruct the tectonic history of a rift valley by computationally removing faults [5]. The result
revealed a dome-like structure from which the valley evolved. On a recent cruise to the Atlantis Massif, detailed
renderings of faults and the hydrothermal structures provided critical clues as to the mechanisms controlling the
hydro-geology at the newly discovered Lost City hydrothermal vent site [6]. Digital photographs of the sea oor from
ABE have provided details of lava ow types and e usion rates [3], sediment cover, and the distribution of benthic
organisms. Water column data from ABE yields indications of hydrothermal plume activity and has been used to
estimate heat ux from known hydrothermal vent sites, and to locate undiscovered sites on the sea oor. To estimate
the heat ux from vent elds on the Juan de Fuca Ridge in the Northeast Paci c (47 540 N, 129 100 W) [7], ABE
measured temperature, salinity, and threeaxis water velocity while repeatedly executing a tight grid pattern above
the eld [8]. Recently ABE located and preliminarily characterized several previously unmapped hydrothermal sites
on the Eastern Lau Spreading Center (ELSC) south of Tonga (21 080 S, 175 120 W) [9]; and on the Southern Mid
Atlantic Ridge (SMAR) north of Ascension Island (7 570 S, 14 220 W) [10]. In each case, we started with clues
provided by towed systems that indicated a vent site within several kilometers. ABE then executed a three-dive
sequence [9, 10] of grid patterns at increasing ner scales and increasingly close to the sea oor. To plan each dive,
the scienti c party carefully scrutinized the data from the previous dive along with any available ancillary
data.These vent prospecting missions capitalized on ABE's ability to conduct precisely navigated surveys at scales
O (m{km), to operate over rugged terrain, and relied on nearly all of ABE's sensing modalities. Figure 2 shows
tracklines from sequence of dives designed to locate and survey a vent site on ELSC along with a sampling of the
variety of data products acquired and used to plan each stage of the dive sequence. ABE mapped plume activity
(temperature, optical backscatter, and reduction-oxidization potential (eH) [11]) to pinpoint the locations of plumes
emanating from the eld, built ne-scale bathymetric maps of the vent elds and surrounding environment, and nally
photographed the vent structures and animal populations. The remainder of this paper presents the underlying
algorithms that enabled ABE to perform this work.This paper reports navigation algorithms that enable an
underwater vehicle to accomplish fully autonomous scienti c surveys in the deep sea. These algorithms allow the
vehicle to determine its position, to bottom-follow (maintain a constant height above sea oor terrain) and avoid
obstacles, and to autonomously focus on the highest value parts of a survey. Scienti c exploration of the deep sea
has traditionally been performed using inhabited submersibles, towed vehicles, and tethered remotely operated

Autonomous underwater vehicles (AUVs) have begun to replace these


vehicles for mapping and survey missions. Autonomous vehicles complement the capabilities of
vehicles (ROVs).

these existing systems, o ering superior mapping capabilities, improved logistics, and improved utilization of the
surface support vessel. AUVs are particularly well suited to systematic preplanned surveys using sonars, in situ
chemical sensors, and cameras in the rugged deep sea terrain that is the focus of many scienti c expeditions.
Inhabited submersibles and ROVs remain the only option for manipulation tasks such as sampling, deploying and
recovering experiments on the sea oor, detailed inspection, and servicing subsea instruments; however, high
resolution maps from AUVs can facilitate these tasks.Figure 1 shows the A utonomous

Benthic Explorer
(ABE), a 6000 m autonomous underwater vehicle that our team has been
developing and deploying for ne-scale quantitative survey and mapping of the
seafloor. ABE can survey at constant depth or bottom-follow even in rugged terrain, and it can autonomously
determine its position and drive tracklines with a precision on the order of several meters. ABE carries a variety of
sensors, including scanning and multibeam sonars; a magnetometer; a digital still camera; two sets of pumped
conductivity and temperature probes; an acoustic Doppler current pro ler (ADCP); several chemical sensors for
hydrothermal plume mapping; and occasional mission-speci c instrumentation. ABE's shape and thruster placement
allow it to maintain control over a wide range of speed, and to stop or back up if necessary to avoid obstacles. ABE
descends to the sea oor with the aid of a descent weight. ABE glides in a controlled spiral trajectory to ensure that it
reaches the desired starting point without consuming signi cant battery energy. After reaching the sea oor and
performing a series of checks, ABE releases its descent weight to become neutrally buoyant and begins its survey.
Throughout the dive, including descent, ABE uses acoustic long-baseline (LBL) transponder navigation and, when in
range of the bottom (< 300 m), bottom-lock acoustic Doppler measurements to determine its position and
velocity.A dive can consist of a mix of hydrothermal plume survey at constant depth, sonar and magnetics survey
following the sea oor (at heights of 50{ 200 m), and digital photography (height of 5 m). ABE usually surveys until
its batteries are depleted (between 15 and 30 hours depending on sensor payload and terrain). At the end of its
dive, ABE releases its ascent weight to become positively buoyant and returns to the surface. The remainder of this
report is organized as follows: Sect. 2 summarizes scienti c survey tasks that have motivated our AUV work, Sect. 3
reports an algorithm for acoustic positioning, Sect. 4 reports methods for terrainfollowing and obstacle avoidance,
Sect. 5 reports a technique for automated nested survey, and Sect. 6 presents a brief summary and conclusion.2
Precisely Navigated, Coregistered AUV Surveys Proximity to the sea oor, precise navigation, robust control, and
coregistered sensors permit an AUV to characterize the sea oor and the near-bottom environment with
complementary sensing modalities on the meter-scale. This section summarizes scienti c work in which ABE-derived
bathymetric maps, magnetics maps, digital photos, and hydrographic maps have played critical enabling roles.
Meter-scale bathymetric and magnetic maps made using ABE have provided geologists and geophysicists with new
perspectives on important sea oor processes. Combined magnetics and bathymetric maps show crustal
magnetization, which permits the age and thickness of lava ows to be determined. Combined maps have also been
used to identify volcanic features such as lava ow units [1], delimit their fronts, and estimate their thicknesses [2,

3]. Meter-scale bathymetric maps show tectonic features such as faults with great clarity, even enabling them to be
resolved into multiple components [4]. In other cases, these maps have revealed the relationship between tectonic

bathymetric maps
have proved to be of sufficient detail and precision for one collaborator to
reconstruct the tectonic history of a rift valley by computationally removing faults
[5]. The result revealed a dome-like structure from which the valley evolved. On a recent cruise to the Atlantis
features and morphology, such as volcanic domes [3], and hydrothermal vents [1]. ABE

Massif, detailed renderings of faults and the hydrothermal structures provided critical clues as to the mechanisms
controlling the hydro-geology at the newly discovered Lost City hydrothermal vent site [6]. Digital photographs of
the sea oor from ABE have provided details of lava ow types and e usion rates [3], sediment cover, and the
distribution of benthic organisms. Water column data from ABE yields indications of hydrothermal plume activity
and has been used to estimate heat ux from known hydrothermal vent sites, and to locate undiscovered sites on
the sea oor. To estimate the heat ux from vent elds on the Juan de Fuca Ridge in the Northeast Paci c (47 540 N, 129
100 W) [7], ABE measured temperature, salinity, and threeaxis water velocity while repeatedly executing a tight
grid pattern above the eld [8]. Recently ABE located and preliminarily characterized several previously unmapped
hydrothermal sites on the Eastern Lau Spreading Center (ELSC) south of Tonga (21 080 S, 175 120 W) [9]; and on
the Southern Mid Atlantic Ridge (SMAR) north of Ascension Island (7 570 S, 14 220 W) [10]. In each case, we started
with clues provided by towed systems that indicated a vent site within several kilometers. ABE then executed a
three-dive sequence [9, 10] of grid patterns at increasing ner scales and increasingly close to the sea oor. To plan
each dive, the scienti c party carefully scrutinized the data from the previous dive along with any available ancillary

ABE's ability to conduct precisely navigated


surveys at scales O (m{km), to operate over rugged terrain, and relied on nearly all
of ABE's sensing modalities. Figure 2 shows tracklines from sequence of dives designed to locate and
data.These vent prospecting missions capitalized on

survey a vent site on ELSC along with a sampling of the variety of data products acquired and used to plan each
stage of the dive sequence. ABE mapped plume activity (temperature, optical backscatter, and reductionoxidization potential (eH) [11]) to pinpoint the locations of plumes emanating from the eld, built ne-scale
bathymetric maps of the vent elds and surrounding environment, and nally photographed the vent structures and
animal populations. The remainder of this paper presents the underlying algorithms that enabled ABE to perform
this work.

Inherency

SQ IOOS/Data solves
IOOS already did enough mapping
IOOS report to congress 13

[Official US IOOS report sent to congress. 2013, U.S. Integrated Ocean Observing
System (U.S. IOOS) 2013 Report to Congress, http://www.ioos.noaa.gov/about/governance/ioos_report_congress2013.pdf
//jweideman]

Gliders are used to monitor water currents, temperature, and biological information
such as dissolved oxygen and nitrate. This information offers a more complete picture of what
is happening at and below the ocean surface, and may allow scientists to detect trends that otherwise might
have gone undetected. Gliders are assuming a prominent and growing role in ocean
science due to their unique capabilities for collecting data safely and at relatively low cost in remote
locations, both in deep water and at the surface. An advantage of gliders is that they can be quickly deployed to

A National Glider Asset Map was deployed by the U.S. IOOS


program in 2012 and will include all historic and current glider flights once it is
completed. The map shown on the right includes data from glider missions since
2005 from Southern California (SCCOOS), the Pacific Northwest (NANOOS), Central
and Northern California (CeNCOOS) and the Mid- Atlantic (MARACOOS) regional
glider operations. The glider asset map can be viewed at:
http://www.ioos.noaa.gov/observ jng/observing assets/glider asset _map.html.
areas of greatest need.

Squo solves satellite and other environmental data


Rogers and Gulledge 10

[Will Rogers is a Research Assistant at the Center for a New American Security. Dr. Jay
Gulledge is a Non-Resident Senior Fellow at the Center for a New American Security and is the Senior Scientist and Science and
Impacts Program Director at the Pew Center on Global Climate Change. April 2010. Lost in Translation: Closing the Gap Between
Climate Science and National Security Policy http://www.cnas.org/files/documents/publications/Lost%20in
%20Translation_Code406_Web_0.pdf //jweideman]

The intelligence community continues to declassify one-meter resolution images


taken from its satellite systems, giving climate scientists access to images 15 to 30
times sharper than the next- best systems controlled by NASA and commercial
entities such as Google.25 These and a plethora of other advancements have
produced a greater understanding of the Earth's climate system as well as the
affects of human activities (or anthro- pogenic influences) on the climate system. The amount of
observational data and output from climate models is growing quickly. For example,
there are terabytes of highly credible climate change projections now available from
the IPCC's Fourth Assessment Report that have never been examined in detail particularly with regard to local and near-term projections , by decade, to the end of the century and
beyond. The sheer vol- ume of available but unanalyzed data creates the potential for
many policy-relevant questions to be answered today, if only decision makers were aware of the data,
knew how to access it and could make sense of it - and if more scientists understood the needs of decision makers and were
motivated to provide it to them in a more useful form. In the future, as even more data become available, new efforts are emerging

the National Polar-orbiting


Operational Environmental Satellite System (NPOESS). Which will orbit the Earth every 100 minutes,
"provid- ing global coverage, monitoring environmental conditions, collecting,
disseminating and process- ing data about the Earth's weather, atmosphere,
oceans, land and near-space environment."26 The private sector has started to
contribute to the flow of new information as well. For example, there are new publicprivate partnerships to advance climate science data collection and analysis with
new satel- lite systems.27 Meanwhile, other private companies arc embarking on similar solo endeavors, in part, in
to handle the onslaught of information. NOAA is leading one public sector effort,

recognition of the likelihood that there will be a surge in the demand for collection and analysis of climate information. Given the
proliferation of new tools (e.g., climate satellites and advanced computer models) and data acquisition systems, there will be no
short- age of climate information (especially data related to present conditions and short-term trends). The question for the national
security community is whether its unique needs will be met. Since the community has not traditionally based decisions on climate

change projections or assessments, there are few processes in place to ensure that the necessary information will be available when
it is needed and in a form that is useful.

Solvency

Satellites
Cant solve without new satellites
Parthemore and Rogers 11

[Christine Parethemore, Senior Advisor at United States Department of Defense


Adjunct Professor, Security Studies Program at Johns Hopkins University Past Fellow, CNAS at Center for a New American Security
Research Assistant at Bob Woodward Education Georgetown University The Ohio State University. Will Rogers, works at cnas. July
2011, Blinded: The Decline of U.S. Earth Monitoring Capabilities and Its Consequences for National Security
http://www.cnas.org/files/documents/publications/CNAS_Blinded_ParthemoreRogers_0.pdf //jweideman]

Networks of satellites, ground-based sensors and unmanned aerial vehicles - the


assets America uses to monitor and understand environmental change and its
consequences - are going dark. By 2016, only seven of NASA's current 13 earth
monitoring satellites are expected to be operational, leaving a crucial information
gap that will hinder national security planning.1 Meanwhile, efforts to prevent this
capability gap have been plagued by budget cuts , launch failures, technical
deficiencies, chronic delays and poor interagency coordination. Without the
information that these assets provide, core U.S. foreign policy and national security
interests will be at risk. The United States depends on satellite systems for managing the unconventional challenges of
the 21st century in ways that are rarely acknowledged. This is particularly true for satellites that monitor
climate change and other environmental trends, which, in the words of the Department of Defense's
(DOD's) 2010 Quadrennial Defense Review, "will shape the operating environment, roles, and missions" of DOD and "may act as an
accelerant of instability or conflict, placing a burden to respond on civilian institutions and militaries around the world.'- Examples
abound of how climate change is shap- ing the strategic environment and of why the U.S. government needs continued access to
earth moni- toring data: The opening of the Arctic is requiring the U.S. Navy and Coast Guard to execute new missions in the High
North, including more frequent search and rescue missions. The receding Himalayan glaciers and related reduced river flows to
South Asia may shape the future relationship between India and Pakistan. Defense planners and diplomats will need to monitor
changes in the glaciers that supply rivers in Pakistan in order to determine whether access to water will exacerbate existing military

In
the South China Sea, changing ocean conditions are altering fish migration, leading
neighboring countries to compete over access to billions of dol- lars in fish
resources; this situation could escalate into serious conflict in contested territorial
waters. DOD and development agencies rely on earth monitoring systems to monitor
and diplomatic tensions between India and Pakistan - states that have longstanding grievances over how they share water.

urbanization, migration patterns and internal population displacement. Several government agencies also rely on earth monitoring

the
government relies on space-based capabilities to monitor and verify compliance with non-proliferation
treaties. Responding to environmental and climate change trends requires a steady
stream of reliable information from earth monitoring satellites that is quickly
becoming unavailable. Ideally, the U.S. government would replace its aging earth moni- toring satellites. Yet the current
capabilities to analyze compliance with deforestation and emissions measures in international climate change treaties, just as

political and fiscal environments constrain available resources, making it less likely that Congress will appropri- ate funds to wholly
replace old systems. Given this reality, U.S. policymakers should use existing systems more efficiently, improve information sharing
among interagency partners and leverage international partners' investments in their own systems in order to bolster U.S. climate
and envi- ronmental data collection capabilities. The Capability Gap Policymakers have known about the challenges stemming from
America's declining earth moni- toring capabilities for years. In 2005,

a report by the National Research


Council warned that America's "system of environmental satellites is at risk of
collapse."1 Key U.S. government agencies, including the Office of Science and Technology Policy (OSTP) and the Government

Accountability Office (GAO), have recently reiter- ated those warnings. According to an April 2010 report by the GAO, "gaps in
coverage ranging from 1 to 11 years are expected beginning as soon as 2015" and "are expected to affect the continuity of
important climate and space weather measure- ments, such as our understanding of how weather cycles impact global food
production."4 These gaps will include key environmental and climate monitoring functions, from radar altimeters that measure
changes in land and ocean surfaces (such as sea level rise and desertification) to aerosol polarimetry sensors that can measure and
distin- guish between sulfates, organic and black carbon and other atmospheric particles. " Meteorologists,

oceanographers, and climatologists reported that these gaps will seriously impact
ongoing and planned earth monitoring activities," according to the GAO.s One recent interagency effort
to close such gaps has fallen short. The National Polar-orbiting Operational Environmental Satellite System (NPOESS) was designed
to translate climate and environmental data (including data from exten- sive existing databases) into products and analysis for DOD,
NASA and the National Oceanic and Atmospheric Administration (NOAA). However, after lone delays, cost overruns and
inadequatecoordination among the partners in the interagency working group, the project was split into two components (as an
alternative to being cancelled completely); DOD and the civilian agencies are moving forward separately with their own projects in
order to sustain the capabilities that NPOESS was intended to provide.

Tech cant solve-political will


Technology cant resolve governance problems.
Malone et al. 2014
Thomas C., University of Maryland Professor of Marine Biology, A global ocean
observing system framework for sustainable development, Marine Policy 43 (2014)
262272
governance, not the science and technology behind
ocean observing systems, is the weak link in the implementation chain for EBAs.
Historically, governments have responded to changes in ecosystem states and their
impacts in an ad hoc fashion focusing on sector by sector management (environmental
As emphasized by Browman et al. [291,

protection and the management of fisheries, water, transportation, oil and gas extraction, wind energy, tourism and

rather than on an integrated, holistic strategy for managing human uses


of ecosystem services [4262734]- Limitations of this sector-based approach to governance are compounded by inherent complexities in managing a growing number of
activities across different levels of government, across agencies (ministries) within governments,
recreation, etc)

and (for multi- national regional entities such as the European Union) among governments. As a result,

competition for funding among govern- ment agencies often inhibits needed
collaborations and can result in policy choices that are detrimental to ecosystems

and their services and dependent coastal communities. The limitations of sector-specific 'stove pipe" approaches to
the stewardship of ecosystem services have been recognized for decades and are reflected in national and
international calls for EBAs to managing human uses and adapting to climate change "3.4.28.153). An effective set

institutional responses to achieve sustainable development must address the


reality that current governance arrangements are not well tuned to the challenges
of formulating, implementing and improving EBAs that are guided by lEAs and the sustained
of

provision of the required data on pressures, states and impacts. Conventional approaches to sustainable devel-

Needed is the development


of institutional frame- works that promote a shift from sector by sector management
of human uses to a more holistic approach that considers the diversity of pressures
on ecosystems, changes in ecosystem states and their impacts through integrated governance. Crosscutting stove pipes must involve a system of integrated ocean governance that
oversees and enables efficient linkages among three interdependent and sustained
activities: (1) establish- ment of an integrated coastal ocean observing system to monitor
opment incorporate current socio-political systems that limit the use of lEAs.

pressures, states and impacts simultaneously; (2) routine and repeated lEAs based on these observations: and (3)
the use of lEAs to help guide the sustainable implementation of EBAs (Fig. 31 The flow of data and information
among these activities must enable an iterative process of evaluating performance against objectives that leads to
more effective observing systems and EBAs.

The primary impediments to

institutionalizing these activities interactively are stable funding,


capacity building, international collaboration, and timely data exchange
among countries . Stable funding for sustained observations is critical to the establishment of EBAs for
sustainable development Sustained capacity building is critical because most coastal waters are under the
jurisdiction of developing countries and emerging economies. International col- laboration is essential to establish
priorities based on issues of mutual concern, to attract funding, to effect timely data exchange among coastal
states for comparative ecosystem analyses, and to develop systems that are interoperable globally (common standards and protocols for comparative analyses of multidisciplinary data streams from sentinel sites). Of these, timely
exchange of ecological data and information on pressures, states and impacts Is likely to be most challenging and
will require international agreements that are codified in the UN Convention on the Law of the Sea. Holistic,
integrated responses have the potential to effectively address issues related to ecosystem services and human
well-being simultaneously. If there is any chance of developing and maintain- ing effective EBAs for sustainable
development (and the sustained and integrated observing systems needed to inform them), the process of
integrated ocean governance must be overseen by a diversity of stakeholders. These include integrated coastal
zone managers, marine spatial planners and managers of CEOBON, LME programs. Marine Protected Areas 11511,

Regional Seas Conventions |149|. and international maritime operations |154|. In addition to these practitioners,
important stakeholders include data providers (scientists and technicians involved in ocean observing and prediction systems), analysts (natural, social and political scientists and economists), and journalists all working for the
common good. This body of stakeholders might be called the "Integrated Govern- ance Forum for Sustaining Marine
Ecosystem Services" and would function as a "Community of Practice* |155|. However, sustained engagement of
such a diversity of stakeholders on a global scale to guide the establishment of EBAs on the spatial scales needed
to sustain ecosystem services (Section 2) is not feasible. Developing COOS to provide the data and information
needed to inform EBAs (Section 3) depends on enhancing the CCN locally and regionally based on priorities
established by stakeholders and articulated in national and regional ocean policies |2S|. The E.U. Marine Strategy
Framework Directive [156| and the U.S. National Policy for the Stewardship of the Ocean. Our Coasts, and the Great
Lakes 1157] are important examples of emerging regional scale approaches to integrated ocean governance. How
effective these policies will be remains to be seen.

Warming

Solvency

No Solvency-Satellites
Cant solve without new satellites
Parthemore and Rogers 11 (Christine Parthemore, Senior Advisor at United States
Department of Defense Adjunct Professor, Security Studies Program at Johns Hopkins University, Will Rogers, a
Research Assistant at the Center for a New American Security, Blinded: The Decline of U.S. Earth Monitoring
Capabilities and Its Consequences for National Security, Center for a New American Security, July 2011,
http://www.cnas.org/files/documents/publications/CNAS_Blinded_ParthemoreRogers_0.pdf)

Networks of satellites, ground-based sensors and unmanned aerial vehicles - the assets
America uses to monitor and understand environmental change and its consequences - are going dark. By
2016, only seven of NASA's current 13 earth monitoring satellites are expected to
be operational, leaving a crucial information gap that will hinder national security planning.1 Meanwhile,
efforts to prevent this capability gap have been plagued by budget cuts, launch
failures, technical deficiencies, chronic delays and poor interagency coordination.
Without the information that these assets provide, core U.S. foreign policy and
national security interests will be at risk. The United States depends on satellite
systems for managing the unconventional challenges of the 21st century in ways that are rarely
acknowledged. This is particularly true for satellites that monitor climate
change

which, in the words of the Department of Defenses (DOD's) 2010


shape the operating environment, roles, and missions" of
DOD and "may act as an accelerant of instability or conflict , placing a burden to respond on
civilian institutions and militaries around the world."3 Examples abound of how climate change is
shaping the strategic environment and of why the U.S. government needs continued
access to earth monitoring data: The opening of the Arctic is requiring the U.S.
Navy and Coast Guard to execute new missions in the High North, including more frequent
search and rescue missions. The receding Himalayan glaciers and related reduced river
flows to South Asia may shape the future relationship between India and Pakistan.
Defense planners and diplomats will need to monitor changes in the glaciers that
supply rivers in Pakistan in order to determine whether access to water will exacerbate existing military and
and other environmental trends,

Quadrennial Defense Review, "will

diplomatic tensions between India and Pakistan - states that have longstanding grievances over how they share

In the South China Sea, changing ocean conditions are altering Ash migration,
leading neighboring countries to compete over access to billions of dollars in fish
resources; this situation could escalate into serious conflict in contested territorial
waters. DOD and development agencies rely on earth monitoring systems to monitor urbanization, migration
patterns and internal population displacement. Several government agencies also rely on earth
monitoring capabilities to analyze compliance with deforestation and emissions measures in
international climate change treaties, just as the government relies on space-based capabilities to monitor
water.

and verify compliance with non-proliferation treaties.

Political Will
Trying to solve for warming does nothing. Its politically
controversial and links to politics
Speath 12 (Ryu Speath, deputy editor at TheWeek.com, Why it's probably too late to roll back global
warming, The Week, December 5, 2012, http://theweek.com/article/index/237392/why-its-probably-too-late-to-rollback-global-warming)

Two degrees Celsius. According to scientists, that's the rise in global temperature, measured
against pre-industrial times, that could spark some of the most catastrophic effects of global warming. Preventing
the two-degree bump has been the goal of every international treaty designed to reduce greenhouse gas emissions,

a new study
shows that it's incredibly unlikely that global
warming can be limited to two degrees. According to the study, the world in 2011
"pumped nearly 38.2 billion tons of carbon dioxide into the air from the burning of
fossil fuels such as coal and oil," says Seth Borenstein at The Associated Press: The total
amounts to more than 2.4 million pounds (1.1 million kilograms) of carbon dioxide
released into the air every second. Because emissions of the key greenhouse gas
have been rising steadily and most carbon stays in the air for a century, it is not just
unlikely but "rather optimistic" to think that the world can limit future temperature
increases to 2 degrees Celsius (3.6 degrees Fahrenheit), said the study's lead author, Glen Peters at the
Center for International Climate and Environmental Research in Oslo, Norway. What happens when the
two-degree threshold is crossed? Most notably, that's when the polar ice caps will
begin to melt, leading to a dangerous rise in sea levels. Furthermore, the world's hottest regions
will be unable to grow food, setting the stage for mass hunger and global food
inflation. The rise in temperature would also likely exacerbate or cause extreme
weather events, such as hurricanes and droughts. There is a very small chance that
the world could pull back from the brink. The U.N. could still limit warming to two degrees if it adopts
including a new one currently being hammered out at a United Nations summit in Doha, Qatar. But
published by the journal Nature Climate Change

a "radical plan," says Peters' group. According to a PricewaterhouseCoopers study, such a plan would entail cutting
carbon emissions "by 5.1 percent every year from now to 2050, essentially slamming the breaks on growth starting
right now," says Coral Davenport at The National Journal, "and keeping the freeze on for 37 years." However, the
U.N. has set a deadline of ratifying a new treaty by 2015, and implementing it by 2020, which means the world is
already eight years behind that pace. There are still major disagreements between the U.S. and China over whether

And
there is, of course, a large contingent of Americans who don't even believe climate
change exists, putting any treaty's ratification at risk. Climate change is so politically toxic in
America that Congress has prioritized the fiscal cliff over no exaggeration
untold suffering and the end of the world as we know it. In other words, it isn't
happening. And if that's not bad enough, keep in mind that the two-degree mark is
just the beginning, says Davenport: Michael Oppenheimer, a professor of geosciences and international
affairs at Princeton University and a member of the Nobel Prize-winning U.N. Intergovernmental Panel on Climate
Change, says that a 2-degree rise is not itself that point, but rather the beginning of
irreversible changes. "It starts to speed you toward a tipping point," he said. "It's
driving toward a cliff at night with the headlights off. We don't know when we'll hit
that cliff, but after 2 degrees, we're going faster, we have less control. After 3, 4, 5
degrees, you spiral out of control, you have even more irreversible change." Indeed,
at the current emissions rate, the world is expected to broach the four-degree mark
by 2100 at which point, we can expect even worse environmental catastrophes.
Some analysts say that the best possible scenario is preventing the Earth from warming up
by three or four degrees. That means instead of focusing solely on preventing global warming, governments
the developed world, which industrialized first, should bear the bulk of the cost of reducing carbon emissions.

around the world should begin preparing for the major environmental upheavals,
starting with protections for coastal cities.

No Warming

No warming
Global warming theory is false 5 warrants
Hawkins 14 (John Hawkins, runs Right Wing News and Linkiest. He's also the co-owner of the The Looking
Spoon. Additionally, he does weekly appearances on the #1 in it's market Jaz McKay show, writes two weekly
columns for Townhall and a column for PJ Media. Additionally, his work has also been published at the Washington
Examiner, The Hill, TPNN, Hot Air, The Huffington Post and at Human Events, 5 Scientific Reasons That Global
Warming Isn't Happening, Town Hall, February 18, 2014, http://townhall.com/columnists/johnhawkins/2014/02/18/5scientific-reasons-that-global-warming-isnt-happening-n1796423/page/full)
How did global warming discussions end up hinging on what's happening with polar bears, unverifiable predictions
of what will happen in a hundred years, and whether people are "climate deniers" or "global warming cultists?" If
this is a scientific topic, why aren't we spending more time discussing the science involved? Why aren't we talking
about the evidence and the actual data involved? Why aren't we looking at the predictions that were made and

why are so many people


who care about science skeptical? Many Americans have long since thought that the
best scientific evidence available suggested that man wasn't causing any sort of
global warming. However, now, we can go even further and suggest that the planet
isn't warming at all. 1) There hasn't been any global warming since 1997: If nothing
changes in the next year, we're going to have kids who graduate from high school
who will have never seen any "global warming" during their lifetimes. That's right; the
temperature of the planet has essentially been flat for 17 years. This isn't a
controversial assertion either. Even the former Director of the Climate Research Unit
(CRU) of the University of East Anglia, Phil Jones, admits that it's true. Since the planet
was cooling from 1940-1975 and the upswing in temperature afterward only lasted 22 years, a 17 year pause
is a big deal. It also begs an obvious question: How can we be experiencing global
warming if there's no actual "global warming?" 2) There is no scientific consensus
that global warming is occurring and caused by man: Questions are not decided by
"consensus." In fact, many scientific theories that were once widely believed to be
true were made irrelevant by new evidence. Just to name one of many, many
examples, in the early seventies, scientists believed global cooling was occurring.
However, once the planet started to warm up, they changed their minds. Yet, the
primary "scientific" argument for global warming is that there is a "scientific
consensus" that it's occurring. Setting aside the fact that's not a scientific argument, even if that ever
was true (and it really wasn't), it's certainly not true anymore. Over 31,000 scientists have signed on
to a petition saying humans aren't causing global warming. More than 1000 scientists
seeing if they match up to the results? If this is such an open and shut case,

signed on to another report saying there is no global warming at all. There are tens of thousands of well-educated,
mainstream scientists who do not agree that global warming is occurring at all and people who share their opinion
are taking a position grounded in science. 3) Arctic ice is up 50% since 2012 : The loss of Arctic ice has
been a big talking point for people who believe global warming is occurring. Some people have even predicted that

How
much Arctic ice really matters is an open question since the very limited evidence
we have suggests that a few decades ago, there was less ice than there is today, but
all of the Arctic ice would melt by now because of global warming. Yet, Arctic ice is up 50% since 2012.

the same people who thought the drop in ice was noteworthy should at least agree that the increase is important as

4) Climate models showing global warming have been wrong over and over:
These future projections of what global warming will do to the planet have been
based on climate models. Essentially, scientists make assumptions about how much of an impact different
well.

factors will have; they guess how much of a change there will be and then they project changes over time.
Unfortunately, almost all of these models showing huge temperature gains have turned out to be wrong. Former
NASA scientist Dr. Roy Spencer says that climate models used by government agencies to create policies have
failed miserably. Spencer analyzed 90 climate models against surface temperature and satellite temperature data,

more than 95 percent of the models have over-forecast the warming


trend since 1979, whether we use their own surface temperature dataset (HadCRUT4), or our satellite dataset of
and found that

lower tropospheric temperatures (UAH). There's an old saying in programming that goes, "Garbage in, garbage

In other words, if the assumptions and data you put into the models are faulty,
then the results will be worthless. If the climate models that show a dire impact
because of global warming aren't reliable -- and they're not -- then the long term
projections they make are meaningless. 5) Predictions about the impact of global
warming have already been proven wrong: The debate over global warming has
been going on long enough that we've had time to see whether some of the
predictions people made about it have panned out in the real world. For example, Al Gore
out."

predicted all the Arctic ice would be gone by 2013. In 2005, the Independent ran an article saying that the Artic had
entered a death spiral. Scientists fear that the Arctic has now entered an irreversible phase of warming which will
accelerate the loss of the polar sea ice that has helped to keep the climate stable for thousands of years.... The

greatest fear is that the Arctic has reached a tipping point beyond which nothing can
reverse the continual loss of sea ice and with it the massive land glaciers of Greenland, which will raise sea
levels dramatically. Of course, the highway is still there. Meanwhile, Arctic ice is up
50% since 2012. James Hansen of NASA fame predicted that the West Side Highway in New York would be
under water by now because of global warming. If the climate models and the predictions about
global warming aren't even close to being correct, wouldn't it be more scientific to
reject hasty action based on faulty data so that we can further study the issue and
find out what's really going on?

No warming History and scientific study prove. Scientists


exaggerate studies to get research grants.
Jaworowski 08 (Prof. Zbigniew Jaworowski, Chairman of Scientific Council of the Central Laboratory for
Radiological Protection in Warsaw, Chair of UN Scientific Committee on the Effects of Atomic Radiation, Fear
Propaganda http://www.ourcivilisation.com/aginatur/cycles/chap3.htm)

the horrors of warming are not troubled by the fact that in


the Middle Ages, when for a few hundred years it was warmer than it is
now, neither the Maldive atolls nor the Pacific archipelagos were flooded. Global
oceanic levels have been rising for some hundreds or thousands of years (the causes
of this phenomenon are not clear). In the last 100 years, this increase amounted to 10 cm to 20 cm, (24) but it
does not seem to be accelerated by the 20th Century warming. It turns out that
in warmer climates, there is more water that evaporates from the ocean
(and subsequently falls as snow on the Greenland and Antarctic ice caps) than there is water that
flows to the seas from melting glaciers. (17) Since the 1970s, the glaciers of the
Arctic, Greenland, and the Antarctic have ceased to retreat, and have started to grow .
Doomsayers preaching

On January 18, 2002, the journal Science published the results of satellite-borne radar and ice core studies
performed by scientists from CalTech's Jet Propulsion Laboratory and the University of California at Santa Cruz.

Antarctic ice flow has been slowed, and sometimes even


stopped, and that this has resulted in the thickening of the continental
glacier at a rate of 26.8 billion tons a year. (25) In 1999, a Polish Academy of Sciences paper
These results indicate that the

was prepared as a source material for a report titled "Forecast of the Defense Conditions for the Republic of Poland
in 2001-2020." The paper implied that the increase of atmospheric precipitation by 23 percent in Poland, which was
presumed to be caused by global warming, would be detrimental. (Imagine stating this in a country where 38
percent of the area suffers from permanent surface water deficit!) The same paper also deemed an extension of the
vegetation period by 60 to 120 days as a disaster. Truly, a possibility of doubling the crop rotation, or even
prolonging by four months the harvest of radishes, makes for a horrific vision in the minds of the authors of this

Newspapers continuously write about the increasing frequency and power


of the storms. The facts, however, speak otherwise. I cite here only some few data from
Poland, but there are plenty of data from all over the world . In Cracow, in 1896-1995,
the number of storms with hail and precipitation exceeding 20 millimeters
has decreased continuously, and after 1930, the number of all storms
decreased. (26) In 1813 to 1994, the frequency and magnitude of floods of
Vistula River in Cracow not only did not increase but, since 1940, have significantly decreased. (27)
paper.

Also, measurements in the Kolobrzeg Baltic Sea harbor indicate that the number of gales has not increased

Similar observations apply to the 20th Century


hurricanes over the Atlantic Ocean (Figure 4,) and worldwide.
between 1901 and 1990. (28)

Emissions are currently being released into space. Proves warming is


actually reversing
Taylor 11 (James Taylor, managing editor of Environment & Climate News, a national monthly publication
devoted to sound science and free-market environmentalism. He is also senior fellow for The Heartland Institute,
focusing on energy and environment issues, New NASA Data Blow Gaping Hole In Global Warming Alarmism,
Forbes, July 27, 2011, http://www.forbes.com/sites/jamestaylor/2011/07/27/new-nasa-data-blow-gaping-hold-inglobal-warming-alarmism/http://www.forbes.com/sites/jamestaylor/2011/07/27/new-nasa-data-blow-gaping-hold-inglobal-warming-alarmism/)

NASA satellite data from the years 2000 through 2011 show the Earths atmosphere is
allowing far more heat to be released into space than alarmist computer models have predicted,
reports a new study in the peer-reviewed science journal Remote Sensing. The study indicates far less
future global warming will occur than United Nations computer models have
predicted, and supports prior studies indicating increases in atmospheric carbon
dioxide trap far less heat than alarmists have claimed. Study co-author Dr. Roy Spencer,
a principal research scientist at the University of Alabama in Huntsville and U.S. Science Team Leader for the

reports that real-world data


from NASAs Terra satellite contradict multiple assumptions fed into alarmist
computer models. The satellite observations suggest there is much more energy
lost to space during and after warming than the climate models show, Spencer said in a
July 26 University of Alabama press release. There is a huge discrepancy between the data and
the forecasts that is especially big over the oceans . In addition to finding that
far less heat is being trapped than alarmist computer models have predicted, the
NASA satellite data show the atmosphere begins shedding heat into space long
before United Nations computer models predicted.
Advanced Microwave Scanning Radiometer flying on NASAs Aqua satellite,

No Consensus
97 percent consensus claims are false
Taylor 13, (James, Writer for forbes. 5/30/13, Global Warming Alarmists Caught Doctoring '97-Percent Consensus' Claims,
http://www.forbes.com/sites/jamestaylor/2013/05/30/global-warming-alarmists-caught-doctoring-97-percent-consensus-claims/
//jweideman)

Global warming alarmists and their allies in the liberal media have been caught
doctoring the results of a widely cited paper asserting there is a 97-percent
scientific consensus regarding human-caused global warming. After taking a closer look at
the paper, investigative journalists report the authors claims of a 97-pecent consensus
relied on the authors misclassifying the papers of some of the worlds most
prominent global warming skeptics. At the same time, the authors deliberately presented
a meaningless survey question so they could twist the responses to fit their own
preconceived global warming alarmism. Global warming alarmist John Cook, founder of the

misleadingly named blog site Skeptical Science, published a paper with several other global warming alarmists
claiming they reviewed nearly 12,000 abstracts of studies published in the peer-reviewed climate literature. Cook
reported that he and his colleagues found that 97 percent of the papers that expressed a position on human-caused
global warming endorsed the consensus position that humans are causing global warming .

As is the case
with other surveys alleging an overwhelming scientific consensus on global
warming, the question surveyed had absolutely nothing to do with the issues of
contention between global warming alarmists and global warming skeptics. The
question Cook and his alarmist colleagues surveyed was simply whether humans
have caused some global warming. The question is meaningless regarding the
global warming debate because most skeptics as well as most alarmists believe
humans have caused some global warming. The issue of contention dividing
alarmists and skeptics is whether humans are causing global warming of such
negative severity as to constitute a crisis demanding concerted action.

Warming slow
Worst case scenario warming will only be 1.5 degrees
Freitas 2 (C. R., Associate Prof. in Geography and Enivonmental Science @ U. Aukland, Bulletin of Canadian
Petroleum Geology, Are observed changes in the concentration of carbon dioxide in the atmosphere really
dangerous? 50:2, GeoScienceWorld)
In any analysis of CO2 it is important to differentiate between three quantities: 1) CO2 emissions, 2) atmospheric
CO2 concentrations, and 3) greenhouse gas radiative forcing due to atmospheric CO2. As for the first, between
1980 and 2000 global CO2 emissions increased from 5.5 Gt C to about 6.5 Gt C, which amounts to an average
annual increase of just over 1%. As regards the second, between 1980 and 2000 atmospheric CO2 concentrations
increased by about 0.4 per cent per year. Concerning the third, between 1980 and 2000 greenhouse gas forcing

Because of the logarithmic


relationship between CO2 concentration and greenhouse gas forcing, even an
exponential increase of atmospheric CO2 concentration translates into linear forcing
and temperature increase ; or, as CO2 gets higher, a constant annual increase of say 1.5 ppm has less and
increase due to CO2 has been about 0.25 W m2 per decade (Hansen, 2000).

less effect on radiative forcing, as shown in Figure 3

. Leaving aside for the moment the satellite temperature

data and using the surface data set, between 1980 and 2000 there has been this linear increase of both CO2

If one extrapolates the rate of observed atmospheric


CO2 increase into the future, the observed atmospheric CO2 increase would only lead to a
concentration of about 560 ppm in 2100, about double the concentration of the late 1800s. That
assumes a continuing increase in the CO2 emission rate of about 1% per year, and a
greenhouse gas forcing and temperature.

carbon cycle leading to atmospheric concentrations observed in the past. If one assumes, in addition, that the
increase of surface temperatures in the last 20 years (about 0.3 C) is entirely due to the increase in greenhouse

that would translate into a temperature increase


of about 1.5 C (or approximately 0.15 C per decade). Using the satellite data, the temperature increase is
gas forcing of all greenhouse gas, not just CO2,

correspondingly lower. Based on this, the temperature increase over the next 100 years might be less than 1.5 C,
as proposed in Figure 19.

Too late
Warming is too fast for adaption
Willis et al 10 (K.J. Willis1,3,4 , R.M. Bailey2 , S.A. Bhagwat1,2 and H.J.B. Birks1,2,3 1 Institute of Biodiversity at the James
Martin 21st Century School, University of Oxford, South Parks Road, Oxford OX1 3PS, UK 2School of Geography and the Environment,
University of Oxford, Oxford OX1 3QY, UK 3Department of Biology, University of Bergen, Post Box 7803, N-5020 Bergen, Norway
4Department of Zoology, University of Oxford. 2010, Biodiversity baselines, thresholds and resilience: testing predictions and
assumptions using palaeoecological data http://users.ugent.be/~everleye/Ecosystem%20Dynamics/papers%20assignment/Willis
%20et%20al%20TREE%202010.pdf//jweideman)

Another critical question relating to future climate change is whether rates of


predicted change will be too rapid for ecological processes to react and so prevent
species and communities persisting. There is also the question of whether species
will be able to migrate quickly enough to new locations with a suitable climate.
Studies based on data from extant populations and modelling suggest that rapid
rates of change could pose a serious threat for many species and communities
unable to track climate space quickly enough, resulting in extensive extinctions
[9,31]. But it is also known from fossil records that there have been numerous previous intervals of abrupt climate
change [32,33]. What were the responses of past biodiversity to these previous intervals of rapid climate change?
The existence of highly detailed evidence from ice-core records (Box 2) spanning the last full glacial cycle provides
an ideal opportunity to examine biodiversity responses to rapid climate change. For example, ice-cores indicate that
temperatures in mid to high latitudes oscil- lated repeatedly by more than 4C on timescales of decades or less

Numerous records of biodiversity response from North America and


Europe across this time-interval reflect ecological changes with a decadal resolution [35-37). While they
demonstrate clear evidence for rapid turnover of communities (e.g. Figure 2), novel
assemblages, migrations and local extinctions, there is no evidence for the broad-scale extinctions predicted
134) (Box 2).

by mod- els; rather there is strong evidence for persistence |25). However, there is also evidence that some species
expand- ed their range slowly or largely failed to expand from their refugia in response to this interval of rapid
climate warm- ing 138). The challenge now is to determine which specific respect, in particular information on
alternative stable states, rates of change, possible triggering mechanisms and systems that demonstrate resilience
to thresholds. In a study from central Spain, for example, it has been demonstrated that over the past 9000 years,
several threshold changes occurred, shifting from one stable forest type to another (pine to deciduous oak and then
evergreen oak to pine) (52). The trigger appears to have been a combination of at least two abiotic variables; in the
first shift, an interval of higher precipitation combined with less evaporation and in the second, increased aridity
combined with increased fire frequency. A similar 'double-trigger' was also found to be important in regime shifts
along the south-east coastline of Madagascar |53| where a threshold shift from closed littoral forest to open Ericadominated heathland occurred in response to the combined effects of aridity and sea-level rise. Neither of the
variables occurring alone resulted in a shift but the combination of the two did.

Nothing can be done to stop global warming


Skeptical Science 13 (Skeptical Science, Global warming: Not reversible, but stoppable, Raw
Story, April 27, 2013, http://www.rawstory.com/rs/2013/04/27/global-warming-not-reversable-but-stoppable/)

Global warming is not reversible but it is stoppable. Many people incorrectly assume
that once we stop making greenhouse gas emissions, the CO2 will be drawn out of
the air, the old equilibrium will be re-established and the climate of the planet will go back to the way it used to
be; just like the way the acid rain problem was solved once scrubbers were put on smoke stacks, or the way lead

This misinterpretation can lead to


complacency about the need to act now. In fact, global warming is , on human timescales,
here forever. The truth is that the damage we have done and continue to doto the
climate system cannot be undone. The second question reveals a different kind of
misunderstanding: many mistakenly believe that the climate system is going to
send more warming our way no matter what we choose to do . Taken to an extreme, that
pollution disappeared once we changed to unleaded gasoline.

viewpoint can lead to a fatalistic approach, in which efforts to mitigate climate change by cutting emissions are
seen as futile: we should instead begin planning for adaptation or, worse, start deliberately intervening through
geoengineering. But this is wrong. The inertia is not in the physics of the climate system, but rather in the human
economy. This is explained in a recent paper in Science Magazine (2013, paywalled but freely accessible here, scroll
down to "Publications, 2013") by Damon Matthews and Susan Solomon: Irreversible Does Not Mean Unavoidable.

Since the Industrial Revolution, CO2 from our burning of fossil fuels has been

building up in the atmosphere. The concentration of CO2 is now approaching 400


parts per million (ppm), up from 280 ppm prior to 1800. If we were to stop all emissions
immediately, the CO2 concentration would also start to decline immediately, with
some of the gas continuing to be absorbed into the oceans and smaller amounts being taken up by carbon sinks on
land. According to the models of the carbon cycle, the level of CO2 (the red line in Figure 1A) would have dropped to
about 340 ppm by 2300, approximately the same level as it was in 1980. In the next 300 years, therefore, nature

So, does this mean that some of the climate


change we have experienced so far would go into reverse, allowing, for example,
the Arctic sea ice to freeze over again? Unfortunately, no. Today, because of the
greenhouse gas build-up, there is more solar energy being trapped, which is
warming the oceans, atmosphere, land and ice, a process that has been referred to as the Earth's energy
imbalance. The energy flow will continue to be out of balance until the Earth warms up
enough so that the amount of energy leaving the Earth matches the amount coming
in. It takes time for the Earth to heat up, particularly the oceans, where approximately 90% of the thermal energy
will have recouped the last 30 years of our emissions.

ends up. It just so happens that the delayed heating from this thermal inertia balances almost exactly with the drop
in CO2 concentrations, meaning the temperature of the Earth would stay approximately constant from the minute

The bad news


is that, once we have caused some warming, we cant go back, at least not without
huge and probably unaffordable efforts to put the CO2 back into the ground , or by
we stopped adding more CO2, as shown in Figure 1C. There is bad news and good news in this.

making risky interventions by scattering tons of sulphate particles into the upper atmosphere, to shade us from the
Sun. The good news is that, once we stop emissions, further warming will immediately cease; we are not on an

The future is not out of our hands. Global warming is stoppable,


not reversible.

unstoppable path to oblivion.


even if it is

Too late to solve warming


Adam 08 (David Adam, environment correspondent for the Guardian between 2005 and 2010, before which
he was science correspondent for two years. He previously worked at the science journal Nature after a PhD in
chemical engineering, Too late? Why scientists say we should expect the worst, The Guardian, December 8, 2008,
http://www.theguardian.com/environment/2008/dec/09/poznan-copenhagen-global-warming-targets-climatechange)

Despite the political rhetoric, the scientific warnings, the media headlines and the corporate promises,
he would say, carbon emissions were soaring way out of control - far above even
the bleak scenarios considered by last year's report from the Intergovernmental Panel on
Climate Change (IPCC) and the Stern review. The battle against dangerous climate change had
been lost, and the world needed to prepare for things to get very, very bad. "As an

academic I wanted to be told that it was a very good piece of work and that the conclusions were sound," Anderson

as a human being I desperately wanted someone to point out a mistake, and


to tell me we had got it completely wrong." Nobody did. The cream of the UK climate
science community sat in stunned silence as Anderson pointed out that carbon emissions since 2000 have
risen much faster than anyone thought possible, driven mainly by the coal-fuelled economic
boom in the developing world. So much extra pollution is being pumped out, he said,
that most of the climate targets debated by politicians and campaigners are fanciful
at best, and "dangerously misguided" at worst. In the jargon used to count the
steady accumulation of carbon dioxide in the Earth's thin layer of atmosphere, he
said it was "improbable" that levels could now be restricted to 650 parts per million
(ppm). The CO2 level is currently over 380ppm, up from 280ppm at the time of the industrial revolution, and it
said. "But

rises by more than 2ppm each year. The government's official position is that the world should aim to cap this rise
at 450ppm.

Even a cut in emissions will accelerate global warming


Dye 12 (Lee Dye, science expert for ABC News, It May Be Too Late to Stop Global Warming, ABC News,
October 26, 2012, http://abcnews.go.com/Technology/late-stop-global-warming/story?id=17557814)

Here's a dark secret about the earth's changing climate that many scientists believe, but few seem
eager to discuss: It's too late to stop global warming . Greenhouse gasses pumped
into the planet's atmosphere will continue to grow even if the industrialized nations
cut their emissions down to the bone. Furthermore, the severe measures that would
have to be taken to make those reductions stand about the same chance as that
proverbial snowball in hell. Two scientists who believe we are on the wrong track argue in the current
issue of the journal Nature Climate Change that global warming is inevitable and it's time to
switch our focus from trying to stop it to figuring out how we are going to deal with
its consequences. "At present, governments' attempts to limit greenhouse-gas emissions through carbon
cap-and-trade schemes and to promote renewable and sustainable energy sources are probably too
late to arrest the inevitable trend of global warming," Jasper Knight of Wits University in
Johannesburg, South Africa, and Stephan Harrison of the University of Exeter in England argue in their study. Those
efforts, they continue, "have little relationship to the real world." What is clear, they contend, is a profound lack of
understanding about how we are going to deal with the loss of huge land areas, including some entire island
nations, and massive migrations as humans flee areas no longer suitable for sustaining life, the inundation of
coastal properties around the world, and so on ... and on ... and on. That doesn't mean nations should stop trying to

But the cold fact is


no matter what Europe and the United States and other "developed" nations do, it's
not going to curb global climate change , according to one scientist who was once highly skeptical of
reduce their carbon emissions, because any reduction could lessen the consequences.

the entire issue of global warming. "Call me a converted skeptic," physicist Richard A. Muller says in an op-ed piece
published in the New York Times last July. Muller's latest book, "Energy for Future Presidents," attempts to poke
holes in nearly everything we've been told about energy and climate change, except the fact that " humans

are

almost entirely the cause" of global warming. Those of us who live in the
"developed" world initiated it. Those who live in the "developing" world will sustain
it as they strive for a standard of living equal to ours. "As far as global warming is
concerned, the developed world is becoming irrelevant ," Muller insists in his book. We could set
an example by curbing our emissions, and thus claim in the future that "it wasn't our fault," but about the only thing
that could stop it would be a complete economic collapse in China and the rest of the world's developing countries.

We are already passed the tipping point


Thomas 14 (Andrew Thomas, writer for the American Thinker, The Global Warming Tipping Point is Near,
American Thinker, January 2, 2014,
http://www.americanthinker.com/blog/2014/01/the_global_warming_tipping_point_is_near.html)

Anthropogenic Global Warming (AGW) theory has been dominant for the past three
decades as absolute fact in the public mind. In the last several years, however, cracks in the fortress of "settled
science" have appeared, and public opinion has begun to shift. Increasingly, alarmist predictions have failed to

In 2004, NASA's chief scientist James Hansen authoritatively announced that


there is only a ten-year window to act on AGW (presumably by transferring mass quantities of
taxpayer funds to global warmist causes) before climate Armageddon destroys humanity. Well,
that window has now shut tight, and AGW is AWOL. Al Gore, the high priest of AGW theory, has closed
all of his Alliance for Climate Protection field offices, and laid off 90% of his staff. Contributions have all
but dried up since 2008. Australia's conservative government has severely curtailed
the country's climate change initiatives and is in the process of repealing its business-killing carbon
come to fruition.

tax. A group of German scientists predicts dramatic global cooling over the next 90 years toward a new "little ice
age." Of course, even many "low information" folks have an awareness of the record increase in Arctic sea ice, as
well as the current highly-publicized predicament of the cadre of wealthy global warmists stuck in record-high sea
ice while on a cruise to the Antarctic to prove the absence of sea ice. Now the UN's Intergovernmental Panel on
Climate Change (IPCC) has quietly downgraded their prediction for global warming for the next 30 years in the final
draft of their landmark "Fifth Assessment Report." The effect of this is that they are tacitly admitting that the
computer models they have religiously relied-upon for decades as "proof" of AGW theory are dead wrong.

tipping point is near. I can smell it.

The

Warming will never be reversible vote NEG on presumption


that the AFF creates worse that the status quo
Romm 13 (Joe Romm, founding editor for Climate Change, The Dangerous Myth That Climate Change Is
Reversible, Climate Progress, March 17, 2013, http://thinkprogress.org/climate/2013/03/17/1731591/thedangerous-myth-that-climate-change-is-reversible/)

As a NOAA-led paper explained 4 years ago, climate change is largely


irreversible for 1000 years. This notion that we can reverse climate change by cutting
emissions is one of the most commonly held myths and one of the most
dangerous, as explained in this 2007 MIT study , Understanding Public Complacency About
Climate Change: Adults mental models of climate change violate conservation of matter. The fact is that, as
RealClimate has explained, we would need an immediate cut of around 60 to 70% globally
and continued further cuts over time merely to stabilize atmospheric
concentrations of CO2 and that would still leave us with a radiative imbalance that
would lead to an additional 0.3 to 0.8C warming over the 21st Century. And that
assumes no major carbon cycle feedbacks kick in, which seems highly unlikely. Wed
have to drop total global emissions to zero now and for the rest of the century just
to lower concentrations enough to stop temperatures from rising. Again, even in this
implausible scenario, we still arent talking about reversing climate change, just stopping
it or, more technically, stopping the temperature rise. The great ice sheets might
well continue to disintegrate, albeit slowly.
Memo to Nocera:

Warming Natural
Scientific study and consensus validates warming is a natural
phenomenon
McClintock 09 (Ian C. McClintock, has held many positions in the NFF and in the NSW Farmers
Association where he currently serves on the Climate Change Task Force, the Business Economic & Trade
Committee, and on the Executive Council. He has served on the NFF s Economic Committee, Conservation
Committee, and as the NFF's representative on the National Farm Forestry Round Table, Proof that CO2 is not the
Cause of the Current Global Warming, Lavoisier, June 2009, http://www.lavoisier.com.au/articles/greenhousescience/climate-change/mcclintock-proofnotco2-2009.pdf)

A careful study of these Reports reveals that there is in fact no empirical evidence that
supports the main theory of anthropogenic global warming (AGW) (this has now been
changed to the all inclusive terminology climate change2 ). Some 23 specialized computer models provide the

There is considerable uncertainty, as a result of a fundamental lack of


admitted by the IPCC. There is still an
incomplete physical understanding of many components of the climate system and
their role in climate change. Key uncertainties include aspects of the roles played by
clouds, the cryosphere, the oceans, land use and couplings between climate and
biogeochemical cycles. Significantly, our Sun provides 99.9% of the energy that
drives the worlds climate systems and its varying Solar Wind protects us from galactic cosmic rays that
only support for the IPCC theory.

knowledge about the causes of climate change,

have been shown to influence global cloud cover (and therefore temperature), and so this should be added to the
list in the above quotation as this material is now largely ignored by the IPCC. It has been common in recent times
to claim that there is a scientific consensus about the causes of climate change, however it has become evident

Amongst many other examples there are now well over


31,000 scientists who have signed a petition disagreeing , on scientific grounds, with the
IPCCs claims. Science is never settled. The object of this paper is to cite a number of examples that
that this is clearly not the case.

conclusively demonstrate and prove that the claims that CO2 is the principle driver of climate change are false.

the whole objective of attempting to reduce CO2 in the


atmosphere a total waste of time, effort, financial and other resources, all of which
can be better directed towards tackling real problems . PROOFS 1. LACK OF CORRELATION
There is no correlation between levels of CO2 3 in the atmosphere and global
temperature, in short, medium or long term historic time scales. There is a scientific
truism that says, Correlation can not prove causation, but a failure to correlate can
prove non-causation. If anthropogenic CO2 were the major climate driver (as claimed),
warming would have continued to increase in synchronization with the increase in
CO2 levels. (i) Short Term. The graph below indicates the present cooling period we are now experiencing.
Once this fact is accepted, it makes

This shows the Hadley UK ground based (pink) and the satellite (blue) temperature record and the simultaneously

lack of correlation with CO2 increases and the inability to


predict climate even in the very short term, let alone for 100 or more years into the future. With
the Sun now entering a quiet phase, it is anticipated that this cooling trend is likely
to continue until Sun activity increases. (ii) Medium Term . This graph shows the temperature
increasing CO2 levels (green line).

recovery (warming) that has occurred since the last gasp of the Little Ice Age (LIA) in the mid 1800s. It shows three
warmings and two coolings, in fact there are now three coolings with details of the latest one (shown on p4) not

Again there is no correlation with CO2 levels in the atmosphere. It


is worth noting that the rate of increase in the latest warming period (1975 1998) is
similar to the rate of increase in the previous two warmings (sloping parallel bars), despite significantly higher
included on this graph.

levels of CO2 being present during the last warming. There appears to be a clear cyclical signature evident in these

It is very obvious that other influences


are dominating (iii) Long Term. For the first 4 billion years of Earth history, the CO2
content of the atmosphere has been from 3 to 100 times the current level (Plimer. I.
2009), without any 'tipping point* leading to a runaway greenhouse effect, as predicted by the alarmists.
The historical record indicates that CO2 levels have been significantly higher than
temperature perturbations that is absent in the CO2 data.

they are now, being 25 times at 545 Ma, (million years ago) (Veizer. J. et al 2000). The killer proof that CO2
does not drive climate is to be found during the Ordovician- Silurian (450-420 Ma) and the Jurassic-Cretaceous
periods (151-132 Ma), when CO> levels were greater than 4000 ppmv (parts per million by volume) and about 2000

If the IPCC theory is correct there should have been runaway


greenhouse induced global warming during these periods, but instead there was
glaciation. This unequivocally proves that CO2 does not drive climate , it can only be a
minor player, swamped by other far more powerful influences. This proves three things. Firstly, CO2
did not, and could not, have caused the warmings. Secondly, rising global
temperatures were responsible for increasing levels of both CO2 and CH4 (Methane)
ppmv respectively4.

and also Nitrous Oxide (N2O), not shown. The physics of how this might occur are well established and beyond

Thirdly, increased levels of CO2 did not inhibit in any way the subsequent fall
in temperatures that regularly occurred, plunging the world into another ice age.
dispute.

It occurs naturally
Ferrara 12 (Peter Ferrara, senior fellow for entitlement and budget policy at The Heartland Institute, a
senior fellow at the Social Security Institute, and the general counsel of the American Civil Rights Union, Sorry
Global Warming Alarmists, The Earth Is Cooling, The Heartland Institute, June 1, 2012,
http://blog.heartland.org/2012/06/sorry-global-warming-alarmists-the-earth-is-cooling/)

Check out the 20th century temperature record, and you will find that its up and
down pattern does not follow the industrial revolutions upward march of
atmospheric carbon dioxide (CO2), which is the supposed central culprit for man caused
global warming (and has been much, much higher in the past). It follows instead the up and
down pattern of naturally caused climate cycles . For example, temperatures
dropped steadily from the late 1940s to the late 1970s. The popular press was even
talking about a coming ice age. Ice ages have cyclically occurred roughly every 10,000 years, with a
new one actually due around now. In the late 1970s, the natural cycles turned warm and
temperatures rose until the late 1990s, a trend that political and economic interests
have tried to milk mercilessly to their advantage. The incorruptible satellite
measured global atmospheric temperatures show less warming during this period
than the heavily manipulated land surface temperatures. Central to these natural
cycles is the Pacific Decadal Oscillation (PDO). Every 25 to 30 years the oceans undergo a
natural cycle where the colder water below churns to replace the warmer water at
the surface, and that affects global temperatures by the fractions of a degree we
have seen. The PDO was cold from the late 1940s to the late 1970s, and it was
warm from the late 1970s to the late 1990s, similar to the Atlantic Multidecadal Oscillation (AMO).
In 2000, the UNs IPCC predicted that global temperatures would rise by 1 degree Celsius by 2010. Was that based
on climate science, or political science to scare the public into accepting costly anti-industrial regulations and
taxes? Don Easterbrook, Professor Emeritus of Geology at Western Washington University, knew the answer. He
publicly predicted in 2000 that global temperatures would decline by 2010. He made that prediction because he
knew the PDO had turned cold in 1999, something the political scientists at the UNs IPCC did not know or did not
think significant.

Reducing emissions has no effects empirics prove


Bethel 05 (Tom Bethel, senior editor of The American Spectator, THE FALSE ALERT OF GLOBAL WARMING,
The American Spectator, May 2005, http://spectator.org/articles/55208/false-alert-global-warming)

whether man-made carbon-dioxide emissions have caused measurable


temperature increases over the last 30 years is debated. Carbon dioxide is itself a benign and
essential substance, incidentally. Without it, plants would not grow, and without plant-life
animals could not live. Any increase of carbon dioxide in the atmosphere causes
plants, trees, and forests to grow more abundantly. It should be a treehugger's
delight. The surface data suggest that man-made carbon dioxide has not in fact
increased global temperatures. From 1940 to 1975, coal-fired plants emitted fumes
But

with great abandon and without restraint by Greens. Yet the Earth cooled slightly in
that time. And if man-made global warming is real, atmospheric as well as surface
temperatures should have increased steadily. But they haven't. There was merely
that one-time increase, possibly caused by a solar anomaly . In addition, an "urban heat island
effect" has been identified. Build a tarmac runway near a weather station, and the nearby temperature readings will

Global warming became the focus of activism at the time of the Earth Summit
in Rio, in 1992. Bush the elder signed a climate-change treaty, with signatories agreeing to
reduce carbon dioxide emissions below 1990 levels. The details were worked out in Kyoto, Japan. But America
was the principal target, everyone knew it , and Clinton didn't submit the treaty to the Senate for
ratification. The 1990 date had been carefully chosen. Emissions in Germany and the Soviet Union
were still high; Germany had just absorbed East Germany, then still using inefficient
coal-fired plants. After they were modernized, Germany's emissions dropped , so the
go up.

demand that they be reduced below 1990 levels had already been met and became an exercise in painless

The same was true for the Soviet Union. After its collapse, in 1991,
economic activity fell by about one-third. As for France, most of its electricity comes
from nuclear power, which has no global-warming effects but has been demonized
for other reasons. If the enviros were serious about reducing carbon dioxide they
would be urging us to build nuclear power plants , but that is not on their agenda. They want
windmills (whether or not they kill golden eagles). Under the Kyoto Protocol, U.S. emissions would
have to be cut so much that economic depression would have been the only certain
outcome. We were expected to reduce energy use by about 35 percent within ten years, which might have
moralizing.

meant eliminating one-third of all cars. You can see why the enviros fell in love with the idea.

Impact Defense

Adaption
IPCC consensus proves we can adapt
Rodgers 14 (Paul Rodgers, contributor of general sciences for Forbes, Climate Change: We Can Adapt,
Says IPCC, Forbes, March 31, 2014, http://www.forbes.com/sites/paulrodgers/2014/03/31/climate-change-is-realbut-its-not-the-end-of-the-world-says-ipcc/)

for the first time, the IPCC is offering a glimmer of hope. It acknowledges that
some of the changes will be beneficial including higher crop yields in places like Canada, Europe
and Central Asia and that in others cases, people will be able to adapt to them. The
really big breakthrough in this report is the new idea of thinking about managing
climate change, said Dr Chris Field, the global ecology director at the Carnegie Institution in Washington and
a co-chairman of the report. We have a lot of the tools for dealing effectively with it. We
just need to be smart about it. Climate-change adaptation is not an exotic agenda
that has never been tried, said Field. Governments, firms, and communities around
the world are building experience with adaptation. This experience forms a starting
point for bolder, more ambitious adaptations that will be important as climate and
society continue to change. Adaptations could include better flood defences or
building houses that can withstand tropical cyclones . Vicente Barros, another co-chairman, said:
Investments in better preparation can pay dividends both for the present and for
the future.
Yet

Cutting emissions will not help. Our only hope is to adapt to rising
temperatures
Roach 13 (John Roach, contributing writer for NBC News, It's time to adapt to unstoppable global warming,
scientists say, NBC News, November 7, 2013, http://www.nbcnews.com/science/environment/its-time-adaptunstoppable-global-warming-scientists-say-f8C11554338)

Even if the world's 7 billion people magically stop burning fossil fuels and chopping down
forests today, the greenhouse gases already emitted to the atmosphere will warm the
planet by about 2 degrees Fahrenheit by the end of this century, according to scientists who are urging a
focused scientific effort to help humanity adapt to the changing climate. And reality shows no sign of
such a magic reduction in emissions. The amount of greenhouse gases in the
atmosphere reached another new high in 2012, the World Meteorological Association announced
Wednesday. In fact, concentrations of carbon dioxide, the most abundant planet warming
gas, grew faster last year than its average growth rate of the past decade. "The fact
is, we are not making a lot of progress in reducing emissions ," Richard Moss, a senior
scientist with the Pacific Northwest National Laboratory's Joint Global Change Research Institute at the University of
Maryland, told NBC News. "So

it seems like we really do need to face up to the fact that


there is change that we can no longer avoid and we have to figure out how to
manage." Moss is the lead author of an article in Thursday's issue of Science calling for the development of a
field of climate science that provides relevant, tangible information to decision makers who are tasked to protect
people and cities increasingly battered by extreme weather, flooded by rising seas, and nourished with food grown
on drought-prone lands.

Science which focuses on adapting to climate change rather than

just preventing it is nothing new. It's the need for more information that field of science can yield that's
increasingly vital. Superstorm Sandy and the onslaught of similar catastrophic events bear the fingerprint of climate
change. Growing evidence that more of the same is on the way brings a new urgency for information that people

The push
for adaptation science also represents a shift in climate policy circles away from an
agenda focused solely on cutting greenhouse gas emissions to reduce the impact of
climate change, according to Adam Fenech, a climate scientist at the University of Prince Edward Island in
can actually use to prepare, survive, and even thrive on a changing planet, Moss explained. Hope lost?

Canada who has studied adaptation science for about 15 years. He did not contribute to the article in Science. For

most of this time, "people

wouldn't even consider adaptation. It was a bad word. You


couldn't bring it up," he told NBC News. "But now it has been accepted probably
because people have either abandoned completely or abandoned faith in the
international climate change negotiations ." In addition, Fenech said, people are coming to
grips with the fact that the world is committed to a certain degree of climate change
no matter what is done to cut emissions. The thinking goes " we're going to have
to live with it anyway,"

"And that's adaptation." The Science article


does not advocate abandoning mitigation efforts, but rather elucidates the need for adaptation , noted
he explained.

Joe Casola, director of the science and impacts program at the Center for Climate and Energy Solutions, an
environmental think tank in Arlington, Va. "I think that both of them are going to be required," he told NBC News.
"The more mitigation we do, the less adaptation we'll need," added Casola, who was not involved with the Science
article. "If we were to do no mitigation, then our adaptation investments now are probably going to be made
worthless or very, very limited as we have a lot of changes to deal with in the climate system." Institute of
adaptation Key to making adaptation work is an acknowledgement that climate is just one factor planners are
forced to wrestle with as they make decisions, noted Moss, whose lab is funded by the Department of Energy.

Only adaptation solves


IPCC 14 (IPCC, Adaptation to Global Warming - IPCC Report, Climate UU, 2014, http://climate.uuuno.org/topics/view/23692/)

Adaptation to climate change / global warming will be needed along with mitigation. Because
impacts are already being observed and because the ocean takes a long time to
heat up, mitigation - although crucial - will not be enough. Adaptation unfortunately will
not be a simple matter. The human race is no longer composed of small groups of hunter-gatherers, but
billions of people generally living in highly arranged societies with limited mobility. The worse the impacts
of global warming, the more difficult adaptation will be. We should start planning for
adaptation now, along with mitigation to try to lessen the impact of global warming.
However, adaptation should not be used as an excuse not to mitigate global
warming. The most complete compendium on adaptation is the 2014 IPCC Report , Vol
WGII (Impacts, Adaptation and Vulnerability).

Mitigation and adaptation are possible


Pearson 11 (Charles S. Pearson, Senior Adjunct Professor of International Economics and Environment at
the Diplomatic Academy of Vienna and Professor Emeritus at the School of Advanced International Studies (SAIS),
Johns Hopkins University, Washington, DC, Economics and the Challenge of Global Warming, pg. 106, 2011,
http://books.google.com/books?id=MDFQX2w7N5EC&pg=PA106&lpg=PA106&dq=%22global+warming%22+AND+
%22no+adaptation%22&source=bl&ots=Ki4SHTBz5&sig=NQs_hBA1qjfowQITf0GsEZeXC_g&hl=en&sa=X&ei=qNu9U8riMIqpyASvYCADA&ved=0CEAQ6AEwBA#v=onepage&q=no%20adaptation&f=false)

Bosello et al. (2010) also conclude that mitigation and adaptation are strategic
complements. They fold into an optimization model three types of adaptation measures - anticipatory,
reactive, and adaptation R&D - to sort out the relative contributions and timing of mitigation and adaptation
expenditures in an optimal climate policy.

The results appear to depend heavily on two factors:


uncertainty regarding a catastrophic outcome, and the discount rate. The lower the dis- count
rate and the greater the concern for catastrophic outcomes , the greater the role for
mitigation relative to adaptation. These are not surprising because of the delay between expenditure and
damages prevented with mitigation, and because adaptation copes poorly with catastrophic
outcomes. They also conclude that expenditures should be tilled toward mitigation
in the short run. These analyses are informative, but one should remember we are
far from a world in which governments cooperate in a globally efficient mitigation
effort, where all adaptation opportunities are recognized and acted on, and where allocations for mitigation and
adaptation expenses are drawn from a single purse.

No Extinction
Warming wont lead to complete extinction
Green 11 (Roedy, PHD from British Colombia, Extinction of Man,
http://mindprod.com/environment/extinction.html//umich-mp)

man is burning the


carbon accumulated over millions of years by plants. The CO levels are
now at the level of the Permian extinction. There have been two mass extinctions in earth
Mankind is embarking on a strange ecological experiment. Over a couple of centuries,

history, the Permian, 230 million years ago, was the worst. 70% of all species were lost. It was caused by natural
global warming when volcanoes released greenhouse gases. (The other extinction event more familiar to most
people was the more recent KT Cretaceous-Tertiary Mass Extinction event, 65 million years ago. It was caused when

We
are re-experiencing the same global warming conditions that triggered the
more devastating Permian extinction, only this time it is man made . When it
an asteroid plunged into the earth at Chicxulub Mexico wiping out the dinosaurs and half of earths species.)

gets too hot, plants die. When it gets too hot and dry, massive fires ravage huge areas. When plants die, insects
and herbivores die. When insects die, even heat-resistant plants dont get pollinated and die. Birds die without
insects to eat. Carnivores die without herbivores to eat, all triggered by what seems so innocuous heat. Similarly,
in the oceans, when they get just a few degrees too warm, corals expel their symbiotic algae and die soon
thereafter. When coral reefs die, the fish that live on them die, triggering extinction chains. Satellites can chart the
loss of vegetation over the planet. We are losing 4 species per hour, a rate on the same scale as the Permian and KT
extinction events. Man has no ability to live without the support of other species. We are committing suicide and

will we wipe ourselves out along


with the rest of the planets ecology? Man is very adaptable. He will
destroy his food supply on land and in the oceans as a result, but some
people will survive. That is not complete extinction.
killing the family of life on earth along with us. The question is,

Their impacts are all empirically denied ---- past temperatures


were substantially warmer than the present
Idso and Idso in 7 (Sherwood, Research Physicist @ US Water Conservation laboratory, and Craig,
President of Center for the Study of Carbon Dioxide and Global change and PhD in Geography, Carbon Dioxide and
Global Change: Separating Scientific Fact from Personal Opinion, 6-6,
http://www.co2science.org/education/reports/hansen/HansenTestimonyCritique.pdf)

In an attempt to depict earth's current temperature a s being extremely high and, therefore,
extremely dangerous, Hansen focuses almost exclusively on a single point of the earth's surface in the Western Equatorial Pacific,
for which he and others (Hansen et al., 2006) compared modern sea surface temperatures (SSTs) with paleo-SSTs that were derived by Medina-Elizade and
Lea (2005) from the Mg/Ca ratios of shells of the surface-dwelling planktonic foraminifer Globigerinoides rubber that they obtained from an ocean

concluded that this critical ocean region, and probably the planet as a whole [our italics],
is approximately as warm now as at the Holocene maximum and within ~1C of the maximum temperature of the past
million years [our italics]. Is there any compelling reason to believe these claims of Hansen et al.
about the entire planet? In a word, no, because there are a multitude of other single-point
measurements that suggest something vastly different . Even in their own paper, Hansen et al. present data
sediment core. In doing so, they

from the Indian Ocean that indicate, as best we can determine from their graph, that SSTs there were about 0.75C warmer than they are currently some
125,000 years ago during the prior interglacial. Likewise, based on data obtained from the Vostok ice core in Antarctica, another of their graphs suggests
that temperatures at that location some 125,000 years ago were about 1.8C warmer than they are now; while data from two sites in the Eastern
Equatorial Pacific indicate it was approximately 2.3 to 4.0C warmer compared to the present at about that time. In fact, Petit et al.s (1999) study of the
Vostok ice core demonstrates that large periods of all four of the interglacials that preceded the Holocene were more than 2C warmer than the peak

we dont have to go nearly so far back in time to demonstrate


the non-uniqueness of current temperatures. Of the five SST records that Hansen et
al. display, three of them indicate the mid-Holocene was also warmer than it is
today. Indeed, it has been known for many years that the central portion of the current interglacial was much warmer than its latter stages have
warmth of the current interglacial. But

been. To cite just a few examples of pertinent work conducted in the 1970s and 80s based on temperature reconstructions derived from studies of
latitudinal displacements of terrestrial vegetation (Bernabo and Webb, 1977; Wijmstra, 1978; Davis et al., 1980; Ritchie et al., 1983; Overpeck, 1985) and
vertical displacements of alpine plants (Kearney and Luckman, 1983) and mountain glaciers (Hope et al., 1976; Porter and Orombelli, 1985) we note it
was concluded by Webb et al. (1987) and the many COHMAP Members (1988) that mean annual temperatures in the Midwestern United States were about
2C greater than those of the past few decades (Bartlein et al., 1984; Webb, 1985), that summer temperatures in Europe were 2C warmer (Huntley and
Prentice, 1988) as they also were in New Guinea (Hope et al., 1976) and that temperatures in the Alps were as much as 4C warmer (Porter and
Orombelli, 1985; Huntley and Prentice, 1988). Likewise, temperatures in the Russian Far East are reported to have been from 2C (Velitchko and Klimanov,
1990) to as much as 4-6C (Korotky et al., 1988) higher than they were in the 1970s and 80s; while the mean annual temperature of the Kuroshio Current

between 22 and 35N was 6C warmer (Taira, 1975). Also, the southern boundary of the Pacific boreal region was positioned some 700 to 800 km north of

the
Medieval Warm Period, centered on about AD 1100, had lots of them. In fact, every single week since 1 Feb 2006, we have featured on
our website (www.co2science.org) a different peer-reviewed scientific journal article that testifies to the existence of this
several-centuries-long period of notable warmth , in a feature we call our Medieval Warm Period Record of the
its present location (Lutaenko, 1993). But we neednt go back to even the mid-Holocene to encounter warmer-than-present temperatures, as

Week. Also, whenever it has been possible to make either a quantitative or qualitative comparison between the peak temperature of the Medieval Warm
Period (MWP) and the peak temperature of the Current Warm Period (CWP), we have included those results in the appropriate quantitative or qualitative

a quick perusal of these ever-growing databases (reproduced


indicates that, in the overwhelming majority of cases, the peak
warmth of the Medieval Warm Period was significantly greater than the peak warmth of the
Current Warm Period.
frequency distributions we have posted within this feature; and
below as of 23 May 2007)

Global warming is nearing its end


Solomon 09 (Lawrence Solomon, writer for the Financial Post, The end is near; The media, polls and even
scientists suggest the global warming scare is all over but the shouting, National Post, October 3, 2009,
http://www.lexisnexis.com/hottopics/lnacademic/)

The great global warming scare is over -- it is well past its peak , very much a spent force,
sputtering in fits and starts to a whimpering end. You may not know this yet. Or rather, you may know it but don't want to
acknowledge it until every one else does, and that won't happen until the press, much of which also knows it, formally acknowledges

I know that the global warming scare is over but for the shouting because that's
what the polls show, at least those in the U.S., where unlike Canada the public is polled extensively on global warming.
Most Americans don't blame humans for climate change -- they consider global
warming to be a natural phenomenon. Even when the polls showed the public
believed man was responsible for global warming, the public didn't take the scare
seriously. When asked to rank global warming's importance compared to numerous
other concerns -- unemployment, trade, health care, poverty, crime, and education
among them -- global warming came in dead last. Fewer than 1% chose global
warming as scare-worthy. The informed members of the media read those polls and know the global warming scare is
it.

over, too. Andrew Revkin, The New York Times reporter entrusted with the global warming scare beat, has for months lamented "the
public's waning interest in global warming." His colleague at the Washington Post, Andrew Freedman, does his best to revive public
fear, and to get politicians to act, by urging experts to up their hype so that the press will have scarier material to run with. The

they offered up plagues of locusts in China and a warning that the


be the last for mankind" because "the earth has passed the point of
no return." But the press has also begun to tire of Armageddon All-The-Time, and (I
believe) to position itself for its inevitable attack on the doomsters . In an online article in June
experts do their best to give us the willies. This week
2016 Olympics "could

entitled "Massive Estimates of Death are in Vogue for Copenhagen," Richard Cable of the BBC, until then the most stalwart of scaremongers, rattled off the global warnings du jour -they included a comparison of global warming to nuclear war and a report from the
former Secretary General of the UN, Kofi Annan, to the effect that "every

year climate change leaves over


300,000 people dead, 325-million people seriously affected, and economic losses of
US $125 -billion." Cable's conclusion: "The problem is that once you've sat up and paid
attention enough to examine them a bit more closely, you find that the means by
which the figures were arrived at isn't very compelling... The report contains so
many extrapolations derived from guesswork based on estimates inferred from
unsuitable data." The scientist-scare-mongers, seeing the diminishing returns that come of their escalating claims of
catastrophe, also know their stock is falling. Until now, they have all toughed it out when the data disagreed with their findings -as it

Some scientists, like Germany's Mojib Latif, have


begun to break ranks. Frustrated by embarrassing questions about why the world
hasn't seen any warming over the last decade, Latif, a tireless veteran of the public
speaking circuits, now explains that global warming has paused, to resume in 2020 or perhaps
2030. "People understand what I'm saying but then basically wind up saying, 'We don't
believe anything,'" he told The New York Times this week. And why should they believe anything that comes from the
global warming camp? Not only has the globe not warmed over the last decade but the
does on every major climate issue, without exception.

Arctic ice is returning , the Antarctic isn't shrinking, polar bear populations aren't
diminishing, hurricanes aren't becoming more extreme . The only thing that's scary about the science

is the frequency with which doomsayer data is hidden from public scrutiny, manipulated to mislead, or simply made up. None of this
matters anymore, I recently heard at the Global Business Forum in Banff, where a fellow panelist from the Pew Centre on Global
Climate Change told the audience that, while she couldn't dispute the claims I had made about the science being dubious, the rights
and wrongs in the global warming debate are no longer relevant. "The train has left the station," she cheerily told the business
audience, meaning that the debate is over, global warming regulations are coming in, and everyone in the room -- primarily
business movers and shakers from Western Canada --had better learn to adapt.

CO2 GOOD (AG DA)

1NC (USE AS CASE TURNS)


High levels of CO2 are key to environmental prosperity
Carlisle 01 (John Carlisle, national policy analyst, Carbon Dioxide is Good for the Environment, National
Policy Analysis, April 2001, http://www.nationalcenter.org/NPA334.html)

Carbon dioxide is good for the environment.

That simple fact must be restated to counter

environmentalists' baseless allegations that the accumulation of man-made carbon dioxide, produced by cars,
power plants and other human activities, is causing dangerous global warming. Indeed, far from being a poisonous

carbon dioxide is arguably the Earth's best


friend in that trees, wheat, peanuts, flowers, cotton and numerous other plants
significantly benefit from increased levels of atmospheric carbon dioxide. Dr. Craig
Idso of the Center for the Study of Carbon Dioxide and Global Change, one of the nation's leading
carbon dioxide research centers, examined records of atmospheric carbon dioxide
concentrations and air temperature over the last 250,000 years. There were three dramatic
gas that will wreak havoc on the planet's ecosystem,

episodes of global warming that occurred at the end of the last three ice ages. Interestingly, temperatures started
to rise during those warming periods well before the atmospheric carbon dioxide started to increase. In fact, the
carbon dioxide levels did not begin to rise until 400 to 1,000 years after the planet began to warm. Concludes
Dr. Idso, "Clearly, there is no way that these real-world observations can be construed to even hint at the possibility
that a significant increase in atmospheric carbon dioxide will necessarily lead to any global warming."1 On the other

scientists have lots of evidence demonstrating that increased carbon dioxide


levels leads to healthier plants. A team of scientists in Nevada conducted a five-year
experiment in which they grew one group of ponderosa pine trees at the current carbon dioxide atmospheric
level of about 360 parts per million (ppm) and another group of pines at 700 ppm. The doubled carbon
dioxide level increased tree height by 43 percent and diameter by 24 percent.
Similarly, a team of scientists from Virginia Tech University reported that growing loblolly
pine trees in a greenhouse with a carbon dioxide concentration of 700 ppm increased
average tree height 9 percent, diameter by 7 percent, needle biomass by 16 percent and root biomass
by 33 percent.2 Increased atmospheric carbon dioxide doesn't just make a plant bigger.
Carbon dioxide also makes plants more resistant to extreme weather
hand,

conditions . In a study discussed in the journal Plant Ecology, a team of scientists subjected the Mojave
Desert evergreen shrub to three different concentrations of carbon dioxide - the current level of 360 ppm and at

plants, which were being grown in simulated drought conditions, responded


more favorably in the carbon dioxide-rich environments. Photosynthetic activity
doubled in the 550 ppm environment and tripled at 700 ppm. Increased
photosynthetic activity enables plants to withstand drought better .3 Likewise, a team of
550 ppm and 700 ppm. The

biologists grew seedlings of three yucca plants in cooler greenhouse environments at the 360 ppm and 700 ppm

yucca plants exposed to the enhanced carbon dioxide concentration


showed a greater resistance to the colder temperatures. Dr. Robert Balling , a
climatologist at Arizona State University, notes that by making plants healthier and more
resistant to extreme weather conditions, higher levels of atmospheric carbon
dioxide expands the habitat of many plants, improves rangeland in semi-arid areas
and enhances agricultural productivity in arid areas.4 Another benefit of enhanced
atmospheric carbon dioxide is that it helps the tropical rainforests. Scientists from
Venezuela and the United Kingdom grew several species of tropical trees and other plants in
greenhouse conditions at carbon dioxide concentrations double the current level. The
plants responded favorably, showing an increase in photosynthetic activity. The
scientists concluded that, "In a future atmosphere with a higher carbon dioxide
concentration, these species should be able to show a higher productivity than today."5
Another team of British and New Zealand researchers grew tropical trees for 119 days at elevated
levels of atmospheric carbon dioxide. They found that the enriched carbon dioxide environment
concentrations. The

stimulated the trees' root growth by 23 percent. Expanded root systems help
tropical trees by increasing their ability to absorb water and nutrients.6 Bigger
trees, increased resistance to bad weather, improved agricultural productivity and a
boon to rainforests are just some of the many benefits that carbon dioxide bestows on
the environment. With little evidence that carbon dioxide triggers dangerous global warming but lots of
evidence showing how carbon dioxide helps the environment , environmentalists should be
extolling the virtues of this benign greenhouse gas.

CO2 good for the environment


Idso and Idso 10 (Keith Idso, Vice President of the Center for the Study of Carbon Dioxide and Global
Change, Craig Idso, founder, former president and current chairman of the board of the Center for the Study of
Carbon Dioxide and Global Change, Feeding the Future World , CO2 Science, September 29, 2010,
http://www.co2science.org/articles/V13/N39/EDIT.php)
In terms of confronting this daunting challenge, Zhu et al. say that " meeting

future increases in
demand will have to come from a near doubling of productivity on a land area basis," and
they opine that "a large contribution will have to come from improved photosynthetic
conversion efficiency," for which they estimate that "at least a 50% improvement will
be required to double global production." The researchers' reason for focusing on
photosynthetic conversion efficiency derives from the experimentally-observed
facts that (1) increases in the atmosphere's CO2 concentration increase the
photosynthetic rates of nearly all plants, and that (2) those rate increases generally
lead to equivalent -- or only slightly smaller -- increases in plant productivity on a land area basis, thereby
providing a solid foundation for their enthusiasm in this regard. In their review of the matter, however,
they examine the prospects for boosting photosynthetic conversion efficiency in an
entirely different way: by doing it genetically and without increasing the air's
CO2 content.

So what is the likelihood that their goal can be reached via this approach?

High levels of atmospheric CO2 spur plant growth


Idso and Idso 14 (Dr. Sherwood B. Idso, president of the Center for the Study of Carbon Dioxide and
Global Change, Dr. Keith E. Idso, , Vice President of the Center for the Study of Carbon Dioxide and Global Change,
CO2 to the Rescue ... Again!, CO2 Science, 2014, http://www.co2science.org/articles/V5/N22/COM.php)

Atmospheric CO2 enrichment has long been known to help earth's plants withstand the
debilitating effects of various environmental stresses, such as high temperature, excessive salinity levels and
deleterious air pollution, as well as the negative consequences of certain resource limitations, such as less than

in an important new study,


Johnson et al. (2002) present evidence indicating that elevated levels of atmospheric
CO2 do the same thing for soil microbes in the face of the enhanced receipt of solar
ultraviolet-B radiation that would be expected to occur in response to a 15% depletion of the earth's
stratospheric ozone layer. In addition, their study demonstrates that this phenomenon will
likely have important consequences for soil carbon sequestration . Johnson et al. conducted
optimal levels of light, water and nutrients (Idso and Idso, 1994). Now,

their landmark work on experimental plots of subarctic heath located close to the Abisko Scientific Research Station
in Swedish Lapland (68.35N, 18.82E). The plots they studied were composed of open canopies of Betula

For a period
of five years, the scientists exposed the plots to factorial combinations of UV-B
radiation - ambient and that expected to result from a 15% stratospheric ozone depletion - and
atmospheric CO2 concentration - ambient (around 365 ppm) and enriched (around 600 ppm) - after
pubescens ssp. czerepanovii and dense dwarf-shrub layers containing scattered herbs and grasses.

which they determined the amounts of microbial carbon (Cmic) and nitrogen (Nmic) in the soils of the plots. When
the plots were exposed to the enhanced UV-B radiation level expected to result from a 15% depletion of the planet's
stratospheric ozone layer, the researchers found that the amount of Cmic in the soil was reduced to only 37% of
what it was at the ambient UV-B level when the air's CO2 content was maintained at the ambient concentration.

When the UV-B increase was accompanied by the CO2 increase, however, not only was
there not a decrease in Cmic, there was an actual increase of fully 37 %. The story with

In this case, when the plots


were exposed to the enhanced level of UV-B radiation, the amount of Nmic in the
soil experienced a 69% increase when the air's CO2 content was maintained at the
ambient concentration. When the UV-B increase was accompanied by the CO2
increase, however, Nmic rose even more, experiencing a whopping 138% increase .
respect to Nmic was both similar and different at one and the same time.

These findings, in the words of Johnson et al., "may have far-reaching implications ... because the productivity of
many semi-natural ecosystems is limited by N (Ellenberg, 1988)." Hence, the 138% increase in soil microbial N
observed in this study to accompany a 15% reduction in stratospheric ozone and a concomitant 64% increase in
atmospheric CO2concentration (experienced in going from 365 ppm to 600 ppm) should do wonders in enhancing
the input of plant litter to the soils of these ecosystems, which phenomenon represents the first half of the carbon
sequestration process, i.e., the carbon input stage.

Biodiversity

Mapping Cant Solve


Cant solve, too many regional differences for one machine to
cover
Rosenfeld 12 (Dr. Leslie Rosenfeld is currently a Research Associate Professor at the Naval Postgraduate School and an oceanographic
consultant. She has a Ph.D. in Physical Oceanography, and is a recognized expert on the California Current System, specializing in circulation over the
continental shelf. December 2012. Synthesis of Regional IOOS Build-out Plans for the Next Decade from the Integrated Ocean Observing System
Association.
http://www.ioosassociation.org/sites/nfra/files/documents/ioos_documents/regional/BOP%20Synthesis%20Final.pdf July 6, 2014.)

Types of hazards and needed products have many common elements throughout
the country but also unique regional differences. For example, the Gulf of Mexico,
the Caribbean and the southeastern U.S. experience more frequent and more
intense hurricanes than other regions of the country. Substantial improvements in
the NWS forecasts of storm intensity, track, and timing of passage are necessary
for timely evacuations of communities and offshore facilities in the path of the
storm, while avoiding evacuations that are unnecessary. Long-term plans for
these regions include close coordination of data productS with the needs of
hurricane modelers, including providing information on ocean heat content via airdeployed sensors during hurricane approach and passage, and autonomous
underwater vehicles to monitor the water column. With the rapid loss of sea ice,
Arctic weather and ocean conditions are increasingly endangering Alaska Native
coastal communities. In a statewide assessment, flooding and erosion affects 184
out of 213 Native villages (GAO, 2003). This presents unique challenges in
forecasting and effectively communicating conditions for small communities in
isolated locations. These and many other types of regional differences must be
considered when tailoring and refining common information needs.

Regional Differences just make it to hard for the aff to solve in its
entirety
Rosenfeld 12 (Dr. Leslie Rosenfeld is currently a Research Associate Professor at the Naval Postgraduate School and an oceanographic
consultant. She has a Ph.D. in Physical Oceanography, and is a recognized expert on the California Current System, specializing in circulation over the
continental shelf. December 2012. Synthesis of Regional IOOS Build-out Plans for the Next Decade from the Integrated Ocean Observing System
Association.
http://www.ioosassociation.org/sites/nfra/files/documents/ioos_documents/regional/BOP%20Synthesis%20Final.pdf July 6, 2014.)

Water quality issues also exhibit a variety of regionally unique differences. For
example, water quality contamination in the Great Lakes is not only an issue for
wildlife, but also for 40 million consumers of public drinking water in the region. To
address this concern, GLOS will provide a decision support tool to track the
movement of drinking water contaminants, and will provide 19 model output that
helps county officials manage drinking water intake systems to avoid
contamination. Tracking of water quality plumes and impacts also has unique
challenges in the Gulf of Mexico. Drainage from urban development, industries
and farmland comes into the Gulf from the enormous Mississippi River watershed,
covering 1,245,000 square miles and 41% of the 48 contiguous states of the U.S.
This drainage leads to hypoxic conditions (low dissolved oxygen content) in
summertime over the Texas-Louisiana shelf waters, and delivers large loads of
sediment and associated pollutants to nearshore environments such as recreational
waters and shellfish beds. Informed management decisions require a broad
distribution of platforms, sensors and derived products to track this plume and its
impacts through estuaries, nearshore and offshore habitats.

Mapping cant solvetoo difficult to perfect and maintain


Rosenfeld 12 (Dr. Leslie Rosenfeld is currently a Research Associate Professor at the Naval Postgraduate School and an oceanographic
consultant. She has a Ph.D. in Physical Oceanography, and is a recognized expert on the California Current System, specializing in circulation over the
continental shelf. December 2012. Synthesis of Regional IOOS Build-out Plans for the Next Decade from the Integrated Ocean Observing System
Association.
http://www.ioosassociation.org/sites/nfra/files/documents/ioos_documents/regional/BOP%20Synthesis%20Final.pdf July 6, 2014.)

Effective utilization of the observing system requires translation of the data and
model outputs into targeted products and decision support tools for a variety of
users. The process of product development ultimately links management
decisions to desired products, to the information needed to produce the products,
to data and models required to create that information and finally to the essential
observing system requirements. Successful engagement with users and the
resultant product development is at the heart of ultimate success of the 10-year
build-out plan. The 27 key products identified as part of the build-out plans include
products that have already been successfully developed and utilized by one or
more RAs as well as many cases where development is either not underway or is in
the early stages. Successful completion of development for these 27 products for
all the regions, and completion of additional unique regional products will involve
multiple components, as outlined below. Sufficient funding needs to be identified to
carry the process for a given product need through to completion, to ensure
effective management and meeting of user expectations. Iterative two-way
engagement with users is required for effective product development, often based
on small group discussions with key decision makers and technical staff in the user
community. It may also include surveys, user advisory panels, targeted workshops,
and discussions at existing forums of the targeted users. This provides an
understanding of the users objectives, the scope of the issue and the decision
processes they use. Those user decisions or management needs that could most
benefit from IOOS products can then be identified and prioritized, including
products involving data, processed information, visualizations, models and decision
support tools. Product development teams focused on specific priority needs
should be created, including RA staff, key partners and users. The RAs should act
as intermediaries that can translate between researchers and users, and should
stay engaged throughout the process. After fully understanding user needs, the
teams evaluate the variables, temporal and spatial scale and resolution, and data
quality required to meet those needs. They evaluate current coastal IOOS, or
other, products that could be used or modified, or new products that could be
developed to meet needs. This should include evaluation across all 11 RAs and
federal agencies to identify useful 26 building blocks and approaches for products.
Identification of any gaps between the users information requirements and current
coastal IOOS capabilities can then be identified, and, where possible, technical
means to fill them can be developed. This should include engaging partners who
may provide critical data or fulfill specific modeling needs, and pursuit of funding
sources to fill gaps, if not already obtained. In addition to addressing technical
issues, the team should identify any institutional issues among users that must be
overcome to fully utilize IOOS products, and where possible work with them to
develop approaches to overcome these barriers. Institutional issues include agency
and industry policies and practices, regulations and permit procedures,
perspectives and communication patterns, staff expertise, training and workload,
and/or governance structures that could impede full use of IOOS information in
decision-making. Addressing these issues will likely require engagement with
management representatives of the user agency or organization. Development of
the initial product includes software development to access and integrate data,

and any necessary documentation and instructions, including description of the


metadata. Testing and evaluation of the product must be conducted with a subset
of the user group, followed by training for a larger user group, along with
refinement of the product as needed. Product development also includes
evaluation of the most effective means to package and distribute the information,
e.g. through websites, e-mail notifications, use of existing channels already favored
by the user, reports, etc. Product release should be prefaced by an evaluation to
ensure that the product is ready to move from the developmental stage to full
operational use and broad implementation, and a notification system for the
targeted user group. Operational implementation of the product should include
ongoing engagement with the users to adapt and modify products as needed as
experience builds with product use, conditions or requirements change, or new
data becomes available. As is evident from the above list, development of
effective user products is a critical but costly and time-consuming endeavor, and
often includes work that is different from what is involved in the modeling and data
management subsystems. Staff needs for product development and user
engagement typically range from five to six full-time equivalents (FTE)s for each RA,
and involve multiple types of staff including technical, outreach and management
needed at various times during the activities outlined above. Meeting the need for
adequate product development funding and staff time, and achieving coordination
and synergism among the RAs in creating the common products will be an
essential component of success in implementing the build-out plan over the next
10 years.

SQ Solves-Mapping
Data Insufficiency is being fixed in the Status Quothe plan
isnt needed
Piotroski 14 (Jon Piotroski is known for his articles on the science development website. He has written numerous articles pertaining to
oceanography, climate change, and other environmental issues. 03/20/14 from SciDev. Tidal wave of ocean data leaves scientists swamped.
http://www.scidev.net/global/data/news/tidal-wave-of-ocean-data-leaves-scientists-swamped.html July 3, 2014)

A lack of data curators and managers capable of cleaning up observational


measurements, particularly in less developed nations, is limiting the scale and
scope of ocean research, researchers have said on the sidelines of an oceans
science meeting. To deal with the challenge, the Ocean Sciences Meeting in Hawaii
last month (23-28 February) heard proposals for creating a comprehensive, quality
controlled database. This would help overcome limited scientific capacity and allow
under-resourced researchers to better participate in efforts to understand pressing
issues such as climate change, scientists said. The complexities involved in
transforming raw oceanographic data into a useable state means researchers with
limited data skills or financial resources are less able to meet the necessary
standard, said Stephen Diggs, a data manager at the Scripps Institution of
Oceanography, in the United States. The explosion of ocean data being collected
through modern methods exceeds scientists capacity to deal with it, he told
SciDev.Net on the fringes of the conference. There definitely needs to be
workforce development to get more data curators and managers that come out of
their training ready to deal with this, he said. Interdisciplinary work that is crucial
for answering questions such as what effect climate change is having on marine
environments is especially sensitive to reliability issues, he added. This is because,
when faced with data sets from various disciplines, researchers often lack the
expertise to spot and correct all the biases created by the idiosyncrasies of different
regions and collection methods, he said. Providing researchers with rigorously
quality controlled data from outside their immediate field was the motivation behind
a proposed initiative, which called for scientists support at a town hall meeting
during the conference. The International Quality-Controlled Ocean Database
(IQuOD) would combine automated, large-scale data cleaning done by computers
with finer-grained expert analysis. The goal would be to produce a definitive
database of the data from the 13 million locations with sea surface temperature
records some dating back to the eighteenth century scattered across various
institutions. According to Diggs, this would significantly improve on the
oceanographic database hosted at the US National Oceanic and Atmospheric
Administrations (NOAAs) National Oceanographic Data Center. Even though
NOAA's only conducts limited automated quality control its repository of marine
datasets is the worlds most comprehensive. Other variables, such as salinity and
oxygen concentrations in the water, could be added to the temperature data as the
IQuOD project progresses, said Diggs.

IOOS and other tech already solve


National Academy 11 (provide a public service by working outside the framework of government to ensure

independent advice on matters of science, technology, and medicine. They enlist committees of the nations top scientists,
engineers, and other experts all of whom volunteer their time to study specific concerns. 08/24/2011. Ocean Exploration from the
National Academies. http://oceanleadership.org/wp-content/uploads/2009/08/Ocean_Exploration.pdf July 6, 2014).

U.S. and international efforts have made significant progress in recent years to-
ward establishing ocean observatories. The Integrated Ocean Observing System
(IOOS), the U.S. contribution to the international Global Ocean Observing Sys- tem,
is designed as a large network that collects high-resolution data along the entire
U.S. coast. This information supports a wide range of operational services,
including weather forecasts, coastal warning systems, and the monitoring of algal
blooms. The Ocean Observatories Initiative (OOI), a National Science Foundation
program, will enable land-based exploration and monitoring of processes
throughout the seafloor, water column, and overlying atmosphere by real-time,
remote interactions with arrays of sensors, in- struments, and autonomous
vehicles. For the first time, The worlds first underwater cabled observatory to span
an entire plate will be installed as part of a collaborative effort between the U.S.
Ocean Observatories Initiative (OOI) and U.S. and Canadian institutions. Fiberoptic
cables will extend high bandwidth and power to a network of hundreds of sensors
across, above, and below the seafloor, allowing in situ, interactive monitoring of
the ocean and seafloor for the next 20 to 30 years. Image courtesy University of
Washington Center for Environmental Visualization. Discover scientists, educators
and the public will have real- time access to the data and imagery being
collected, so that they can learn about events such as underwater volcanic
eruptions and anoxia events as they happen, even if there are no ships in the
area.

SQ Conservation Solves
Status Quo Solves -- US already implementing conservation programs
to prevent impacts
Tullo 14 (Michelle Tulo is a writer for various news magazines, in particular IPS. IPS is a reputable communication institution that specializes in
global news. U.S. Turns Attention to Ocean Conservation, Food Security from IPS (Inter Press Service) on 06/19/14. http://www.ipsnews.net/2014/06/u-sturns-attention-to-ocean-conservation-food-security/ July 7, 2014.)

A first-time U.S.-hosted summit on protecting the oceans has resulted in pledges


worth some 800 million dollars to be used for conservation efforts. During the
summit, held here in Washington, the administration of President Barack Obama
pledged to massively expand U.S.-protected parts of the southern Pacific Ocean. In
an effort to strengthen global food security, the president has also announced a
major push against illegal fishing and to create a national strategic plan for
aquaculture. If we drain our resources, we wont just be squandering one of
humanitys greatest treasures, well be cutting off one of the worlds leading
sources of food and economic growth, including for the United States, President
Obama said via video Tuesday morning. The Our Ocean conference, held Monday
and Tuesday at the U.S. State Department, brought together ministers, heads of
state, as well as civil society and private sector representatives from almost 90
countries. The summit, hosted by Secretary of State John Kerry, focused on
overfishing, pollution and ocean acidification, all of which threaten global food
security. In his opening remarks, Kerry noted that ocean conservation constitutes a
great necessity for food security. More than three billion people, 50 percent of
the people on this planet, in every corner of the world depend on fish as a
significant source of protein, he said. Proponents hope that many of the solutions
being used by U.S. scientists, policymakers and fishermen could serve to help
international communities. There is increasing demand for seafood with
diminished supply We need to find ways to make seafood sustainable to rich and
poor countries alike, Danielle Nierenberg, the president of FoodTank, a Washington
think tank, told IPS.

No Impact-Species
No species snowball
Roger A Sedjo 2k, Sr. Fellow, Resources for the Future, Conserving Natures
Biodiversity: insights from biology, ethics & economics, eds. Van Kooten, Bulte and
Sinclair, p 114
As a critical input into the existence of humans and of life on earth, biodiversity obviously has a very high value (at
least to humans). But, as with other resource questions, including public goods,

biodiversity is not an

either/or question, but rather a question of how much.

Thus, we may argue as to how much


biodiversity is desirable or is required for human life (threshold) and how much is desirable (insurance) and at what
price, just as societies argue over the appropriate amount and cost of national defense. As discussed by Simpson,
the value of water is small even though it is essential to human life, while diamonds are inessential but valuable to
humans. The reason has to do with relative abundance and scarcity, with market value pertaining to the marginal

Although biological diversity is


essential, a single species has only limited value, since the global system will
continue to function without that species. Similarly, the value of a piece of biodiversity
(e.g., 10 ha of tropical forest) is small to negligible since its contribution to the functioning of
the global biodiversity is negligible. The global ecosystem can function with
somewhat more or somewhat less biodiversity, since there have been larger
amounts in times past and some losses in recent times . Therefore, in the absence of
evidence to indicate that small habitat losses threaten the functioning of the global
life support system, the value of these marginal habitats is negligible . The value
unit. This water-diamond paradox can be applied to biodiversity.

question is that of how valuable to the life support function are species at the margin. While this, in principle, is an

thus far, biodiversity losses appear


to have had little or no effect on the functioning of the earths life support system,
presumably due to the resiliency of the system, which perhaps is due to the
redundancy found in the system . Through most of its existence, earth has had far less biological
empirical question, in practice it is probably unknowable. However,

diversity. Thus, as in the water-diamond paradox, the value of the marginal unit of biodiversity appears to be very
small.

No extinction
Easterbrook 2003
Gregg, senior fellow at the New Republic, We're All Gonna Die!
http://www.wired.com/wired/archive/11.07/doomsday.html?
pg=2&topic=&topic_set=
If we're talking about doomsday - the end of human civilization - many scenarios simply don't
measure up. A single nuclear bomb ignited by terrorists, for example, would be awful beyond words, but life
would go on. People and machines might converge in ways that you and I would find ghastly, but from the

Environmental collapse might


make parts of the globe unpleasant, but considering that the biosphere has survived
ice ages, it wouldn't be the final curtain . Depression, which has become 10 times more prevalent in
standpoint of the future, they would probably represent an adaptation.

Western nations in the postwar era, might grow so widespread that vast numbers of people would refuse to get out
of bed, a possibility that Petranek suggested in a doomsday talk at the Technology Entertainment Design
conference in 2002. But Marcel Proust, as miserable as he was, wrote Remembrance of Things Past while lying in
bed.

No Impact-Exaggerated
Environmental threats exaggerated
Gordon 95 - a professor of mineral economics at Pennsylvania State University
[Gordon, Richard, Ecorealism Exposed, Regulation, 1995,
http://www.cato.org/pubs/regulation/regv18n3/reg18n3-readings.html
the environmental
movement has exaggerated the threats and ignored evidence of improvement . His
Easterbrook's argument is that although environmental problems deserve attention,

discontent causes him to adopt and incessantly employ the pejoratively intended (and irritating) shorthand
"enviros" to describe the leading environmental organizations and their admirers. He proposes-and overuses-an
equally infelicitous alternative phrase, "ecorealism," that seems to mean that most environmental initiatives can be
justifited by more moderate arguments. Given the mass, range, and defects of the book, any review of reasonable
length must be selective. Easterbrook's critique begins with an overview of environmentalism from a global
perspective. He then turns to a much longer (almost 500- page) survey of many specific environmental issues. The
overview section is a shorter, more devastating criticism, but it is also more speculative than the survey of specific

human impacts on the environment are minor,


easily correctable influences on a world affected by far more powerful forces . That is a
issues. In essence, the overview argument is that

more penetrating criticism than typically appears in works expressing skepticism about environmentalism.

mankind's effects on nature long predate industrialization or the


white colonization of America, but still have had only minor impacts. We are then
reminded of the vast, often highly destructive changes that occur naturally and the
recuperative power of natural systems.
Easterbrook notes that

Prefer our evidence Environmental apocalypse scenarios are


always overblown and recent human advancements solve.
Ronald Bailey 2k, science correspondent, author of Earth Report 2000: Revisiting
the True State of the Planet, former Brookes Fellow in Environmental Journalism at
the Competitive Enterprise Institute, member of the Society of Environmental
Journalists, adjunct scholar at the Cato Institute, May 2000, Reason Magazine,
Earth Day, Then and Now, http://reason.com/0005/fe.rb.earth.shtml

Earth Day 1970 provoked a torrent of apocalyptic predictions. We have about five more years at the outside to do
something, ecologist Kenneth Watt declared to a Swarthmore College audience on April 19, 1970. Harvard biologist
George Wald estimated that civilization will end within 15 or 30 years unless immediate action is taken against
problems facing mankind. We are in an environmental crisis which threatens the survival of this nation, and of the
world as a suitable place of human habitation, wrote Washington University biologist Barry Commoner in the Earth
Day issue of the scholarly journal Environment. The day after Earth Day, even the staid New York Times editorial
page warned, Man must stop pollution and conserve his resources, not merely to enhance existence but to save
the race from intolerable deterioration and possible extinction. Very Apocalypse Now. Three decades later, of
course, the world hasnt come to an end; if anything, the planets ecological future has never looked so promising.
With half a billion people suiting up around the globe for Earth Day 2000, now is a good time to look back on the
predictions made at the first Earth Day and see how theyve held up and what we can learn from them. The short

The prophets of doom were not simply wrong, but spectacularly wrong. More
important, many contemporary environmental alarmists are similarly mistaken
when they continue to insist that the Earths future remains an eco-tragedy that has
already entered its final act. Such doomsters not only fail to appreciate the huge
environmental gains made over the past 30 years, they ignore the simple fact that
increased wealth, population, and technological innovation dont degrade and
destroy the environment. Rather, such developments preserve and enrich the
environment. If it is impossible to predict fully the future, it is nonetheless possible
to learn from the past. And the best lesson we can learn from revisiting the discourse surrounding the very
answer:

first Earth Day is that passionate concern, however sincere, is no substitute for rational analysis.

No Impact-Resilient
No impact --- Ocean ecosystems are resilient
CO2 Science 2008
Marine Ecosystem Response to "Ocean Acidification" Due to Atmospheric CO2
Enrichment, Vogt, M., Steinke, M., Turner, S., Paulino, A., Meyerhofer, M., Riebesell,
U., LeQuere, C. and Liss, P. 2008. Dynamics of dimethylsulphoniopropionate and
dimethylsulphide under different CO2 concentrations during a mesocosm
experiment. Biogeosciences 5: 407-419,
http://www.co2science.org/articles/V11/N29/B2.php
Vogt et al. report that they detected no significant phytoplankton species shifts between treatments, and that " the
ecosystem composition, bacterial and phytoplankton abundances and productivity,
grazing rates and total grazer abundance and reproduction were not significantly
affected by CO2 induced effects," citing in support of this statement the work of Riebesell et al. (2007),
Riebesell et al. (2008), Egge et al. (2007), Paulino et al. (2007), Larsen et al. (2007), Suffrian et al. (2008) and
Carotenuto et al. (2007). In addition, they say that "while DMS stayed elevated in the treatments with elevated
CO2, we observed a steep decline in DMS concentration in the treatment with low CO2," i.e., the ambient CO2

the eight researchers say their


observations suggest that "the system under study was surprisingly resilient to
abrupt and large pH changes," which is just the opposite of what the world's climate
alarmists characteristically predict about CO2-induced "ocean acidification." And
that may be why Vogt et al. described the marine ecosystem they studied as
"surprisingly resilient" to such change: it may have been a little unexpected .
treatment. What it means With respect to their many findings,

Nature sustains damage and recovers.


Easterbrook 95 Distinguished Fellow, Fulbright Foundation (Gregg, A Moment
on Earth)
Nature is not ending, nor is human damage to the environment unprecedented.
Nature has repelled forces of a magnitude many times greater than the worst
human malfeasance. Nature is no ponderously slow. Its just old. Old and slow are quite different concepts.
That the living world can adjust with surprising alacrity is the reason nature has
been able to get old. Most natural recoveries from ecological duress happen with
amazing speed. Significant human tampering with the environment has been in progress for at least ten

millennia and perhaps longer. If nature has been interacting with genus Homo for thousands of years, then the
living things that made it to the present day may be ones whose genetic treasury renders them best suited to resist

This does not ensure any creature will continue to survive any clash
with humankind. It does make survival more likely than doomsday orthodox asserts .
If natures adjustment to the human presence began thousands of years ago, perhaps it will soon be complete. Far
from reeling helplessly before a human onslaught, nature may be on the verge of
reasserting itself. Nature still rules much more of the Earth than does genus Homo. To the statistical majority
human mischief.

of natures creatures the arrival of men and women goes unnoticed.

No Impact-long TF
Their impact has a three hundred year timeframe
CO2 Science 2005
The Fate of Fish in a High-CO2 World, Ishimatsu, A., Hayashi, M., Lee, K.-S., Kikkawa,
T. and Kita, J. 2005. Physiological effects of fishes in a high-CO2 world. Journal of
Geophysical Research 110: 10.1029/2004JC002564,
http://www.co2science.org/articles/V8/N42/B3.php
Although this conclusion sounds dire indeed, it represents an egregious flight of the
imagination in terms of what could realistically be expected to happen anytime in
earth's future. Ishimatsu et al. report, for example, that "predicted future CO2
concentrations in the atmosphere are lower than the known lethal concentrations
for fish," noting that "the expected peak value is about 1.4 torr [just under 1850 ppm] around the
year 2300 according to Caldeira and Wickett (2003)." So just how far below the lethal CO2
concentration for fish is 1.4 torr? In the case of short-term exposures on the order of
a few days, the authors cite a number of studies that yield median lethal
concentrations ranging from 37 to 50 torr, which values are 26 and 36 times greater
than the maximum CO2 concentration expected some 300 years from now!

No solvency-Past Tipping Point


Were passed the tipping point.
AFP, Agence France Presse, September 15, 1999, Outlook Grim For Worlds
Environment Says UN, http://www.rense.com/earthchanges/grimoutlook_e.htm
The United Nations warned Wednesday that the worlds environment was facing catastrophic damage as the new
millennium nears, ranging from irreversible destruction to tropical rainforests to choking air pollution and a threat to
the polar ice caps. In a lengthy report, the UN Environment Programme painted a grim tableau for the planets
citizens in the next millennium, saying time was fast running out to devise a policy of sustainable human
development. And for some fragile eco-systems and vulnerable species, it is already too late, warns the report,
called GEO-2000. Tropical

forest destruction has gone too far to prevent irreversible

damage. It would take many generations to replace the lost forests, and the cultures that have been lost with
them can never be replaced, it warns. Many of the planets species have already been lost or
condemned to extinction because of the slow response times of both the environment and
policy-makers; it is too late to preserve all the bio-diversity the planet had. Sounding the
alarm, the UNEP said the planet now faced full-scale emergencies on several fronts, including
these: -- it is probably too late to prevent global warming , a phenomenon whereby exhaust gases
and other emissions will raise the temperature of the planet and wreak climate change. Indeed, many of the
targets to reduce or stabilise emissions will not be met, the report says. -- urban air pollution
problems are reaching crisis dimensions in the developing worlds mega-cities, inflicting damage to the health of

the seas are being grossly over-exploited and even with strenuous
efforts will take a long time to recover .
their inhabitants. --

Too many alt causes to solve


Pynn 07 Journalist for the Vancouver Sun [Larry Pynn. Global warming not
biggest threat: expert. The Vancouver Sun. January 27, 2007.
http://www.canada.com/vancouversun/news/story.html?id=6e2988da-31ab-4697810d-7a008306d571]
The biggest immediate threat to the vast majority of endangered species in Canada is
not global warming, but issues such as habitat loss, pollution and overexploitation . "We all
worry about climate change, as we should, but it doesn't mean we shouldn't worry about protecting habitat," says
James Grant, a biology professor at Concordia University in Montreal and co-author of a new report on threats to
endangered species in Canada. "The

really immediate causes right now for many species are


things like farming, urbanization and habitat loss caused by the direct things we do ."
Research by Grant and his pupils shows the biggest threat is habitat loss at 84 percent,
overexploitation 32 percent, native species interactions 31 percent, natural causes
27 percent, pollution 26 percent, and introduced species 22 percent . On average, species
are threatened by at least two of the six categories. Human activities representing the biggest source of habitat
loss and pollution are not industrial resource extraction, but agriculture at 46 per cent and urbanization at 44 per
cent. "Farming is huge," Grant said in an interview. "The Prairies are one of the most affected habitats in the world.
We've turned them into wheat fields."The southern Okanagan-Similkameen is another example, home to about onethird of species at risk in B.C. as well as a thriving agricultural industry, including vineyards, and increased urban
development.

No Solvency-Overpopulation swamps
Population growth makes biodiversity loss inevitable
Gaston 5 [Kevin J. Gaston Biodiversity and Macroecology Group, Department of
Animal and Plant Sciences, University of Sheffield. Progress in Physical Geography
29, 2 (2005) pp. 239247. Biodiversity and extinction: species and people
http://www.epa.gov/ncer/biodiversity/pubs/ppg_vol29_239.pdf //jweideman]
The human population is predicted to grow by 2 to 4 billion people by 2050 (United
Nations, 2001). While it took until about 1800 to attain a global population of I billion people, a medium
projection is that it may take just 13 to 14 years to add another billion to the present total (Cohen, 2003). All else remaining equal, which it seldom does, a number of pre- dictions
would seem to follow from the work that has been conducted to date on the relationships between human densities and the numbers of native species, numbers or
pro- portions of threatened species, and the numbers or proportions of introduced
species. First, the spatial scale at which relationships between overall numbers of native species and human

density become hump-shaped or at least gain marked negative phases seems likely to increase, even when species

Increased human densification


will mean that the maintenance and conservation of tracts of natural or seminatural vegetation will become more difficult in areas of higher human density. Secondly, the
numbers and proportions of threatened species in different areas will tend to
increase. McKee et al. (2003) have used existing relationships between numbers of
threatened species and numbers of people in different areas to predict the
consequences for biodiversity of continued increases in the human population. They
found that the num- ber of threatened bird and mammal species across 114
continental nations is expected to increase in the average nation by 7% by 2020 and 14% by 2050,
numbers and human density are mapped at a low spa- tial resolution.

on the basis of human pop- ulation growth alone. Such aggregate estimates provide no indication of precisely what
this is likely to do for the overall propor- tion of species that are globally threatened with extinction, but these
increases can only serve to increase the 12% of bird species and the 23% of mammals currently listed as such
(lUCN, 2003).

increase.

Likewise, the proportion of species that have become globally extinct will

No Impact-Tech Solves
Tech solves the impact
Stossel 2007
John, Investigative reporter for Fox news, Environmental Alarmists Have It
Backwards
http://www.realclearpolitics.com/articles/2007/04/how_about_economic_progress_da
.html
you'd think that the earth was in imminent danger -- that
human life itself was on the verge of extinction . Technology is fingered as the perp. Nothing
could be further from the truth. John Semmens of Arizona's Laissez Faire Institute points out that Earth
Watching the media coverage,

Day misses an important point. In the April issue of The Freeman magazine, Semmens says the environmental
movement overlooks how hospitable the earth has become -- thanks to technology. " The

environmental
alarmists have it backwards. If anything imperils the earth it is ignorant obstruction
of science and progress. ... That technology provides the best option for serving
human wants and conserving the environment should be evident in the progress
made in environmental improvement in the United States . Virtually every measure
shows that pollution is headed downward and that nature is making a comeback ."
(Carbon dioxide excepted, if it is really a pollutant.) Semmens describes his visit to historic Lexington and Concord
in Massachusetts, an area "lush with trees and greenery." It wasn't always that way. In 1775, the land was cleared
so it could be farmed. Today, technology makes farmers so efficient that only a fraction of the land is needed to
produce much more food. As a result, "Massachusetts farmland has been allowed to revert back to forest." Human
ingenuity and technology not only raised living standards, but also restored environmental amenities. How about a
day to celebrate that?

No Impact-Acidification
No ocean acidification problem its natural
Idso et al 2009
Sherwood, founder and former President of the Center for the Study of Carbon
Dioxide and Global Change and currently serves as Chairman of the Center's board
of directors, The Ocean Acidification Fiction Volume 12, Number 22: 3 June 2009
There is considerable current concern that the ongoing rise in the air's CO2 content
is causing a significant drop in the pH of the world's oceans in response to their absorption of a

large fraction of each year's anthropogenic CO2 emissions. It has been estimated, for example, that the globe's
seawater has been acidified (actually made less basic) by about 0.1 pH unit relative to what it was in pre-industrial
times; and model calculations imply an additional 0.7-unit drop by the year 2300 (Caldeira and Wickett, 2003),
which decline is hypothesized to cause great harm to calcifying marine life such as corals. But just how valid are

Whenever the results of theoretical calculations are proposed as the basis


for a crisis of some kind or other, it is always good to compare their predictions
against what is known about the phenomenon in the real world . In the case of oceanic pH,
for example, Liu et al. (2009) write in an important new paper that "the history of ocean pH variation
during the current interglacial (Holocene) remains largely unknown," and that it
"would provide critical insights on the possible impact of acidification on marine
ecosystems." Hence, they set about to provide just such a context. Working with eighteen samples of fossil and
these claims?

modern Porites corals recovered from the South China Sea, the nine researchers employed 14C dating using the
liquid scintillation counting method, along with positive thermal ionization mass spectrometry to generate high
precision 11B (boron) data, from which they reconstructed the paleo-pH record of the past 7000 years that is

there is nothing unusual, unnatural or


unprecedented about the two most recent pH values. They are neither the lowest of
the record, nor is the decline rate that led to them the greatest of the record.
Hence, there is no compelling reason to believe they were influenced in any way by
the nearly 40% increase in the air's CO2 concentration that has occurred to date over the course of
depicted in the figure below. As can be seen from this figure,

the Industrial Revolution. As for the prior portion of the record, Liu et al. note that there is also "no correlation
between the atmospheric CO2 concentration record from Antarctica ice cores and 11B-reconstructed paleo-pH
over the mid-late Holocene up to the Industrial Revolution." Further enlightenment comes from the earlier work of
Pelejero et al. (2005), who developed a more refined history of seawater pH spanning the period 1708-1988
(depicted in the figure below), based on 11B data obtained from a massive Porites coral from Flinders Reef in the
western Coral Sea of the southwestern Pacific. These researchers also found that " there

is no notable trend
toward lower 11B values." Instead, they discovered that "the dominant feature of
the coral 11B record is a clear interdecadal oscillation of pH , with 11B values ranging
between 23 and 25 per mil (7.9 and 8.2 pH units)," which they say "is synchronous with the
Interdecadal Pacific Oscillation." Going one step further, Pelejero et al. also compared their results with
coral extension and calcification rates obtained by Lough and Barnes (1997) over the same 1708-1988 time period;
and as best we can determine from their graphical representations of these two coral growth parameters, extension
rates over the last 50 years of this period were about 12% greater than they were over the first 50 years, while
calcification rates were approximately 13% greater over the last 50 years. Most recently, Wei et al. (2009) derived
the pH history of Arlington Reef (off the north-east coast of Australia) that is depicted in the figure below. As can be

there was a ten-year pH minimum centered at about 1935 (which obviously


was not CO2-induced) and a shorter more variable minimum at the end of the
record (which also was not CO2-induced); and apart from these two non-CO2-related exceptions,
the majority of the data once again fall within a band that exhibits no long-term
trend, such as would be expected to have occurred if the gradual increase in
atmospheric CO2 concentration since the inception of the Industrial Revolution were
truly making the global ocean less basic. In light of these several diverse and
independent assessments of the two major aspects of the ocean acidification
hypothesis -- a CO2-induced decline in oceanic pH that leads to a concomitant decrease in coral growth rate -seen there,

it would appear that the catastrophe conjured up by the world's climate alarmists is
but a wonderful work of fiction.

Seagrasses solve the impact acidification refuge.


CO2 Science 2013
Seagrasses Enable Nearby Corals to Withstand Ocean Acidification, v16 n10
http://www.co2science.org/articles/V16/N10/B2.php
although many people expect future ocean acidification
(OA) due to rising atmospheric CO2 concentrations to reduce the calcification rates of marine
organisms, they say we have little understanding of how OA will manifest itself within
dynamic, real-world systems, because, as they correctly note, "natural CO2, alkalinity, and salinity
gradients can significantly alter local carbonate chemistry , and thereby create a range of
Background The authors state that

susceptibility for different ecosystems to OA." What was done "To determine if photosynthetic CO2 uptake

Manzello et al.
repeatedly measured carbonate chemistry across an inshore-to-offshore gradient in the upper,
middle and lower Florida Reef Tract over a two-year period. What was learned During times of heightened
oceanic vegetative productivity, the five U.S. researchers found "there is a net uptake of total
CO2 which increases aragonite saturation state (arag) values on inshore patch reefs of the upper Florida Reef
associated with seagrass beds has the potential to create OA refugia," as they describe it,

Tract," and they say that "these waters can exhibit greater arag than what has been modeled for the tropical
surface ocean during preindustrial times, with mean arag values in spring equaling 4.69 0.10." At the same
time, however, they report that arag values on offshore reefs "generally represent oceanic carbonate chemistries

the
pattern described above "is caused by the photosynthetic uptake of total CO2 mainly by
seagrasses and, to a lesser extent, macroalgae in the inshore waters of the Florida Reef Tract." And they
therefore conclude that these inshore reef habitats are "potential acidification
refugia that are defined not only in a spatial sense, but also in time, coinciding with seasonal productivity
dynamics," which further implies that "coral reefs located within or immediately downstream
of seagrass beds may find refuge from ocean acidification ." And in further support of this
consistent with present day tropical surface ocean conditions." What it means Manzello et al. hypothesize that

conclusion, they cite the work of Palacios and Zimmerman (2007), which they describe as indicating that
"seagrasses

exposed to high-CO2 conditions for one year had increased


reproduction, rhizome biomass, and vegetative growth of new shoots, which could
represent a potential positive feedback to their ability to serve as ocean
acidification refugia."

Adaptation will solve acidification


Knappenberger 10 (Chip Knappenberger is the assistant director of the Center for the Study of Science at the Cato Institute,

and coordinates the scientific and outreach activities for the Center. He has over 20 years of experience in climate research and public outreach. January
6, 2010. Ocean Acidification: Another Failing Scare Story from Master Resource. http://www.masterresource.org/2010/01/ocean-acidification-anotherfailing-scare-story/ July 6, 2014.)

The folks over at the Science and Public Policy Institute have taken it upon
themselves to look a bit more closely into the ocean acidification story and see just
what the scientific literature has to say about how organisms are actually
responding to changes in ocean pH rather than just talk about how they may
respond. What SPPI finds is neither newsworthy nor catastrophic, simply that, as
has always been the case, the aquatic organisms of the worlds oceans are very
adaptable and are able to respond to environmental changes in such a way as to
avoid large-scale detrimental impacts. -

Rising CO2 cant cause acidification 4 warrants and history


proves
SPPI 10 (The Science and Public Policy Institute provide research and educational materials dedicated to sound public policy based on sound
science. Only through science and factual information, separating reality from rhetoric, can legislators develop beneficial policies without unintended
consequences that might threaten the life, liberty, and prosperity of the citizenry. (SPPI). 01/05/10.
A New Propaganda Film by Natl. Resources Defense Council Fails the Acid Test of Real World Data from SPPI.
http://scienceandpublicpolicy.org/originals/acid_test.html July 6, 2014.)

First, because it has not done so before. During the Cambrian era, 550 million years
ago, there was 20 times as much CO2 in the atmosphere as there is today: yet
that is when the calcite corals first achieved algal symbiosis. During the Jurassic
era, 175 million years ago, there was again 20 times as much CO2 as there is
today: yet that is when the delicate aragonite corals first came into being.
Secondly, ocean acidification, as a notion, suffers from the same problem of scale as
global warming. Just as the doubling of CO2 concentration expected this century
will scarcely change global mean surface temperature because there is so little
CO2 in the atmosphere in the first place, so it will scarcely change the acid-base
balance of the ocean, because there is already 70 times as much CO2 in solution
in the oceans as there is in the atmosphere. Even if all of the additional CO2 we
emit were to end up not in the atmosphere (where it might in theory cause a 4
very little warming) but in the ocean (where it would cause none), the quantity of
CO2 in the oceans would rise by little more than 1%, a trivial and entirely harmless
change. Thirdly, to imagine that CO2 causes ocean acidification is to ignore the
elementary chemistry of bicarbonate ions. Quantitatively, CO2 is only the seventhlargest of the substances in the oceans that could in theory alter the acid-base
balance, so that in any event its effect on that balance would be minuscule.
Qualitatively, however, CO2 is different from all the other substances in that it acts
as the buffering mechanism for all of them, so that it does not itself alter the acidbase balance of the oceans at all. Fourthly, as Professor Ian Plimer points out in
his excellent book Heaven and Earth (Quartet, London, 2009), the oceans slosh
around over vast acreages of rock, and rocks are pronouncedly alkaline. Seen in a
geological perspective, therefore, acidification of the oceans is impossible.

SQ data solves acidification


Status quo efforts solve ocean acid data
OAP 12

[NOAA ocean acidification program. March 12, DATA COLLECTION AND MANAGEMENT
http://oceanacidification.noaa.gov/AreasofFocus/DataCollectionandManagement.aspx//jweideman]

OAP scientists collect a variety of data to understand changing ocean chemistry and
its impacts on marine organisms and ecosystems. The National Oceanographic Data
Center (NODC) serves as the NOAA Ocean Acidification data management focal
point through its Ocean Acidification Data Stewardship (OADS) project. OA data will
be archived at and available from an ocean acidification data stewardship system at
NODC. Ocean acidification data can generally be classified as either physio-chemical or biological. Physio-chemical parameters
include, among others, pCO2 (measurement of carbon dioxide gas both in the air and dissolved in seawater), pH, total alkalinity,

Physio-chemical data from the


field are collected either by remote observing methods (buoys, gliders) or through
direct measurement from ships (hydrographic cruises or volunteer observing ships).
Biological data from the field can be collected in similar ways, either by remote
collection techniques or through direct measurement by scientists. For example,
data about primary production (photosynthesizing activity) can be collected from
buoys through measurement of chlorophyll a, nutrient levels and oxygen. Biologists
have many techniques for collecting biological data in the field, both from ships and on
total inorganic carbon, temperature, salinity, dissolved oxygen and current speed.

shorelines. These collections can be instantaneous or from gear placed to collect organisms over time for later retrieval and
analysis. During laboratory experiments on marine organisms, scientists can measure calcification rates, shell growth, behavior,
otolith growth (for fish), metabolic rate, reproduction, among others parameters. Original datasets from all OAP research and the

The National Oceanographic


Data Center (NODC) is serving as the NOAA OA data management focal point by
providing online data discovery, access to NODC-hosted and distributed data
sources, and long-term archival for a diverse range of OA data. The OAP and NODC will build a
collaborative relationship with shared responsibilities among scientists, data
managers, and NODC staff towards the implementation of an OA data stewardship
system (OADS). CURRENT EFFORTS In March 2012, the Ocean Acidification Program
in collaboration with the University of Washington held an ocean acidification data
management workshop in Seattle, WA which brought together a wide array of
scientists and data managers from across US. Given that successful and integrated
OA data management requires looking beyond just the NOAA program, we
convened researchers and program managers from across the spectrum of US
funded OA research. Representatives from NOAA, NSF, DOE, USGS and NASA
attended as did academic researchers from a range of institutions. The workshop
generated both a Declaration of Data Interdependence and a longer Draft
Integrated Data Management Plan.
papers which analyze the data will be available through the OA Program and NODC.

No Impact-Coral Reefs
Coral Reefs Resilient as long as there is some Biodiversity
how much doesnt matter
Grimsditch and Salm 06 (Gabriel D Grimsditch holds the post of Programme Officer for oceans and climate change for

the UNEP Marine and Coastal Ecosystems Branch in Nairobi, Kenya. Before joining UNEP, Gabriel worked for the IUCN Global Marine Programme where he
coordinated the IUCN Climate Change and Coral Reefs Working Group. Rod Salm has his dissertation in marine ecology. He has 35 years experience in
international marine conservation and ecotourism. 2006. Coral Reef Resilience and Resistance to Bleaching from The World Conservation Union.
http://icriforum.org/sites/default/files/2006-042.pdf July 6, 2014)

The main ecological factor that affects coral reef resilience to bleaching is a
balanced biological and functional diversity (see Glossary) within the coral reef.
It is essential to have a balanced ecological community with sufficient species
interactions for coral reefs to recover from disturbances, and this applies not only
to bleaching but to other disturbances as well (Nystrm and Folke, 2001). An
especially important functional group (see Glossary) for coral reef resilience is
grazing animals, comprising herbivorous fish and sea urchins, among others. They
enhance coral reef resilience by preventing phase shifts from coral- dominated
reefs to algal-dominated reefs by keeping algal growth in check and allowing the
settlement of slower-growing coral recruits rather than faster-growing algae. Their
importance is highlighted in a classic example from Jamaica, where the
overfishing of predators and competitors (herbivorous fish) of the black-spined sea
urchin Diadema antillarum led to an explosion in its population, and thus to a
reduction of the diversity within the herbivorous functional group. Consequently,
grazing by D. antillarum became the primary mechanism for algal control and
crucial to coral reef recovery after Hurricane Allen in 1981. However, in 19831984, a pathogen killed off 95- 99% of its population and enabled a phase shift as
macroalgae out-competed coral. The extent to which this phase shift is
irreversible is still unclear (Nystrm et al, 2000).

Adaptation and Resilience solve the coral reef scenario


Griffin 13 (Catherine Griffin is an environmental and science journalist. 10/30/13. Coral Reefs Resist Climate Change with Genetic

Adaptations from Science World Report. http://www.scienceworldreport.com/articles/10579/20131030/coral-reefs-resist-climate-change-geneticadaptations.htm July 9, 2014)

As our climate changes, coral reefs are becoming increasingly vulnerable to


degradation. Now, though, scientists have discovered that corals may be more
resilient than they thought. It turns out that coral reefs may be able to adapt to
moderate climate warming, spelling new hope for those that hope to conserve these
remarkable ecosystems. Coral reefs are some of the most biodiverse areas in our
oceans. They host thousands of species and act as nurseries for larger fish. Yet as
temperatures warm, corals suffer from a process known as coral "bleaching." This
occurs when reef-building corals eject algae living inside their tissues. Since these
algae supply the coral with most of its food, bleaching can be fatal and leave vast
swathes of a reef lifeless after one of these events. Fortunately, though, it seems
that the corals are adaptive. Scientists have discovered that corals possess a range
of adaptations that can actually counteract the effects of warming. In order to come
to these conclusions, though, the scientists used global sea surface temperature
output from the NOAA/GFDL Earth System Model-2 for the pre-industrial period
through 2100. This allowed them to project rates of coral bleaching. The initial
results seemed to show that past temperature increases should have bleached reefs
more often. The fact that they didn't seemed to indicate that corals are adapting.

Overfishing

Advantage Defense

1NC Regs Cant Solve


Literally no new regulations can solve overfishing- this card
wrecks the advantage
Bratspies 09 (Rebecca M. Bratspies, Professor of Law at the CUNY School of Law in New York, has taught
environmental law, Natural Resources Law, International Environmental Law, Property, Administrative Law, and
Lawyering, J.D. cum laude from the University of Pennsylvania Law School and a B.A. in Biology from Wesleyan
University Why the Free Market Can't Fix Overfishing, http://www.theatlantic.com/health/archive/2009/08/why-thefree-market-cant-fix-overfishing/22524/, 8/9/09, SM)

Catch shares" are the latest fix-all solution to the world's overfishing crisis, and that's
too bad. The idea was recently promoted by Gregg Easterbrook here in The Atlantic, and NOAA Administrator
Jane Lubchenco has committed her agency to "transitioning to catch shares as a solution to overfishing.

Although it's tempting to think that property rights will transform fishing into an
industry focused on long-term sustainability, catch-shares are really just a retread of the
same "markets will fix everything" thinking that has been thoroughly discredited. Catch-

shares allocate portions of the total catch within a fishery and give fishers a property-like right to a share of the fish.

The alternative is open-access, in which


anyone may catch fish until the total quota for the fishery is reached. Not surprisingly,
open-access fisheries often involve a mad scramble to capture as large a share of fish
as quickly as possible, the so-called "fisherman's dilemma." Catch-shares remove the incentive
to catch all the fish immediately, but the real management challenge is deciding how
many fish should be caught in the first place. Regardless of whether a fishery is open-access or
allocates catch-shares, it is only sustainable if the total level of fishing is low enough to
allow fish populations to regenerate. Unfortunately, total catch limits are often shortsighted. While setting the total allowable catch is theoretically a scientific decision,
fishery managers face intense pressure to interpret every ambiguity in favor of allowing more,
rather than less fishing. Sustainable levels of fishing fluctuate with environmental conditions: one
year it might be fine to catch 100 tons of fish, while the next year even 10 tons might be
too much. But the fishing industry depends on predictable levels of catch.
Uncertainties inherent to estimating the "right" level of fishing makes it hard for
managers to defend decisions to reduce fishing. That brings us to the real problem with
(These permits are freely tradable on the open market.)

fisheries: overcapacity. There are

simply

too many boats chasing too few

fish . The catch-share approach tries to solve this problem by creating a permit trading
market. The thinking is that the permits will consolidate in the hands of "rational and
efficient" producers who will voluntarily forego using a portion of their shares. That
won't happen. The recent financial crisis ought to give anyone pause about the
claim that markets inherently promote choices that are in everyone's long-term best
interest . Regulators have sharply defined territorial jurisdictions, but fish do
not cooperate by staying in one place . Fish move between countries' Exclusive
Economic Zones (waters under the effective control of a coastal state) and the high seas. Boats
on the high seas can undermine
Canada almost went to war over this in the 1990s).

of a single state,

most

a coastal state's rigorously

set catch limits

Even when a fishery is under

the

(Spain and

control

governments don't have the capacity to make sure

each boat takes only its allotted catch-share. Catch-shares also fail to
address bycatch , the dirty little secret of the fishing industry. By most estimates, at least 40
percent of every catch is discarded as bycatch --fish other than the target species, including at

least half a million endangered marine mammals and an unknown number of endangered sea turtles.

Catch-

shares will exacerbate this problem by creating a powerful incentive for


fishing boats to discard not only unwanted or uncommercial fish, but also
any fish potentially subject to someone else's share. Moreover, privatizing
the ocean through catch-shares has troubling social justice implications. Catch-shares
are typically awarded to boat owners on the basis of fishing capacity, rather than to each
fisher on a per capita basis. Those with big boats benefit richly, but their workers are shut
out of any ownership stake. This is the same "give to the rich" dynamic that distorts
privatization schemes around the world.

2NC Cant solve


Regulations on overfishing are ineffective
Boon 08 (Kristen E. Boon, Director of International Programs, specializes in public international law and
international organization, Doctorate in law from Columbia Law School, a J.D. from New York University School of
Law, M.A. in Political Science from McGill University OVERFISHING OF BLUEFIN TUNA: INCENTIVIZING INCLUSIVE
SOLUTIONS http://www.louisvillelawreview.org/sites/louisvillelawreview.org/files/pdfs/printcontent/52/1/Boon.pdf,
2008)
While my proposals would not preclude a property rights approach, this Article proceeded on the assumption that

introducing property rights on the high seas would not be sufficient to address
problems of scarcity or to inform institutional design. A final word justifying this presumption is in order.
Advocates of property rights suggest that they would serve as a corrective force for the
negative impacts of overfishing. Essentially, a property rights solution would involve the
creation of a market for individual transferable quotas (ITQs). After a TAC has been set, either an RFMO like
ICCAT could auction off quotas to interested rights holders, including corporations and states, or rights could
be granted to individual fishermen directly who could then swap and sell their rights to
others. Property rights to natural resources have never been introduced on the high
seas, largely due to the difficulties of regulating species outside of national
jurisdictions. When fisheries fall within one nation's enforcement jurisdiction, a public authority can institute a
system of property rights based on catch or fishing effort. Thus a state can create, allocate, and enforce the
allocation of property rights to fish on the basis of ITQs. This might involve setting a limit on the amount of fish that

On the high
seas, however, there is much more competition for the resource, and the role of property
rights becomes considerably more complex. Moreover, the costs of implementing,
maintaining, and enforcing a property rights solution becomes higher as the
competition for resources increases.
could be caught, and then auctioning off the fishing rights to those who want to purchase them.

Cant solve overfishing- Other nations wont co-operate


McDermott 10 (Mat McDermott, Writer about resource consumption for Treehugger, Masters from New
York Universitys Center for global affairs on environmental and energy policy How Bad Is Overfishing & What Can
We Do To Stop It?, http://www.treehugger.com/green-food/how-bad-is-overfishing-what-can-we-do-to-stop-it.html,
August 16th 2010, Date Accessed: 7/3/14, SM)

Enforcement is Key Though... The crucial variable in all this is enforcement. Though nations can control
territorial waters--if they choose to, Libya for example actively turns a blind eye to overfishing in
its waters, as do a number of other nations--in the open ocean things become much more difficult.
Even with scientifically determined quotas, marine reserves and catch share systems in place, if
the rules are not enforced, it all falls apart . Going back to the bluefin example, if the quota is 15,000
tons a year but the actual take, due to utter lack of enforcement, is 60,000 tons, you simply can't manage
the fishery well. And when, as is the case with certain fisheries, you have organized crime syndicates
involved, it all gets that much more complicated.

Cant solve overfishing- Other countries wont cooperate


The House Of Ocean 13 (The House of ocean, Big fishing nations that wont stop overfishing,
http://houseofocean.org/2013/12/13/big-fishing-nations-that-wont-stop-overfishing/, 12/13/13 SM)
A recent Guardian article exposes some of the figures behind industrial tuna fishing in the pacific. The article says

the US, China, South Korea, Japan, Indonesia and Taiwan are responsible for 80%
of bigeye tuna caught each year. The remaining 20% is captured by vessels flagged to smaller fishing
nations. Some of the smallest nations depend on their fisheries for basic survival. In 2012, 2.6m
tonnes of tuna were extracted from the Pacific 60% of the global total. Scientists are in agreement
that

that tuna is being overfished

at an alarming rate. Some species are practically on the brink, with bluefin

Yet, the
organisation that has been entrusted by the international community to be the steward of
tuna fisheries in the Pacific ocean, the Western and Central Pacific Fisheries Commission, has failed to
protect the fish for yet another year. In spite of clear scientific advice regarding the need
to reduce tuna quotas, the large fishing nations that currently haul the most, have pointtuna populations being currently just 4% of what they were before industrial fishing commenced.

blank

refused to reduce their quota . Small Pacific nations have pointlessly warned of the

consequences of overfishing the big boys wont budge.

Unilateral fishing policies fail


Schlick 09 (Katharina Schlick, Fishery agreements in view of the South China Sea Conflict: A regional
cooperative approach to maritime resource conflicts and territorial disputes,
http://www.victoria.ac.nz/chinaresearchcentre/publications/papers/19-katharina-schlick.pdf, 2009 SM)

Overexploitation of marine living resources is a problem of both science and governance (Zha
2001: 577). As the example of Chinese domestic efforts in fisheries management show ,
many unilateral measures stay inefficient in view of the South China Sea ecosystem as a whole. On the
one hand institutional weakness and poor implementation of regulations at the
domestic level impairs any regional effort for sustainable resource management , on
the other hand, diverging national, regional and international interests lead to different objectives and outcomes.
The controversies over jurisdictional boundaries in many parts of the South China Sea and consequentially the
absence of property rights over maritime living resources has created the image of an open access resource pool. In

no one takes efforts to preserve the resource at a sustainable level as


others will free-ride and enjoy the benefits from the resources at the expense of the others
that put restrictions on their own use of the resource. The consequence is that the resource will
gradually be overexploited and in view of fisheries, face danger of extinction . Given the
migrator nature of many species of the sea, no single country would be able to manage or conserve
these fish stocks (Wang 2001: 539-540). Conservation and management issues within the territorial waters and
this situation,

the high sea areas must therefore be compatible with each other.

Nutall Goes neg- Cant solve overfishing


Nutall 06 (Nick Nuttall, Head of Media Services, United Nations Environment Program, Overfishing: a threat
to marine biodiversity, http://www.un.org/events/tenstories/06/story.asp?storyID=800, 2006 ,SM)
According to a Food and Agriculture Organization (FAO) estimate, over 70% of the worlds fish species are either
fully exploited or depleted. The dramatic increase of destructive fishing techniques worldwide destroys marine

FAO reports that illegal, unreported and unregulated fishing


worldwide appears to be increasing as fishermen seek to avoid stricter rules in many
places in response to shrinking catches and declining fish stocks. Few, if any, developing countries and only a
limited number of developed ones are on track to put into effect by this year the International Plan of
Action to Prevent, Deter and Eliminate Unreported and Unregulated Fishing. Despite that
fact that each region has its Regional Sea Conventions, and some 108 governments
and the European Commission have adopted the UNEP Global Programme of Action for the
Protection of the Marine Environment from Land based Activities, oceans are cleared at
twice the rate of forests.
mammals and entire ecosystems.

AT: Food prices I/L


Food prices ARE increasing but because of biofuels
Wahlberg 08 (Katarina Wahlberg, Advisor on Social and Economic Policy Program Coordinator, M.A.
degree in political science from the University of Stockholm, Between Soaring Food Prices and Food Aid Shortage
https://www.globalpolicy.org/component/content/article/217/46194.html, March 3rd 2008, SM)

Food prices have soared because agricultural production has not kept up with the
rising demand of cereals for food consumption, cattle feeding and biofuel production. For the
first time in decades, worldwide scarcity of food is becoming a problem . Global cereal stocks are
falling rapidly. Some predict that US wheat stocks will reach a 60-year low in 2008. Population growth in poor
countries is boosting the grain demand for food consumption . But cereal demand for the feeding
of cattle is increasing even more rapidly as consumers in both rich countries and fast growing economies are eating

The most important factor behind the sudden spike in food prices,
is the rapidly growing demand for biofuels, particularly in the EU and the US. Of total
corn production, 12% is used to make biofuel , and that share is growing fast. Concerns about
global climate change and soaring energy prices have boosted demand for biofuels . Until
more dairy and meat.
however,

recently, few voices critical of biofuels were heard, but now an increasing number of policy makers and analysts
strongly oppose converting food into fuel. In addition to directly threatening food security, there are alarming
examples of how biofuel production causes environmental harm and speeds up global warming. US ethanol
production uses large amounts of fuel, fertilizer, pesticides and water and most analysts consider its environmental
impact quite negative. And in Indonesia, Malaysia and Brazil, companies have slashed thousands of hectares of rain

According to the Food and Agricultural


production of cereal, vegetables, fruit, meat and dairy increased in 2007.
But, prices will remain high or grow even further in the coming years, as production is not
growing fast enough to keep up with rising demand. Production is increasing mainly in the US, the
EU, China and India. Not counting China and India, cereal production in poor countries decreased ,
due in part to climate change related emergencies such as droughts and floods. In addition to a tight
supply and demand situation, soaring petroleum prices contribute to higher food prices
by raising costs of transportation, fertilizers, and fuel for farm machinery. Moreover,
forests to cultivate palm oil or sugarcane for biofuel production.
Organization (FAO), world

financial investors speculating in commodity prices aggravate prices and increase volatility in the market.

Impact Defense

Impact Defense
Squo solves overfishing
NOAA 11 (National Oceanic and Atmospheric Administration, The Road to End Overfishing: 35 Years of
Magnuson Act, http://www.nmfs.noaa.gov/stories/2011/20110411roadendoverfishing.htm, 2011)
I want to acknowledge and highlight the 35 th anniversary of the Magnuson-Stevens Fishery Conservation and
Management Act. Simply called the Magnuson Act, this law, its regional framework and goal of sustainability, has
proven to be a visionary force in natural resource management - both domestically and internationally. The
Magnuson Act is, and will continue to be a key driver for NOAA as we deliver on our nations commitment to ocean

the U.S. is
on track to end overfishing in federally-managed fisheries, rebuild stocks, and ensure
conservation and sustainable use of our ocean resources. Fisheries harvested in the
United States are scientifically monitored, regionally managed and legally enforced
under 10 strict national standards of sustainability. This anniversary year marks a critical turning point
in the Acts history. By the end of 2011, we are on track to have an annual catch limit and
accountability measures in place for all 528 federally-managed fish stocks and
complexes. The dynamic, science-based management process envisioned by Congress is now in place, the
stewardship, sustainable fisheries, and healthy marine ecosystems Because of the Magnuson Act,

rebuilding of our fisheries is underway, and we are beginning to see real benefits for fishermen, fishing communities
and our commercial and recreational fishing industries.

Squo solves overfishing- many species are recovering


The Economist 08 (The economist, Grabbing it all, http://www.economist.com/node/12798494,
December 30th 2008)

regulations have been issued


about the size and type of fish to be caught, the mesh of nets to be used, the number of
days a month that boats may go to sea, the permissible weight of their catch and so on. In some
countries fishermen are offered inducements to give up fishing altogether. Those that
continue are, at least in theory, subject to monitoring both at sea and in port. Large areas are
A variety of remedies have been tried, usually in combination. Thus

sometimes closed to fishing, to allow stocks to recover. Others have been designated as marine reserves akin to

some of the technology that fishermen use to find their prey is now used by
inspectors to monitor the whereabouts of the hunters themselves. Most of these measures
have helped, as the recovery of stocks in various places has shown. Striped bass and
North Atlantic swordfish have returned along America's East Coast, for instance. Halibut have
made a comeback in Alaska. Haddock, if not cod, have begun to recover in Georges Bank off
national parks. And

Maine. And herring come and go off the coasts of Scotland. Those who doubt the value of government intervention
have only to look at the waters off Somalia, a country that has been devoid of any government worth the name
since 1991. The ensuing free-for-all has devastated the coastal stocks, ruining the livelihoods of local fishermen and
encouraging them, it seems, to take up piracy instead.

Climate change proves marine bioD are resilient


Taylor 10 (James M. Taylor is a senior fellow of The Heartland Institute and managing editor of Environment
and Climate News, Ocean Acidification Scare Pushed at Copenhagen,
http://www.heartland.org/publications/environment
%20climate/article/26815/Ocean_Acidification_Scare_Pushed_at_Copenhagen.html, Feb 10)

With global temperatures continuing their decade-long decline


and United Nations-sponsored global warming talks falling
apart in Copenhagen, alarmists at the U.N. talks spent considerable
time claiming carbon dioxide emissions will cause catastrophic ocean
acidification, regardless of whether temperatures rise. The latest
scientific data, however, show no such catastrophe is likely to occur . Food
Supply Risk Claimed The United Kingdoms environment
secretary, Hilary Benn, initiated the Copenhagen ocean scare
with a high-profile speech and numerous media interviews
claiming ocean acidification threatens the worlds food supply.
The fact is our seas absorb CO2. They absorb about a quarter of the
total that we produce, but it is making our seas more acidic,
said Benn in his speech. If this continues as a problem, then it
can affect the one billion people who depend on fish as their
principle source of protein, and we have to feed another 2 to
3 billion people over the next 40 to 50 years. Benns claim of
oceans becoming more acidic is misleading , however. Water with a pH of 7.0 is
considered neutral. pH values lower than 7.0 are considered acidic , while those
higher than 7.0 are considered alkaline. The worlds oceans have a pH
of 8.1, making them alkaline, not acidic. Increasing carbon dioxide
concentrations would make the oceans less alkaline but not acidic. Since
human industrial activity first began emitting carbon dioxide into the
atmosphere a little more than 200 years ago, the pH of the oceans
has fallen merely 0.1, from 8.2 to 8.1. Following Benns December

14 speech and public relations efforts, most of the worlds


major media outlets produced stories claiming ocean
acidification is threatening the worlds marine life. An
Associated Press headline, for example, went so far as to call
ocean acidification the evil twin of climate change. Studies
Show CO2 Benefits Numerous recent scientific studies show
higher carbon dioxide levels in the worlds oceans have the same beneficial
effect on marine life as higher levels of atmospheric carbon dioxide have on
terrestrial plant life. In a 2005 study published in the Journal of Geophysical
Research, scientists examined trends in chlorophyll concentrations , critical

building blocks in the oceanic food chain. The French and


American scientists reported an overall increase of the world
ocean average chlorophyll concentration by about 22 percent
during the prior two decades of increasing carbon dioxide
concentrations. In a 2006 study published in Global Change
Biology, scientists observed higher CO2 levels are correlated
with better growth conditions for oceanic life. The highest CO2
concentrations produced higher growth rates and biomass yields than the
lower CO2 conditions. Higher CO2 levels may well fuel subsequent primary

production, phytoplankton blooms, and sustaining oceanic food-webs , the


study concluded. Ocean Life Surprisingly Resilient In a 2008
study published in Biogeosciences, scientists subjected marine organisms to varying
concentrations of CO2, including abrupt changes of CO2 concentration. The
ecosystems were surprisingly resilient to changes in atmospheric

CO2, and the ecosystem composition, bacterial and


phytoplankton abundances and productivity, grazing rates and
total grazer abundance and reproduction were not significantly
affected by CO2-induced effects. In a 2009 study published in
Proceedings of the National Academy of Sciences, scientists
reported, Sea star growth and feeding rates increased with
water temperature from 5C to 21C. A doubling of current
[CO2] also increased growth rates both with and without a
concurrent temperature increase from 12C to 15C. Another
False CO2 Scare Far too many predictions of CO2-induced catastrophes are
treated by alarmists as sure to occur, when real-world observations show these
doomsday scenarios to be highly unlikely or even virtual impossibilities ,

said Craig Idso, Ph.D., author of the 2009 book CO2, Global
Warming and Coral Reefs. The phenomenon of CO2-induced
ocean acidification appears to be no different.

Science Leadership

Tech Leadership High


U.S. manufacturing dominates the market, manufacturing
innovation solves tech output
NIST 2009 National Institute of Standards and Technology, measurement

standards laboratory, government agency, 2009, The Facts About Modern


Manufacturing, http://www.nist.gov/mep/upload/FINAL_NAM_REPORT_PAGES.pdf
The 8th edition of the Facts gives a snapshot of the state of U.S. manufacturing, and exhibits its strengths and

manufacturing continues to play a vital role in the U.S.


economy. This edition illustrates that the quantity of manufactured goods produced
in the United States has kept pace with overall economic growth since 1947 , as both
GDP and manufacturing have grown by about seven times (Figure 2). The United States
still has the largest manufacturing sector in the world, and its market share (around
20 percent) has held steady for 30 years (Figure 131. One in six private sector jobs is
still in or directly tied to manufacturing (Figure 8). Moreover, productivity growth is
higher in manufacturing than in other sectors of the economy (Figure 25). Due largely
to outstanding productivity growth, the prices of manufactured goods have declined
since 1995 in contrast to inflation in most other sectors, with the result that
manufacturers are contributing to a higher standard of living for U.S. consumers
(Figure 12). Manufacturing still pays premium wages and benefits, and supports
much more economic activity per dollar of production than other sectors (Figure 9).
Another major indicator of the importance of manufacturing to the strength of the
economy is its key role in driving innovation and technology . These are crucial components of
challenges. The Facts clearly show that

a productivity-driven, global competitiveness agenda, and also help explain the steady rise in our standard of living.

U.S. manufacturing accounts for 35 percent of value added in all of the world's high
technology production, and enjoys a trade surplus in revenues from royalties from
production processes and technology. U.S. inventors still account for more than onehalf of all patents granted in the United States (Figure 26), and the nation outpaces its rivals in
terms of industrial research and development. Technology has aided U.S. manufacturers to use less energy per unit
or dollar of production (Figure 30) and to lead all other sectors in reducing C02 emissions in the last two decades

the technology and advanced processes developed in manufacturing


consistently spill over into productivity growth in the service and agriculture sectors .
(Figure 29). Finally,

For this reason, it is important to consider innovation in terms of processes as well as technologies.

Tech leadership inevitable solves the advantage


Zakaria 2009 Fareed Zakaria, editor of Foreign Affairs, editor of Newsweek
International, host of CNN's Fareed Zakaria GPS, 11/14/09, Newsweek, "Can America
Still Innovate?", http://www.newsweek.com/id/222836
Government funding of basic research has been astonishingly productive. Over the past
five decades it has led to the development of the Internet, lasers, global positioning satellites,
magnetic resonance imaging, DNA sequencing, and hundreds of other technologies .

Even when government was not the inventor, it was often the facilitator. One example: semiconductors. As a study
by the Breakthrough Institute notes, after the microchip was invented in 1958 by an engineer at Texas Instruments,

"the federal government bought virtually every microchip firms could produce." This

was particularly true of the Air Force, which needed chips to guide the new Minuteman II missiles, and NASA, which
required advanced chips for the on-board guidance computers on its Saturn rockets. " NASA

bought so many
[microchips] that manufacturers were able to achieve huge improvements in the
production processso much so, in fact, that the price of the Apollo microchip fell

from $1,000 per unit to between $20 and $30 per unit in the span of a couple
years."

No Impact-Soft Power
Soft power doesnt solve everything turns it
Quinn 2011 Adam Quinn, Lecturer in International Studies at the University of
Birmingham, previously worked at the University of Leicester and the University of
Westminster, focuses on the role of national history and ideology in shaping US
grand strategy, The art of declining politely: Obamas prudent presidency and the
waning of American power, International Affairs 87:4 (2011) 803824
http://www.chathamhouse.org/sites/default/files/87_4quinn.pdf
if we consider soft
power as a national attribute then it is difficult to separate it with confidence from
the economic and military dimensions of power. Is it really likely that Americas ideological and cultural
Nevertheless, this qualification demands two further qualifications of its own. The first is that

influence will endure undiminished in the absence of the platform of military and economic primacy upon which it has been
constructed? It may be overstatement to suggest that, borrowing Marxist terminology, hard power represents the base and soft

even Americas non-coercive power and


political appeal are inextricably entwined with the status conferred upon it by
possession of a preponderance of material resources. While vestigial soft power
may delay or mitigate the consequences of relative material decline, it is surely
unrealistic to expect it to override them such as to allow the US to continue to
exercise the same influence in a multipolar or non-polar world as it did in a unipolar
one.
power mere superstructure. But one could plausibly argue that

No Impact-Hegemony
Hegemony fails policy has shifted from helping other nations
to focusing on Americas domestic interests increasing
wealth, military power and influence dont solve
Kagan 5/26 Robert Kagan, PhD in American History, senior fellow at the
Brookings Institution and a member of the Council on Foreign Relations, 5/26/14,
Superpowers dont get to retire,
http://www.newrepublic.com/article/117859/allure-normalcy-what-america-stillowes-world
Almost 70 years ago, a new world order was born from the rubble of World War II,
built by and around the power of the United States. Today that world order shows
signs of cracking, and perhaps even collapsing. The Russia-Ukraine and Syria crises,
and the worlds tepid response, the general upheaval in the greater Middle East and
North Africa, the growing nationalist and great-power tensions in East Asia, the
worldwide advance of autocracy and retreat of democracy taken individually, these
problems are neither unprecedented nor unmanageable. But collectively they are a sign that
something is changing, and perhaps more quickly than we may imagine. They may
signal a transition into a different world order or into a world disorder of a kind not
seen since the 1930s. If a breakdown in the world order that America made is
occurring, it is not because Americas power is decliningAmericas wealth, power,
and potential influence remain adequate to meet the present challenges. It is not
because the world has become more complex and intractablethe world has
always been complex and intractable. And it is not simply war-weariness. Strangely
enough, it is an intellectual problem, a question of identity and purpose . Many Americans
and their political leaders in both parties, including President Obama, have either
forgotten or rejected the assumptions that undergirded American foreign policy for
the past seven decades. In particular, American foreign policy may be moving away
from the sense of global responsibility that equated American interests with the
interests of many others around the world and back toward the defense of narrower,
more parochial national interests. This is sometimes called isolationism, but that is not the
right word. It may be more correctly described as a search for normalcy. At the core
of American unease is a desire to shed the unusual burdens of responsibility that
previous generations of Americans took on in World War II and throughout the cold
war and to return to being a more normal kind of nation , more attuned to its own needs and
less to those of the wider world. If this is indeed what a majority of Americans seek today, then the current
period of retrenchment will not be a temporary pause before an inevitable return to
global activism. It will mark a new phase in the evolution of Americas foreign policy.
And because Americas role in shaping the world order has been so unusually
powerful and pervasive, it will also begin a new phase in the international system,
one that promises not to be marginally different but radically different from what we
have known these past 70 years. Unless Americans can be led back to an understanding of their
enlightened self-interest, to see again how their fate is entangled with that of the world, then the prospects
for a peaceful twenty-first century in which Americans and American principles can
thrive will be bleak.

S&T Add on Neg


S&T policy is effective now US commitments and theyre
legally binding
Dolan 2012 Bridget M. Dolan PhD, Research Scholar for the American Association for

the Advancement of Science, "Science and Technology Agreements as Tools for Science
Diplomacy: A U.S. Case Study," Science & Diplomacy, Vol. 1, No. 4 (December 2012).
http://www.sciencediplomacy.org/article/2012/science-and-technology-agreements-tools-forscience-diplomacy.

International agreements to promote cooperation in


scientific research and development can be bilateral or multilateral, governmentwide or at the level of individual technical agencies (e.g., the National Science Foundation or the
A Formal and Legally Binding Agreement

National Institutes of Health). The focus of this paper is on bilateral, government-wide agreements, also referred to

Scientific cooperation
between the United States and other countries is undertaken using a variety of
arrangements, from informal scientist-to-scientist collaborations to cooperation
between research institutions to formal agreements between technical agencies.
While S&T agreements are not necessary for these types of interactions, other
nations often seek S&T agreements with the United States because they carry the
weight of being legally binding and having been negotiated on behalf of the U.S.
government. These agreements endeavor to establish a framework to foster
international science collaboration while protecting intellectual property,
establishing benefit sharing, and preventing taxation of research equipment . The
as umbrella agreements, framework agreements, or simply S&T agreements.

contents of an S&T agreement usually include common features such as types of cooperative activities and ways to
encourage access to facilities and personnel, as well as clarification that some information or equipmentsuch as

There are
three areas where the agreement text often varies: (1) the preamble, which is not legally binding
and is often used to highlight the public motivations behind the agreement; (2) the intellectual property
rights annex, which delineates how the parties share and exploit intellectual
property generated; and (3) the implementation plan, including whether to establish a joint committee that
those requiring protection for national security reasonsare not covered under the agreement.

would meet regularly to review execution of the agreement.

They cant solve funding


Brown and Sarewitz 1998 George Brown Jr., former House representative,
Daniel Sarewitz, Co-Director, Consortium for Science, Policy & Outcomes Associate
Director, Fall 1998, Science and Technology, Volume 15, Issue 1, U.S. failure in
international scientific cooperation,
http://www.freepatentsonline.com/article/Issues-in-ScienceTechnology/53435944.html
In August 1991, we traveled to Mexico to meet with policymakers and scientists
about the establishment of a United States-Mexico science foundation devoted to
supporting joint research on problems of mutual interest. We encountered
enthusiasm and vision at every level, including an informal commitment by the
Minister of Finance to match any U.S. contribution up to $20 million. At about this
time, our article "Fiscal Alchemy: Transforming Debt into Research" (Issues, Fall
1991) sought to highlight three issues: 1) the pressing need for scientific
partnerships between the United States and industrializing nations, 2) the

mechanism of bilateral or multilateral foundations for funding such partnerships,


and 3) the device of debt swaps for allowing debtor nations with limited foreign
currency reserves to act as full partners in joint research ventures. We returned
from our visit to Mexico flush with optimism about moving forward on all three
fronts. Results, overall, have been disappointing. We had hoped that the debt-forscience concept would be adopted by philanthropic organizations and universities
as a way to leverage the most bang for the research buck. This has not taken place.
The complexity of negotiating debt swaps and the changing dynamics of the
international economy may be inhibiting factors. But much more significant, in our
view, is a general unwillingness in this nation to pursue substantive international
scientific cooperation with industrializing and developing nations. Although the
National Science Foundation and other agencies do fund U.S. scientists conducting
research in the industrializing and developing world, this work does not support
broader partnerships aimed at shared goals. Such partnerships can foster the local
technological capacities that underlie economic growth and environmental
stewardship; we also view them as key to successfully addressing a range of mutual
problems, including transborder pollution, emerging diseases, and global climate
change. Yet there is a conspicuous lack of attention to this approach at all levels of
the administration; most important, the State Department continues to view
scientific cooperation as a question of nothing more than diplomatic process.
Incredibly, through 1995 (the latest year for which data are available) the United
States has negotiated more than 800 bilateral and multilateral science and
technology agreements (up from 668 in 1991), even though virtually none of these
are backed by funding commitments. Nor is there any coordination among agencies
regarding goals, implementation, redundancy, or follow-up. A report by the RAND
Corporation, "International Cooperation in Research and Development," found little
correlation between international agreements and actual research projects.
Moreover, although there are few indications that these agreements have led to
significant scientific partnerships with industrializing and developing nations, there
is plenty of evidence that they support a healthy bureaucratic infrastructure in the
U.S. government. We cannot help but think that a portion of the funds devoted to
negotiating new agreements and maintaining existing ones might be better spent
on cooperative science.

No warming and not anthropogenic


Ferrara, 2012 (Peter, Director of Entitlement and Budget Policy for the Heartland Institute, Senior Advisor
for Entitlement Reform and Budget Policy at the National Tax Limitation Foundation, General Counsel for the
American Civil Rights Union, and Senior Fellow at the National Center for Policy Analysis, served in the White House
Office of Policy Development, graduate of Harvard College and Harvard Law School , 5/31/2012, "Sorry Global
Warming Alarmists, The Earth Is Cooling," http://www.forbes.com/sites/peterferrara/2012/05/31/sorry-globalwarming-alarmists-the-earth-is-cooling/)
Climate change itself is already in the process of definitively rebutting climate alarmists who think human use of

natural climate cycles have


already turned from warming to cooling, global temperatures have already been
declining for more than 10 years , and global temperatures will continue to decline for
another two decades or more. That is one of the most interesting conclusions to come out of the seventh
fossil fuels is causing ultimately catastrophic global warming. That is because

International Climate Change Conference sponsored by the Heartland Institute, held last week in Chicago. I
attended, and served as one of the speakers, talking about The Economic Implications of High Cost Energy. The

serious natural science, contrary to the self-interested political science


you hear from government financed global warming alarmists seeking to justify widely
conference featured

expanded regulatory and taxation powers for government bodies, or government body wannabees, such as the

you will see are calm,


dispassionate presentations by serious, pedigreed scientists discussing and explaining
reams of data. In sharp contrast to these climate realists, the climate alarmists have long admitted
that they cannot defend their theory that humans are causing catastrophic global
warming in public debate. With the conference presentations online, lets see if the alarmists really do have any
United Nations. See for yourself, as the conference speeches are online. What

response. The Heartland Institute has effectively become the international headquarters of the climate realists, an
analog to the UNs Intergovernmental Panel on Climate Change (IPCC). It has achieved that status through these
international climate conferences, and the publication of its Climate Change Reconsidered volumes, produced in
conjunction with the Nongovernmental International Panel on Climate Change (NIPCC). Those Climate Change
Reconsidered volumes are an equivalently thorough scientific rebuttal to the irregular Assessment Reports of the
UNs IPCC. You can ask any advocate of human caused catastrophic global warming what their response is to
Climate Change Reconsidered. If they have none, they are not qualified to discuss the issue intelligently. Check out

20th century temperature record, and you will find that its up and down pattern does
not follow the industrial revolutions upward march of atmospheric carbon dioxide (CO2),
the

which is the supposed central culprit for man caused global warming (and has been much, much higher in the past).

It follows instead the up and down pattern of naturally caused climate


cycles. For example, temperatures dropped steadily from the late 1940s to the late
1970s. The popular press was even talking about a coming ice age. Ice ages have cyclically occurred roughly
every 10,000 years, with a new one actually due around now. In the late 1970s, the natural cycles turned
warm and temperatures rose until the late 1990s, a trend that political and economic interests
have tried to milk mercilessly to their advantage. The incorruptible satellite measured global atmospheric
temperatures show less warming during this period than the heavily manipulated land surface temperatures.

Every 25 to 30 years the oceans


undergo a natural cycle where the colder water below churns to replace the warmer
water at the surface, and that affects global temperatures by the fractions of a degree we
have seen. The PDO was cold from the late 1940s to the late 1970s, and it was warm from the late 1970s to the
Central to these natural cycles is the Pacific Decadal Oscillation (PDO).

late 1990s, similar to the Atlantic Multidecadal Oscillation (AMO). In 2000, the UNs IPCC predicted that global
temperatures would rise by 1 degree Celsius by 2010. Was that based on climate science, or political science to

Easterbrook, Professor
Emeritus of Geology at Western Washington University , knew the answer. He publicly
predicted in 2000 that global temperatures would decline by 2010. He made that prediction
scare the public into accepting costly anti-industrial regulations and taxes? Don

because he knew the PDO had turned cold in 1999, something the political scientists at the UNs IPCC did not know
or did not think significant. Well, the results are in, and the winner is .Don Easterbrook.
Easterbrook also spoke at the Heartland conference, with a presentation entitled Are Forecasts of a 20-Year Cooling
Trend Credible? Watch that online and you will see how scientists are supposed to talk: cool, rational, logical

All I ever see from the global warming alarmists, by


contrast, is political public relations, personal attacks, ad hominem arguments, and name
calling, combined with admissions that they cant defend their views in public debate.
Easterbrook shows that by 2010 the 2000 prediction of the IPCC was wrong by well
analysis of the data, and full explanation of it.

over a degree , and the gap was widening . Thats a big miss for a forecast just 10 years away,
when the same folks expect us to take seriously their predictions for 100 years in the future. Howard Hayden,
Professor of Physics Emeritus at the University of Connecticut showed in his presentation at the conference that
based on the historical record a doubling of CO2 could be expected to produce a 2 degree C temperature increase.
Such a doubling would take most of this century, and the temperature impact of increased concentrations of CO2
declines logarithmically. You can see Haydens presentation online as well. Because PDO cycles last 25 to 30 years,

Easterbrook expects the cooling trend to continue for another 2 decades or so.
Easterbrook, in fact, documents 40 such alternating periods of warming and cooling
over the past 500 years, with similar data going back 15,000 years. He further expects the flipping of the
ADO to add to the current downward trend. But that is not all. We are also currently experiencing a
surprisingly long period with very low sunspot activity . That is associated in the earths
history with even lower, colder temperatures . The pattern was seen during a period known as the
Dalton Minimum from 1790 to 1830, which saw temperature readings decline by 2 degrees in a 20 year period, and
the noted Year Without A Summer in 1816 (which may have had other contributing short term causes). Even worse
was the period known as the Maunder Minimum from 1645 to 1715, which saw only about 50 sunspots during one
30 year period within the cycle, compared to a typical 40,000 to 50,000 sunspots during such periods in modern

times. The Maunder Minimum coincided with the coldest part of the Little Ice Age, which the earth suffered from
about 1350 to 1850. The Maunder Minimum saw sharply reduced agricultural output, and widespread human

impacts of the sun on the earths climate were


discussed at the conference by astrophysicist and geoscientist Willie Soon, Nir J. Shaviv, of the
Racah Institute of Physics in the Hebrew University of Jerusalem, and Sebastian Luning, co-author
with leading German environmentalist Fritz Vahrenholt of The Cold Sun. Easterbrook suggests that the
outstanding question is only how cold this present cold cycle will get . Will it be modest like
suffering, disease and premature death. Such

the cooling from the late 1940s to late 1970s? Or will the paucity of sunspots drive us all the way down to the
Dalton Minimum, or even the Maunder Minimum? He says it is impossible to know now. But based on experience, he
will probably know before the UN and its politicized IPCC.

STEM Add on Neg


U.S. manufacturing dominates the market, manufacturing
innovation solves tech output
NIST 2009 National Institute of Standards and Technology, measurement

standards laboratory, government agency, 2009, The Facts About Modern


Manufacturing, http://www.nist.gov/mep/upload/FINAL_NAM_REPORT_PAGES.pdf
The 8th edition of the Facts gives a snapshot of the state of U.S. manufacturing, and exhibits its strengths and

manufacturing continues to play a vital role in the U.S.


economy. This edition illustrates that the quantity of manufactured goods produced
in the United States has kept pace with overall economic growth since 1947 , as both
GDP and manufacturing have grown by about seven times (Figure 2). The United States
still has the largest manufacturing sector in the world, and its market share (around
20 percent) has held steady for 30 years (Figure 131. One in six private sector jobs is
still in or directly tied to manufacturing (Figure 8). Moreover, productivity growth is
higher in manufacturing than in other sectors of the economy (Figure 25). Due largely
to outstanding productivity growth, the prices of manufactured goods have declined
since 1995 in contrast to inflation in most other sectors, with the result that
manufacturers are contributing to a higher standard of living for U.S. consumers
(Figure 12). Manufacturing still pays premium wages and benefits, and supports
much more economic activity per dollar of production than other sectors (Figure 9).
Another major indicator of the importance of manufacturing to the strength of the
economy is its key role in driving innovation and technology . These are crucial components of
challenges. The Facts clearly show that

a productivity-driven, global competitiveness agenda, and also help explain the steady rise in our standard of living.

U.S. manufacturing accounts for 35 percent of value added in all of the world's high
technology production, and enjoys a trade surplus in revenues from royalties from
production processes and technology. U.S. inventors still account for more than onehalf of all patents granted in the United States (Figure 26), and the nation outpaces its rivals in
terms of industrial research and development. Technology has aided U.S. manufacturers to use less energy per unit
or dollar of production (Figure 30) and to lead all other sectors in reducing C02 emissions in the last two decades

the technology and advanced processes developed in manufacturing


consistently spill over into productivity growth in the service and agriculture sectors .
(Figure 29). Finally,

For this reason, it is important to consider innovation in terms of processes as well as technologies.

Ocean policy is increasing now status quo solves the


advantage
Babb-Brott 2013 Deerin Babb-Brott, former director of the National Ocean
Council, 7/19/13, A Guide for Regional Marine
Planning,http://www.whitehouse.gov/blog/2013/07/19/guide-regional-marineplanning

the National Ocean Council released a Marine Planning Handbook to support


the efforts of regions that choose to engage marine industries, stakeholders, the
public, and government to advance their economic development and conservation
priorities. Each coastal region of the country has its own interests and ways of doing business, but all regions want
to support their marine economies and coastal communities. Voluntary marine
planning is a science-based tool that provides regionally tailored information that all
ocean interests can use to reduce conflicts, grow ocean industries, and support the
healthy natural resources that our economy and communities depend on . Federal, state
Today,

governments have a variety of roles and responsibilities when it comes to the


ocean, and make decisions every day that impact ocean resources, industries and coastal communities. Regions that choose to
and local

do marine planning are guaranteeing that the public and marine stakeholders will shape these decisions early on, promoting better
outcomes for everyone. Regions can define what they want to address and how they do so, in ways that reflect their unique

some components of planning like making sure the public and


are common to all regions. The Handbook provides
guidance on how regions can address their priorities through a bottom-up,
transparent, science-based process. The Handbook reflects the extensive public and stakeholder input received
interests and priorities. At the same time,
stakeholders have a chance to engage

in the development of the National Ocean Policy and its Implementation Plan. We will update it as needed to reflect the lessons
learned in regions and ensure it continues to be a useful guide for successful, collaborative planning.

New programs dont generate interest and STEM is inevitable


Spudis 2011 Paul Spudis, Senior Staff Scientist at the Lunar and Planetary
Institute, 2011, The Once and Future Moon Blog, Smithsonian Air and Space Blog,
5/14/11, Young Visitors Inspire Old Scientist,
http://blogs.airspacemag.com/moon/2011/05/young-visitors-inspire-old-scientist/

A perennial hand-wringing topic among policy geeks is Americas decline in math and science proficiency. This
sentiment has been expressed the entire 30 years Ive worked on space science and exploration new generations
dont care about space, cant do math and science, cant think properly and the countrys going to hell in a hand
basket. Complaint about the decline in our ability is something passed from one generation to the next. Todays
youth are being corrupted by iPods, Facebook and hip-hop; when I was a kid, it was Frisbees, MAD magazine and the

There is a continuous stream of doom-laden reports outlining the decline of


American youth in the fields of science, technology, engineering and math (called
Beatles.

STEM). In this country, most Ph.D.s in science and technology are now foreign-born (these reports dont mention

Multiple factors are suggested


as contributors to this decline, with the lack of an inspiring, exciting space program
long believed to be important by many advocates. This meme , of long currency in space
policy circles, has some flaws. Origins of the association between space exploration and science education go
that most often, they stay here, adding to our technical knowledge base).

back to the days of Sputnik the ping that shocked and alarmed the country. This event prompted loud public
cries for somebody to do something about the educational decline of Americas youth (corrupted then by 57
Chevys, hula-hoops and Elvis). Congress responded in the usual manner they threw money at the problem. The
National Defense Education Act of 1958 (interesting wording that) created a huge infrastructure largely dependent
upon federal funding that directly answered the Soviet challenge in space. The number of science graduates
exploded over the next couple of decades, leading many to conclude that 1) the excitement generated by the
Apollo program inspired these students to aspire to careers in science; 2) huge amounts of federal money can solve

Although Apollo is now a distant memory (or for many, a distant, historical event that
the confluence of space and education is taken
as a given by many in the business. NASA spends a designated fraction of their budget on a process
any problem.

theyve only read about in third-hand accounts),

called EPO (Education and Public Outreach), designed to inform and inspire the next generation of scientists and
explorers. As you might expect, these efforts range from the interesting and innovative to the embarrassing

A perception has emerged that the problem lies not with the
methodology, but with the product because we are not doing anything in space
that is exciting, we arent producing quality scientists and engineers. This may well
(though well intentioned).

account for what sensible students with an eye toward putting food on the table after they graduate choose to
study. Then too, perhaps there are too many in the field already. But with effort, excellence will find productive
work; self-esteem and entitlement will not cut it in the long run, no matter what your field of endeavor. Recently,

had the opportunity to directly interact with students at two ends of the education
pipeline and found the experience highly encouraging. In the first case, the father of a local second-grader
asked if his son could visit and interview me. The boy had chosen to write a semester paper (in second grade??)
about the Moon. The child was both well spoken and well informed . He asked relevant and very
intelligent questions. What is the value of the Moon? What do we want to know about it and how do we find these
things out? Can people live there? I found his questions and understanding of the problems and benefits of
exploring the Moon to be at a very high level (much higher than many adult reporters who call me). Then he asked
me an unexpected question: How fast does the Moon travel through space? After initially drawing a complete blank,
I suggested that we find out together and went on to calculate it on the spot. We concluded that the Moon flies
around the Earth at over 2200 miles per hour (much faster than he traveled down the freeway to visit me). He was

delighted by this episode of science in action. I was delighted to be challenged by his understanding and his

a high school debate coach contacted me. He told me


that next years debate question is Should humans explore space? and asked if I could assist his
team, as they were collecting information for their briefing books. Once again, I was
pleasantly surprised by their level of knowledge and their understanding of complex
issues. We reviewed the history of the space program, and why and how the current policy confusion has
developed. These students were informed and sharp. They had already read and digested a great
deal of information drawing insight and conclusions about issues the space community is embroiled in. Their
questions were both penetrating and logical, and sent a clear message of their
desire to fully understand the technical and programmatic issues involved. What did I
conclude from my encounter with a sample of todays youth? Mostly, that reports of the demise of our
Republic are premature. These kids were smart and well informed. They could
assimilate new information and apply it to other topics in clever ways. They had an
enthusiasm for their subject that was both gratifying and surprising. And, they are interested in space,
regardless of the current uninspiring nature of the program. Inspiration is great, but its a
highly personal factor and its impact and importance are difficult to measure. The current STEM/outreach
process at NASA conflates excitement and inspiration, but they are two different
things. The circus entertains us but we find inspiration elsewhere. We need to focus on
interest in the topic. Around the same time,

building a stable space program that will give us long-term benefits a step-wise, incremental program that
gradually increases the extent of our reach into space. Compared to the current policy chaos, it just might be
inspirational too.

Growth is not a strong enough predictor of war to matter


Blackwill 2009 Robert Blackwill, former associate dean of the Kennedy School

of Government and Deputy Assistant to the President and Deputy National Security
Advisor for Strategic Planning, The Geopolitical Consequences of the World
Economic RecessionA Caution,
http://www.rand.org/pubs/occasional_papers/OP275.html
Did the economic slump lead to strategic amendments in the way Japan sees the world? No. Did
it slow the pace of Indias emergence as a rising great power? No. To the contrary, the new Congress-led government in
New Delhi will accelerate that process. Did it alter Irans apparent determination to acquire a nuclear capability or
something close to it? No. Was it a prime cause of the recent domestic crisis and instability in Iran after its 2009 presidential
election? No. Did it slow or

accelerate the moderate Arab states intent to move along the nuclear path? No. Did it affect

North Koreas destabilizing nuclear calculations? No. Did it importantly weaken political reconciliation in Iraq? No, because
there is almost none in any case. Did it slow the Middle East peace process? No, not least because prospects for progress on issues
between Israel and the Palestinians are the most unpromising in 25 years. Did it substantially

affect the enormous internal and

international challenges associated with the growth of Jihadiism in Pakistan? No. But at the same time, it is important to stress
that Pakistan, quite apart from the global recession, is the epicenter of global terrorism and now represents potentially the most
dangerous international situation since the 1962 Cuban Missile Crisis. Did the global economic downturn systemically affect the

future of Afghanistan? No. The fact that the United States is doing badly in the war in Afghanistan has nothing to do with the
economic deterioration. As Henry Kissinger observes, The conventional army loses if it does not win. The guerrilla wins if he does
not lose. And NATO is not winning in Afghanistan. Did it change in a major way the future of the Mexican state? No. Did the
downturn

make Europe, because of its domestic politics, less willing and able over time to join the U.S. in

effective alliance policies? No, there will likely be no basic variations in Europes external policies and no serious evolution in
transatlantic relations. As President Obama is experiencing regarding Europe, the problems with European publics in this regard are
civilizational in character, not especially tied to this recessionin general, European publics do not wish their nations to take on
foreign missions that entail the use of force and possible loss of life. Did the downturn slow further EU integration? Perhaps, at the
margin, but in any case one has to watch closely to see if EU integration moves like a turtle or like a rock. And so forth.

To be

clear, there will inevitably be major challenges in the international situation in the next five
years. In fact, this will be the most dangerous and chaotic global period since before the 1973 Middle East war. But it is not
obvious that these disturbing developments will be primarily a result of the global
economic problems. It is, of course, important to be alert to primary and enduring international discontinuities. If

such a convulsive geopolitical event is out there, what is it? One that comes to mind is another
catastrophic attack on the American homeland. Another is the collapse of Pakistan and the loss of government control of its nuclear
arsenal to Islamic extremists. But again, neither of these two geopolitical calamities would be connected to the current economic

Some argue that, even though geopolitical changes resulting from the current global economic
tribulations are not yet apparent, they are occurring beneath the surface of the international
system and will become manifest in the years to come. In short, causality not perceptible now will become so.
decline.

This subterranean argument is difficult to rebut. To test that hypothesis, the obvious analytical method is to seek tangible data that
demonstrates that it is so. In short, show A, B, and/or C (in this case, geopolitical transformations caused by the world slump) to
have occurred, thus substantiating the contention. One could then examine said postulated evidence and come to a judgment

To instead contend that, even though no such data can be adduced,


the assertion, nevertheless, is true because of presently invisible occurrences
seems more in the realm of religious conviction than rigorous analysis. But it is worth
regarding its validity.

asking, as the magisterial American soldier/statesman George Marshall often did, Why might I be wrong? If the global economic
numbers continue to decline next year and the year after, one must wonder whether any region would remain stable whether
China would maintain internal stability, whether the United States would continue as the pillar of international order, and whether
the European Union would hold together. In that same vein, it is unclear today what effect, if any, the reckless financial lending and
huge public debt that the United States is accumulating, as well as current massive governmental fiscal and monetary intervention
in the American economy, will have on U.S. economic dynamism, entrepreneurial creativity, and, consequently, power projection
over the very long term. One can only speculate on that issue at present, but it is certainly worth worrying about, and it is the most
important known unknown27 regarding this subject.28 In addition, perhaps the Chinese Communist Partys grip on China is more
fragile than posited here, and possibly Pakistan and Mexico are much more vulnerable to failed-state outcomes primarily because of

While it seems unlikely that these worst-case


scenarios will eventuate as a result of the world recession, they do illustrate again that crucial
uncertainties in this analysis are the global downturns length and severity and the long-term effects of the Obama
Administrations policies on the U.S. economy. Finally, if not, why not? If the world is in the most severe
international economic crisis since the 1930s, why is it not producing structural
changes in the global order? A brief answer is that the transcendent geopolitical
elements have not altered in substantial ways with regard to individual nations in
the two years since the economic crisis began . What are those enduring geopolitical
elements? For any given country, they include the following: Geographic location, topography, and climate. As
the economic downturn than anticipated in this essay.

Robert Kaplan puts it, to embrace geography is not to accept it as an implacable force against which humankind is powerless.
Rather, it serves to qualify human freedom and choice with a modest acceptance of fate.29 In this connection, see in particular the
works of Sir Halford John Mackinder and his The Geographical Pivot of History (1904)30, and Alfred Thayer Mahan, The Influence of
Sea Power upon History, 16601783 (1890).31 Demographythe size, birth rate, growth, density, ethnicity, literacy, religions,
migration/emigration/ assimilation/absorption, and industriousness of the population. The histories, foreign and defense policy

size and strength of the


domestic economy. The quality and pace of technology. The presence of natural resources. The
tendencies, cultural determinants, and domestic politics of individual countries. The

character, capabilities, and policies of neighboring states. For the countries that matter most in the global order, perhaps

none of these decisive variables have changed very much since the global
downturn began, except for nations weaker economic performances. That single factor is not likely to
trump all these other abiding geopolitical determinants and therefore produce international
structural change. Moreover, the fundamental power relationships between and among the
worlds foremost countries have also not altered, nor have those nations perceptions of their vital
national interests and how best to promote and defend them. To sum up this pivotal concept, in the absence of war,
revolution, or other extreme international or domestic disruptions, for nation-states, the powerful abiding
conditions just listed do not evolve much except over the very long term , and thus neither do
unsurprisingly,

countries strategic intent and core external policies even, as today, in the face of world economic trials. This point was made
earlier about Russias enduring national security goals, which go back hundreds of years. Similarly, a Gulf monarch recently advised
with respect to Irannot to fasten on the views of President Ahmadinejad or Supreme Leader Khamenei. Rather, he counseled
that, to best understand contemporary Iranian policy, one should more usefully read the histories, objectives, and strategies of the
Persian kings Cyrus, Darius, and Xerxes, who successively ruled a vast empire around 500 BC.32 The American filmmaker Orson
Welles once opined that To

occupation

give an accurate description of what never happened is the proper


true of pundits.

of the historian. 33 Perhaps the same is occasionally

Disease

No Solvency
They cant solve disease anti-vaccination movement gives
disease a foothold within communities and spreads from there
Kollipara 5/5 Puneet Kollipara, Journalist for the Washington Post, 5/5/2014,
How the Anti-Vaccine Movement is Endangering Lives
http://www.washingtonpost.com/blogs/wonkblog/wp/2014/05/05/how-the-antivaccine-movement-is-endangering-lives/

Infectious diseases that we normally think of as rare in the United States are making
a comeback. In recent years, pertussis -- also known as whooping cough -- has returned to the headlines. A measles outbreak
that struck a Texas megachurch community late last summer sickened 21 people. And just recently, at least 16 people got sick

the Centers for Disease Control and Prevention recently


reported 13 measles outbreaks so far in 2014 -- the most since 1996. The diseases are highly
contagious, but they are also preventable; two of the many recommended
childhood vaccinations protect against measles and pertussis . And measles had been considered
during a measles outbreak in Ohio. In fact,

effectively eliminated in the United States. What's going on, exactly? Here are some answers. Why are so many outbreaks

more and more people are


choosing not to get their kids vaccinated against these diseases . For instance, in California
happening? Although it's a complex problem, health officials say one key culprit is that

parents are increasingly seeking personal or religious exemptions from vaccination requirements for their kids to attend schools.

Substandard vaccination rates create an opening for outbreaks, which often start
when an unvaccinated person catches the disease while traveling abroad and
spreads the illness to friends and family upon returning . But aren't overall vaccination rates really
high? Nationally, yes, though it wasn't always this way. Before the 1990s, rates languished below 80 percent for most childhood
vaccines at the time. In 1993, after the 1989-1991 measles outbreak, Congress enacted the Vaccines for Children (VFC) program to
promote childhood vaccinations. CDC data show that vaccination rates are now above 90 percent range for several routine vaccines,
including the measles-mumps-rubella (MMR) and whooping cough vaccines. Public health officials target a 90 percent vaccination

Experts say that a population has "herd immunity" when


enough people are vaccinated to prevent a disease from taking hold . This chart shows how
rate for most routine childhood vaccines.

vaccination rates climbed after VFC's enactment: If vaccination rates are high, why are we seeing so many outbreaks? That's

vaccination rates aren't geographically uniform. Public-health experts say that


high non-vaccination or exemption rates can occur among pockets of people,
particularly at the county or city level. And some research has found that outbreaks
are far more likely to happen in these areas, such as during the recent whooping cough outbreaks in
because

California. Why are people not vaccinating their kids? There are a number of factors at play. Many of the diseases we vaccinate
against are so rare here now that the public's awareness of vaccination might have decreased. But the one reason that has most

Groups and activists such as


celebrity Jenny McCarthy have repeatedly claimed that vaccines cause autism. This
vaccine-autism concern may be causing a drop in childhood vaccination rates in
many communities, including in affluent, well-educated ones . Do vaccines cause autism? Science
alarmed public-health experts lately has been the rise of the anti-vaccine movement.

gives a resounding no. Anti-vaccine activists often hang their case on a study published in the British journal The Lancet in 1998.
This study, which posited a link between the MMR vaccine and autism, was widely discredited by the scientific community and

Anti-vaccine activists have also


raised concerns about vaccines made with the mercury-based preservative known
as thimerosal, which they worry could cause brain damage in developing kids . It's true
eventually retracted. But the anti-vaccine movement has still gained steam.

that vaccines once routinely contained thimerosal, which government officials recognized as generally safe. But this preservative
has been phased out of nearly all vaccines as a precautionary measure. Anti-vaccine activists also worry that the CDC's
recommended vaccination schedule could overwhelm infants' immune systems by packing too many doses into a short period of
time. Although the number of vaccinations that kids receive now is higher than it used to be, the main ingredients in the vaccines
have actually decreased in amount. Even if these ingredient amounts hadn't decreased, research has found no relationship between
those amounts and autism risk. Vaccines do carry a risk of side effects, but they are usually minor. The CDC has concluded from
reviewing the scientific evidence that there's no causal link between childhood vaccinations and autism.

No Solvency-Cant contain
The aff cant solve, containment is impossible
Blancou et al 2005 Jean Blancou, former General Director of OEI, Bruno
Chomel, Researcher for WHO/PAHO Collaborating Center on New and Emerging
Zoonoses, Albino Belotto, Researcher for Veterinary Public Health Unit, Pan
American Health Organization, Franois Meslin, Researcher for Strategy
Development and Monitoring of Zoonoses, Food-borne Diseases and Kinetoplastidae
(ZFK),
Communicable Diseases Control Prevention and Eradication Emerging or reemerging bacterial zoonoses: factors of emergence, surveillance and control, Vet.
Res. 36, pg 507-522, http://hal.archives-ouvertes.fr/docs/00/90/29/77/PDF/hal00902977.pdf
The main obstacles that are encountered in the control of bacterial zoonoses are the same as those opposed to the
control of any infectious disease, that is most often finan- cial and human obstacles rather than tech- nical
limitations.

The financial resources needed to effi- ciently fight against zoonotic agents
are not available for all countries. Only the international communitys financial
support, could, notably, allow developing countries to organize a proper control of
zoonotic dis- eases, but it is rare that this is materialized as a financial gift and
mobilization of spe- cific funds, even by well-known interna- tional organizations (such as WHO, FAO,

OIE), is limited for such diseases. Due to all these difficulties, many sanitary authorities of these countries have
given up the estab- lishment of such prevention programs. Oth- ers manage, with a lot of perseverance, to
elaborate complicated multilateral financial arrangements. This allows punctual projects to be realized, but rarely to
establish the long-term prophylaxis plans that they really need. When financial and material problems are
supposedly solved, human-related dif- ficulties should not be underestimated . These
difficulties can originate within the services in charge of applying the national prophy- laxis plans, when these
services are not themselves convinced of the good use of these plans, or when they do not seem to get specific

The obstacles sometimes result from a lack of cooperation between


specific professional categories, amongst which figure breeders, as well as livestock brokers
or even veterinarians bothered by the application of certain pro- grams of control or
the limited incentive given by the health authorities for perform- ing prophylaxis
tasks. Finally, the obstacle to such plans may be caused by the active opposition of
the public opinion to certain methods of control. This is notably the case for the
hostility of some groups to the mass slaughtering of animals during epizootics, or to
the use of vaccines issued from genetic engineering. By lack of an appropriate con- sensus, the
benefits from it.

control of some zoonotic dis- eases may simply be impossible in some countries.

No Extinction
---Super viruses wont cause extinction
(A.) Burnout.
Lafee 2009
Scott, Union-Tribune Staff Writer, Viruses versus hosts: a battle as old as time, May 3 rd,
http://www.signonsandiego.com/news/2009/may/03/1n3virus01745-viruses-versus-hosts-battle-old-time/?uniontrib
Generally speaking, it's not in a virus's best interest to kill its host. Deadly viruses such as Ebola and SARS are self-limiting

because

they kill too effectively and quickly to spread widely.

Flu viruses do kill, but they aren't considered especially deadly. The
fatality rate of the 1918 Spanish flu pandemic was less than 2.5 percent, and most of those deaths are now attributed to secondary bacterial
infections. The historic fatality rate for influenza pandemics is less than 0.1 percent. Humans make

imperfect hosts for the nastiest flu viruses, Sette said. From the point of view of the virus, infecting humans can be a
dead end. We sicken and die too soon.

(B.) Genetic diversity.


Sowell 2001
Thomas, Fellow @ Hoover Institution, Jewish World Review, The Dangers of Equality, 3-5,
http://www.jewishworldreview.com/cols/sowell030501.asp
People have different vulnerabilities and resistances to a variety of diseases.

That is why one disease is


unlikely to wipe out the human species, even in one place. An epidemic that sweeps through an area may leave
some people dying like flies while others remain as healthy as horses.

(C.) Co-evolution.
Posner 2005
Richard, Judge, 7th Circuit court of Appeals, Catastrophe: Risk and Response, pg. 22
AIDS illustrates the further point that despite the progress made by modern medicine in the diagnosis and treatment of diseases, developing a
vaccine or cure for a new (or newly recognized or newly virulent) disease may be difficult, protracted, even impossible. Progress has been made
in treating ATDS, but neither a cure nor a vaccine has yet been developed. And because the virus's mutation rate is high, the treatments may not
work in the long run.7 Rapidly mutating viruses are difficult to vaccinate against, which is why there is no vaccine for the common cold and why
flu vaccines provide only limited protection.8 Paradoxically, a treatment that is neither cure nor vaccine, but merely reduces the severity of a
disease, may accelerate its spread by reducing the benefit from avoiding becoming infected. This is an important consideration with respect to
AIDS, which is spread mainly by voluntary intimate contact with infected people. Yet the fact that Homo sapiens has managed to survive every
disease to assail it in the 200,000 years or so of its existence is a source of genuine comfort, at least if the focus is on extinction events. There
have been enormously destaictive plagues, such as the Black Death, smallpox, and now AIDS, but none has come close to destroying the entire
human race. There is a biological reason. Natural selection favors germs of limited lethality; they are fitter in an

evolutionary sense because their genes are more likely to be spread if the germs do not kill their hosts too
quickly. The AIDS virus is an example of a lethal virus, wholly natural, that by lying dormant yet infectious in its host for years maximizes its
spread. Yet there is no danger that AIDS will destroy the entire human race. The likelihood of a natural pandemic that would
cause the extinction of the human race is probably even less today than in the past (except in prehistoric times,
when people lived in small, scattered bands, which would have limited the spread of disease), despite wider human contacts that make it more
difficult to localize an infectious disease. The reason is improvements in medical science. But the comfort is a small one. Pandemics can still
impose enormous losses and resist prevention and cure: the lesson of the AIDS pandemic. And there is always a first time.

No Impact-Zoonotic disease
Zoonotic diseases are less threatening than others they are
contained now
Torres 1999 Alfonso Torres, D.V.M., M.S., Ph.D., Deputy Administrator, USDA,
Animal Plant and Health Inspection Service, Veterinary Services, 2/6/99,
International Economic Considerations Concerning Agricultural Diseases and
Human Health Costs of Zoonotic Diseases, Annals of the New York Academy of
Sciences 894:80-82, http://onlinelibrary.wiley.com/doi/10.1111/j.17496632.1999.tb08047.x/abstract

Animal diseases can negatively affect the number and availability of animals, their productivity, or their
appearance. 1 A few centuries ago, animal diseases affected mostly individual owners or herdsmen, but did not
have serious consequences on the larger community. A similar event today will not only have a negative impact on
the animal owners, but more importantly, will significantly affect the general economy of the region, the entire

animal diseases as an element affecting


international trade of animals and animal products has reached its full impact level with
the recent designation by the World Trade Organization (WTO) of the International Office of Epizootics
(OIE) as the international agency in charge of establishing animal health standards
upon which international commerce can institute restrictions to prevent the spread
of animal diseases from one nation to another. It is important to point out that while the spread of
human diseases around the world is due to the unrestricted movement of people
across political boundaries, animal diseases are , for the most part, restricted to defined
geographic areas of the world due to the implementation of animal importation
requirements, quarantines, animal movement regulations, and by disease control
measures that include mass vaccination campaigns and animal depopulation
practices. A number of animal diseases have been eradicated from countries or even from
continents around the world by aggressive, well-coordinated, long-term animal health campaigns. This is
in contrast to the relatively few human diseases successfully eradicated from large
areas of the world.
nation, even a group of nations. The importance of

Warming DA turns Disease


Warming turns the diseases advantage climate change allows
diseases to spread more effectively
NRDC 2011 National Resources Defense Council, environmental protection
group, 8/3/11, Infectious Diseases: Dengue Fever, West Nile Virus, and Lyme
Disease, http://www.nrdc.org/health/climate/disease.asp

many infectious diseases were once all but eliminated from the United States ,
climate change is a factor that could help them expand their range
and make a comeback. Mosquitoes capable of carrying and transmitting diseases
like Dengue Fever, for example, now live in at least 28 states. As temperatures
increase and rainfall patterns change - and summers become longer - these insects
can remain active for longer seasons and in wider areas, greatly increasing the risk
for people who live there. The same is true on a global scale: increases in heat, precipitation,
and humidity can allow tropical and subtropical insects to move from regions where
infectious diseases thrive into new places . This, coupled with increased international travel to and
from all 50 states, means that the U.S. is increasingly at risk for becoming home to these
new diseases. Nearly 4,000 cases of imported and locally-transmitted Dengue Fever
were reported in the U.S. between 1995 and 2005, and that number rises to 10,000 when cases
While

there's evidence that

in the Texas-Mexico border region are included. In Florida, 28 locally-transmitted cases were reported in a 20092010 outbreak, the first there in more than 40 years. Dengue Fever, also known as "Breakbone Fever", is
characterized by high fever, headaches, bone and joint aches, and a rash. Recurrent infection can lead to bleeding,

Lyme disease - transmitted primarily through bites from certain tick species - could
expand throughout the United States and northward into Canada , as temperatures warm,
allowing ticks to move into new regions. West Nile virus, which first entered the U.S. in 1999,
expanded rapidly westward across the country. By 2005, over 16,000 cases had
been reported. Warmer temperatures, heavy rainfall and high humidity have
reportedly increased the rate of human infection .
seizures, and death.

Sea Power

1NCSea Power Now


The navy is extremely powerfulthe loss of information
systems wouldnt hurt it
Mizokami 14(Kyle, June 6, The Five Most-Powerful Navies on the Planet, Kyle
Mizokami is a writer based in San Francisco who has appeared in The Diplomat, Foreign
Policy, War is Boring and The Daily Beast. In 2009 he cofounded the defense and security
blog Japan Security Watch., The National Intrest, The Five Most-Powerful Navies on the
Planet, http://nationalinterest.org/feature/the-five-most-powerful-navies-the-planet-10610)

First place on the list is no surprise: the United States Navy. The U.S. Navy has the
most ships by far of any navy worldwide. It also has the greatest diversity of
missions and the largest area of responsibility. No other navy has the global reach
of the U.S. Navy, which regularly operates in the Pacific, Atlantic and Indian
Oceans, as well as the Mediterranean, Persian Gulf and the Horn of Africa. The
U.S. Navy also forward deploys ships to Japan, Europe and the Persian Gulf. The
U.S. Navy has 288 battle force ships, of which typically a third are underway at any
given time. The U.S. Navy has 10 aircraft carriers, nine amphibious assault ships,
22 cruisers, 62 destroyers, 17 frigates and 72 submarines. In addition to ships, the
U.S. Navy has 3,700 aircraft, making it the second largest air force in the world. At
323,000 active and 109,000 personnel, it is also the largest navy in terms of
manpower. What makes the U.S. Navy stand out the most is its 10 aircraft carriers
more than the rest of the world put together. Not only are there more of them,
theyre also much bigger: a single Nimitz-class aircraft carrier can carry twice as
many planes (72) as the next largest foreign carrier. Unlike the air wings of other
countries, which typically concentrate on fighters, a typical U.S. carrier air wing is
a balanced package capable of air superiority, strike, reconnaissance, antisubmarine warfare and humanitarian assistance/disaster relief missions. The U.S.
Navys 31 amphibious ships make it the largest gator fleet in the world, capable
of transporting and landing on hostile beaches. The nine amphibious assault ships
of the Tarawa and Wasp classes can carry helicopters to ferry troops or act as
miniature aircraft carriers, equipped with AV-8B Harrier attack jets and soon F-35B
fighter-bombers. The U.S. Navy has 54 nuclear attack submarines, a mix of the Los
Angeles, Seawolf, and Virginia classes. The U.S. Navy is also responsible for the
United States strategic nuclear deterrent at sea, with 14 Ohio-class ballistic
missile submarines equipped with a total of 336 Trident nuclear missiles. The USN
also has four Ohio-class submarines stripped of nuclear missiles and modified to
carry 154 Tomahawk land attack missiles. The U.S. Navy has the additional roles of
ballistic missile defense, space operations and humanitarian assistance/disaster
relief. As of October 2013, 29 cruisers and destroyers were capable of intercepting
ballistic missiles, with several forward deployed to Europe and Japan. It also
monitors space in support of U.S. military forces, tracking the satellites of potential
adversaries. Finally, the U.S. Navys existing aircraft carriers and amphibious
vessels, plus the dedicated hospital ships USNS Mercy and USNS Comfort,
constitute a disaster relief capability that has been deployed in recent years to
Indonesia, Haiti, Japan and the Philippines.

2NCSea Power Now


US naval power is decades ahead of the nearest competitor
plan doesnt matter
Bloomberg 14(April 9, China's Aircraft Carrier Is Nothing to Fear,
http://www.bloombergview.com/articles/2014-04-09/china-s-aircraft-carrier-is-nothing-tofear)

Chinas plans for a blue-water navy centered on aircraft carriers follow a course the
U.S. Navy plotted more than half a century ago. This head start gives the U.S. an
insurmountable lead in operational experience and military hardware. The
Liaoningis a refurbished Soviet warship with a conventional (not nuclear) power
plant and a tilted deck that severely limits the range and payload of its aircraft. It
would be no match for any of the U.S. Navy's 11 nuclear-powered carriers, each
carrying twice as many aircraft and state-of-the-art catapults for launching them.
Chinas planned new carriers will be far more capable than the Liaoning -- but at a
crippling price. The newest U.S. carrier will cost $13 billion. Add roughly 60
aircraft, 6,700 sailors, and other vessels to support and protect it, and a single U.S.
carrier strike group costs roughly $6.5 million a day to operate. Such outlays are a
questionable investment even for the U.S., let alone China. China's blue-water
ambitions will drain a significant portion of its military budget to a sphere it cannot
hope to dominate. Better still, the strategic aims that China can reasonably
advance with a carrier-based navy are relatively benign. Such a force could protect
Chinas energy lifeline of oil tankers stretching to the Middle East, which is
occasionally threatened by pirates. It could more effectively respond to
humanitarian disasters and, if the need arose, evacuate Chinese citizens working
in far-flung locations. In addition, of course, China's government wants its bluewater navy for prestige, as a symbol of the country's rise to great-power status.
That's fine, too -- so long as it poses no threat to U.S. naval dominance. When it
comes to Chinese military power, aircraft carriers are the least of the world's
concerns.

1NCInformation Dominance Now


Lockheed Martin solves data collection integration now
Lockheed Martin 14(June 30, Lockheed Martin Awarded Contract to Enhance U.S.
Navy C4ISR Collection and Dissemination Capabilities,
http://www.lockheedmartin.com/us/news/press-releases/2014/june/isgs-navyC4ISR0630.html)

Lockheed Martin [NYSE:LMT] will work to enhance how the Navy exchanges C4ISR
data throughout the space, air, surface, subsurface, and unmanned sensor domains
under a contract with Space and Naval Warfare Systems Center Pacific. This IDIQ
contract has a ceiling value of $35 million over five years. For the Navy, every
platform is a sensor, and every sensor must be networked, said Dr. Rob Smith,
vice president of C4ISR for Lockheed Martin Information Systems and Global
Solutions. Well leverage our more than 30 years developing and fielding signals
intelligence systems to increase the Navys intelligence sharing capability across
the full spectrum of maritime and littoral missions. Lockheed Martin co-developed
the Navys Distributed Information Operations-System, which addresses the Navys
need for network-centric intelligence to improve interoperability and enhance
battlespace awareness. For that effort, Lockheed Martin connected disparate Navy
signals intelligence systems facilitating tactical data exchange and allowing
commanders to better understand their operational environment. Building upon
those capabilities, Lockheed Martin will to continue to enhance the Navys signals
intelligence collection, data fusion, and intelligence processing and dissemination
capabilities. This could involve integrating and deploying capabilities that monitor
the status of all sensors registered in the network; then displaying the input from
those sensors in support of real-time planning. Network integration of sensors will
be designed to accomplish cross-cueing, cooperative sensing and, where feasible
and prudent, automated target recognition or classification. The workscope for this
contract also includes analyzing ways to enhance the Navys use of Unmanned
Aerial Vehicles (UAVs) for surface combatant land attacks.

Lockheed Martin solves information dominance


Tradingcharts.com 14(July 1, Lockheed Martin gets contract to enhance US Navy
C4ISR collection, dissemination capabilities,
http://futures.tradingcharts.com/news/futures/Lockheed_Martin_gets_contract_to_enhance_U
S_Navy_C4ISR_collection__dissemination_capabilities_216288980.html)

Lockheed Martin (NYSE: LMT) said it will work to enhance how the Navy exchanges
C4ISR data throughout the space, air, surface, subsurface, and unmanned sensor
domains under a contract with Space and Naval Warfare Systems Center Pacific.
This IDIQ contract has a ceiling value of USD35m over five years. Lockheed
Martin co-developed the Navy's Distributed Information Operations-System, which
addresses the Navy's need for network-centric intelligence to improve
interoperability and enhance battlespace awareness. For that effort, Lockheed
Martin connected disparate Navy signals intelligence systems facilitating tactical
data exchange and allowing commanders to better understand their operational
environment.

2NCInformation Dominance Now

Lockheed Martin solves any problems with sensors

Cheng 14(Joey, June 30, Navy awards $35M contract to boost C4ISR info sharing,
Defense Systems, Joey Cheng is a writer for Defense Systems Magazine,
http://defensesystems.com/articles/2014/06/30/spawar-dio-s-unpgrades-lockheed.aspx)

In an effort to improve information sharing from its multitude of sensors, the Navy
has awarded Lockheed Martin a contract to enhance the services C4ISR data
exchange capabilities. The indefinite delivery/indefinite quantity contract with the
Space and Naval Warfare Systems Center Pacific has a ceiling of up to $35 million
over five years, and would upgrade how the Navys space, air, surface, subsurface
and unmanned sensors would collect and disseminate data, according to a
Lockheed Martin release. SPAWAR is the Navys Information Dominance system
command, and is responsible for the development of communications and
information capabilities for warfighters. The Navys Distributed Information
Operations-System, originally co-developed by Lockheed, was designed to improve
interoperability and enhance battlespace awareness through network-centric
intelligence, and connects a variety of signals intelligence systems for tactical data
exchange. Leveraging its experience with DIO-S, Lockheed may be working to
implement a monitoring system capable of checking the statuses of all of the
sensors registered in its network in the future. The same system would then
display the input from the sensors for real-time planning. Further integration of the
sensors would allow cross-cueing and cooperative sensing, as well as automated
target recognition. The scope of the contract also includes possible enhancements
of the Navys use of unmanned aerial vehicles for ground attacks. Lockheed Martin
also is a co-developer of the Distributed Common Ground System , which seeks to
integrate multiple intelligence, surveillance and reconnaissance (ISR) sensors from
different services and agencies into a common network. The Navys version, DCGSN, is the services primary ISR&T (the T is for Targeting) support system and
provides processing, exploitation, and dissemination capabilities for the
operational and tactical level. These systems are designed to allow modularity,
flexibility, and standardization for integrating data sources, transformation services,
and user interfaces, according to Lockheed Martin. For the Navy, every platform is
a sensor, and every sensor must be networked, said Dr. Rob Smith, vice president
of C4ISR for Lockheed Martin Information Systems and Global Solutions. Well
leverage our more than 30 years developing and fielding signals intelligence
systems to increase the Navys intelligence sharing capability across the full
spectrum of maritime and littoral missions.

Current navy sensors solve


USNIDC 13(US Navy Information Dominance Corps, March, US Navy Dominance
Information Dominance Roadmap 2013-2028,
http://defenseinnovationmarketplace.mil/resources/Information_Dominance_Roadmap_Mar
ch_2013.pdf)

A number of changes are programmed to improve the Navy's ability to manage


collect, process and disseminate essential information to Navy units afloat and
ashore. Near-term BA capabilities to be addressed during this timeframe are as
follows: ' Centralized fleet collection management (CM) and tasking to support
both standing and ad hoc requirements across diverse intelligence disciplines

will be developed and will include: integrated collection plans, a

consolidated collection requirements database, and the means to visualize


available collection assets; ' Manned, unmanned and other diverse platforms

will be fielded to extend orgnic sensor capability and capacity; commercial spacebased imaging and Automatic lnformation Systems (AIS) collection systems will
proliferate, and an emergent capability to task, plan and direct organic and

other sensors will be developed; ' The transformation to distributed


network environments will begin to emerge; high-performance computing
will better understand and predict the physical and virtual environments;
automation of multiple intelligence source (multi-INT) data/information
fusion and correlation will evolve and adopt common data models and
standards; and service-based architectural frameworks will be developed to
enhance information sharing; ' An emerging system-of-systems approach fiir
providing a Common Operational Picture (COP) and a Common Maritime Picture
(CMP) will begin to shape tactics, techniques and procedures to enhance multidomain awareness, and enterprise services and visualization tools will be
developed to help understand information, depict actions and trends in near realtime, ' Data sharing will be enhanced, enterprise solutions will be pursued

fiir data purchasing and cross-domain solutions will be developed to begin


consolidating Top Secret and Secret networks into a single classified
domain; MOCs will remain as the centers ofgravity for Fleet BA
coordination and will serve as central nodes for the C2 of Navy ISR assets
and resultant in- theater analysis, and fleet commanders and TAOs will be
able to establish and share a maritime COP that is periodically updated; '

Planning tools with Theater Security Cooperation capabilities and mission partner
capability information sources will become better integrated; ' Improved BA
training fi)r all operators and watchstanders will be developed.

1NCInformation Overload
IOOS failsoverloads system with too much data
RAND 14(The RAND Corporation is a nonprofit institution that helps improve policy
and decisionmaking through research and analysis, Data Flood Helping the Navy Address
the Rising Tide of Sensor Information,
http://www.rand.org/content/dam/rand/pubs/research_reports/RR300/RR315/RAND_RR315.
pdf)

Despite the bat"t1etested value of ISR systems, however, the large amount
of data diey gener- ate has become overwhelming to Navy analysts. As the
Intelligence Science Board wrote in 2008, referring to d'1C entire
Department of Defense (DoD), the number of images and signal intercepts are
well beyond the capacity of the existing analyst community, so there are huge
bacldogs For tra.nslators and image interpreters, and much of the collected data
are never reviewed." This is a good description of the Navys big data

challenge.

2NCInformation Overload
IOOS doesnt solveinformation overload
RAND 14(The RAND Corporation is a nonprofit institution that helps improve policy
and decisionmaking through research and analysis, Data Flood Helping the Navy Address
the Rising Tide of Sensor Information,
http://www.rand.org/content/dam/rand/pubs/research_reports/RR300/RR315/RAND_RR315.
pdf)

U.S. Navy intelligence, surveillance, and reconnaissance (ISR) func- tions


have become critical to U.S. national security over the last two decades.1
Within the Navy, there is a growing demand for ISR data from drones and other
souroes that provide situational awareness, which helps Navy vessels avoid

collisions, pinpoint targets, and per- form a host of odier mission-critical


tasks. The amount of data generated by ISR systems has, however, become
overwhelming. All of the data collected by the Navyand available from other
sources, both government and commercialare potentially useful, but processing
them and deriving useful knowledge from them are severely taxing die analytical
capabilities of the Navy's humans and networks. As the Navy aoquires and fields
new and addi- tional sensors for collecting data, this big data challenge will
con- tinue to grow. Indeed, if the Navy continues to field sensors as planned

but does not change the way it processes, exploits, and disseminates
information, it will reach an ISR tipping point"the point at which intelligence
analysts are no longer able to complete a minimum number of exploitation tasks

within given time constraintsas soon as 2016.1

IOOS would failtoo much data collection


Sarkar 14(Dibya, May 21, Navy must deal with rising tide of sensor data, writer for
GCN, http://gcn.com/Articles/2014/05/21/RAND-Navy-cloud.aspx?Page=1)

Were only now at the point where were starting to put up new UAVs with
incredible sensors, so the problem is only going to get worse, said Isaac R. Porche
III, a senior researcher with RAND and co-author of the report. Porche said the
Navy had argued for more manpower to deal with the growing volumes of data, but
budgetary pressures forced it to seek other options to improve efficiency. RAND,
which was hired to do a quantitative assessment, looked at the imagery, video and
audio that the Navy was collecting from UAVs, studying how long it took for its
analysts to process data from several siloed databases using different desktop
applications. The report, Data Flood: Helping the Navy Address the Rising Tide of
Sensor Information, concluded that the Navy may reach a tipping point as early as
2016 when analysts are no longer able to complete a minimum number of
exploitation tasks within given time constraints. In other words, analysts will be
much less effective than they are now in finding value in the exponentially
increasing data if the Navy doesnt change the way it collects, processes, uses, and
distributes that data.

IOOS doesnt solvesensors collect too much data


Ungerleider 14(Neal, May 23, Neal Ungerleider covers science and technology for
Fast Company, Will Drones Make The U.S. Navy Migrate To The Cloud?,
http://www.fastcolabs.com/3031082/will-drones-make-the-us-navy-migrate-to-the-cloud)

The U.S. Navy loves sensors. The same gyrometers and motion detectors that fuel
smartphones and Microsofts Kinect also keep drones in the skies and aircraft
carriers in sharp order. But the Navy--which is the size of a large global
megacorporation and also saddled with a huge bureaucracy--also has problems
dealing with the flood of data that sensors create. The RAND Corporation is arguing
for something unorthodox to deal with that data influx: A cloud for the Navy
(PDF).What do you think? In a paper called, succinctly, Data Flood, think tank
analyst Isaac R. Porche argues that the United States Navy needs a private cloud
service to cope with the influx of data created by sensors on unmanned aerial
vehicles (UAVs) and other smart military devices. Porche and his coauthors argue
that theres a precedent for creating a private cloud service for the military--the
little-known fact that the NSA, CIA, and other American intelligence agencies are
building their own cloud. In late October, Amazon Web Services reportedly sealed a
$600 million, 10-year deal with the CIA to develop a custom cloud infrastructure for
U.S. intelligence agents. Porche and his team argue that if the Navy doesnt adopt
a cloud infrastructure, there will be a tipping point as early as 2016 where analysts
will become less and less effective because bandwidth choke will prevent access to
the information they need. 2016 is the year when the number of UAS (drone)
systems will be high enough and generating enough sensor data/imagery to
overwhelm human analysts, Porche told Co.Labs by email.

Increasing the amount of information available does not


increase military effectiveness
USNI 9, US Naval institute blog, September 2009, Information Dominance!? Say
WHAT? http://blog.usni.org/2009/09/18/information-dominance-say-what
Information Dominance ? Who was the Legion-of-Merit wearing 05 or 06 who thought THAT one up? Do we actually think we
dominate the information spectrum? Nothing against VADM Dorsett, but this type of terminology represents a badly flawed
understanding of information as it pertains to warfighting. This seems like the Cult of Cebrowski.
Network-Centric Warfare as a concept is found very much wanting where the rubber meets the road. It might work if the enemy we are fighting is the

for the rest of the world, our


obsession with a flattened information hierarchy is much more of a vulnerability than an
advantage. That obsession treats information like a drug. The more we have, the
more we want. We are convinced that the key gem of information that will clear
away all the fog MUST be coming in the next data dump . Except the data dump is
increasingly difficult to sift through to find the critical information, if we are
United States Navy, who has elevated information technology to a place on Olympus. But

even aware of what is critical in light of so much unfiltered and unverified gatherings. The effort required to produce
pertinent and actionable intelligence , or even timely and worthwhile information, oftentimes suffers from such an
approach to information and intelligence gathering. Without launching a long dissertation regarding NCW and the problems created by
information inundation resulting from a sensor/shooter marriage across a massive battlefield, I believe such a term as Information
Dominance pays short shrift to a less sophisticated enemy whose extensive lowtech networks of informants are capable of negating billions of dollars of maritime stealth
technology (or finding the right merchant ship in a 400-mile long shipping lane) by using the eyeball, the messenger, the motor scooter, or the
sandal. Such an enemy , I would argue, has a much clearer and simpler set of information
requirements, and is far more skilled, more disciplined, and much more successful at meeting those

Terms such as
Information Dominance ring of a grandiose bit of self-delusion that is part plan and part capability, with a large
requirements than we are. So, who is dominant in the information game? One could argue very effectively that it is not us.

measure of wishful thinking. In the words of a certain retired Marine Corps General, words mean things. They also reveal a lot about who uses them, and
how. The term Information Dominance gives me the willies. Can we call it something a little less presumtuous?

1NCSea Power Not K2 Deterrence


Naval power is irrelevantthere are no emerging threats
Gerson and Russell 11(Michael and Allison Lawler, Michael Gerson is a nationally
syndicated columnist who appears twice weekly in the Washington Post. Michael Gerson is
the Hastert Fellow at the J. Dennis Hastert Center for Economics, Government, and Public
Policy at Wheaton College in Illinois., Russell is also an assistant professor of Political
Science and International Studies at Merrimack College in North Andover, Massachusetts.
Russell holds a Ph.D. from the Fletcher School of Law and Diplomacy at Tufts University, an
M.A. in International Relations from American University in Washington, D.C., and a B.A. in
Political Science from Boston College., American Grand Strategy and Sea Power,
https://cna.org/sites/default/files/research/American%20Grand%20Strategy%20and
%20Seapower%202011%20Conference%20Report%20CNA.pdf)

According to Dr. John Mueller, the United States has been and will continue to be
substantially free from threats that require a great deal of military
preparedness. In his view, there are no major threats to U.S. security, and there
have been none since the end of the Second World War. During the Cold War, the
United States spent trillions of dollars to deter a direct military threat that did not
exist, since the Soviet Union had no intention of launching an unprovoked attack
on Europe or the United States. Despite the continued absence of significant
threats today, the United States is still engaged in a number of conflicts in an
effort to make the world look and act the way we want. In reality, however, most
modern security issues are not really military in nature; rather, they are policing
and diplomatic activities that do not require substantial U.S. military involvement.
While isolationism is not a viable policy, the United States does not need to use its
military to solve all of the problems in the world. V 17 Dr. Mueller argued that
the absence of war among developed countries since 1945 is the greatest single
development about war in history. The end of the Cold War ushered in a New
World Order, and from 1989 to 2000 the United States was engaged in what Dr.
Mueller called policing wars. There was very little domestic support for most of
these ventures, however, because there was a strong U.S. public aversion to
nation building, a low tolerance for casualties, and a lack of concrete political
gains from success.

Navy power is irrelevantfuture conflicts will use ground


forces
Till 6(Geoffrey, December, NAVAL TRANSFORMATION, GROUND FORCES, AND THE
EXPEDITIONARY IMPULSE: THE SEA-BASING DEBATE, GEOFFREY TILL is the Professor
of Maritime Studies at the Joint Services Command and Staff College,
http://www.strategicstudiesinstitute.army.mil/pdffiles/pub743.pdf)

Recent experience suggests that the lessons learned with such difficulty by the
allies when they confronted the chaos ensuing from the sudden collapse of Nazi
Germany in 1945 were major casualties of the Cold War. In Afghanistan and Iraq,
the United States and its allies are being reminded painfully that stabilization and
reconstruction require larger numbers of troops on the ground for much longer
than preintervention planners might have thought necessary, and that the
soldiers in question need much more than "mere" warlighting skills. To relearn
these lessons will likely require a shift in the organizational cultures of the armed

services? Navies and air forces around the world have drawn from this experience
the obvious conclusion that future defense priorities in countries with similar
interventionist aspirations are likely to reflect a grow- ing relative emphasis on the
provision of intelligently trained and responsive "boots on the ground." With
resources being finite, defense expenditure on those aspects of air and naval
forces whose function seems less than wholly related to this central aim seem
likely to be limited.3

2NCSea Power Not Key to Deterrence


Naval power not keyground forces are sufficient
Rhodes 99(Edward, Spring, Naval War College Review, . . . From the Seaand Back
Again
Naval Power in the Second American Century, Dr. Rhodes is associate professor of
international relations and director of the Center for Global Security and Democracy at
Rutgers University.,Volume 42, No. 2, http://fas.org/man/dod-101/sys/ship/docs/art1sp9.htm#AUTHOR)

On the other hand, it also suggests the possibility that the Army is right and that if
forward presence is to matter it needs to be on the ground, that an offshore
presence of a potent but limited force, with only the implicit threat of surged ground
forces, is less likely to have an impact, at least if the potential aggressor has limited
goals. It also suggests the possibility that the symbolism of naval forward
presence, serving as a reminder of the full weight and power the United States
could ultimately bring to bear, may not be that important. In war, the argument
that forward naval forces operating with a littoral strategy can have an important
impact in the initial phases of the conflict, thereby preparing the ground for later
U.S. successes, is doubtless true. While true, however, it may well be relevant in
only a limited range of cases. Most potential conflicts or contingencies involve
adversaries who are too small for this effect to matter much. Short of a major
regional conflict ( MRC ), the superiority of U.S. military forces is sufficiently overwhelming that initial setbacks are not likely to be critically important. At the other
extreme, in the case of a regional near-peer competitora Russia or a Chinait is
hard to imagine a littoral strategy having much of an impact: the amount of
(nonnuclear)power that can be projected from the sea is trivial compared to the size
of the adversarys society or military establishment.

Deterrence failsmultiple studies prove


Rhodes 99(Edward, Spring, Naval War College Review, . . . From the Seaand Back
Again
Naval Power in the Second American Century, Dr. Rhodes is associate professor of
international relations and director of the Center for Global Security and Democracy at
Rutgers University.,Volume 42, No. 2, http://fas.org/man/dod-101/sys/ship/docs/art1sp9.htm#AUTHOR)

In crisis, the forward-deployed capacity to project power "from the sea" is touted as
having an immediate deterrent effectthat is, dissuading an adversary who is
tentatively considering going to war from following through on that idea. Here we
do have some evidence; at very best, however, it must be regarded as offering
mixed support for the Navy's advocacy of a littoral approach. A variety of studies of
conventional deterrence have been undertaken.60 While the research questions,
underlying theoretical assumptions, and research methods have varied, several
general findings emerge. The principal one is that immediate extended deterrence
with conventional meansthat is, using threats of conventional response to deter
an adversary who is considering aggression against a third partyregularly fails,
even in cases where commitments are "clearly defined, repeatedly publicized and
defensible, and the committed [gives] every indication of its intentions to defend
them by force if necessary."61 Unlike nuclear deterrence, conventional deterrence

does not appear to result in a robust, stable stalemate but in a fluid and competitive
strategic interaction that, at best, buys time during which underlying disputes or
antagonisms can be resolved. The possession of decisive conventional military
superiority and the visible demonstration of a resolve will not necessarily permit
the United States to deter attacks on friends and interests. There are three
reasons why immediate extended conventional deterrence is so problematic. First,
potential aggressors are sometimes so strongly motivated to challenge the status
quo that they are willing to run a high risk, or even the certainty, of paying the lessthan-total costs of losing a war. Second, potential aggressors frequently conclude,
correctly or incorrectly, that they have developed a military option that has
politically or militarily "designed around" the deterrent threat. Third, there is
considerable evidence that, particularly when they are under severe domestic
stress, potential aggressors are unable to understand or respond rationally to
deterrent threats. "Wishful thinking" by leaders who find themselves caught in a
difficult situation appears to be an all-too-common pathology. Further, and more
germane to the issue of naval forward presence as a crisis deterrent tool, there is
some evidence that because of the general insensitivity of potential aggressors to
information, efforts to "signal" resolve through measures such as reinforcing or
redeploying forces have limited effectiveness. If force movements are large enough
to foreclose particular military options, they may forestall aggression. But as a
means of indicating resolve and convincing an aggressor of the credibility of
deterrent commitments, they do not generally appear to have an impact.

Naval power irrelevantit cant effectively project power


Rhodes 99(Edward, Spring, Naval War College Review, . . . From the Seaand Back
Again
Naval Power in the Second American Century, Dr. Rhodes is associate professor of
international relations and director of the Center for Global Security and Democracy at
Rutgers University.,Volume 42, No. 2, http://fas.org/man/dod-101/sys/ship/docs/art1sp9.htm#AUTHOR)

The first and most immediate danger is from competitors to the littoral strategy:
there are, as Army and Air Force voices have noted, a variety of ways besides
projecting power "from the sea" to support a liberal-internationalist foreign policy
and to fight a transoceanic-countermilitary war. While budgetary realities have
stimulated this strategic competition between the services and are likely to
continue to serve as the spur, it would be wrong to dismiss this challenge to the
littoral strategy as mere interservice rivalry or budgetary gamesmanship. Rather,
what has developed is a serious, if admittedly parochially grounded, intellectual
debate over alternative national military strategiesover alternative ways to use
America's military potential in support of "engagement and enlargement." While a
littoral naval strategy is consistent with a liberal-internationalist vision of national
security and a transoceanic-countermilitary image of war, it is not the only military
strategy of which that can be said, and the Army and Air Force have successfully
articulated alternative military strategies that call into question the need for
significant naval effort in the littorals. The second danger, linked to the first, is that
the Navy may be unable to develop a workable operational concept for putting the
littoral strategy into effect. Indeed, the Navy has found it remarkably difficult to
script a convincing story about precisely how a littoral strategy worksthat is, the
Navy has had a hard time identifying what it is about naval operations in the

littorals that yields political-military leverage and what forces and activities are
therefore required. The failure of "Forward . . . from the Sea" to address the issue
of alternative force packages is illustrative in this regard: continued insistence that
carrier battle groups and amphibious ready groups are needed at all times in all
theaters reflects the conceptual and bureaucratic difficulty of determining the
actual requirements of a littoral strategy. Any decision to change deployment
patterns, mixes, or timetables would at least implicitly require a prioritization of
peacetime, crisis, and wartime duties; it would also represent a reallocation of
resources within the service. But without a clear understanding of the process by
which littoral operations generate the peacetime, crisis, and wartime outcomes
sought, the Navy will find it impossible to make the difficult tradeoffs demanded by
budgetary pressures. Indeed, as budgetary pressures, the need to moderate
personnel and operational tempos, and the need to modernize become greater, the
imperative for a clearer understanding of the relative value of (for example)
forward peacetime presence, forward peacetime presence by carriers and
amphibious forces, rapid crisis response, and massive wartime strike capacity will
increase. Ultimately the danger is that a littoral strategy will become unworkable
through an inability of the Navy to make the required tradeoffs, in which case it will
find itself with forces that are too small, too overstretched, too poorly maintained,
too poorly trained or manned, too obsolescent, or simply improperly configured to
meet what prove to be the essential demands of a littoral strategy. The third
danger, more basic and more beyond the control of the Navy than the first two, is
that the vision of warfare underlying the littoral strategy will be abandoned by the
nation. The DESERT STORM image of war as a transoceanic countermilitary
encounter is increasingly vulnerable, and as the elite and public begin to imagine
war in other, more traditional terms, the attractiveness and importance of projecting
power "from the sea" will become less apparent. To stay in harmony with national
leadership and national strategy, the Navy will be called upon to offer a revised
account of the utility of naval power.

Large militaries are irrelevant in deterrence--empirics


Mueller 11(John, Embracing threatlessness: Reassessing U.S. military spending,
John Mueller is Adjunct Professor of Political Science and Senior Research Scientist at the
Mershon Center for International Security Studies.,
https://cna.org/sites/default/files/research/American%20Grand%20Strategy%20and
%20Seapower%202011%20Conference%20Report%20CNA.pdf)

Over the course of the last several decades, the United States has variously sensed
threat from small counties led by people it found to be decidedly unpleasant.
These rogue states (as they came to be called in the 1990s) were led by such
devils de jour as Nasser, Sukarno, Castro, Qaddafi, Khomeini, Kim Il-Sung, and
Saddam Hussein, all of whom have since faded into historys dustbin. Today, such
alarmed focus is directed at teetering Iran, and at North Korea, the most pathetic
state on the planet. Except in worst-case fantasies, however, neither country
presents a threat of direct military aggression Iran, in fact, has eschewed the
practice for several centuries. Nonetheless, it might make some sense to maintain
a capacity to institute containment and deterrence efforts carried out in formal or
informal coalition with concerned neighboring countries and there are quite a
few of these in each case. However, neither country is militarily impressive and the
military requirements for effective containment are far from monumental and do

not necessarily need large forces-in-being. Moreover, the Iraq syndrome seems
already to be having its effect in this area. Despite nearly continuous concern
about Iranian nuclear developments, proposals to use military force to undercut
this progress have been persistently undercut. The Gulf War of 1991 is an
example of military force being successfully applied to deal with a rogue venture
the conquest by Saddam Husseins Iraq of neighboring Kuwait. This experience
does not necessarily justify the maintenance of substantial military forces,
however. First, Iraqs invasion was rare to the point of being unique: it has been
the only case since World War II in which one UN country has invaded another
with the intention of incorporating it into its own territory. As such, the
experience seems to be much more of an aberration than a harbinger. Second, in
a case such as that, countries do not need to have a large force-in-being because
there is plenty of time to build a force should other measures to persuade the
attacker to withdraw, such as economic sanctions and diplomatic forays, fail. And
third, it certainly appears that Iraqs pathetic forces lacking strategy, tactics,
leadership, and morale needed the large force thrown at them in 1991 to decide
to withdraw.18

Naval power fails


Till 6(Geoffrey, December, NAVAL TRANSFORMATION, GROUND FORCES, AND THE
EXPEDITIONARY IMPULSE: THE SEA-BASING DEBATE, GEOFFREY TILL is the Professor
of Maritime Studies at the Joint Services Command and Staff College,
http://www.strategicstudiesinstitute.army.mil/pdffiles/pub743.pdf)

Even so, at least three problems remain. First, there may be a tendency to focus
force protection too much x3 on the sea lines of communication and not enough
on the security of sea ports, both of embarkation and arrival. There also is a
natural tendency to concentrate on the safe and timely arrival of soldiers and
their equipment in the theater, neglecting the dangers that, in these
asymmetrical days, may be posed to their collection and dispatch at home. In the
Iraq operation, Greenpeace attempted to interfere with the loading of military
supplies for the British forces at the military port of Marchwood in Southampton
Water. They were ineffective and harmless, but nonetheless represented a useful
reminder of the vulnerability at the suppliers end of the supply chain. Second,
many would doubt the permanence of the shift of naval priorities away from
oceanic sea control and towards its coastal force-protection variant. The
emergence of new maritime powers such as Japan and China, or the recovery of
Russia, might lead to a resurgence of peer competition and old-fashioned
maritime rivalry on the high seas. The U.S. Navys current wariness about the
prospective maritime expansion of China later in the century may be used to
justify investment in more conventional forms of naval power. Third, the ability of
naval forces to maintain an operational posture against relatively unsophisticated
shore-based opposition can be exaggerated. The vulnerability of ships to coastal
mines, small, quiet diesel submarines, terrorists on jet skis, or radical weaponry
of the kind recently demonstrated by the Iranians, has to be taken seriously.24

1NCSea Power Not K2 Trade


Naval power is not key to tradethere are no credible threats
Mueller 11(John, Embracing threatlessness: Reassessing U.S. military spending,
John Mueller is Adjunct Professor of Political Science and Senior Research Scientist at the
Mershon Center for International Security Studies.,
https://cna.org/sites/default/files/research/American%20Grand%20Strategy%20and
%20Seapower%202011%20Conference%20Report%20CNA.pdf)

Particularly in an age of globalization and expanding world trade, many, especially


in the Navy, argue that a strong military force is needed to police what is
portentously labeled the global commons. However, there seems to be no
credible consequential threat, beyond those to worldwide shipping. There have
been a few attacks by pirates off Somalia, exacting costs in the hundreds of
millions of dollars a year from a multi-billion dollar industry which, surely, has the
capacity to defend itself from such nuisances perhaps by the application on
decks of broken glass, a severe complication for barefoot predators. In the unlikely
event that the problem should become severe, it would not need forces-in-being;
it could be dealt with by newly formulated forces designed for the specific purpose

2NCSea Power Not K2 Trade


Naval power not key to trade
Hultin & Blair 6(Jerry MacArthur HULTIN and Admiral Dennis BLAIR, Naval Power
and Globalization: The Next Twenty Years in the Pacific, Jerry M. Hultin is former
president of Polytechnic Institute of New York University From 1997 to 2000 Mr. Hultin
served as Under Secretary of the Navy., Dennis Cutler Blair is the former United States
Director of National Intelligence and is a retired United States Navy admiral who was the
commander of U.S. forces in the Pacific region, http://engineering.nyu.edu/files/hultin
%20naval%20power.pdf)

Technology has fundamentally changed the relationship between national naval


power and economic prosperity from the simplistic mercantilist tenets of Alfred
Thayer Mahan almost two hundred years ago. No longer does trade simply follow
the flag, no longer is a strong national Navy required to support strong national
trade relationships. With the economies of many countries interdependent in
sectors such as energy and raw materials, and with globalization increasing
interdependence in manufacturing and services, military security is becoming
increasingly a shared responsibility.

1NCAlt Causes
Alt causes to navy power decline that IOOS cant solve for
Hultin & Blair 6(Jerry MacArthur HULTIN and Admiral Dennis BLAIR, Naval Power
and Globalization: The Next Twenty Years in the Pacific, Jerry M. Hultin is former
president of Polytechnic Institute of New York University From 1997 to 2000 Mr. Hultin
served as Under Secretary of the Navy., Dennis Cutler Blair is the former United States
Director of National Intelligence and is a retired United States Navy admiral who was the
commander of U.S. forces in the Pacific region, http://engineering.nyu.edu/files/hultin
%20naval%20power.pdf)

As successful as this transformation has been, there are new technological, geopolitical, and warfighting challenges on the horizon that require solutions. The
three most noteworthy are missile defense, irregular warfare peace operations,
counter-insurgency warfare, stability operations and military operations against
non-state organizations using terrorist tactics and the integration of all factors of
political-military influence. First, missile defense: in the future increasing
numbers of littoral countries, organizations such as Hezbollah that are sponsored
by countries, and even unsponsored militant organizations will field ballistic and
cruise missiles up to intermediate ranges. These weapons will threaten deployed
US naval expeditionary forces, military forces and civilian populations of U.S. allies,
and ultimately the United States itself. Thus, it is imperative that the United
States and its allies develop and deploy a highly effective capacity to defend
against such attacks8 . There is not a single system solution to this threat the
Navy must continue developing more capable sensors, communications networks,
offensive weapons to take out enemy missiles and their support systems, along
with both hard-kill and soft-kill defensive systems that can defeat enemy missiles
once they have been launched.

No uq to undersea warfare shortage of subs and china


military modernization
Eaglen and Rodeback 2010
Mackenzie Eaglen Research Fellow for National Security Studies, Allison Center for
Foreign Policy Studies Jon Rodeback Editor, Research Publications Heritage
Submarine Arms Race in the Pacific: The Chinese Challenge to U.S. Undersea
Supremacy http://www.heritage.org/research/reports/2010/02/submarine-arms-racein-the-pacific-the-chinese-challenge-to-us-undersea-supremacy
The People's Republic of China (PRC) is rapidly emerging as a regional naval power
and a potential global power, which "raises the prospect of intensifying security
competition in East Asia both between the United States and China and between
China and Japan."[2] Other Pacific countries in the region have also taken note of
the changing security environment as evidenced in particular by their planned
submarine acquisitions. Australia's military buildup includes doubling its submarine
fleet from six submarines to 12 larger, more capable submarines.[3] In addition,
"India, Malaysia, Pakistan, Indonesia, Singapore, Bangladesh and South Korea are
planning to acquire modern, conventional submarines."[4] Both Australia and India
have explicitly described their naval buildups as responses, at least in part, to
China's naval buildup. In contrast, the U.S. submarine fleet is projected to continue
shrinking through 2028, further limiting the U.S. ability to shape and influence

events in the Pacific. The U.S. attack submarine serves an important part in
establishing sea control and supremacy, and it is not interchangeable with other
assets. Its unique capabilities make it a force multiplier and allow it to "punch above
its weight." To protect U.S. interests in East Asia and the Pacific, and to support and
reassure U.S. allies, the U.S. must halt and then reverse the decline of its submarine
fleet as a critical component of a broader policy to maintain the military balance in
the Pacific.

Budget cuts destroy readiness


Ron Ault 13, Executive Assistant to the President for Metal Trades Department,
AFL-CIO, Navy Budget Cuts Sinking Americas Military Readiness, Feb 15 2013,
http://www.metaltrades.org/index.cfm?
zone=/unionactive/view_blog_post.cfm&blogID=1390&postID=6618

While many in Washington are focused on the rapidly-approaching March 1st sequestration deadline, few seem to be paying much

massive cuts that will go into effect this week across our military cuts that will
have an equally devastating impact on our economy and security. As the result of the continuing
attention to the

resolution (CR) that was agreed upon in the final hours of 2012, a deal that averted the so-called fiscal cliff, the U.S. Navy was
forced to accept $6.3 billion in cuts for the remainder of the forces Fiscal Year 2013 (FY13) budget. If sequestration goes into effect,
that number could grow to $10 billion. As the representative of 80,000 workers employed at U.S. shipyards and naval bases
workers who take great pride in the role they play in supporting our military I would like to shed some light on the severity and
senselessness of the Navy budget cuts and implore the 113th Congress to take prudent and expeditious action to avoid further
Defense spending cuts in FY13.

Lawmakers have already taken a hatchet to the Defense

budget; we simply cannot sustain any further cuts this year. The $6.3 billion in Navy
budget reductions cut too deeply . For example, Navy Fleet Commanders have been
ordered to cancel 3rd and 4th Quarter surface ship maintenance. Thats a very tall order
considering our shipyards are already fully loaded with high priority work on
extremely tight schedules to support fleet deployment. Once scheduled maintenance on a ship
is missed, there is no capacity to make it up later without causing delays elsewhere. The
well-oiled machine breaks down. With the cuts contained in the CR alone, it is conceivable that
aircraft carriers, destroyers, nuclear submarines, and other Navy vessels would be
tied up at piers awaiting critical repairs without the ability to move , much less support
our national defense. If we allow our fleet to collect barnacles in harbor for six months, we
would significantly disrupt our militarys access to the ships they need to defend
our country.

Specifically wrecks undersea dominance


Hugh Lessig 13, Daily Press, 9/12/13, U.S. Navy: Budget challenges threaten

submarine program, http://articles.dailypress.com/2013-09-12/news/dp-nws-forbessubmarine-hearing-20130912_1_u-s-navy-submarines-missile


The U.S. military touts its submarine program as an unqualified success, yet the
fleet is expected to drop by nearly 30 percent in coming years and one Hampton Roads
lawmaker wants to pull money from outside the Navy's shipbuilding program to ease the pressure on one key

Navy leaders on Thursday told a panel chaired by Rep. Randy Forbes, R-Chesapeake, that
more budget cuts would dull the edge of the submarine program , where the
U.S. enjoys a distinct advantage over other superpowers. Forbes chairs the House Armed Service's
program. Two

subcommittee on sea power. About 2,300 jobs at Newport News Shipbuilding are tied to the U.S. submarine

The
U.S. military has three types of nuclear-powered submarines . First are the smaller fast-attack
program. The shipyard builds them in partnership with General Dynamics Electric Boat of Groton, Conn.

submarines that fall primarily in two classes, the older Los Angeles class and the newer Virginia class. Last week,
the Navy commissioned the newest Virginia class sub at Naval Station Norfolk. The second type are Ohio class

ballistic missile submarines that roam the seas and provide a nuclear strike capability. The third type is an offshoot
of the second: When the Cold War ended, the U.S. converted four of those ballistic missile submarines into guidedcruise missile submarines.

All three types are scheduled to drop , said Rear Adm. Richard P.

Breckenridge and Rear Adm. David C. Johnson. who testified before the Forbes panel.

The huge national debt means cuts will continue---guarantees


an ineffective navy
Seth Cropsey 10, senior fellow at the Hudson Institute, former Deputy

Undersecretary of the Navy, The US Navy in Distress, Strategic Analysis Vol. 34,
No. 1, January 2010, 3545, http://navsci.berkeley.edu/ma20/Class%208/Class
%2012.%20Sea%20Power.%20%20Cropsey_US_Navy_In_Distress.pdf
The most tangible result is the continued withering of the US combat fleet which today numbers about
280 ships. This is less than half the size achieved towards the end of the Reagan
administration buildup and 33 ships short of what the Navy says it needs to fulfill todays
commitments.

Nothing suggests a substantive reversal . Most signs point to additional

decline over the long term. Four years ago the Navys projected fleet size had dwindled to 313 ships. There it has stayed . . .
until May of this year when a senior Navy budget official, commenting on the proposed 2010 budget, suggested that the new
Quadrennial Review now underway at the Defense Department will likely result in a smaller
projected fleet size. Huge increases in current and projected national debt and
the vulnerability of the military budget to
without compelling events

help

offset it increase the chance that

the nations sea services will experience additional and perhaps drastic

reductions . National indebtedness will grow from its current ratio of 40 per cent of GDP to 80 per
cent of GDP in a decade. Servicing this will cripple the nations ability to modernize
and increase a powerful world-class fleet or drive us deeper into a yawning financial hole.

1NCNo War (Escalation)


No escalation to great power war--empirics
Mueller 11(John, Embracing threatlessness: Reassessing U.S. military spending,
John Mueller is Adjunct Professor of Political Science and Senior Research Scientist at the
Mershon Center for International Security Studies.,
https://cna.org/sites/default/files/research/American%20Grand%20Strategy%20and
%20Seapower%202011%20Conference%20Report%20CNA.pdf)

A sensible place to begin the consideration is with an examination of the prospects


for a major war like World War II. Although there is no physical reason why such
a war cannot recur, it has become fairly commonplace to regard such wars as
obsolescent, if not completely obsolete.4 Leading or developed countries continue
to have disputes, but, reversing the course of several millennia, none seems likely
seriously to envision war as a sensible method for resolving any of these disputes.
Europe, once the most warlike of continents, has taken the lead in this. It was on
May 15, 1984, that the major countries of the developed world had managed to
remain at peace with each other for the longest continuous stretch since the days
of the Roman Empire.5 That rather amazing record has now been further
extended, and today one has to go back more than two millennia to find a longer
period in which the Rhine remained uncrossed by armies with hostile intent.6
All historians agree, observed Leo Tolstoy in War and Peace in 1869, that states
express their conflicts in wars and that as a direct result of greater or lesser
success in war the political strength of states and nations increases or
decreases.7 Whatever historians may currently think, it certainly appears that
this notion has become substantially obsolete. Prestige now 4 For an early
examination of this proposition, see John Mueller, Retreat from Doomsday: The
Obsolescence of Major War (New York: Free Press, 1989; reprinted and updated
edition, edupublisher.com, 2009). See also Fettweis, Dangerous Times. 5 Paul
Schroeder, Does Murphys Law Apply to History? Wilson Quarterly, New Years
1985: 88. The previous record, he notes, was chalked up during the period from
the end of the Napoleonic Wars in 1815 to the effective beginning of the Crimean
War in 1854. The period between the conclusion of the Franco-Prussian War in
1871 and the outbreak of World War I in 1914 marred by a major war in Asia
between Russia and Japan in 1904 was an even longer era of peace among major
European countries. That record was broken on November 8, 1988. On some of
these issues, see also Evan Luard, War in International Society (New Haven, CT:
Yale University Press, 1986), 395-99; and James J. Sheehan, Where Have All the
Soldiers Gone? The Transformation of Modern Europe (Boston: Houghton Mifflin,
2008). 6 Bradford de Long, Let Us Give Thanks (Wacht am Rhein Department),
November 12, 2004, www.j-bradford-delong.net/movable_type/20042_arcives/000536.html. 7 (New York: Norton, 1966), 1145. V 49 comes from
other factors, such as making economic progress and putting on a good Olympics.
The Cold War did supply a set of crises and peripheral wars that engaged leading
countries from time to time, and it was commonly envisioned that doom would
inevitably emerge from the rivalry. Thus, political scientist Hans J. Morgenthau in
1979 said: In my opinion the world is moving ineluctably towards a third world
war a strategic nuclear war. I do not believe that anything can be done to
prevent it. The international system is simply too unstable to survive for long.8
At about the same time, John Hackett penned his distinctly non-prescient book

The Third World War: August 1985. 9 Such anxieties obviously proved to be
over-wrought, but to the degree that they were correctly focused on a potential
cause of major war, that specific impetus no longer exists. World War III, then,
continues to be the greatest nonevent in human history, and that happy condition
seems very likely to continue. There have been wars throughout history, of course,
but the remarkable absence of the species worst expression for two-thirds of a
century (and counting) strongly suggests that realities may have changed , and
perhaps permanently. Accordingly it may be time to consider that spending a lot of
money preparing for a conceivable eventuality or fantasy that is of everreceding likelihood is a highly questionable undertaking.

1NC South China Sea


No South China conflict-engagement will check miscalc and
mistrust
Thayer, New South Wales emeritus professor, 2013
(Carlyle, Why China and the US wont go to war over the South China Sea, 5-13,
http://www.eastasiaforum.org/2013/05/13/why-china-and-the-us-wont-go-to-warover-the-south-china-sea/, ldg)
Chinas increasing assertiveness in the South China Sea is challenging US primacy in the Asia Pacific. Even before Washington
announced its official policy of rebalancing its force posture to the Asia Pacific, the United States had undertaken steps to strengthen
its military posture by deploying more nuclear attack submarines to the region and negotiating arrangements with Australia to
rotate Marines through Darwin.Since then, the United States has deployed Combat Littoral Ships to Singapore and is negotiating

But these developments do not presage


armed conflict between China and the United States . The Peoples Liberation Army
Navy has been circumspect in its involvement in South China Sea territorial
disputes, and the United States has been careful to avoid being entrapped by
regional allies in their territorial disputes with China. Armed conflict between China and the United States in the
new arrangements for greater military access to the Philippines.

appears unlikely . Another, more probable, scenario is that both


countries will find a modus vivendi enabling them to collaborate to maintain
security in the South China Sea. The Obama administration has repeatedly emphasised that its policy of
South China Sea

rebalancing to Asia is not directed at containing China. For example, Admiral Samuel J. Locklear III, Commander of the US Pacific
Command, recently stated, there has also been criticism that the Rebalance is a strategy of containment. This is not the case it is
a strategy of collaboration and cooperation. However, a review of past USChina military-to-military interaction indicates that an
agreement to jointly manage security in the South China Sea is unlikely because of continuing strategic mistrust between the two

third scenario is more


likely than the previous two: that China and the United States will maintain a
relationship of cooperation and friction. In this scenario, both countries work
separately to secure their interests through multilateral institutions such as the East Asia Summit, the ASEAN
Defence Ministers Meeting Plus and the Enlarged ASEAN Maritime Forum. But they also continue to engage
each other on points of mutual interest . The Pentagon has consistently sought to keep channels of
countries. This is also because the currents of regionalism are growing stronger. As such, a

communication open with China through three established bilateral mechanisms: Defense Consultative Talks, the Military Maritime
Consultative Agreement (MMCA), and the Defense Policy Coordination Talks. On the one hand, these multilateral mechanisms reveal
very little about USChina military relations. Military-to-military contacts between the two countries have gone through repeated
cycles of cooperation and suspension, meaning that it has not been possible to isolate purely military-to-military contacts from their

the channels have accomplished the following:


continuing exchange visits by high-level defence officials ; regular Defense Consultation Talks;
political and strategic settings. On the other hand,

continuing working-level discussions under the MMCA; agreement on the 7-point consensus; and no serious naval incidents since
the 2009 USNS Impeccable affair. They have also helped to ensure continuing exchange visits by senior military officers; the
initiation of a Strategic Security Dialogue as part of the ministerial-level Strategic & Economic Dialogue process; agreement to hold
meetings between coast guards; and agreement on a new working group to draft principles to establish a framework for military-to-

So the bottom line is that, despite ongoing frictions in their


relationship, the United States and China will continue engaging with each other.
Both sides understand that military-to-military contacts are a critical component of
bilateral engagement. Without such interaction there is a risk that mistrust between
the two militaries could spill over and have a major negative impact on bilateral
relations in general. But strategic mistrust will probably persist in the absence of greater transparency in military-tomilitary relations. In sum, Sino-American relations in the South China Sea are more
military cooperation.

likely to be characterised by cooperation and friction than


collaboration or,

a worst-case scenario, armed conflict.

a modus vivendi of

2NC South China Sea


No south china sea conflict-china would never engage
Carlson, Cornell government professor, 2013

(Allen, China Keeps the Peace at Sea, 2-21,


http://www.foreignaffairs.com/articles/139024/allen-carlson/china-keeps-the-peaceat-sea?page=show, ldg)
fundamentals of Deng's grand economic strategy are still revered in Beijing. But
any war in the region would erode the hard-won, and precariously held, political
capital that China has gained in the last several decades. It would also disrupt trade
relations, complicate efforts to promote the yuan as an international currency, and send shock waves through
the country's economic system at a time when it can ill afford them. There is thus little reason to think that
China is readying for war with Japan. At the same time, the specter of rising Chinese nationalism ,
although often seen as a promoter of conflict , further limits the prospects for armed
engagement. This is because Beijing will try to discourage nationalism if it fears it may lose
control or be forced by popular sentiment to take an action it deems unwise. Ever since the Tiananmen Square
massacre put questions about the Chinese Communist Party's right to govern before the
population, successive generations of Chinese leaders have carefully negotiated a
balance between promoting nationalist sentiment and preventing it from boiling
over. In the process, they cemented the legitimacy of their rule. A war with Japan could easily upset that
balance by inflaming nationalism that could blow back against China's leader s. Consider
The

a hypothetical scenario in which a uniformed Chinese military member is killed during a firefight with Japanese soldiers. Regardless
of the specific circumstances, the casualty would create a new martyr in China and, almost as quickly, catalyze popular protests
against Japan. Demonstrators would call for blood, and if the government (fearing economic instability) did not extract enough,
citizens would agitate against Beijing itself. Those in Zhongnanhai, the Chinese leadership compound in Beijing, would find
themselves between a rock and a hard place. It is possible that Xi lost track of these basic facts during the fanfare of his rise to
power and in the face of renewed Japanese assertiveness. It is also possible that the Chinese state is more rotten at the core than is
understood. That is, party elites believe that a diversionary war is the only way to hold on to power -- damn the economic and social

Xi does not seem blind to the principles that have served Beijing so well
over the last few decades. Indeed, although he recently warned unnamed others about infringing upon China's
"national core interests" during a foreign policy speech to members of the Politburo, he also underscored China's
commitment to "never pursue development at the cost of sacrificing other country's
interests" and to never "benefit ourselves at others' expense or do harm to any
neighbor." Of course, wars do happen -- and still could in the East China Sea. Should either side draw first blood through
consequences. But

accident or an unexpected move, Sino-Japanese relations would be pushed into terrain that has not been charted since the middle

understanding that war would be a no-win situation, China has avoided rushing over
the brink. This relative restraint seems to have surprised everyone . But it shouldn't.
of the last century. However,

Beijing will continue to disagree with Tokyo over the sovereign status of the islands, and will not budge in its negotiating position
over disputed territory. However, it cannot take the risk of going to war over a few rocks in the
sea. On the contrary, in the coming months it will quietly seek a way to shelve the
dispute in return for securing regional stability , facilitating economic development,
and keeping a lid on the Pandora's box of rising nationalist sentiment. The ensuing peace,
while unlikely to be deep, or especially conducive to improving Sino-Japanese relations, will be enduring.

China wont get in a naval conflict


Holmes, Naval War College strategy professor, 2012

(James, The Sino-Japanese Naval War of 2012, 8-20,


http://www.foreignpolicy.com/articles/2012/08/20/the_sino_japanese_naval_war_of_2
012?page=0,1, DOA: 10-16-12, ldg)

Whoever forges sea, land, and air forces into the sharpest weapon of sea combat stands a good chance of prevailing. That could be
Japan if its political and military leaders think creatively, procure the right hardware, and arrange it on the map for maximum effect.

After all, Japan doesnt need to defeat Chinas military in order to win a showdown
at sea, because it already holds the contested real estate; all it needs to do is deny
China access. If Northeast Asian seas became a no-mans land but Japanese forces hung on, the political victory would be

Tokyos. Japan also enjoys the luxury of concentrating its forces at home, whereas the PLA Navy is dispersed into three fleets spread

they concentrate forces to amass


numerical superiority during hostilities with Japan, they risk leaving other interests uncovered.
It would hazardous for Beijing to leave, say, the South China Sea unguarded during a
conflict in the northeast. And finally, Chinese leaders would be forced to consider how far a
marine war would set back their sea-power project . China has staked its economic
and diplomatic future in large part on a powerful oceangoing navy . In December 2006,
along Chinas lengthy coastline. Chinese commanders face a dilemma: If

President Hu Jintao ordered PLA commanders to construct a powerful peoples navy that could defend the nations maritime

That takes
lots of ships. If it lost much of the fleet in a Sino-Japanese clash even in a winning effort Beijing
lifelines in particular sea lanes that connect Indian Ocean energy exporters with users in China at any time.

could see its momentum toward world-power status reversed in an afternoon. Heres hoping Chinas political and
military leaders understand all this. If so, the Great Sino-Japanese Naval War of 2012 wont be happening outside these pages.

1NCNo Taiwan War


No China-Taiwan conflictChinese democratization and
economic interdependence
Mueller 11(John, Embracing threatlessness: Reassessing U.S. military spending,
John Mueller is Adjunct Professor of Political Science and Senior Research Scientist at the
Mershon Center for International Security Studies.,
https://cna.org/sites/default/files/research/American%20Grand%20Strategy%20and
%20Seapower%202011%20Conference%20Report%20CNA.pdf)
Chinas oft-stated desire to incorporate (or re-incorporate) Taiwan into its territory and its apparent design on
other offshore areas do create problems ones that world leaders elsewhere should sensibly keep their eyes on,
and ones that could conceivably lead to an armed conflict to which American military forces might appear

it is also conceivable, and far more likely, that the whole problem will be
worked out over the course of time without armed conflict. The Chinese strongly
stress that their perspective on this issue is very long term and that they have a
historic sense of patience. Indeed, if China eventually becomes a true democracy, Taiwan might
even join up voluntarily or, failing that, some sort of legalistic face-saving
agreement might eventually be worked out. Above all, China is increasing
becoming a trading state, in Richard Rosecrances phrase.11 Its integration into the world
economy and its increasing dependence on it for economic development and for
the consequent acquiescent contentment of the Chinese people is likely to keep the
country reasonable. Armed conflict over the issue would be extremely even
overwhelmingly costly to the country, and, in particular, to the regime in charge,
and Chinese leaders seem to realize this.
relevant. But

2NCNo Taiwan War


China would not attack Taiwanit destroys is position globally
and there is no certainty that it would be successful
Keck 13(Zachary, December 24, Why China Won't Attack Taiwan, Zachary Keck is

Associate Editor of The Diplomat., The Diplomat, http://thediplomat.com/2013/12/why-chinawont-attack-taiwan/)

Although the trend lines are undoubtedly working in Chinas favor, it is ultimately
extremely unlikely that China will try to seize Taiwan by force. Furthermore, should it
try to do this, it is unlikely to succeed. Even assuming Chinas military capabilities
are great enough to prevent the U.S. from intervening, there are two forces that
would likely be sufficient to deter China from invading Taiwan. The first and least
important is the dramatic impact this would have on how countries in the region
and around the world would view such a move. Globally, China seizing Taiwan would
result in it being permanently viewed as a malicious nation. Regionally, Chinas
invasion of Taiwan would diminish any lingering debate over how Beijing will use its
growing power. Every regional power would see its own fate in Taiwan. Although
Beijing would try to reassure countries by claiming that Taiwan was part of China
already, and thus the operation was a domestic stability one, this narrative would
be convincing to none of Chinas neighbors. Consequently, Beijing would face an
environment in which each state was dedicated to cooperating with others to
balance against Chinese power. But the more important deterrent for China would
be the uncertainty of success.

No escalation to China-US war


Lowther 14(William, Febuary 20, Independence will not happen, William Lowther is a
staff writer for the Taipei Times,
http://www.taipeitimes.com/News/taiwan/archives/2014/02/20/2003583939/1)

In a fight over Taiwan, Mearsheimer says, US policymakers would be reluctant to


launch major strikes against Chinese forces in China, for fear it might precipitate
nuclear escalation. The US is not going to escalate to the nuclear level if Taiwan is
being overrun by China. The stakes are not high enough to risk a general
thermonuclear war. Taiwan is not Japan, or even South Korea, he says. He says
Taiwan is an especially dangerous flashpoint, which could easily precipitate a SinoUS war that is not in the US interest. US policymakers understand that the fate of
Taiwan is a matter of great concern to China and there is a reasonable chance US
policymakers will eventually conclude that it makes good strategic sense to
abandon Taiwan and allow China to coerce it into accepting unification,
Mearsheimer says.

Homeland Security

Terrorism
Better detection fails- terrorists can hide their HEU from
radiation detectors.
de Rugy 05 (Veronique, senior research fellow at the Mercatus Center at George Mason
University, Is Port Security Spending Making Us Safer?,
http://cip.management.dal.ca/publications/Is%20Port%20Security%20Spending%20Making
%20Us%20Safer.pdf)

Considering the extreme difficulty of interdicting drug smugglers, it seems that


determined smugglers with a nuclear device would have little trouble circumventing
the nations border protection and control, particularly because they would be able
to leverage the techniques used successfully by drug smugglers. Further lowering
their probability of being caught is the fact that, according to a series of experts
testifying 11 before Congress in July 2005, terrorists could easily shield the highly
enriched uranium and avoid detection from radiation detectors. 34

Port security fails to solve terrorism

a. Other targets besides ports


Shie 04, Tamara Renee Shie, [former visiting fellow at the Pacific Forum CSIS in Honolulu,

research assistant at the Institute for National Strategic Studies (INSS) at the National Defense
University (NDU) in Washington, D.C. ], Ships and Terrorists Thinking Beyond Port Security,
http://csis.org/files/media/csis/pubs/pac0445a.pdf

Unfortunately, relying on these initiatives alone creates a false sense of security.


They are inadequate to deter terrorists from pursuing many maritime targets. The
principal limitation of these two initiatives is their specific focus on the security of
major transshipment ports. Though these are essential to international trade,
securing only these ports will not protect them or a region from terrorist attacks.
First, the emphasis on upgrading the security of major ports neglects the fact that
these represent only a single link in the transportation chain. A shipping container
may pass through some 15 physical locations and some two dozen individuals
and/or companies while traveling from departure point to destination. Because
containers are only searched at the major port, there is no guarantee they cannot
be waylaid in route after that point. Second, the CSI conducts security checks only
on U.S. bound containers. Therefore even if a tampered container arrives at a major
port, if it is destined for a port other than the U.S., it is more likely to escape notice.
Containers between the major ports of Singapore and Shenzhen or Pusan and Hong
Kong are not subject to CSI requirements. Yet terrorist assaults on U.S. ships or
interests can occur outside the U.S. Third, as major ports increase security,
terrorists will look for other maritime targets or other means to target those ports.
Terrorists are increasingly aiming at soft targets. Attacking maritime targets has
never been particularly easy, often requiring a greater sophistication in planning,
training, and coordination than those aimed at many land-based facilities. This is
why maritime terrorism is rather rare, and why terrorists are less likely to attack a
more secure major port. Yet in considering maritime terrorist threat scenarios
using a ship to smuggle goods or weapons, sinking a vessel in a major shipping
thoroughfare, using a ship as a weapon, or even targeting maritime vessels none

require access to a major port or a shipping container to carry out a strike. There
remain numerous small ports and small vessels not covered under the new security
initiatives. The ISPS Code for instance only covers ships of 500 tons or more and
port facilities that serve large international-bound vessels. The Code would not have
protected the USS Cole.

b. Effective nuclear detection is impossible


Rugy 07, Veronique de Rugy, [senior research fellow at the Mercatus Center at George Mason

University],Is Port Security Funding Making Us Safer?, MIT Center for International Studies, November
2007, http://web.mit.edu/cis/pdf/Audit_11_07_derugy.pdf.

The Domestic Nuclear Detection Office (DNDO) received $535 million in 2007.14
DNDOs mission addresses a broad spectrum of radiological and nuclear protective
measures, but is focused exclusively on domestic nuclear detection.15 The
fundamental problem is that DNDO relies on radiation portal monitors that have
been proven unable to detect shielded nuclear material essentially rendering them
useless.16 Besides, even if the system could detect every dangerous item, it is
ineffective unless the nuclear material is brought through the fixed ports of entry
where the monitors are located. With thousand of miles of unguarded borders and
no cost effective way to address the issuesmugglers can easily find positions to
bring illicit goods inside the country. Consider the countrys long standing War on
Drugs and the inability to stop the flow of illegal drugs into the country.

c. Least cost-effective way


Rugy 07, Veronique de Rugy, [senior research fellow at the Mercatus Center at George Mason

University],Is Port Security Funding Making Us Safer?, MIT Center for International Studies, November
2007, http://web.mit.edu/cis/pdf/Audit_11_07_derugy.pdf.

A close look at port security allocation decisions indicates that spending occurs
without regard for risk analysis let alone cost-benefit analysis, leading to a large
array of misallocated spending. For instance, why should the highest priorities
preventing terrorists from acquiring nuclear devices and materialreceive less
money than much less cost-effective policies such as nuclear detection in the ports
or post-disaster response activities. Because it rests mainly on domestic detection
of WMD in portsa task that is not clear could be achieved the port security model
offers almost no value to the nation.6 Even if we could seal our ports, America
wouldnt be safe. The only effective way to prevent nuclear attacks is to deny
terrorists access to weapons and material. Without nuclear materials there can be
no nuclear bombs.

d. Vulnerability between stops


Jon D. Haveman and Howard J. Shatz 2006- DirectoroftheEconomyProgramat
thePublicPolicyInstituteof California and Senior Economist; Professor, Pardee RAND
Graduate School (Protecting the Nations Seaports: Balancing Security and Cost, Pg
30-31, http://www.ppic.org/content/pubs/report/r_606jhr.pdf)
In Chapter 4, Stephen S. Cohen considers the security threat that the container
creates for the maritime transportation system. Each day, tens of thousands of
containers flow through U.S. ports, largely undisturbed in their trip from one part of
the world to another. In general, containers are loaded and sealed, or perhaps only

closed and not sealed, well inland of a port. They are then transferred by truck, or
truck and train, to a seaport, where they are loaded onto a ship. Following the sea
journey, they are transferred to another truck, and perhaps another train, for a
further journey over land to the ultimate destination. Each container trip to the
United States has, on average, 17 different stops, or points at which the containers
journey temporarily halts.14 The adage goods at rest are goods at risk readily
applies to the terrorist threat. The container will be at rest at any point in the
journey that involves a change in mode of transportation. While at rest, the
container is vulnerable to thieves and terrorists alike. Providing port security
therefore involves closely scrutinizing activities not only at the port but at points all
along the shipping chain. The truck driver picking up the container at the U.S. port,
often poorly paid and possibly an illegal immigrant not well integrated into U.S.
society, may himself represent a vulnerability in the system. The issue is not
merely that something could be put in a container illicitly for an attack on the port
where it is unloaded but that nuclear weapons or radiological material could be
inserted, shipped to the United States, moved inland without inspection, and then
unloaded into the hands of terrorists. These objects could then be transported for
use in major population centersperhaps better targets than a port complex.
Likewise, explosive material could be put in several containers and then detonated
at or near port complexes around the same time, leading to a security reaction that
could shut down the entire maritime transportation system until officials, and port
workers and management, were certain the threat had passed. There is no way to
completely inspect all of the millions of containers entering the United States . They
are about as large as a fullsize moving van and are often tightly packed. Inspecting
each thoroughly would bring commerce to a halt, exactly the kind of reaction that
terrorists hope to generate.

The risk of nuclear terrorism is vanishingly small -- terrorists


must succeed at each of twenty plus stages -- failing at one
means zero risk
Mueller, 10 [John,

Woody Hayes Chair of National Security Studies at the Mershon


Center for International Security Studies and a Professor of Political Science at The
Ohio State University, A.B. from the University of Chicago, M.A. and Ph.D. -- UCLA,
Atomic Obsession Nuclear Alarmism from Hiroshima to Al-Qaeda, Oxford
University Press]
.Even

those who decidedly disagree with such scary-sounding, if somewhat elusive, prognostications
about nuclear terrorism often come out seeming like they more or less agree. In his Atomic Bazaar, William
Langewiesche spends a great deal of time and effort assessing the process by means of which a terrorist group could come up with a
bomb. Unlike Allisonand, for that matter, the considerable bulk of accepted opinion he concludes that it

"remains very, very unlikely . It's a possibility, but unlikely." Also: The best information is that no one has gotten
anywhere near this. I mean, if you look carefully and practically at this process, you see that it is an enormous undertaking full of risks for the would-be terrorists. And
so far there is no public case, at least known, of any appreciable amount of weapons-grade HEU [highly enriched uranium] disappearing. And that's the first step. If you don't have
that, you don't have anything. The first of these bold and unconventional declarations comes from a book discussion telecast in June 2007 on C-SPAN and the second from an interview on National Public Radio. Judgments in the book itself, however, while consistent with such conclusions, are expressed more ambiguously, even coyly: "at the extreme is the
possibility, entirely real, that one or two nuclear weapons will pass into the hands of the new stateless guerrillas, the jihad-ists, who offer none of the retaliatory targets that have so far
underlain the nuclear peace" or "if a would-be nuclear terrorist calculated the odds, he would have to admit that they are stacked against^ffen," but they are "not impossible."5 The
previous chapter arrayed a lengthy set of obstacles confront-: v , ing the would-be atomic terroristoften making use in the process of Langewlesche's excellent reporting. Those who
warn about the likelihood of a terrorist bomb contend that a terrorist group could, if often with great difficulty, surmount each obstaclethat doing so in each case is, in
Langewiesche's phrase, "not impossible."6 But it is vital to point out that, while it may be "not impossible" to surmount each individual step, the likelihood that a group

could surmount a series of them could quickly approach impossibility. If the

odds are "stacked against" the terrorists, what


as well as other material, helps us evaluate the many ways such a quest
in his words, "an enormous undertaking full of risks" could fail. The odds, indeed, are stacked
against the terrorists, perhaps massively so . In fact, the likelihood a terrorist group will come up with an
are they? Lange-wiesche's discussion,

atomic bomb seems to be vanishingly small . ARRAYING THE BARRIERS Assuming terrorists have some
desire for the bomb (an assumption ques-tioned in the next chapter), fulfillment of that desire is obviously another
matter. Even the very alarmed Matthew Bunn and Anthony Wier contend that the atomic terrorists' task "would clearly
be among the most difficult types of attack to carry out" or "one of the most difficult missions a terrorist group could
hope to try" But, stresses the CIA's George Tenet, a terrorist atomic bomb is "possible" or "not beyond the realm of possibility." In his
excellent discussion of the issue, Michael Levi ably catalogues a wide array of difficulties confronting the would-be

atomic terrorist, adroitly points out that "terrorists must succeed at every stage , but the defense needs
to succeed only once ," sensibly warns against preoccupation with worst-case scenarios, and pointedly formulates "Murphy's Law
of Nuclear Terrorism: What can go wrong might go wrong." Nevertheless, he holds nuclear terrorism to be a "genuine possibility," and
concludes that a good defensive strategy can merely "tilt the odds in our favor."7 Accordingly, it might be useful to take a stab

at estimating just how "difficult" or "not impossible" the atomic terrorists' task , in aggregate, is that is, how
far from the fringe of the "realm of possibility" it might be, how "genuine" the possibilities are, how tilted the odds actually are. After
all, lots of things are "not impossible." It is "not impossible" that those legendary monkeys with
typewriters could eventually output Shakespeare.8 Or it is "not impossible"that is, there is a "genuine possibility"
that a colliding meteor or comet could destroy the earth, that Vladimir Putin or the British could decide one
morning to launch a few nuclear weapons at Ohio, that an underwater volcano could erupt to cause a
civilization-ending tidal wave, or that Osama bin Laden could convert to Judaism , declare himself to be the
Messiah, and fly in a gaggle of mafioso hit men from Rome to have himself publicly crucified .9 As suggested, most
discussions of atomic terrorism deal in a rather piecemeal fashion with the subjectfocusing separately on individual tasks such as
procuring HEU or assembling a device or transporting it. However, as the Gilmore Commission, a special advisory panel to the president
and Congress, stresses, setting off a nuclear device capable of producing mass destruction presents

" Herculean challenges ," requiring that a whole series of steps be accomplished: obtaining enough fissile
material, designing a weapon "that will bring that mass together in a tiny fraction of a second" and figuring out some
way to deliver the thing. And it emphasizes that these merely constitute "the minimum requirements ."
If each is not fully met, the result is not simply a less powerful weapon, but one that can't produce any
significant nuclear yield at all or can't be delivered.10 Following this perspective, an approach that seems appropriate is to catalogue
the barriers that must be overcome by a terrorist group in order to carry out the task of producing, transporting, and then successfully
detonating an improvised nuclear device, an explosive that, as Allison acknowledges, would be "large, cumbersome, unsafe, unreliable,
unpredictable, and inefficient." Table 13.1 attempts to do this, and it arrays some 20 of these all of which must be surmounted by the
atomic aspirant. Actually, it would be quite possible to come up with a longer list: in the interests of keeping the catalogue of hurdles
down to a reasonable number, some of the entries are actually collections of tasks and could be divided into two or three or more. For
example, number 5 on the list requires that heisted highly enriched uranium be neither a scam nor part of a sting nor of inadequate
quality due to insider incompetence, but this hurdle could as readily be rendered as three separate ones. In contemplating the task before
them, woixftlsbe atomic terrorists effectively must go through an exercise that looks much like this. If and when they do so, they are
likely to find the prospects daunting and accordingly uninspiring or even terminally dispiriting. "

Economy
Terrorist attacks have a minimal effect on the economy
a. Ports arent key and empirics disprove
Jon D. Haveman and Howard J. Shatz 2006- DirectoroftheEconomyProgramat
thePublicPolicyInstituteof California and Senior Economist; Professor, Pardee RAND
Graduate School (Protecting the Nations Seaports: Balancing Security and Cost, Pg
30-31, http://www.ppic.org/content/pubs/report/r_606jhr.pdf)

In Chapter 2, Edward E. Leamer and Christopher Thornberg argue that the actual
costs of an attack on the Los AngelesLong Beach port complex may not be as high
as many fear. For example, if a port is closed, many shippers will reroute their
shipments through other ports. In addition, displaced workers will seek alternative
employment. As a result, the economy will adjust. Some output will be lost, but it
may be so small in magnitude that it will not reveal itself in data that track
national or even regional macroeconomic trends. The authors provide examples of
other disruptions that might have caused severe economic damage but did not,
such as the terrorist attacks of September 11, 2001. Consumer spending fell
immediately after the attacks but then rebounded sharply at the end of 2001 ,
growing at an unprecedented, seasonally adjusted annual rate of 7 percent.
Likewise, although retail sales fell immediately after the attacks, they returned to
trend in November, only two months later. Some sectors did suffer, most notably
the airline industry, which had already been in deep trouble before the end of the
technology boom in early 2001. But consumer spending actually increased,
suggesting that people reallocated the money that they would have spent on
airline travel to other forms of consumption. Similarly, the authors argue that other
disruptions such as hurricanes, earthquakes, and even labor disputes at seaports
did have immediate negative economic effects but that these effects dissipated
quickly as the economy adjusted. The message in this is that most such disruptions
lead to business being delayed rather than business being cancelled, which in turn
results in much less economic harm than would be expected.

b. The economys resilience and excess physical capacity


checks back
Jon D. Haveman and Howard J. Shatz 2006- DirectoroftheEconomyProgramat
thePublicPolicyInstituteof California and Senior Economist; Professor, Pardee RAND
Graduate School (Protecting the Nations Seaports: Balancing Security and Cost, Pg
67-69, http://www.ppic.org/content/pubs/report/r_606jhr.pdf)
A cursory look would seem to portend a dramatic, dangerous scenario, but a closer
look at the facts suggests otherwise. From an input-output perspective, a wide
variety of holes would be quickly created in the flow of production that would seem
to lead to a very sharp downturn in economic activity. But our economy is not a
mechanical system; it is an organic self-healing system, much like that of a human
being: Large injuries take time to heal, but for the most part they do eventually
heal. To continue the analogy, a port attack is only a cut on the armquickly healed
with little noticeable effect on the day-to-day functioning of the person. Although
the ports of Los Angeles and Long Beach certainly represent a primary

infrastructure target in the United States, a complete shutdown of the ports is


highly unlikely as a direct result of some physical attack. There are two reasons for
this: the sheer physical scale of the facilities and the large amount of excess
physical capacity (as opposed to human capital capacity) currently in place. As
shown in the port map on p. xxiii, the two facilities take up approximately 12
square miles of space in a six-by-four-mile area. The complex is broken into a
number of separate yards, each completely controlled by a number of
independent, competing major shipping lines, each of which have substantial
investment in the physical cranes and equipment on their property. Some of these
yards are on Terminal Island, connected to the mainland by three road bridges and
a railroad; others are on the mainland itself. There are multiple access points into
the area as the map shows, including two highways. Even if these roads were shut
down, it would be relatively simple to construct a temporary bridge to the island,
and although it might have some implications for the movement of ships, no yard
would be effectively isolated.3 Conventional weapons would be able to damage , at
best, only a small portion of the complex, and would be unable to isolate a
substantial portion of the port given the multiple access routes into and out of the
area. Even a so-called dirty bomb could cover only one or two square miles of
area with radioactivity. Given the location on the water, winds would quickly blow
most of the radioactive materials away, leaving even most of the initially affected
area quickly reusable. The only known weapon that could take out an area of this
size for an extended period of time would be a nuclear weapon. It seems more
likely that the target of such a horrific device would be a densely populated area,
not a port.

Terrorism doesnt affect growth


Haveman and Shatz 06 (Jon and Howard, Public Policy Institute of California,
Protecting the Nations Seaports: Balancing Security and Cost
http://www.ppic.org/content/pubs/report/R_606JHR.pdf)

September 11 Did Not Cause the 2001 Recession There is a strong tendency
to blame too many secondary effects on disasters. A good example of
this phenomenon is found in the September 11 attacks on New York and
Washington, D.C. In the days after the attacks, the rhetoric regarding

the potential effect on the national economy was both loud and
wrong. The theory proposed by many analysts and journalists was
that psychologically fragile consumers in the United States would
suffer a crisis and stop spending, driving the economy into a deeper
recession. Support for this theory came from the first Gulf War, which

supposedly caused a similar consumer crisis of confidence that in turn drove us into
a recession in 1990. For example, the Wall Street Journal reported on September 13,
2001: Past shocks to Americas sense of security, such as the Oklahoma City
bombing or the Gulf War, have prompted consumers to pull back temporarily on
major purchases and other discretionary spending, said Richard Curtin, director of
surveys of consumers at the University of Michigan. He expects a similar reaction
now, which could mean a rough time in the next several weeks for the economy,
which was already struggling with rising jobless rates and high consumer debt
burdens. We were teetering on the edge, and this might well push us over, said

Mr. Curtin. This hypothesis ignores the facts and completely

overstates the psychological fragility of American consumers. The


1990 recession was not caused by the first Gulf War at all.
Residential investment and expenditures on consumer durables
typically are leading indicators of the economy. When spending on these
items begins to fall as a percentage of gross domestic product (GDP), this is a
strong indication of an underlying weakness in the economy that will create a
recession. Expenditures in these two sectors had dropped from 14

percent of GDP to below 12 percent of GDP in the three years


preceding the 1990 downturnand before the Gulf war. There has never
been such a drop that did not eventually lead to a recession, with one

exceptionin 1967, when the economy was wobbling, appearing to be on the verge
of recession, the sharp increase in spending for the Vietnam War propelled the
economy forward. This was just the reverse of what Mr. Curtin suggested.

Economic growth causes extinction we should allow the


collapse to occur
Barry 08 [Dr. Glen Barry PhD in Land Resources from Wisconsin-Madison
University, January 14, 2008, Economic Collapse and Global Ecology,
http://www.countercurrents.org/barry140108.htm]

Bright greens take the continued existence of a habitable Earth with viable,
sustainable populations of all species including humans as the ultimate truth and
the meaning of life. Whether this is possible in a time of economic collapse is
crucially dependent upon whether enough ecosystems and resources remain post
collapse to allow humanity to recover and reconstitute sustainable, relocalized
societies.It may be better for the Earth and humanity's future that economic
collapse comes sooner rather than later, while more ecosystems and opportunities
to return to nature's fold exist. Economic collapse will be deeply wrenching -- part
Great Depression, part African famine. There will be starvation and civil strife, and a
long period of suffering and turmoil.Many will be killed as balance returns to the
Earth. Most people have forgotten how to grow food and that their identity is more
than what they own. Yet there is some justice, in that those who have lived most
lightly upon the land will have an easier time of it, even as those super-consumers
living in massive cities finally learn where their food comes from and that ecology is
the meaning of life. Economic collapse now means humanity and the Earth
ultimately survive to prosper again. Human suffering -- already the norm for many,
but hitting the currently materially affluent -- is inevitable given the degree to which
the planet's carrying capacity has been exceeded. We are a couple decades at most
away from societal strife of a much greater magnitude as the Earth's biosphere fails.
Humanity can take the bitter medicine now, and recover while emerging better for
it; or our total collapse can be a final, fatal death swoon.

Economic decline doesnt cause war


Tir 10 [Jaroslav Tir - Ph.D. in Political Science, University of Illinois at UrbanaChampaign and is an Associate Professor in the Department of International Affairs
at the University of Georgia, Territorial Diversion: Diversionary Theory of War and
Territorial Conflict, The Journal of Politics, 2010, Volume 72: 413-425]

Empirical support for the economic growth rate is much weaker. The finding
that poor economic performance is associated with a higher likelihood of
territorial conflict initiation is significant only in Models 34.14 The weak results
are not altogether surprising given the findings from prior literature. In accordance with
the insignificant relationships of Models 12 and 56, Ostrom and Job (1986), for example, note that the likelihood that a U.S.
President will use force is uncertain, as the bad economy might create
incentives both to divert the publics attention with a foreign adventure and to
focus on solving the economic problem, thus reducing the inclination to act
abroad. Similarly, Fordham (1998a, 1998b), DeRouen (1995), and Gowa (1998) find no
relation between a poor economy and U.S. use of force . Furthermore, Leeds and Davis
(1997) conclude that the conflict-initiating behavior of 18 industrialized democracies is
unrelated to economic conditions as do Pickering and Kisangani (2005) and
Russett and Oneal (2001) in global studies. In contrast and more in line with my findings of a significant relationship (in Models 3
4), Hess and Orphanides (1995), for example, argue that economic recessions are linked with forceful action by an incumbent U.S. president.
Furthermore, Fordhams (2002) revision of Gowas (1998) analysis shows some effect of a bad economy and DeRouen and Peake (2002) report
that U.S. use of force diverts the publics attention from a poor economy. Among cross-national studies, Oneal and Russett (1997) report that slow
growth increases the incidence of militarized disputes, as does Russett (1990)but only for the United States; slow growth does not affect the
behavior of other countries. Kisangani and Pickering (2007) report some significant associations, but they are sensitive to model specification,
while Tir and Jasinski (2008) find a clearer link between economic underperformance and increased attacks on domestic ethnic minorities. While
none of these works has focused on territorial diversions, my own inconsistent findings for economic growth fit well with the mixed results
reported in the literature.15 Hypothesis 1 thus receives strong support via the unpopularity variable but only weak support via the economic

These results suggest that embattled leaders are much more likely to
respond with territorial diversions to direct signs of their unpopularity (e.g.,
strikes, protests, riots) than to general background conditions such as economic
malaise. Presumably, protesters can be distracted via territorial diversions while fixing the economy would take a more concerted and
growth variable.

prolonged policy effort. Bad economic conditions seem to motivate only the most serious, fatal territorial confrontations. This implies that leaders
may be reserving the most high-profile and risky diversions for the times when they are the most desperate, that is when their power is
threatened both by signs of discontent with their rule and by more systemic problems plaguing the country (i.e., an underperforming economy).

No impact to economic decline


D. Scott Bennett and Timothy Nordstrom, February 2k.

Department of Political Science Professors at Pennsylvania State. Foreign Policy


Substitutability and Internal Economic Problems in Enduring Rivalries, Journal of
Conflict Resolution, Ebsco.
In this analysis, we focus on using economic conditions to understand when
rivalries are likely to escalate or end. Rivalries are an appropriate set of cases to
use when examining substitutability both because leaders in rival states have
clearly substitutable choices and because rivalries are a set of cases in which
externalization is a particularly plausible policy option.7 In particular, when
confronted with domestic problems, leaders in a rivalry have the clear
alternatives of escalating the conflict with the rival to divert attention or to work to
settle the rivalry as a means of freeing up a substantial amount of resources that

can be directed toward solving internal problems. In the case of the diversion
option, rivals provide logical, believable actors for leaders to target; the presence of
a clear rival may offer unstable elites a particularly inviting target for hostile
statements or actual conflict as necessary. The public and relevant elites already
consider the rival a threat or else the rivalry would not have continued for an
extended period; the presence of disputed issues also provides a casus belli with
the rival that is always present. Rivals also may provide a target where the possible
costs and risks of externalization are relatively controlled. If the goal is diversion,
leaders willwant to divert attention without provoking an actual (and expensive)war.
Over the course of many confrontations, rival states may learn to anticipate
response patterns, leading to safer disputes or at least to leaders believing that
they can control the risks of conflict when they initiate a new confrontation. In sum,

rivals provide good targets for domestically challenged political leaders. This leads
to our first hypothesis, which is as follows: Hypothesis 1: Poor economic conditions
lead to diversionary actions against the rival. Conflict settlement is also a distinct
route to dealing with internal problems that leaders in rivalries may pursue when
faced with internal problems. Military competition between states requires large
amounts of resources, and rivals require even more attention . Leaders may
choose to negotiate a settlement that ends a rivalry to free up important
resources that may be reallocated to the domestic economy. In a guns versus
butter world of economic trade-offs, when a state can no longer afford to pay the
expenses associated with competition in a rivalry, it is quite rational for leaders to
reduce costs by ending a rivalry. This gain (a peace dividend) could be achieved at
any time by ending a rivalry. However, such a gain is likely to be most important
and attractive to leaders when internal conditions are bad and the leader is
seeking ways to alleviate active problems. Support for policy change away from
continued rivalry is more likely to develop when the economic situation sours and
elites and masses are looking for ways to improve a worsening situation. It is at
these times that the pressure to cut military investment will be greatest and that
state leaders will be forced to recognize the difficulty of continuing to pay for a
rivalry. Among other things, this argument also encompasses the view that the cold

war ended because the Union of Soviet Socialist Republics could no longer compete
economically with the United States. Hypothesis 2: Poor economic conditions
increase the probability of rivalry termination. Hypotheses 1 and 2 posit opposite
behaviors in response to a single cause (internal economic problems). As such, they
demand a research design that can account for substitutability between them.

Water

Water Wars-Solvency
Scarcity drives middle eastern cooperation
Sikimic 11

[Simona, author for the daily star, Lebanon news. Jan. 21, 2011, Water scarcity can be source for regional
cooperation: report http://www.dailystar.com.lb/News/Lebanon-News/2011/Jan-21/61157-water-scarcity-can-be-source-for-regionalcooperation-report.ashx#axzz36jugkuUs //jweideman]
BEIRUT: Water scarcity in the region can be channeled for a common good and used to
reduce, rather than ignite conflict, an environmental report released Thursday has
claimed. The Blue Peace: Rethinking Middle East Water launched at the Beirutbased Carnegie Middle East Center, proposes radical cooperation between the six
concerned states: Lebanon, Israel, the Palestinian Territories , Jordan, Iraq and
Turkey, and envisages the neighbors setting up a mutual monitoring system to
guarantee collaboration and more equal partitioning of resources. The social and economic
development of nations depends on water availability in terms of quantity and quality, said Fadi Comair, the
president of Mediterranean Network of River Basin Organizations. It holds a major
place on the diplomatic agenda of the [six] governments, Comair said. River flows in Turkey, Syria,
Iraq, Lebanon and Jordan have been depleted by 50 to 90 percent in the last 50 years alone, while the vital Jordan River, which acts
as a water source for five of the concerned countries, has decreased its discharge by over 90 percent from 1960, the report said.
This is a serious problem, said Sundeep Waslekar, the president of the India-based think tank Strategic Foresight Group, which
coordinated the reports compilation. Especially as water demand is rising and consumption has gone up from [an estimated] 10-15
percent 50 years ago to 37 percent now. With consumer requirements predicted to increase to 50-60 percent over the next decade,
further pressure will be put on ever-dwindling supplies, he said. But hydrodiplomacy as outlined in the report has the potential to
alleviate conflicts on trans-boundary watercourse between riparian states [which] will intensify more and more, especially in the

Some moves toward cooperation have already been made ,


especially between Syria and Lebanon , which have signed two important accords to
partition resources from the Orontes and Nahr al-Kabir rivers.
Middle East, Comair added.

Trade

No Solvency-Alt cause
Too many alt causes trade is declining in the SQ
IEC 09 (The International Economy is a specialized quarterly magazine covering global financial policy, economic trends, and international trade.
Spring of 2009. Collapse in World Trade from The Magazine of International Economic Policy. http://www.internationaleconomy.com/TIE_Sp09_WorldTrade.pdf July 8, 2014)

The collapse of trade since the summer of 2008 has been absolutely terrifying,
more so insofar as we lack an adequate understanding of its causes. Murky
protectionism has played a role. Disruptions to the supply of trade credit from
international banks in particular have negatively impacted some countries. The
most important factor is probably the growth of global supply chains, which has
magnified the impact of declining final demand on trade. When a U.S. household
decides not to buy a $40,000 Cayenne sport utility vehicle from Germany,
German exports to the United States go down by $40,000, but Slovakian exports to
Germany go down by perhaps half that amount, since while the final assembly is
done in Leipzig, the coachwork is done in Bratislava. All this said, it really is the
case that we dont fully understand the rel- ative importance of the effects. If it is
any consolation, there are signs that trade will rise with recovery every bit as fast
as it fell with the onset of the crisis.

Too many other reasons that prohibit economic growth.


IEC 09 (The International Economy is a specialized quarterly magazine covering global financial policy, economic trends, and international trade.
Spring of 2009. Collapse in World Trade from The Magazine of International Economic Policy. http://www.internationaleconomy.com/TIE_Sp09_WorldTrade.pdf July 8, 2014)

There are several reasons why international trade has collapsed faster than global
GDP. First, the decline in manufacturing output has exceeded the decline in real
GDP. For example, U.S. industrial pro- duction in the first quarter of 2009 was 14
percent lower than its peak in the fourth quarter of 2007. In contrast, the level of
real GDP in the United States in the first quarter of 2009 was only 3.2 percent below
its peak. Demand for tradable goods is largely derived from demand for
manufactured goods, so the collapse in trade reflects the collapse in manufacturing.
Second, the decline in global trade has been exacerbated by business efforts to
conserve cash rather than build up inventories to prior levels. Trade flows are being
buffeted not only by a general decline in activity, but also by this temporary move
to lower inventories. Third, the decline in trade reflects the crisis in finance. Credit
plays a critical role in international trade, and the disruption in global credit markets
has restricted flows of credit needed to support trade. Finally, the boom in
commodities-related trade has been replaced by gloom. Prices for many
commodities are falling and ports are packed with imports (think Houston and oilcountry tubular goods) for which there is currently limited demand. This new reality
is forcing even hyperactive exporters to cut back shipments dramatically.

No Trade War
Trade war wont happen.
Fletcher 2010
Ian, Senior Economist of the Coalition for a Prosperous America, former Research
Fellow at the U.S. Business and Industry Council, The Mythical Concept of Trade War,
April 2nd 2010, http://www.huffingtonpost.com/ian-fletcher/the-mythical-concept-oft_b_523864.html
As Americans ponder how to get the U.S. out of its current trade mess, we are constantly warned to do nothing - like
impose a tariff to neutralize Chinese currency manipulation - that would trigger a "trade war." Supposedly, no
matter how bad our problems with our trading partners get, they are less bad than the spiraling catastrophe that

the curious
it has no

would ensue if we walked a single inch away from our current policy of unilateral free trade. But

thing about the concept of trade war is that, unlike actual shooting war,

historical precedent . In fact, the reality is that there has never been a significant
trade war, "significant" in the sense of having done serious economic damage. All history records are minor
skirmishes at best. The standard example free traders give is that America's Smoot-Hawley
tariff of 1930 either caused the Great Depression or made it spread around the world. But this canard does not
survive serious examination, and has actually been denied by almost every
economist who has actually researched the question in depth--a group ranging from
Paul Krugman on the left to Milton Friedman on the right. The Depression's cause
was monetary. The Fed allowed the money supply to balloon during the late 1920s, piling up in the stock
market as a bubble. It then panicked, miscalculated, and let it collapse by a third by 1933, depriving the economy

Trade had nothing to do with it. As for the charge that


Smoot caused the Depression to spread worldwide: it was too small a change to
have plausibly so large an effect. For a start, it only applied to about one-third of
America's trade: about 1.3 percent of our GDP. Our average tariff on dutiable goods went from 44.6 to 53.2
percent--not a terribly big jump. Tariffs were higher in almost every year from 1821 to 1914. Our tariff went
up in 1861, 1864, 1890, and 1922 without producing global depressions , and the
recessions of 1873 and 1893 managed to spread worldwide without tariff increases. Neither does the myth
of a death spiral of retaliation by foreign nations hold water. According to the official State
Department report on this question in 1931: With the exception of discriminations in France, the
extent of discrimination against American commerce is very slight ...By far the largest
of the liquidity it needed to breathe.

number of countries do not discriminate against the commerce of the United States in any way. "Notorious"

Smoot-Hawley is a deliberately fabricated myth , plain and simple. There is a basic


unresolved paradox at the bottom of the very concept of trade war. If, as free traders
insist, free trade is beneficial whether or not one's trading partners reciprocate, then
why would any rational nation start one, no matter how provoked? The only way to
explain this is to assume that major national governments like the Chinese and the U.S.-governments which, whatever bad things they may have done, have managed to hold nuclear weapons for decades
without nuking each other over trivial spats--are not players of realpolitik, but schoolchildren.
When the moneymen in Beijing, Tokyo, Berlin, and the other nations currently running trade surpluses against the
U.S. start to ponder the financial realpolitik of exaggerated retaliation against the U.S. for any measures we may
employ to bring our trade back into balance, they will discover the advantage is with us, not them. Because they
are the ones with trade surpluses to lose, not us. So our position of weakness is actually a position of strength.

Supposedly, China can suddenly stop buying our Treasury Debt if we rock the boat.
But this would immediately reduce the value of the trillion or so they already hold-not to mention destroying, by making their hostility overt, the fragile (and desperately-tended) delusion
in the U.S. that America and China are still benign economic "partners" in a win-win economic relationship.

At the end of the day, China cannot force us to do anything economically that we don't choose to. America is still a
nuclear power. We can--an irresponsible but not impossible scenario--repudiate our debt to them (or stop paying the
interest) as the ultimate countermove to anything they might contemplate. More plausibly, we might simply restore

the tax on the interest on foreign-held bonds that was repealed in 1984 thanks to Treasury Secretary Donald Regan.

A certain amount of back-and-forth token retaliation (and loud squealing) is indeed likely if
America starts defending its interests in trade as diligently as our trading partners
have been defending theirs, but that's it. After all, the world trading system has
survived their trade barriers long enough without collapsing .

Wont be hot wars


Erixon and Sally, directors-ECIPE, 10 (Fredrik and Razeen, European Centre

for International Political Economy, TRADE, GLOBALISATION AND EMERGING PROTECTIONISM


SINCE THE CRISIS, http://www.ecipe.org/media/publication_pdfs/trade-globalisation-andemerging-protectionism-since-the-crisis.pdf)

But in one other important respect the comparison with the 1930s is highly
misleading. Then, tit-for-tat trade protection rapidly followed the Wall Street Crash,
and the world splintered into warring trade blocs . This has not happened
today, and it is unlikely to happen anytime soon. As we will discuss in the next
section, crisis-related protectionism today is remarkably restrained .
Multilateral trade rules, international policy cooperation and market-led
globalisation provide defences against a headlong descent into 1930s-style
protectionism.

No retaliation
Erixon and Sally, directors-ECIPE, 10 (Fredrik and Razeen, European Centre
for International Political Economy, TRADE, GLOBALISATION AND EMERGING PROTECTIONISM
SINCE THE CRISIS, http://www.ecipe.org/media/publication_pdfs/trade-globalisation-andemerging-protectionism-since-the-crisis.pdf)
Perhaps the biggest surprise is that the world has not hurtled into tit-for-tat protectionism . That
spectre was real enough in late 2008, given the scale of growth contraction and deglobalisation. The good news is

New protectionist measures have


appeared what the WTO refers to as policy slippage but they are remarkably mild . They
affect a maximum of 1 per cent of world trade in goods, and protectionism in trade
in services has not increased noticeably. New protectionism is concentrated in sectors that have long
been protected: textiles, clothing, footwear, iron, steel, consumer electronics and agriculture. Trade remedies
have increased, as they do in economic downturns. New antidumping (AD) investigations
documented in WTO updates on new trade measures in 2009.

increased by 15 per cent from mid 2008 to mid 2009, and there has been a marked pickup in new investigations for

Signs are that trade-defence filings and investigations


are mounting quarter-by-quarter, which points to a sharp increase in imposed duties
in 2010 and 2011. Nevertheless, new investigations affect a tiny share of world trade. For
example, new AD investigations affect 0.4 per cent of the value of US and EU imports .
Finally, up to one-third of new trade measures have been liberalising. These include
tariff reductions, removal of export restrictions and FDI liberalisation in several
developing countries. 18
safeguards and countervailing duties.

No Impact-Trade doesnt solve war


Interdependence doesnt solve war
Layne 8 (Christopher Layne is a professor at Texas A&M Universitys George H. W.
Bush School of Government and Public Service. He is author of The Peace of
Illusions: American Grand Strategy from 1940 to the Present (Cornell University
Press, 2006), and (with Bradley A. Thayer) American Empire: A Debate (Routledge,
2007). Chinas Challenge to US Hegemony
http://acme.highpoint.edu/~msetzler/IR/IRreadingsbank/chinauscontain.ch08.6.pdf /
/Donnie)
Proponents of engagement have also argued that the United States can help foster political liberalization in China
by integrating the country into the international economy and embedding it in the complex web of international

A China so engaged, it is said, will have strong interests in


cooperation and will not be inclined to pursue security competition with America or
with its Asian neighbors. Engagement is a problematic strategy, however, because it rests on a shaky
foundation. The conventional wisdom notwithstanding, there is little support in the
historical record for the idea that economic interdependence leads to
peace. After all, Europe never was more interdependent (not only economically but also, among
the ruling elites, intellectually and culturally) than before World War I. It was famously predicted,
on the eve of World War I, that the economic ties among Europes great powers had
ushered in an era in which war among them was unthinkable. Yet, as we know, the prospect
institutional arrangements.

of forgoing the economic gains of trade did not stop Europes great powers from fighting a prolonged and

Beijings actual foreign policy furnishes a concrete reason to be


skeptical of the argument that interdependence leads to peace . Chinas behavior in
the 1996 crisis with Taiwan (during which it conducted missile tests in waters surrounding the island in the runup to Taiwans presidential election) suggested it was not constrained by fears that its
muscular foreign policy would adversely affect its overseas trade. Of course, during the
past decade, China has been mindful of its stake in international trade and investment. But this does not
vindicate the us strategy of engagement. Chinas current policy reflects the fact that, for now,
devastating war.

Beijing recognizes its strategic interest in preserving peace in East Asia. Stability in the region, and in Sino-

For a
the United States, this is the optimal realpolitik strategy:
buying time for its economy to grow so that the nation can openly balance against
the United States militarily and establish its own regional hegemony in East Asia.
Beijing is pursuing a peaceful policy today in order to strengthen itself to
confront the United States tomorrow.
American relations, allows China to become richer and to catch up to the United States in relative power.
state in Chinas position vis--vis

You might also like