Professional Documents
Culture Documents
U.S. IOOS works with its eyes on the future. The successes of U.S. IOOS are achieved through
cooperation and coordination among Federal agencies, U.S. IOOS Regional Associations, State
and regional agencies, and the private sector. This cooperation and coordination requires a sound
governance and management structure. In 2011 and 2012, program milestones called for in U.S. IOOS legislation
were achieved, laying the groundwork for more success in the future. First, the U.S. IOOS Advisory Committee was
established. Second, the Independent Cost Estimate was delivered to Congress. As part of the estimate, each of the
11 U.S. IOOS Regional Associations completed 10-year build-out plans, describing services and products to address
local user needs and outlining key assets required to meet the Nations greater ocean-observing needs.
Administration (NOAA), Department of Commerce Funding Opportunity Title: FY2014 Marine Sensor and Other Advanced Observing
Technologies Transition Project. ANNOUNCEMENT OF FEDERAL FUNDING OPPORTUNITY EXECUTIVE SUMMARY,
http://www.ioos.noaa.gov/funding/fy14ffo_msi_noaa_nos_ioos_2014_2003854.pdf //jweideman]
U.S. IOOS seeks to increase the rate that new or existing marine sensor
technologies are transitioned into operations mode in order to facilitate the efficient collection of ocean,
coastal and Great Lakes observations. The Marine Sensor Transition topic is focused on transitioning
1. Marine Sensor Transition Topic:
marine sensors from research to operations mode to meet the demonstrated operational needs of end-users.
Letters of Intent (LOIs) are being solicited for this topic with particular emphasis on a) projects comprised
of multi-sector teams of partners, b) projects that will meet the demonstrated operational needs of end-users, and
c) sensors that are at or above TRL 6. Applicants with sensors for ocean acidification that are at or above TRL 6 are
also eligible to apply to this topic if they have strong commitments for operational transition
2NC Impact-Education
Definitions are key to education about IOOS
IOOS report to congress 13
[Official US IOOS report sent to congress. 2013, U.S. Integrated Ocean Observing
System (U.S. IOOS) 2013 Report to Congress, http://www.ioos.noaa.gov/about/governance/ioos_report_congress2013.pdf
//jweideman]
[Official US IOOS report sent to congress. 2013, U.S. Integrated Ocean Observing
System (U.S. IOOS) 2013 Report to Congress, http://www.oos.noaa.gov/about/governance/ioos_report_congress2013.pdf
//jweideman]
The IOOC recognizes that U.S. IOOS must be responsive to environmental crises while
maintaining the regular long-term ocean observation infrastructure required to
support operational oceanography and climate research. As a source of our Nations ocean
data and products, U.S. IOOS often serves as a resource for the development of targeted
applications for a specific location or sector . At the same time, U.S. IOOS organizes data
from across regions and sectors to foster the national and international application of local
data and products broadly across oceans, coasts, and Great Lakes. Events over the
last few years, including Hurricane Sandy and the Deep Water Horizon oil spill have
awakened U.S. communities to the value and necessity of timely ocean information.
IOOC commends U.S. IOOS for responsive and capable support to the Nation in
these events in addition to diverse everyday support to the Nations maritime
economy. We have much more work to do to build and organize the ocean-observing infrastructure of the
Nateion and look forward to wrking with congress on this continuing challenge.
[Federal Agency Name(s): National Ocean Service (NOS), National Oceanic and Atmospheric
Administration (NOAA), Department of Commerce Funding Opportunity Title: FY2014 Marine Sensor and Other Advanced Observing
Technologies Transition Project. ANNOUNCEMENT OF FEDERAL FUNDING OPPORTUNITY EXECUTIVE SUMMARY,
http://www.ioos.noaa.gov/funding/fy14ffo_msi_noaa_nos_ioos_2014_2003854.pdf //jweideman]
U.S. IOOS seeks to increase the rate that new or existing marine sensor
technologies are transitioned into operations mode in order to facilitate the efficient collection of ocean,
coastal and Great Lakes observations. The Marine Sensor Transition topic is focused on transitioning
1. Marine Sensor Transition Topic:
marine sensors from research to operations mode to meet the demonstrated operational needs of end-users.
Letters of Intent (LOIs) are being solicited for this topic with particular emphasis on a) projects comprised
of multi-sector teams of partners, b) projects that will meet the demonstrated operational needs of end-users, and
c) sensors that are at or above TRL 6. Applicants with sensors for ocean acidification that are at or above TRL 6 are
also eligible to apply to this topic if they have strong commitments for operational transition
Observations are of little value if they cannot be found, accessed, and transformed
into useful products. The U.S. IOOS Data Management and Communications
subsystem, or DMAC, is the central operational infrastructure for assessing,
disseminating, and integrating existing and future ocean observations data. As a
core functional component for U.S. IOOS, establishing DMAC capabilities continues
to be a principal focus for the program and a primary responsibility of the U.S. IOOS
Program Office in NOAA. Importance and Objectives of DMAC Although DMAC implementation
remains a work in progress, a fully implemented DMAC subsystem will be capable of
delivering real-time, delayed-mode, and historical data. The data will include in situ
and remotely sensed physical, chemical, and biological observations as well as
model-generated outputs, including forecasts, to U.S. IOOS users and of delivering
all forms of data to and from secure archive facilities. Achieving this requires a
governance framework for recommending and promoting standards and policies to
be implemented by data providers across the U.S. IOOS enterprise , to provide seamless
long-term preservation and reuse of data across regional and national boundaries and across disciplines. The
governance framework includes tools for data access, distribution, discovery, visualization, and analysis; standards
for metadata, vocabularies, and quality control and quality assurance; and procedures for the entire ocean data life
cycle. The DMAC design must be responsive to user needs and it must, at a minimum, make data and products
discoverable and accessible, and provide essential metadata regarding sources, methods, and quality. The overall
DMAC objectives are for U.S. IOOS data providers to develop and maintain capabilities to: Deliver
accurate and timely ocean observations and model outputs to a range of
consumers; including government, academic, private sector users, and the general
public; using specifications common across all providers Deploy the information
system components (including infrastructure and relevant personnel) for full lifecycle management of observations, from collection to product creation, public delivery, system
documentation, and archiving Establish robust data exchange responsive to variable customer requirements as
well as routine feedback, which is not tightly bound to a specific application of the data or particular end-user
U.S. IOOS daia providers therefore are being encouraged lo address the
following DMAC- specific objectives: A standards-based foundation for DMAC capabilities: U.S.
IOOS partners must clearly demonstrate how they will ensure the establishment and
maintenance of a standards- based approach for delivering their ocean observations
decision support tool
data and associated products to users through local, regional and global/international data networks Exposure of
U.S. IOOS partners must describe how they will ensure coastal
ocean observations are exposed to users via a service- oriented architecture and
recommended data services that will ensure increased data interoperability
including the use of improved metadata and uniform quality-control methods
and access to coastal ocean observations:
Certification and governance of U.S. IOOS data and products: U.S. IOOS partners must present a description of how
they will participate in establishing an effective U.S. IOOS governance process for data certification standards and
compliance procedures. This objective is part of an overall accreditation process which includes the other U.S. IOOS
subsystems (observing, modeling and analysis, and governance)
Central government
AHD 92 (American Heritage Dictionary of the English Language, p. 647)
relating to the central government of a federation as distinct from the
its member units.
federal3. Of or
governments of
[Official US IOOS report sent to congress. 2013, U.S. Integrated Ocean Observing
System (U.S. IOOS) 2013 Report to Congress, http://www.ioos.noaa.gov/about/governance/ioos_report_congress2013.pdf
//jweideman]
U.S. IOOS works with its eyes on the future. The successes of U.S. IOOS are achieved through
cooperation and coordination among Federal agencies, U.S. IOOS Regional Associations, State
and regional agencies, and the private sector. This cooperation and coordination requires a sound
governance and management structure. In 2011 and 2012, program milestones called for in U.S. IOOS legislation
were achieved, laying the groundwork for more success in the future. First, the \U.S. IOOS Advisory Committee was
established. Second, the Independent Cost Estimate was delivered to Congress. As part of the estimate, each of the
11 U.S. IOOS Regional Associations completed 10-year build-out plans, describing services and products to address
local user needs and outlining key assets required to meet the Nations greater ocean-observing needs.
Disads
Politics
environmental protections that have existed for decades. To put that in per- spective, that was almost one in every
five votes held in Congress during the past two years. These were votes to al- low additional oil and gas drilling in
coastal waters, while simultaneously limiting the environmental review process for offshore drilling sites. There
were repeal attempts to un- dermine the Clean Water Act and to roll back protections for threatened fish and other
National Ocean Policy. Murkowski also pushed for an additional $3 million for regional fishery management councils and secured $15 million for the
Pacific Salmon Treaty that was in line to be cut by NOAA's proposed budget (for $65 million total). On April 24, the full Senate Appropriations Committee
approved the Commerce Department budget with language inserted by Sen. John Kerry, D-Mass., and Sen. Olympia Snowe, R-Maine, into NOAA's budget
that would transfer $119 million currently unrestricted funds and require they be used for stock assessments, surveys and monitoring, cooperative
research and fisheries grants. The $119 million is derived from Saltonstall-Kennedy funds, which are levies collected on seafood imports by the
Department of Agriculture. Thirty percent of the import levies are transferred to NOAA annually, and without Kerry's language there are no restrictions on
how NOAA may use the funds.
has drawn as much fire from both parties as NOAA and its Administrator Jane Lubchenco. Sen. Scott
Brown, R-Mass., has repeatedly demanded accountability for NOAA Office of Law Enforcement abuses uncovered by the Commerce Department Inspector
General that included the use of fishermen's fines to purchase a luxury boat that was only used for joyriding around Puget Sound. There is currently
another Inspector General investigation under way into the regional fishery management council rulemaking process that was requested last August by
Massachusetts Reps. John Tierney and Barney Frank, both Democrats. In July 2010, both Frank and Tierney called for Lubchenco to step down, a
remarkable statement for members of Obama's party to make about one of his top appointments. Frank introduced companion legislation to Kerry's in
the House earlier this year, where it should sail through in a body that has repeatedly stripped out tens of millions in budget requests for catch share
programs. Catch share programs are Lubchenco's favored policy for fisheries management and have been widely panned after implementation in New
England in 2010 resulted in massive consolidation of the groundfish catch onto the largest fishing vessels. Another New England crisis this year with Gulf
of Maine cod also drove Kerry's action after a two-year old stock assessment was revised sharply downward and threatened to close down the fishery.
Unlike many fisheries in Alaska such as pollock, crab and halibut, there are not annual stock assessment surveys around the country. Without a new stock
assessment for Gulf of Maine cod, the 2013 season will be in jeopardy. "I applaud Senator Kerry for his leadership on this issue and for making sure that
this funding is used for its intended purpose - to help the fishing industry, not to cover NOAA's administrative overhead," Frank said in a statement. "We
are at a critical juncture at which we absolutely must provide more funding for cooperative fisheries science so we can base management policies on
sound data, and we should make good use of the world-class institutions in the Bay State which have special expertise in this area." Alaska's Sen. Mark
Begich and Murkowski, as well as Rep. Don Young have also denounced the National Ocean Policy as particularly misguided, not only for diverting core
funding in a time of tightening budgets but for creating a massive new bureaucracy that threatens to overlap existing authorities for the regional fishery
management councils and local governments. The first 92 pages of the draft policy released Jan. 12 call for more than 50 actions, nine priorities, a new
National Ocean Council, nine Regional Planning Bodies tasked with creating Coastal Marine Spatial Plans, several interagency committees and taskforces,
pilot projects, training in ecosystem-based management for federal employees, new water quality standards and the incorporation of the policy into
regulatory and permitting decisions. Some of the action items call for the involvement of as many as 27 federal agencies. Another requires high-quality
marine waters to be identified and new or modified water quality and monitoring protocols to be established. Young hosted a field hearing of the House
Natural Resources Committee in Anchorage April 3 where he blasted the administration for refusing to explain exactly how it is paying for implementing
the National Ocean Policy. "This National Ocean Policy is a bad idea," Young said. "It will create more uncertainty for businesses and will limit job growth.
It will also compound the potential for litigation by groups that oppose human activities. To make matters worse, the administration refuses to tell
Congress how much money it will be diverting from other uses to fund this new policy." Natural Resources Committee Chairman Doc Hastings, R-Wash.,
sent a letter House Appropriations Committee Chairman Hal Rogers asking that every appropriations bill expressly prohibit any funds to be used for
implementing the National Ocean Policy. Another letter was sent April 12 to Rogers by more than 80 stakeholder groups from the Gulf of Mexico to the
Bering Sea echoing the call to ban all federal funds for use in the policy implementation. "The risk of unintended economic and societal consequences
remains high, due in part to the unprecedented geographic scale under which the policy is to be established," the stakeholder letter states. "Concerns are
further heightened because the policy has already been cited as justification in a federal decision restricting access to certain areas for commercial
activity."
to implement the National Ocean Policy, but the administration released its draft implementation policy
Russian Oil
[Nicholas M. Short, Sr is a geologist who received degrees in that field from St. Louis University (B.S.), Washington
University (M.A.), and the Massachusetts Institute of Technology (Ph.D.); he also spent a year in graduate studies in the geosciences
at The Pennsylvania State University. In his early post-graduate career, he worked for Gulf Research & Development Co., the
Lawrence Livermore Laboratory, and the University of Houston. During the 1960s he specialized in the effects of underground
nuclear explosions and asteroidal impacts on rocks (shock metamorphism), and was one of the original Principal Investigators of the
Apollo 11 and 12 moon rocks. 2011, Finding Oil and Gas from Space https://apollomapping.com/wpcontent/user_uploads/2011/11/NASA_Remote_Sensing_Tutorial_Oil_and_Gas.pdf //jweideman]
Geophysicists is a simplified summary of the basics of hydrocarbon exploration. Oil and gas result from the decay of organisms mostly marine plants (especially microscopic algae and similar free-floating vegetation) and small animals such as fish - that are
buried in muds that convert to shale. Heating through burial and pressure from the overlying later sediments help in the process.
(Coal forms from decay of buried plants that occur mainly in swamps and lagoons which are eventually buried by younger
sediments.). The decaying liquids and gases from petroleum source beds, dominantly shales after muds convert to hard rock,
migrate from their sources to become trapped in a variety of structural or stratigraphic conditions shown in this illustration:The oil
and gas must migrate from deeper source beds into suitable reservoir rocks. These are usually porous sandstones, but limestones
with solution cavities and even fractured igneous or metamorphic rocks can contain openings into which the petroleum products
to the surface - either naturally when the trap is intersected by downward moving erosional surfaces or by being penetrated by a
drill. If pressure is high the oil and/or gas moves of its own accord to the surface but if pressure is initially low or drops over time,
pumping is required.
indicators of surface anomalies. This diagram sets the framework for the approach used; this is the so-called
microseepage model, which leads to specific geochemical anomalies:
[Louis, Contributor at forbes, entrepenour and investor. 3/3/14, It's Time To Drive Russia Bankrupt Again
http://www.forbes.com/sites/louiswoodhill/2014/03/03/its-time-to-drive-russia-bankrupt-again/ //jweideman]
The high oil prices of 1980 were not real, and Reagan knew it. They were being caused by the weakness of
the U.S. dollar, which had lost 94% of its value in terms of gold between 1969 and 1980. Reagan immediately
decontrolled U.S. oil prices, to unleash the supply side of the U.S. economy . Even more
importantly, Reagan backed Federal Reserve Chairman Paul Volckers campaign to strengthen and stabilize the U.S. dollar. By the
end of Reagans two terms in office, real oil prices had plunged to $27.88/bbl. As
Russia does today, the old USSR depended upon oil exports for most of its foreign
exchange earnings, and much of its government revenue. The 68% reduction in real
oil prices during the Reagan years drove the USSR bankrupt. In May 1990, Gorbachev called
German Chancellor Helmut Kohl and begged him for a loan of $12 billion to stave off financial disaster. Kohl advanced only $3 billion.
By August of 1990, Gorbachev was back, pleading for more loans. In December 1991, the Soviet Union collapsed .
President Bill Clintons strong dollar policy (implemented via Federal Reserve Vice-Chairman Wayne Angells secret commodity
price rule system) kept real oil prices low during the 1990s, despite rising world oil demand. Real crude oil prices during Clintons
From the end of 2000 to the end of 2013, the gold value of the dollar fell by 77%, and real oil prices tripled, to $111.76/bbl. It is
gas ($66.00/FOE* bbl) the total revenue of Russias petroleum industry is $662.3 billion (26.5% of GDP), and Russians oil and gas
Nuclear war
Filger 9
[Sheldon, founder of Global Economic Crisis, The Huffington Post,, 5.10.9, http://www.huffingtonpost.com/sheldonfilger/russian-economy-faces-dis_b_201147.html // jweideman]
sufficient scope and capability to destroy the world as we know it. For that reason, it is not
only President Medvedev and Prime Minister Putin who will be lying awake at nights over the prospect that a national economic
crisis can transform itself into a virulent and destabilizing social and political upheaval. It just may be possible that U.S.
UQ: econ
Russias economy is weak, but growth is coming
ITARR-TASS 7/3
[Russian news agency. 7/3/14, Russias economic growth to accelerate to over 3% in 2017
http://en.itar-tass.com/economy/738849 //jweideman]
The Russian government expects the national economy to grow , despite the unfavorable
economic scenario for this year, Prime Minister Dmitry Medvedev said on Thursday. Medvedev chaired a
government meeting on Thursday to discuss the federal budget parameters and
state program financing until 2017. This scenario (of Russias social and economic
development) unfortunately envisages a general economic deterioration and slower
economic growth this year. Then, the growth is expected to accelerate gradually to 2%
next year and to over 3% in 2017, Medvedev said. The 2015-2017 federal budget will set
aside over 2.7 trillion rubles ($78 billion) for the payment of wages and salaries, the
premier said, adding the governments social obligations were a priority in the forthcoming budget period. More than a half
of federal budget funds are spent through the mechanism of state programs. In
2015, the government intends to implement 46 targeted state programs to the
amount of over 1 trillion rubles (about $30 billion) in budget financing, the premier said. The same
budget financing principle will now be used by regional and local governments,
including Russias two new constituent entities - the Republic of Crimea and the city
of Sevastopol, the premier said.
[Agence France-Presse is an international news agency headquartered in Paris. It is the oldest news agency in the
world and one of the largest. 7/10/14, Russia avoids recession in second quarter
https://au.news.yahoo.com/thewest/business/world/a/24425122/russia-avoids-recession-in-second-quarter/ //jweideman]
Russia managed to avoid sliding into a recession in the second quarter, its
deputy economy minister said Wednesday , citing preliminary estimates showing zero growth for the three
months ending June. The emerging giant was widely expected to post a technical recession
with two consecutive quarters of contraction, after massive capital flight over the
uncertainty caused by the Ukraine crisis . "We were expecting a possible technical recession in the second
quarter," Russia's deputy economy minister Andrei Klepach said, according to Interfax
news agency. But it now "appears that we will avoid recession, our preliminary
forecast is one of zero growth after adjusting for seasonal variations," he said.
Official data is due to be published by the statistics institute during the summer ,
Moscow (AFP) -
although no specific date has been given. Russia's growth slowed down sharply in the first quarter as investors, fearing the impact
of Western sanctions over Moscow's annexation of Crimea, withdrew billions worth of their assets. In the first three months of the
until now not issued any quarter to quarter growth forecast. Economy minister Alexei Ulyukayev on Monday issued a year-on-year
forecast, saying
[Agence France-Presse is an international news agency headquartered in Paris. It is the oldest news agency in the
world and one of the largest. 7/7/14, Russian output expands 1.2% in second quarter: economy minister
http://www.globalpost.com/dispatch/news/afp/140707/russian-output-expands-12-second-quarter-economy-minister-0 //jweideman]
Russia's economy minister said Monday that output expanded by 1.2 percent in the
second quarter compared to the same period last year , a preliminary figure that is "slightly better"
than expected despite the Ukraine crisis. "The results are slightly better than we predicted, with the
emphasis on 'slightly'," economy minister Alexei Ulyukayev said. He added that the "refined" official figure by Russia's statistics
agency will be released later. The IMF said last month that Russia is already in recession, while the central bank said growth in 2014
UQ AT: sanctions
There wont be more sanctions
Al Jazeera 14
[Al Jazeera also known as Aljazeera and JSC (Jazeera Satellite Channel), is a Doha-based broadcaster owned
by the Al Jazeera Media Network, June 5 2014. G7 holds off from further Russia sanctions
http://www.aljazeera.com/news/europe/2014/06/g7-hold-off-from-further-russia-sanctions-20146545044590707.html //jweideman]
the summit following Russia's annexation of Crimea in March, saying that he was ready to meet Ukraine's president-elect Petro
Poroshenko and US President Barack Obama.
U-Production decline
Oil production is declining
Summers 14
[Dave, author for economonitor, july 7 14, Oil Production Numbers Keep Going Down
http://www.economonitor.com/blog/2014/07/oil-production-numbers-keep-going-down/ //jweideman]
probable in the non-too distant future. Rising prices continually change this latter condition, and may initially disguise the arrival of
the peak, but it is becoming inevitable. Over the past two years there has been a steady growth in demand, which OPEC expects to
continue at around the 1 mbd range, as has been the recent pattern. The challenge, on a global scale, has been to identify where
the matching growth in supply will come from, given the declining production from older oilfields and the decline rate of most of the
years this is going to turn out to have created a false sense of security, and led to decisions on energy that will not easily be
reversed. Consider that the Canadians have now decided to build their Pipeline to the Pacific. The Northern Gateway pipeline that
Enbridge will build from the oil sands to the port of Kitimat.
[Jennifer, Staff writer for NOLA.com | The Times-Picayune covering energy, banking/finance and general business
news in the greater New Orleans area. 7/8/14, Oil, gas production declining in Gulf of Mexico federal waters, report says
http://www.nola.com/business/index.ssf/2014/07/oil_gas_production_in_federal.html //jweideman]
Oil and gas found off the coast of Louisiana and other Gulf Coast states made up almost
one quarter of all fossil fuel production on federal lands in 2013, reinforcing the region's role as
a driving force in the U.S. energy industry, according to updated government data. But a closer look at the
numbers shows the region's oil and gas production has been in steady decline for
much of the past decade. A new U.S. Energy Information Administration report
shows federal waters in the Gulf of Mexico in 2013 accounted for 23 percent of the
16.85 trillion British thermal units (Btu) of fossil fuels produced on land and water
owned by the federal government. That was more than any other state or region aside from Wyoming, which has
seen strong natural gas production in recent years. The report did not include data on oil and gas production on private lands, which
But
production in the offshore gulf has also fallen every year since 2003. According to
the report, total fossil fuel production in the region is less than half of what it was a
decade ago, down 49 percent from 7.57 trillion Btu in 2003 to 3.86 trillion in 2013. The report notes that the region has seen
a sharp decline in natural gas production as older offshore fields dry out and more companies invest in newer gas finds
makes up most production in many onshore oil and gas fields, including the Haynesville Shale in northwest Louisiana.
onshore, where hydraulic fracturing has led to a boom production. Natural gas production in the offshore gulf was down 74 percent
from 2003 to 2013. The region's oil production has declined, though less drastically. The offshore gulf produced about 447 million
barrels of oil in 2013, down from a high of 584 million barrels in 2010. Still, the region accounted for 69 percent of all the crude oil
produced on all federal lands and waters last year.
Companies purchase leases hoping they will hold enough oil or natural gas to
benefit consumers and become economically viable for production. Companies can
spend millions of dollars to purchase a lease and then explore and develop it, only
to find that it does not contain oil and natural gas in commercial quantities. It is not
unusual for a company to spend in excess of $100 million only to drill a dry hole.
The reason is that a company usually only has limited knowledge of resource
potential when it buys a lease. Only after the lease is acquired will the company be in a position to evaluate it,
usually with a very costly seismic survey followed by an exploration well. If a company does not find oil or natural gas in commercial
quantities, the company hands the lease back to the government, incurs the loss of invested money and moves on to more
promising leases. If a company finds resources in commercial quantities, it will produce the lease. But there sometimes can be
delays often as long as ten years for environmental and engineering studies, to acquire permits, to install production facilities
(or platforms for offshore leases) and to build the necessary infrastructure to bring the resources to market. Litigation, landowner
disputes and regulatory hurdles also can delay the process.
Exxon Mobil Corp. is having trouble finding more oil, it revealed yesterday in its
earnings call. The company said it was depleting its reserves, replacing only 95 out
of every 100 barrels it pumps. Though it tried to put a positive spin on things by saying it had made up for the
shortfall by bolstering its natural gas supplies, experts tell the Wall Street Journal thats probably a
shift born of grim necessity. Oil companies across the board are drifting toward
natural gas, because oil is harder and harder to come by , according to the Journal. Most accessible
fields are already tapped out, while new fields are either technically or politically difficult to exploit. The good old days are gone and
not to be repeated, says one analyst. Natural gas is not going to give you the same punch as oil.
Ocean FOCUS began issuing forecasts on 16 February 2006 just in time to warn oil
production operators of a new warm eddy that has formed in the oil and gasproducing region of the Gulf of Mexico. These eddies, similar to underwater
hurricanes, spin off the Loop Current an intrusion of warm surface water that flows northward from the
Caribbean Sea through the Yucatan Strait from the Gulf Stream and can cause extensive and costly damage
to underwater equipment due to the extensive deep water oil production activities in the
region.The Ocean FOCUS service is a unique service that provides ocean current forecasts to the offshore oil
production industry to give prior warning of the arrival of eddies. The service is
based on a combination of state-of-the-art ocean models and satellite
measurements. Oil companies require early warning of these eddies in order to
minimise loss of production, optimise deep water drilling activities and prevent damage to critical equipment. The
Loop Current and eddies shedding from it pose two types of problems for underwater production systems: direct force and induced
The impact
of these eddies can be very costly in terms of downtime in production and
exploration and damage to sub sea components.
vibrations, which create more stress than direct force and results in higher levels of fatigue and structural failure.
IOOS supplies critical information about our Nation's waters. Scientists working to understand
climate change, governments adapting to changes in the Arctic, -nunicipalities monitoring local water quality, industries
jnderstanding coastal and marine spatial planning all have the same need :
[Technical University of Denmark (DTU) 2/27/9, New Oil Deposits Can Be Identified Through Satellite Images
http://www.sciencedaily.com/releases/2009/02/090226110812.htm //jweideman]
result in high precision Ole Baltazars map shows variations in gravitational force across
the surface of the Earth and knowledge about these small variations is a valuable
tool in oil exploration. Subterranean oil deposits are encapsulated in relatively light
materials such as limestone and clay and because these materials are light, they
have less gravitational force than the surrounding materials. Ole Baltazars map is based on
satellite measurements and has a hitherto unseen level of detail and accuracy. With this map in your hands, it is, therefore,
easier to find new deposits of oil underground. Climate change is revealing new sea regions The
gravitational map from DTU Space is unique on account of its resolution of only 2 km and the fact that it covers both land and sea
regions. Oil companies use the map in the first phases of oil exploration. Previously, interesting areas were typically selected using
The map will also be worth its weight in gold when the ice in the Arctic seriously begins to melt, revealing large sea regions where it
is suspected that there are large deposits of oil underground. With our map, the companies can more quickly start to drill for oil in
the right places without first having to go through a resource-intensive exploration process, explains Ole Baltazar. Based on height
measurements instead of direct gravitation measurements The success of the gravitational map is due in large part to the fact that
it is not based on direct gravitation measurements but on observations of the height of the sea, which reflects the gravitation.
[Julia, Foreign policys Moscow correspondant. 6/12/12, What will it take to push russians over the edge
http://www.foreignpolicy.com/articles/2012/06/12/powder_keg?page=0,1 //jweideman]
There is also the economic factor to consider. The Russian economy is currently
growing at a relatively healthy 3.5 percent, but it's useful to recall the whopping
growth rates Russia was posting just a few years ago. I n 2007, the year before the world financial
crisis hit Russia, Russia's GDP growth topped 8 percent. It had been growing at that pace,
buoyed by soaring commodity prices, for almost a decade, and it was not accidental that this
was the decade in which Putin made his pact with the people: You get financial and
consumer comforts, and we get political power. It's hard to maintain such a pact
when the goodies stop flowing. Which brings us to the looming issue of the Russian budget deficit. To keep
the people happy and out of politics , the Russian government has promised a lot of
things to a lot of people. (Putin's campaign promises alone are estimated by the
Russian Central Bank to cost at least $170 billion.) To balance its budget with such
magnanimity, Russia needs high oil prices, to the point where last month, the
Ministry of Economic Development announced that an $80 barrel of oil would be a
"crisis." Keeping in mind that oil is now about $98 a barrel, and that Russia used to be able to balance its budgets just fine with
oil at a fraction of the price, this doesn't look too good for Putin. Factor in the worsening European crisis -- Europe is still Russia's
biggest energy customer -- and the fact that the state has put off unpopular but increasingly necessary reforms, like raising utility
prices, and you find yourself looking at a powder keg.
[Michael, is an American author and journalist who specializes in Asian economics, politics and history. He is
currently the Asia business correspondent for TIME Magazine. July 5 2012, Why Vladimir Putin Needs Higher Oil Prices
http://business.time.com/2012/07/05/why-vladimir-putin-needs-higher-oil-prices/ //jweideman]
Falling oil prices make just about everyone happy. For strapped consumers in struggling developed nations, lower
oil prices mean a smaller payout at the pump, freeing up room in strained wallets to spend on other things and boosting
economic growth. In the developing world, lower oil prices mean reduced inflationary pressures, which will give central bankers
more room to stimulate sagging growth. With the global economy still climbing out of the 2008 financial crisis, policymakers around
What that means is Putin requires a higher oil price to meet his
spending requirements today than he did just a few years ago. Research firm Capital Economics
15.2% four years earlier.
figures that the government budget balanced at an oil price of $55 a barrel in 2008, but that now it balances at close to $120. Oil
prices today have fallen far below that, with Brent near $100 and U.S. crude less than $90. The farther oil prices fall, the more
pressure is placed on Putins budget, and the harder it is for him to keep spreading oil wealth to the greater population through the
Putin hasnt been scaling back even as oil prices fall. His government is earmarking $40 billion to support the economy, if necessary,
over the next two years. He does have financial wiggle room, even with oil prices falling. Moscow has wisely stashed away
a rainy day fund it can tap to fill its budget needs. But Putin doesnt have the
flexibility he used to have. The fund has shrunk, from almost 8% of GDP in 2008 to a
touch more than 3% today. The package, says Capital Economics, simply highlights the weaknesses of Russias
economy: This cuts to the heart of a problem we have highlighted before namely that Russia is now much more
dependent on high and rising oil prices than in the past The fact that the share of
permanent spending (e.g. on salaries and pensions) has increasedcreates
additional problems should oil prices drop back (and is also a concern from the perspective of mediumpetrodollars into
term growth)The present growth model looks unsustainable unless oil prices remain at or above $120pb.
[Maugeri, Leonardo. Global Oil Production is Surging: Implications for Prices, Geopolitics, and the
Environment. Policy Brief, Belfer Center for Science and International Affairs, Harvard Kennedy School, June
2012.http://belfercenter.ksg.harvard.edu/files/maugeri_policybrief.pdf //jweideman]
Oil Prices May Collapse. Contrary to prevailing wisdom that increasing global
demand for oil will increase prices, the report finds oil production capacity is
growing at such an unprecedented level that supply might outpace consumption.
When the glut of oil hits the market, it could trigger a collapse in oil prices. While the age
of "cheap oil" may be ending, it is still uncertain what the future level of oil prices might be. Technology may turn today's expensive
The oil market will remain highly volatile until 2015 and prone to
extreme movements in opposite directions, representing a challenge for investors.
After 2015, however, most of the oil exploration and development projects analyzed in the report will advance significantly and
contribute to a shoring up of the world's production capacity. This could provoke overproduction and lead
to a significant, steady dip of oil prices, unless oil demand were to grow at a sustained yearly rate of at least
1.6 percent trough 2020. Shifting Market Has Geopolitical Consequences. The United States could conceivably
produce up to 65 percent of its oil consumption needs domestically , and import the
remainder from North American sources and thus dramatically affect the debate
around dependence on foreign oil. However the reality will not change much, since there is one global oil market
oil into tomorrow's cheap oil.
in which all countries are interdependent. A global oil market tempers the meaningfulness of self- sufficiency, and Canada,
Venezuela, and Brazil may decide to export their oil and gas production to non- U.S. markets purely for commercial reasons.
considering the recent political focus on U.S. energy security, even the spirit
of oil self-sufficiency could have profound implications for domestic energy policy
and foreign policy. While the unique conditions for the shale boom in the United States cannot be easily replicated in
However,
other parts of the world in the short-term, there are unknown and untapped resources around the globe and the results of future
exploration development could be surprising. This combined with China's increasing influence in the Middle East oil realm will
continue to alter the geopolitics of energy landscape for many decades.
[Harry vidas, Martin Tallett Tom OConnor David Freyman William Pepper Briana Adams Thu Nguyen Robert
Hugman Alanna Bock ICF International EnSys Energy. March 31, 2014. The Impacts of U.S. Crude Oil Exports on Domestic Crude
Production, GDP, Employment, Trade, and Consumer Costs http://www.api.org/~/media/Files/Policy/LNG-Exports/LNG-primer/APICrude-Exports-Study-by-ICF-3-31-2014.pdf //jweideman]
increase in U.S. crude production accompanied by a relaxation of crude export constraints would tend
to increase the overall global supply of crude oil, thus pulling downward pressure
global oil prices. Although the U.S. is the second largest oil producer in the world and could soon be the largest by 2015,
according to the International Energy Agency (IEA)48, the price impact of crude exports is determined
by the incremental production, rather than total production . For this study, ICF used Brent crude as
The
proxy for the global crude price as affected by forces of global crude supply and demand. The impact of lifting crude exports on
Brent prices, as shown in Exhibit 4-19, is relatively small, about $0.05 to $0.60/bbl in the Low- Differential Scenario and about $0.25
geopolitical events. Changes in any of these factors could mean actual Brent prices would deviate significantly from our forecasts.
[Ron Alquist and Justin-Damien Gunette work for the bank of Canada. 2013, A Blessing
in Disguise: The Implications of High Global Oil Prices for the North American Market, http://www.bankofcanada.ca/wpcontent/uploads/2013/07/wp2013-23.pdf //jweideman]
The presence of these unconventional sources of oil throughout the world and the ability to recover
them makes a large expansion in the physical production of oil a possibility. Recent
estimates suggest that about 3.2 trillion barrels of unconventional crude oil, including up to 240 billion barrels of tight oil, are
available worldwide (IEA 2012a). By 2035, about 14 per cent of oil production will consist of unconventional oil, an increase of 9
percentage points.
The potential for unconventional oil extraction around the world has led some
oil industry analysts to describe scenarios in which the world experiences an oil glut
and a decline in oil prices over the medium term (Maugeri 2012).
Impact-Proliferaiton
Russias economy is key to stop proliferation
Bukharin 3 [Oleg, August, he is affiliated with Princeton University and received his Ph.D. in physics from the Moscow
Institute of Physics and Technology The Future of Russia: The Nuclear Factor,
http://www.princeton.edu/~lisd/publications/wp_russiaseries_bukharin.pdf,//jweideman]
There are presently no definite answers about the future of the nuclear security agenda in Russia. The Russian nuclear legacy its nuclear forces, the
nuclear-weapons production and power-generation complex, huge stocks of nuclear-useable highly enriched uranium and plutonium, and environmental
worlds nuclear industry, increased transparency of nuclear operations, and cooperative nuclear security relations with the United States and other
western countries are also essential to reducing nuclear dangers and preventing catastrophic terrorism.
level, nuclear weapons make states feel more powerful, respected and influential in world politics. When it is in their best interest,
danger of nuclear proliferation in 1963: I ask you to stop and think for a moment what it would mean to have nuclear weapons in so
many hands, in the hands of countriesthere would be no rest for anyone then, no stability, no real securitythere would only be
the increased chance of accidental war, and an increased necessity for the great powers to involve themselves in what otherwise
Impact-Relations
US-Russia tensions high now
Labott 14
[Elise Labott, CNN Foreign Affairs Reporter. 3/11/14, Ukraine impasse stirs U.S.-Russia tensions
http://www.cnn.com/2014/03/10/world/europe/ukraine-us-russia-tensions/ //jweideman]
Tensions between the United States and Russia over the crisis in Crimea have
exploded into an open row as Russia rejects U.S. diplomatic efforts to solve the
impasse. Russian Foreign Minister Sergey Lavrov said Monday that U.S. Secretary of State John Kerry postponed a face-to-face
meeting with Russian President Vladimir Putin to discuss American proposals, which Moscow has effectively rejected, on solving the
The meeting, which Russia said was supposed to happen Monday, would have
marked the highest-level contact between the two countries since Russian troops
took up positions in Crimea, and would have come ahead of Sunday's potentially
explosive vote on whether Crimea should split from Ukraine and join Russia. But Kerry
told Lavrov he needed to know Moscow would engage seriously on a diplomatic solution before meeting with the Russian leader .
He also wanted to see and end to Russia's "provocative steps" before traveling to
Russia. Expert: We need a 'Plan B' for Ukraine Pro-Russian forces muscle into base What Bush admin. got wrong on Russia
Relations between Russia and the West have grown increasingly tense since Russian soldiers seized effective
control of the pro-Russian region. The United States and other European powers
have threatened possible sanctions in response to Russia's moves, but Moscow has
shown little sign of backing down.
crisis.
[Mark, contributer to forbes, specializes in Russian economics. 6/22/12, Is Russia Suffering From Dutch
Disease? http://www.forbes.com/sites/markadomanis/2012/06/22/is-russia-suffering-from-dutch-disease///jweideman]
the reason that Russia is not experiencing Dutch Disease (which is something you would
is that the
world economy has been in turmoil for most of the past 4 years: there has been a
flight to quality in safe assets and currencies which has surely worked to
weaken the ruble and depress its value. The new normal is actually a pretty bizarre state of affairs, and is
Part of
normally expect in a country that has earned such an enormous pile of money from selling oil and natural gas)
characterized by any number of things, such as negative real interest rates on German Bunds and US treasuries, that ten years ago
Russias economy faces an awful lot of risks, and its overdependence on natural resources is extremely dangerous, particularly at a time that global growth
is slamming to a halt. Buckley is right that Russia needs to diversify , and that its government will find this process to
be an extremely difficult and complicated one. But, at the present time, one of the very few
economic risks that Russia doesnt face is Dutch Disease: its currency isnt overvalued and, if
would have seemed impossible.
The fear that the Russian economy may become too dependent on the energy
sector and not sufficiently diversified has influenced monetary policy over the last
ten years. This policy was aimed at preventing the nominal appreciation of the rouble in order to maintain industrial
competitiveness. In this paper, using Rosstat and CHELEM databases, we study whether Russia
suffered the Dutch disease in 1999-2007. We do find some symptoms of it in Russia:
there was a strong real appreciation of the rouble, real wages increased,
employment decreased in manufacturing industries and rose in services sector.
However, there was no sign of a de- industrialisation, what contradicts the theory of
the Dutch disease. Indeed, industrial production increased significantly. Furthermore, the symptoms present
in Russia can be the consequences of other factors than the existence of natural
resources. The appreciation of the rouble in real terms came partly from the Balassa-Samuelson effect. The quick development
of the services was partly due to the fact that services were not put forward during the Soviet Union limes. The outflow of labour
from the manufacturing industries resulted in inflow of labour in services sector rather than in the energy sector.
[US China Energy Cooperation Project. 2013-11-20, U.S.-China Energy Cooperation Program (ECP) and Chinese
Institute of Electronics (CIE) Launch Joint U.S.-China Green Data Center Industrial Initiative,
http://www.uschinaecp.org/en/News/NewsDetail.aspx?nid=f74ebbdf-f9be-418a-8147-a12a7e31088b&cid=d952ba0f-3ba2-43728b26-ef635b67d638 //JWEIDEMAN]
On November 20, 2013, with support of the Chinese Ministry of Industry and
Information Technology (MIIT) and the U.S. Trade and Development Agency (USTDA),
the U.S.-China Energy Cooperation Program (ECP) and Chinese Institute of
Electronics (CIE) jointly launched the U.S.-China Green Data Center Industry
Partnership at the US China Green Data Center Workshop at the Xiyuan Hotel in Beijing. The
key related issues of the development of the energy efficient/green data center
sector, which includes: market, needs, opportunities, challenges, technology
solutions and best practices, evaluation methods and etc, have been addressed and discussed at
the workshop. At the workshop, the two sides on behalf of the participating U.S. and Chinese
industries, signed a MEMORANDUM OF UNDERSTANDING For Cooperation on Green
Data Center to promote US-China Industry cooperation in Chinas green data center sector. Senior
officials from the Department of Energy Efficiency and Resources Utilization of MIIT, USTDA, Foreign Commercial Service and the U.S.
Department of Energy China Office of the U.S. Embassy in Beijing witnessed the signing ceremony. Industry experts from Intel, UTC,
Caterpillar, Cisco, Emerson, Fuxing Xiaocheng, NewCloud, Neusoft, Huawei, Inspur, ZTE attended the workshop. The three-year joint
ECP and CIE aims to provide valuable reference and living best practices
for green data center development in China through deeply cooperation between
both US and China industries. Specifically, these include: 1. Green data center technical guideline, technology
program between
catalogue, and green data center best practice portfolio. 2. Green data center related incentive plan, business model, monitoring
and evaluation method and system and etc. 3. Green data center energy saving demonstration projects, needs assessment, and
industry entry study etc. 4. Capacity building: green data center expert committee establishment, practical training, and certificate
training and study tour.
The United States and the Peoples Republic of China have worked together on
science and technology for more than 30 years. Under the Science and Technology
Cooperation Agreement of 1979, signed soon after normalization of diplomatic relations, our two
countries have cooperated in a diverse range of fields, including basic research in
physics and chemistry, earth and atmospheric sciences, a variety of energy-related
areas, environmental management, agriculture, fisheries, civil industrial technology,
geology, health, and natural disaster planning. More recently, in the face of
emerging global challenges such as energy security and climate change, the United
States and China entered into a new phase of mutually beneficial cooperation. In
June 2008, the U.S.-China Ten Year Framework for Cooperation on Energy and the
Environment was created and today it includes action plans for cooperation on
energy efficiency, electricity, transportation, air, water, wetlands, nature reserves and protected areas. In November 2009,
President Barack Obama and President Hu Jintao announced seven new U.S.- China
clean energy initiatives during their Beijing summit. In doing so, the leaders of the worlds two largest
energy producers and consumers affirmed the importance of the transition to a clean and low-carbon economyand the vast
opportunities for citizens of both countries in that transition.the following joint initiatives were announced in november 2009: U.S.-
$150 million in financial support from public and private sources over five years.
Electric Vehicles Initiative. This initiative includes the joint development of standards for charging plugs and testing protocols of
batteries and other devices, demonstration projects in paired cities to collect and share data on charging patterns and consumer
preferences, joint development of technical roadmaps, and public education projects. Energy Efficiency Action Plan. Both
governments are working together with the private sector to develop energy efficient building codes and rating systems, benchmark
industrial energy efficiency, train building inspectors and energy efficiency auditors for industrial facilities, harmonize test
procedures and performance metrics for energy-efficient consumer products, and exchange best practices in energy efficiency
opportunity to develop clean energy solutions that will reduce pollution and improve energy security while enhancing economic
growth globally.
Will elevated oil prices generate a sharp slowdown? The recent run-up in oil prices
has investors worried that a sharp slowdown in global growth lies ahead. Worse,
with the oil market increasingly pricing the risk that the shock may be permanent the Dec 2012 futures contract trades at around US$109 per barrel - there is even talk of the possibility of a
global recession. We think not, for two broad reasons. First, oil prices constitute
wealth redistribution, not wealth destruction. And second, this time is different: as
things stand, we think that the current oil shock is unlikely to harm growth much.
Higher oil prices redistribute wealth rather than destroying it. From a global
perspective, higher oil prices do not mean that wealth is destroyed. Rather, it is
redistributed - from net oil importers to net exporters (see below on The Effects of Oil Price Shocks).
While this wealth transfer can be substantial, much of it is, over time, recycled back into net oilimporter economies: Oil exporters will spend some of their newly gained oil wealth on imports from net importers;
hence, increased import demand replaces some of the domestic demand lost due to
the wealth transfer; Oil exporters will also purchase assets in net importer
economies, providing support for asset markets there. Even so, oil shocks have
always been redistributive - yet arguably they have done considerable damage to the global economy in the past.
The 1970s, when oil price spikes preceded two successive recessions, are held up as the prime examples of the harm that oil can do
to the economy. So, it is tempting to look at the effects of previous oil shocks in order to infer the consequences of the current one.
We would urge caution, however, as all else is not equal. Many things are different this time, so a given increase in the price of oil
US$80 per barrel to about US$100 was due to demand, in our view, with the oil market repricing the global growth and inflation
trajectory. It is only the recent rise, due to events in the Middle East, that constitutes a supply shock. That is, supply has accounted
doubled, with little evident harm to the global economy. But exogenous, supply-induced shocks tend to be stagflationary in nature.
In the dog metaphor, an oil supply shock could act as a jerk on the leash, which could bring the dog to a halt. 2. Initial Conditions
This recovery is still in relatively early stages and thus more robust than a late-cycle recovery. This is partly because personal
savings rates are higher early in the cycle than late in the cycle, providing more of a cushion for consumer spending. Also, corporate
profit margins are usually higher early on in the cycle than late in the cycle, and companies can thus more easily absorb the cost
push from higher oil prices. Most importantly, global monetary conditions are very loose, providing a cushion to the real economy. 3.
The oil intensity of global GDP is lower. That is, less oil is used
today to produce a dollar of GDP. Hence, a given percentage increase in the
inflation-adjusted price of oil will do less damage to the economy now. Labour
markets are more flexible: The effect of an oil supply shock on employment and output is larger, the less real wages
Structure of the Economy
adjust in response. More flexible labour markets make real wages more responsive to supply shocks, thus dampening the effects on
production and employment. Monetary policy is more credible: A more credible central bank needs to act less, all else equal, to
achieve a given objective. In the case of an oil shock, the implication is that monetary policy needs to do less harm to the real
economy to achieve a given dampening effect on inflation. 4. Policy Response in Net Importing Economies: Rational Inaction' Means
Central Banks Will Keep Rates on Hold The policy response to an oil shock is an important determinant of the overall economic
effect. If central banks were to tighten in response to inflation, the dampening effect of the initial oil shock on the economy would be
amplified. We have just argued, in very general terms, that a more credible monetary policy requires less of a response to achieve a
desired effect. In this particular case, however, we do not expect much of a response in the first place: in our view, most central
banks will not tighten policy aggressively in response to the recent surge in oil prices. This is likely to be true for both DM and EM
monetary authorities. Again, this reaction - or rather, the lack of one - is rational: it is virtually impossible to forecast whether the
current supply worries will abate or not, so a wait and see' stance makes sense. 5. Oil Exporters' Behaviour May Be Materially
Different Oil exporters will likely spend more of the wealth transfer than usual: With the risk of social unrest increasing,
governments will be inclined to increase spending and transfers in order to maintain stability. This suggests that a larger part of the
initial wealth transfer from net exporters to net importers will be reversed, over time, through goods imports of the former from the
latter. At current prices, oil producers will probably want to make up a meaningful part of the Libyan production shortfall over the
medium term - not least because doing so will generate additional revenue with which social stability can be bought. Our commodity
strategy team thinks that there is enough spare capacity to do so (see Crude Oil: MENA Turm-OIL, February 25, 2011).
Farms DA
1NC
Water pollution from runoff on US farms has avoided
regulation so far but continued exemption is not guaranteed
David C. Roberts 9, assistant professor in the Department of Agribusiness and
NPS
sources subject to the CWAs effluent limitations to satisfy some of these restrictions by leasing effluent reductions from NPS. This has come to be known
as water quality trading (USEPA 2003b, 1996). States, attracted by the possibility of reducing pollution abatement costs, have implemented a number of
States have also stepped into the void by enacting statutes that address
NPS pollution by requiring landowners to observe mandatory setback or buffer requirements along waterways. These statutes
tend to be limited in their application to specific activities or waterways, or both. For example, while the
Georgia Erosion and Sedimentation Control Act prohibits land-disturbing activities within 7.6 meters (25 feet) of the banks of any state water, it
exempts a fairly extensive list of activities from the prohibition, including agricultural practices, forestry land
water quality trading (WQT) programs (Breetz et al.).
management practices, dairy operations, livestock and poultry management practices, construction of farm buildings (Georgia Department of Natural
Resources). North Carolina has adopted regulations designed to protect a 15.2-meter (50-feet) riparian buffer along waterways in the Neuse and TarPamlico River Basins (North Carolina Department of Environment and Natural Resources). Virginias Chesapeake Bay Preservation Act provides for 30.4meter (100-feet) buffer areas along specifically designated Resource Protection Areas, but allows encroachments into buffer areas for agricultural and
silvicultural activities under certain conditions (Chesapeake Bay Local Assistance Board). Riparian buffer stripsareas of trees, shrubs, or other
vegetation along surface water bodieshave also played a prominent role in federal and state voluntary programs to reduce NPS pollution. For example,
the Conservation Reserve Program has supported the use of buffer strips since 1988 (Prato and Shi) and funds made available to states through the CWAs
Section 319 program are often used to subsidize buffer strip installation (Nakao and Sohngen). Buffer strips are proving an attractive policy option because
of their effectiveness in intercepting and removing nutrients, sediment, organic matter, and other pollutants before these pollutants enter surface water
and because of the other environmental benefits they provide, including improved terrestrial and aquatic habitat, flood control, stream bank stabilization,
that alternatives to regulation will continue, if not increase. Many of these will promote the installation of buffer strips through subsidies of one form or
another. For example, the Conservation Reserve Enhancement Program, which subsidizes conservation-oriented practices by landowners, has established
a goal of installing 157,000 miles of riparian buffers, filter strips, and wetlands in states in the Chesapeake Bay watershed (Bonham, Bosch, and Pease).
IOOS leads to new policies to prevent harmful algal blooms--results in stricter regulation on non-point sources of nutrient
pollution like agriculture
Dr. Donald Anderson 8, Senior Scientist in the Biology Department of the Woods
Hole Oceanographic Institution and Director of the U.S. National Office for Marine
Biotoxins and Harmful Algal Blooms, Phd in Aquatic Sciences from MIT, Written
testimony presented to the House Committee on Science and Technology,
Subcommittee on Energy and Environment, July 10 2008,
http://www.whoi.edu/page.do?pid=8915&tid=282&cid=46007
These are but a few of the advances in understanding that have accrued from ECOHAB regional funding. Equally important are the discoveries that
Numerous monitoring programs of this type have been established in U.S. coastal waters, typically by state agencies. This monitoring has become quite
expensive, however, due to the proliferation of toxins and potentially affected resources. States are faced with flat or declining budgets and yet need to
monitor for a growing list of HAB toxins and potentially affected fisheries resources. Technologies are thus urgently needed to facilitate the detection and
characterization of HAB cells and blooms. One very useful technology that has been developed through recent HAB research relies on species- or strainspecific probes that can be used to label only the HAB cells of interest so they can then be detected visually, electronically, or chemically. Progress has
been rapid and probes of several different types are now available for many of the harmful algae, along with techniques for their application in the rapid
and accurate identification, enumeration, and isolation of individual species. One example of the direct application of this technology in operational HAB
monitoring is for the New York and New Jersey brown tide organism, Aureococcus anophagefferens. The causative organism is so small and non-descript
that it is virtually impossible to identify and count cells using traditional microscopic techniques. Antibody probes were developed that bind only to A.
anophagefferens cells, and these are now used routinely in monitoring programs run by state and local authorities, greatly improving counting time and
accuracy. These probes are being incorporated into a variety of different assay systems, including some that can be mounted on buoys and left
unattended while they robotically sample the water and test for HAB cells. Clustered with other instruments that measure the physical, chemical, and
optical characteristics of the water column, information can be collected and used to make algal forecasts of impending toxicity. These instruments are
taking advantage of advances in ocean optics, as well as the new molecular and analytical methodologies that allow the toxic cells or chemicals (such as
includes arrays of moored instruments as sentinels along the U.S. coastline, detecting HABs as they develop and radioing the information to resource
managers. Just as in weather forecasting, this information can be assimilated into numerical models to improve forecast accuracy.
agriculture. Farm Bureau has a long history of supporting market-based approaches to improving the environment. We have also encouraged states to
include trading in their toolbox to help implement state water quality programs because trading and offsets can reduce costs associated with achieving
environmental improvements. Even with that history of support, however, farmers and ranchers remain skeptical of trading programs in general and
those associated with water quality specifically, and for good reason. Farmers grow things and understand that nutrient enrichment is a predictable
outcome of all human activities not just farming and ranching activities. Farmers understand that agricultural activities like those in virtually every
other part of life, such as shopping malls, golf courses, residential areas to name just a few can affect the amount of nutrients that reach our waters. The
fact is, each and every one of us plays a role in water quality; we all contribute to nutrient loading, either directly or indirectly, through the food we
heavily on pollution prevention and reduction based on the concept of polluter pays. For conventional pollutants, this approach has resulted in costly
sources . The fact is, nutrients are critical for optimal productivity in agriculture
even though farmers and ranchers are striving for the best possible ecosystem
function. Managing nutrients is extremely complicated because there is not one
practice, technology or approach that can optimize nutrient utilization throughout
the environment. Therefore, we need policy options that are balanced. We must develop solutions that optimize outcomes. We all want: 1)
safe, affordable and abundant food, fiber and fuel; 2) vibrant and growing communities with jobs and expanding economic activity; and 3) fishable and
swimmable waters. The challenges presented by trading and offset programs are the complex interplay of economic scenarios that could play out over
regulatory offsets
time when such programs are taken to their logical conclusions. For example, if
are required for any new development or
for expanding economic activity, one would expect a regulatory offsets process to trade low-value economic activity for high-value activity. In real life,
however, such a program would not be likely to require an old home to be torn down before a new home could be constructed. Likewise, the construction
and operation of a new manufacturing facility and the jobs inherent to that economic activity would not likely come at the expense of other high-value
suitable for building a factory could be valued at $100,000 or more per acre, while land in the same geographic area suitable to produce corn or soybeans
could be valued at $10,000 per acre. In a market-based system, it would appear to be only rational to extinguish the environmental externalities
generated by the farmland to offset the externalities associated with the higher value economic activity of manufacturing. While this may be an extreme
example, the reality is that the nation has never used water quality as a mechanism to cap or, in some cases like the Chesapeake Bay, reduce economic
The long-run reality for farmers and ranchers would be that, over time, rural
areas will have fewer and fewer means to sustain themselves . Trading and Offsets are Creatures of
activity.
State Statutes The Clean Water Act leaves the task of controlling water pollution largely to the states; it expressly recognizes, preserves and protects
the primary responsibilities and rights of States to prevent, reduce, and eliminate pollution [and] to plan the development and use of land and water
resources. Authorized federal involvement in state actions is carefully limited. Under no circumstances does the act authorize EPA to assume state
responsibility to develop a planning process or a Total Maximum Daily Load (TMDL) implementation plan. It is within these contexts that trading programs
are often contemplated. As such, states may implement trading and offsets programs established under state laws. In addition, states retain the flexibility
to choose both if and how to use trading in the implementation of state water quality programs. Nutrient Standards May Not be Affordable or Attainable
Without Trading Optimizing nitrogen and phosphorus utilizations through trading may hold potential, but there are significant scientific, market and
regulatory challenges. First, from a scientific standpoint, there is no direct relationship between agricultural nutrient management practices and nutrient
loss. If the relationship were direct, trading would be straightforward, transparent and enable orderly operations of markets. Second, under the Clean
Water Act, states establish and EPA approves water quality standards and criteria. States are currently feeling pressure from EPA to adopt default numeric
nutrient standards and criteria based on the level of nutrients found in pristine waters. Such an approach holds the prospect of establishing standards that
standards that are not based on reference waters can be unachievable and require costly control and management measures. EPA and States Are
Imposing Barriers for Markets and Trading Achieving the environmental and economic goals of point source - nonpoint source (PS-NPS) water quality
trading depends on having understandable rules that clearly define what is being traded and the parameters of the exchange. Trading rules and
procedures establish who can trade, what is traded (credit definition), duration of a credit, baseline requirements (for calculating credits), accepted
procedures for calculating credits, how the trade occurs, trading ratios, verification, liability rules, and enforcement procedures. In theory, trading
assumes market participants have full information about the cost and effectiveness of their nutrient reduction options and can instantly and, at little-to-nocost, obtain information on credit market prices and quantities. However, in the real world people are faced with limited time, resources, skills and
acquaintance with markets. Complex rules and inadequate institutional design can result in poor buyer or seller participation, coordination failures and
lack of desired outcomes. (Shortle, 2013). In fact, ex-post assessments of PS-NPS water quality trading programs already in existence have generally
been negative about their performance. Most have seen little or no trading activity, with the expected role for nonpoint sources unrealized. A number of
reasons have been presented including a lack of trading partners (due to limited regional scale or underlying economics), inadequate regulatory
incentives, uncertainty about trading rules and practice performance, excessively high PS-NPS trading ratios (increasing the cost of nonpoint credits), legal
and regulatory obstacles (including liability concerns), high transaction costs, and participant unfamiliarity and inexperience. Pennsylvanias experience
with water quality trading illustrates a number of the challenges I have mentioned. For example, the rules underlying Pennsylvanias nutrient credit
trading program, created in large part in response to an EPA mandate to reduce pollution in the Chesapeake Bay watershed, are the product of a multiyear stakeholder negotiation process that was codified in regulation in 2010. However, shortly thereafter, EPA announced that it would undertake a review
of the offset and trading program in each Chesapeake Bay jurisdiction. EPAs assessment included questions about whether or not Pennsylvanias
agricultural trading baseline met the requirements of TMDLin spite of the fact that trades had already taken place under the program rules in place at
the time. Further, the assessment included EPAs expectations that Pennsylvania would demonstrate that the existing baseline was sufficient to meet the
TMDL, or otherwise make necessary adjustments to the baseline acceptable to EPA. In response to EPAs review, Pennsylvania has since proposed a
number of possible changes to its trading program that have raised serious questions among existing and potential credit generators and users about
what the rules governing the market for credits will look like going forward. Specifically, many are concerned about what happens to non-point source
credit generators, primarily farmers, who have generated and sold credits under Pennsylvania's existing program, and who may have long-term
commitments to provide credits for years into the future. The uncertainty is not conducive to sustaining a successful, transparent, long-term water quality
credits less expensively than other nonpoint and point sources. Whether or not this is true depends heavily on the trading rules and procedures described
previously. Baseline requirements represent one trading rule that has an important impact on agricultures ability to be the low-price supplier of credits.
Baseline requirements establish the level of stewardship farmers and ranchers perform on a parcel of land before they are eligible to participate in the
trading program and actually produce credits for sale. Any abatement necessary to meet the baseline cannot be sold as credits, but is instead credited to
meeting the load allocation for agriculture. When baselines are more stringent than current practices, a farmer would only be willing to create and sell
credits if the expected credit price were high enough to cover the cost of meeting the baseline plus the cost of any measures taken to produce additional
abatement. This increases the cost of supplying credits, and reduces the amount of credits purchased by point sources. Current research suggests that
concerns about baseline requirements are well founded. Stephenson et al. (2010) found that when the baseline is more stringent than current practices,
agricultural credit costs for nitrogen can surpass costs per pound (for marginal abatement) for point sources because the baseline has claimed the lowestcost pollutant reductions. Ghosh et al. (2011) found that Pennsylvanias baseline requirements significantly increased the cost of entering a trading
program, making it unlikely that nonpoint sources that could reduce nutrient losses for the lowest unit costs would enter the market. Wisconsin has
expressed concern that EPAs approach to defining baselines could obstruct agricultural sources participation in trading programs and possibly impede
water quality improvements (Kramer, 2003). The impact of baseline requirements is a crucial matter and fundamental to the successful operation of any
trading program, though its impact is not unique. Any trading rule or requirement that is incorrectly developed can have similar effects: fewer nonpoint
source credits purchased by point sources, and total abatement costs for regulated sources higher than they could have been. As a regulatory agency,
EPA appears to have difficulty appreciating the realities of how markets function. The agency is not necessarily tasked with creating private markets and
most people would probably agree that the agency has difficulty appreciating the realities of how real markets function. As a result, environmental
markets are suffering from a significant creditability crisis. This ultimately results in skeptical farmers and ranchers who then take a cautious approach to
way to capture the benefit associated with incremental nutrients lost to the
environment. Farmers today use some of the most advanced technology in the
world to optimize their productivity. From precision application using 4R nutrient stewardship to GPS technology,
farmers and ranchers are committed to improving their production efficiencies, a
fact that allows them in turn to reduce their environmental footprint. 4R nutrient stewardship is an
effective concept that allows a farmer to use the right fertilizer source, at the right rate, at the right time and with the right placement to optimize nutrient
utilization, while precision agriculture is a farming system that uses technology to allow closer, more site-specific management of the factors affecting
crop production. For example, in precision agriculture, utilizing GPS and yield monitors, farmers can measure their output more precisely by matching
yield data with the location in the field. Special computer-driven equipment can change the rate at which fertilizers, seeds, plant health products and other
inputs are used, based on the needs of the soil and crop in a particular portion of a field. Farmers have embraced precision agriculture and the 4R
philosophy because it is an innovative and science-based approach that enhances environmental protection, expands production, increases farmer
profitability and improves sustainability. Conclusion Your
food , fiber and fuel, and the members of Farm Bureau want the chance to provide them .
Farmers are concerned about the environment. As technology evolves so do farmers. We take advantage of technology, new practices and programs in
order to not only provide safe, affordable and abundant food, fiber and fuel, but also to protect our land, water and air resources.
Extinction
Richard Lugar 2k, Chairman of the Senator Foreign Relations Committee and
Member/Former Chair of the Senate Agriculture Committee calls for a new green
revolution to combat global warming and reduce world instability,
http://www.unep.org/OurPlanet/imgversn/143/lugar.html
In a world confronted by global terrorism, turmoil in the Middle East, burgeoning nuclear threats and other crises, it is easy to lose
sight of the long-range challenges. But we do so at our peril. One of the most daunting of them is meeting the worlds need for food
and energy in this century. At stake is not only preventing starvation and saving the environment, but also world peace and security.
states may go to war over access to resources, and that poverty and famine have
often bred fanaticism and terrorism. Working to feed the world will minimize
History tells us that
farm in Marion County, Indiana, for example, yields on average 8.3 to 8.6 tonnes of corn per hectare typical for a farm in central
Indiana. To triple our production by 2050, we will have to produce an annual average of 25 tonnes per hectare. Can we possibly
boost output that much? Well, its been done before. Advances in the use of fertilizer and water, improved machinery and better
tilling techniques combined to generate a threefold increase in yields since 1935 on our farm back then, my dad produced 2.8 to 3
tonnes per hectare. Much US agriculture has seen similar increases. But of course there is no guarantee that we can achieve those
results again. Given the urgency of expanding food production to meet world demand, we must invest much more in scientific
research and target that money toward projects that promise to have significant national and global impact. For the United States,
that will mean a major shift in the way we conduct and fund agricultural science. Fundamental research will generate the
innovations that will be necessary to feed the world. The
revolution. And our success at
United States
role in the survival of billions of people and the health of our planet.
AT Political Opposition
Their evidence overestimates the farm lobbys clout---the farm
bill proves its too weak to influence policies
Ron Nixon 13, NY Times, July 3 2013, Farm Bill Defeat Shows Agricultures
Waning Power, http://www.nytimes.com/2013/07/03/us/politics/farm-bill-defeatshows-agricultures-waning-power.html
WASHINGTON The startling failure of the farm bill last month reflects the declining clout of
the farm lobby
and the once-powerful committees that have jurisdiction over agriculture policy,
economists
this week. Although a number of factors contributed to the defeat of the bill
including Speaker John A. Boehners failure to rally enough Republican support and Democratic opposition to $20 billion in cuts to
the food stamps program analysts said
the 234 to 195 vote also illustrated the shift in the American
population and political power to more urban areas. There are a small number of Congressional
districts where farming continues to carry much sway, said Vincent H. Smith, a professor of agricultural economics at Montana
State University. Especially
work in farming in the United States, or about 2.5 percent of the total work force. Farming now accounts for about 1 percent of
gross national product, down from a high of about 9 percent in 1950.
farming districts, according to research by Mr. Smith in 2006. He said that number was probably smaller today.
prevent (or
even
appreciably slow) such regulation , and time is running out. As GHG emission controls get imposed, and companies
invest in the necessary control technologies, the political support for constraining EPA authority over GHGs will decline. Utilities may not want to give up
coal or install costly carbon capture technologies, but once theyve made these investments they will hardly want to see the regulations removed. If these
rules are not stopped soon, it will be too late. This is a reason even those who refuse to accept the likelihood of climate change should consider
alternatives to command and control regulation. Shouting No is insufficient to ensure success.
AT Labor Shortages
Farm labor shortages are a myth that repeatedly fail to pan
out
John Carney 12, CNBC, More Data on The Phony Farm Labor Crisis, 30 Aug
2012, http://www.cnbc.com/id/48847903
Its become something of an annual tradition. Every summer , newspapers around the country
roll out stories of a labor shortage on farms. The fruit is going to rot in the
orchards, crops will go unpicked, agricultural communities will be devastated unless
something is done, the stories predict. Heres a pretty typical version of the story, as told by the San Francisco Chronicle: But the American Farm Bureau
Federation and its California chapter believe there is plenty of reason to worry. "There have been instances in which growers had to disc up whole crops
because they didn't have the workforce to harvest," said Kristi Boswell, the Farm Bureau's director of congressional relations. She points to Georgia, where
whole tomato fields were plowed under last year. "The workforce has been decreasing in the last two to three years, but last year it was drastic." And
Chronicle puts it: Growers have tried to hire more domestic workers and had hoped that with unemployment rates so high, they'd find local labor. "But
domestic workers don't stick around to finish the job," Boswell said. "It's labor-intensive and often involves hand-picking in grueling conditions." Earlier
the farm data was that the data about national farm income might have been concealing more local crises. Perhaps the national numbers had come in well
despite the alleged farm crisis, because fields of soy and corn in the Midwest had done well, while the apples of Washington, peanuts of Georgia, and
aging Mexican population, drug wars at the border and a weakened job market in the United States, the flow of migrants has stopped and may actually
So
what happened? Farm profits, what the Ag Department calls net cash income, in California rose from $11.1 billion
in 2010 to $16.1 in 2011, an eye-popping 45 percent growth. In Washington, the apple harvest was
have reversed, according to the Pew Hispanic Center, a nonprofit, nonpartisan public policy research firm that has been studying the trend.
going to be devastated by a labor shortage. Farm profits instead rose from $1.98 billion to $3.14 billion, a 58
percent
Georgia was going to have a rancid harvest due to its farm labor shortage, according to The
Georgia Report. I guess those peanuts picked themselves, because farm profits in the state rose from $2.5 billion to $2.6 billion.
The mythological Arizona farm labor shortage was supposedly destroying its farm
rise.
sector . Somehow or another , farm profits rose from $734 million to $1.3 billion. Not every state saw
profits boom. Farms in Arkansas, Delaware, Hawaii, Louisiana, New Hampshire, North Carolina, Rhode Island and South Carolina did less well in 2011 than
2010. Alabamas farms saw net cash income fall off a cliff, from $1.2 billion to $773 million. But
http://www.thedailybeast.com/articles/2013/05/24/famers-solve-labor-shortage-byraising-pay.html
Farm owners have responded to fears of a labor shortage by actually raising
wages by a little bit. The Department of Agriculture reports: Farm operators paid their hired
workers an average wage of $11.91 per hour during the April 2013 reference week, up 4 percent
from a year earlier. Field workers received an average of $10.92 per hour, up 4 percent from a year earlier.
Livestock workers earned $11.46, up 51 cents. The field and livestock worker combined wage rate, at $11.10 per
American farmer . Farm profits are expected to rise by more than 13 percent this
yearto more than double what they were as recently as 2009.
AT Brazil Solves
Brazil doesnt solve---lack of infrastructure
Alistair Stewart 13, South America Correspondent for DTN/The Progressive
Farmer, 8/6/13, Brazilian Farming's Biggest Problem,
http://www.dtnprogressivefarmer.com/dtnag/view/blog/getBlog.do;jsessionid=2F48B
C2D3422FCE86B5624EA3DE307B4.agfreejvm1?
blogHandle=southamerica&blogEntryId=8a82c0bc3e43976e014055b03e641466
Everybody knows the biggest problems facing Brazilian agribusiness. It's logistics . When the
cost of transporting soybeans from the fields in central Mato Grosso to a China-bound ship reach 40% of the commodity's value, you
over the last year . The freight cost from Sorriso (center-north Mato Grosso) to Paranagua port rose 50% this season,
reaching $3.19 per bushel at the peak of the soybean harvest, while ships were waiting for up to 70 days to load beans at
Paranagua. With demurrage costs (basically ship rental) at $15,000 to $20,000, that wait can be very expensive. "While we don't
resolve the logistics questions, we are inviting other players to enter into the game," said Julio Toledo Piza, chief executive at
works out a cheap way of delivering the grain , feed and meat . If not, alternative
producers in South America, Africa and Asia will grow. "This is a huge opportunity that we can't let slip through our grasp," said
Jorge Karl, director of OCEPAR, the farm coops association in Parana state. Governors have belatedly woken up to the logistical
chaos that surrounds them and the federal government has recently given greater priority to infrastructure development. As a
result, a series of plans to improve infrastructure have started moving forward and progress has been made on key grain corridors,
such as the North-South railway, which will eventually connect the new eastern Cerrado soybean fields to ports in the North. "The
situation is vastly improved ... We hope that in six to seven years we can clear the logistics deficit," said Bernardo Figueiredo, head
For example, farmers have been waiting for asphalt along the BR163 highway, which connects central Mato Grosso with northern
Amazon ports, for a decade. Work has progressed over the last two years, leading many to believe the Transport Ministry forecast
that it will be ready in 2014. However, just this week the Ministry admitted that the asphalt may only be completed in 2015, or
after. Similarly, an East-West railway that will connect the soy and cotton fields of western Bahia to the sea is due to be complete in
2015, but work is yet to start on many stretches and, realistically, beans will flow along this route in 2018 at the earliest. Many
are told that things are improving but we can't wait. The inability of the government to deliver projects on time is unacceptable,"
according to Carlos Favaro, president of the Brazilian Soybean and Corn Growers Association (APROSOJA-MT). None of the major
new grain infrastructure projects will be ready for upcoming 2013-14 season, and with soybean area set to grow by 5% or more, the
logistical chaos could deepen next year. "We are going to have to muddle through next season," said Luis
Carlos Correa Carvalho, president of the Brazilian Agribusiness Association (ABAG). "Realistically, the situation is only likely to
improve in 2016-17," he added. The
production
over the next couple of years, but there remains so much pent up demand for logistics in Brazil that any new
export corridor will be inundated as soon as it opens. Speakers at the conference agreed that the fault for the current situation lies
with government incompetence, as there are ample funds to invest. "Credit is everywhere. The money isn't the problem. The
government has to allow it to be invested," said Ingo Ploger, president of the Business Council of Latin America (CEAL). So why is
government so sluggish? For years, the problem was a lack of political will. Simply, farm infrastructure was not a vote winner. More
recently, the realization that Brazil needs farm revenues to underpin the foreign accounts led President Dilma Rousseff to prioritize
drawn up are often of poor quality and environmental and other authorities don't have expertise in assess them in a timely manner.
That's a big reason why processes are delayed," he explained to the conference. Meanwhile,
corruption remains
means government auditors are quick to suspend projects when suspicions arise. Until
these problems are solved, Brazil will continue to have production costs 10% above
its competitors in the U.S. and Argentina, noted CEAL's Ploger.
food situation even more precarious. Food prices, already elevated, will follow the price of corn upward, quite
Ext UQ
Nutrient runoff from farms is currently unregulated
Ken Kirk 12, Executive Director at the National Association of Clean Water
Agencies, JD from Georgetown University Law Center and Masters in Environmental
Law from GWU Law School, June 20 2012, Is the Clean Water Act Broken and Can
We Fix It? http://blog.nacwa.org/is-the-clean-water-act-broken-and-can-we-fix-it/
Much has been written and will continue to be written about the Clean Water Act this
year as we celebrate the Acts 40th anniversary. I dont think anyone would argue the fact that the Act has been
great for our waterways or that we would be much worse off without it. But now weve entered a much more
difficult and complex period in the history of the Clean Water Act. Federal money continues to dry up. Costs
continue to increase. And requirements have become more stringent. We also have many new challenges to deal
with including climate change, altered weather patterns, and population growth, to name just a few. And
still
to
struggling with how to address wet weather and related runoff, particularly
were
as this relates
nutrients arguably one of our biggest clean water issues right now. And yet, nonpoint
sources remain unregulated . How has this been allowed to continue? Lets look back in time for a
minute. Before the Clean Water Act, discharge was largely unregulated . We know this was
not a good thing. Then, during the first 20 years of the Act, the focus was on wastewater
treatment from domestic and industrial point sources and the broad use of secondary treatment
to accomplish the Acts goals. I believe this was the right thing to do at the time. Unfortunately, this entire
time, nonpoint sources including agricultural operationshave completely
avoided regulatory responsibility for their share of the water pollution problem .
If we continue to ignore this very major source of water quality impairment, then it will come at great cost to all
taxpayers.
He continues:
for World War 3 is not so much a confrontation of super powers and their allies, as a festering , selfperpetuating chain of resource conflicts. He also says: The wars of the 21st Century are less likely to be
global conflicts with sharply defined sides and huge armies, than a scrappy mass of failed states, rebellions, civil strife, insurgencies,
terrorism and genocides, sparked by bloody competition over dwindling resources. As another workshop participant put it, people
do not go to war to kill; they go to war over resources, either to protect or to gain the resources for themselves. Another observed
A study by
the I nternational P eace R esearch I nstitute indicates that where food security is
that hunger results in passivity not conflict. Conflict is over resources, not because people are going hungry.
S trategic and I nternational S tudies and the Oslo Peace Research Institute, all identify
famine as a potential trigger for conflicts and possibly even nuclear war .
Division at the Center for Energy Studies at Louisiana State University, PhD and
Allan G. Pulsipher, Professor of Energy Policy in the Center for Energy Studies and
Professor in the Department of Environmental Studies at Louisiana State University,
PhD in Economics from Tulane University, The potential value of improved ocean
observation systems in the Gulf of Mexico, Marine Policy 28 (2004) 469489,
Science Direct
Waterways draining into the GOM transport wastes from 75% of US farms and ranches,
80% of US cropland, hundreds of cities, and thousands of industries located upstream of the GOM coastal zone. Urban
and agricultural runoff contributes large quantities of pesticides, nutrients , and fecal coliform
bacteria. Activities that have contributed or are still contributing to the degradation of coastal water conditions along the Gulf
Coast include the petrochemical, industrial, agricultural, power plants, pulp and paper mills, fish processing, municipal wastewater
the greatest concerns for GOM coastal water quality is an excess of nutrients
can lead to noxious algal blooms, decreased seagrasses, fish kills, and oxygen-depletion events.
which
Improved ocean
Oceanography from MIT and Woods Hole Oceanographic Inst., and Burt Jones,
Adjunct Professor of Biological Sciences at USC, PhD in Biological Oceanography
from Duke, Making Use of Ocean Observing Systems: Applications to Marine
Protected Areas and Water Quality, Sept 25-26 2007,
http://calost.org/pdf/resources/workshops/OceanObserving_Report.pdf
Recent summaries of available information have indicated that coastal ecosystems have witnessed a general increase
in the occurrence and severity of harmful and toxic algal blooms. The coastline of California is no exception,
with powerful neurotoxins such as saxitoxin and domoic acid now commonly observed throughout the state. Numerous factors have been implicated as possible contributors to these
Harmful blooms can result from natural, seasonal supply of nutrients to coastal waters during upwelling and from
anthropogenic inputs of nutrients in river discharges and land runoff. Unfortunately, quantifying
the contribution of the many potential sources of nutrients that can support algal blooms is a daunting
task because of the difficulty of maintaining continuous observations in the ocean .
Harmful algal blooms are ephemeral
events that can develop quickly and dissipate before their causes can be adequately characterized.
Efforts by municipalities, counties and the state to provide responsible environmental stewardship
of coastal waters
are often
thwarted by the lack of sufficient observational capabilities to document water quality, let alone
determine the cause(s) of adverse events such as harmful algal blooms. Partnerships are desperately needed between ocean observing programs,
research/academic institutions, coastal managers and monitoring programs (including both government and nonprofit) to grapple with the increasing number of environmental influences
carried by mobile surface and underwater vehicles provide a constant presence in the ocean that assists scientists to detect impending or emerging events, and guide their sampling
effort. Aquatic sensors can provide information on salinity (which can identify sources of freshwater input to the coastal ocean), temperature (which yields information on the physical
structure of the water column, a primary factor affecting algal growth in nature), chlorophyll fluorescence (which documents the biomass of algae in the water) and dissolved oxygen
(which indicates biological activity and ecosystem health). A variety of ever-improving nutrient sensors can quantify specific substances that may stimulate algal growth (e.g.,
ammonium, nitrate, phosphate). In addition, new sensors that can detect specific microorganisms and/or toxins produced by these species are under development and eventually will
increase the capabilities of ocean observing systems. These
ability to document harmful or toxic events, and aid our attempts to attribute these events to
specific environmental causes.
difficult to implement without a marked increase in budgeting for individual farm permitting, monitoring and
Local variations in weather, soil salinity, and soil erosion potential, leaching
potential, and freshwater availability present further challenges to an effective national
regulatory regime. Variations in crop type, production practices, livestock type and
concentration, use of irrigation, tillage practices, sediment runoff and fertilizer
runoff all contribute to the difficulty of one size fits all regulation . Social factors like
proximity to metropolitan area, and surrounding land use also influence farm practices. EPA has noted that a program of
this breadth would make it very difficult to implement and enforce regulations . The
economic dimensions of agriculture also pose barriers to regulation. Agriculture
enforcement.
in the U nited S tates has vast economic value , yet is dispersed widely across the
country and by landowner. Faced with the rising costs of inputs and equipment, the farm
industry is quickly consolidating. Increased environmental regulation of farms
may reduce their economic viability due to compliance costs . And the political
dimensions, mentioned earlier, that make regulation of agriculture difficult include a consolidated voting block, strong lobbying and
political pressure.
Counterplans
Europe
1NC Europe CP
Counterplan: The European Union should fully develop and
implement an integrated European Ocean Observation System.
Integrated European observation is key to broader data tech
is world class, just a question of implementation.
EMD 2014
REPORT FROM THE JOINT EUROGOOS/EMODNET/EMB/JRC WORKSHOP AT THE
EUROPEAN MARITIME DAY IN BREMEN,The importance of an integrated end-to-end
European Ocean Observing System: key message of EMD 2014
http://eurogoos.eu/2014/06/09/eoos-at-emd-bremen-2014/
Ocean observations are essential for marine science, operational services and
systematic assessment of the marine environmental status. All types of activities in
the marine environment require reliable data and information on the present and future
conditions in which they operate. Many maritime economic sectors (e.g. oil and gas exploration,
maritime transport, fisheries and aquaculture, maritime renewable energy) directly benefit from easily
accessible marine data and information in several ways: improved planning of operations, risk
minimization though increased safety, improved performance and overall reduced cost. Other activities, such as
deep sea mining and marine biotechnology, also benefit from specialized deep-sea observations that were not
marine observations, data to knowledge cycle, activities and tools are needed to create added value products for
specific stakeholders, including the wider public, such as the European Atlas of the Seas which allows professionals,
students and anyone interested to explore Europes seas and coasts, their environment, related human activities
and European policies. At the same time, it is critical to evaluate whether we are monitoring/observing what we
actually need. Regional assessments such as performed by the newly established EMODnet sea-basin checkpoints
could provide relevant information, among others to advise Member States about requirements for essential and
optimal observation capability.
EU Tech Good
European sensors are cutting edge and real time no
distinction between them and the aff.
COOPEUS 2013
Solves Environment
Europe solves ecosystem management academic,
government, industry partnerships
ESF 2014
Dramatic changes, largely attributed to anthropogenic activity, have taken place in the Arctic in recent decades.
These changes include melting of glaciers and sea ice, altered oceanic current patterns, movement and
accumulation of contaminants and range shifts in many species. As a result of these changes the Arctic region is
being transformed, with wide-ranging impacts and opportunities including the potential for ice-free shipping routes
in the future, increased activity in oil and gas exploration, changes to Arctic fisheries and biodiversity, and impacts
on residents livelihoods. At present we are unprepared for the environmental and societal implications of
increased human access to the Arctic that will come with the receding ice explains Professor Peter Haugan from
the University of Bergen and vice-Chair of the European Marine Board. We
panel discussion on the roles of industry and science in achieving sustainable management of the Arctic Ocean.
Solves Acidification
Europe can develop acidification solutions experimentation.
Data is sufficient.
Riebesell 2009
Mesocosms, experimental
enclosures that are designed to approximate natural conditions and that allow
manipulation of environmental factors, provide a powerful tool to link small-scale
single species laboratory experiments with observational and correla* tlve
approaches applied In field surveys. A mesocosm study has an advantage over standard laboratory
simultaneously varying in time and space and by the lack of replication.
tests In that It maintains a natural community under close to natural self-sustaining conditions, taking Into account
relevant aspects of natural systems such as indi- rect effects, biological compensation and recovery, and ecosystem
resilience. Replicate enclosed populations can be experimentally manipulated and the same populations can be
reference conditions and replication. To Improve understanding of the under- lying mechanisms of observed
responses, which are often difficult to infer from mesocosm results, and to facilitate scaling up mesocosm results,
large-scale enclosure experiments should be closely inte- grated with well-controlled laboratory experiments and
modeling of ecosystem responses. Taking advantage of a recently developed mobile, flexible wall mesocosm
EPOCA will conduct a joint mesocosm CO, perturbation experiment In the high
Arctic, involving marine and atmospheric chemists, molecular and cell biologists,
marine ecolo- gists. and biogeochemists . A total of nine mesocosm units 2 m In diameter and lS-m
system.
deep, each containing approximately 45,000 liters of water, will be deployed In Kongsfjord off Ny-Alesund on
Svalbard. The carbonate chemistry of the enclosed water will Initially be manipulated to cover a range of pCO_,
levels from pre-lndustrlal to projected year 2100 values (and possibly beyond) and will be allowed to float freely
during the course of the experiment to mimic varia- tions naturally occurring due to biological activity. The high
level of scientific Integration and cross-disciplinary collaboration of this study Is expected to generate a
comprehensive data set that lends itself to analyses of community-level ocean acidification sensitivities and
ecosyscem/biogeochemical model parameterizations
User Fees
1NC Commercialization CP
Text
The United States Federal Government should adopt a user fee system of
commercialization for the Integrated Ocean Observing System.
combination feature uses the information similar to the semantic compatibility features proposed by Yang (Yang et
al., 2005) and Bergsma (Bergsma and Lin, 2006). Depending on the pronoun type, the feature extractor decides
which relationship is used. For example,
noticeable feature is the combination of C parg, Sdist, and P type which contains the association of the grammatical
role of the candidate, the sentence-based distance, and the pronoun type. The idea of adding this combination is
based on the Centering theory (Walker et al., 1998), a theory of discourse successfully used in pronoun resolution.
This simple feature shows the potential of encoding centering theory in the machine learning features, based on the
parse information.
MTS/IEEE Biloxi - Marine Technology for Our Future: Global and Local Challenges
The role of commercialization in IOOS and the regional associations
http://ieeexplore.ieee.org.ezproxy.uky.edu/xpls/icp.jsp?arnumber=5422377#authors
Since its inception, IOOS has operated in much the same way as federal
government agencies traditionally have - set policy, identify requirements, prioritize
within those requirements, secure federal funding , and allocate the funding via
the RAs, which in turn fund the organizations that actually perform the required
tasks. The organizations funded by the RAs to perform the desired tasks have
included universities, research institutions, and private sector companies. This has
resulted in a loose organizational structure, in which federal (and sometimes state)
agencies provide policy, coordination, and funding; RAs and their member
organizations conduct the science, process data, and operate observing and other
systems; products and services are generated and provided (usually free of charge)
to end users. Fig. 1 illustrates this functional structure. Ideally, funding for these
programs follows the traditional trajectory: a large sum at the beginning to start
the program up, and then some fraction of that annually to support ongoing
Operations and Maintenance (O&M). Fig. 2 depicts this traditional funding profile.
While significant progress has been made recently, the unfortunate fact is that
funding for both initial start up costs and O&M have been less than anticipated. As a
result, worthy new projects remain on the sidelines, and viable observing assets
funded previously are beginning to be decommissioned or inactivated due to
inadequate O&M funding. In response to this situation, some have called for a
retreat from the IOOS organizational concept and a return to individual efforts by
member organizations. Others retain faith in the IOOS concept, but argue that the
enterprise simply needs to be more effective at articulating the requirements and
hazards of inaction to the appropriate members of Congress. This paper
proposes a third option - the embrace of commercialization as a program
objective for IOOS the RAs, and the oceanography enterprise. The active
participation of universities and research institutions in IOOS and the RAs has been
a cornerstone of their success to date and is a logical extension of previous
practices in the academic arena. The avoidance of redundancy in effort and the
benefits of cooperation have long been recognized in establishing and operating
successful programs in the academic community, and participation in IOOS by
university and research institutions has been high. However, the participation
of private sector businesses has been less widespread . A partial list of
reasons would include the lack of a clear answer to why it is in a company's interest
to participate, a business environment that is generally more competitive than
cooperative, and a desire to protect proprietary information. Unfortunately, the
current structure and functionality of the RAs does little to address or
correct those concerns, and in many ways actually exacerbates them. The
linchpins to this proposal are the embrace of commercialization using private
A2: CP Illegitimate
---Literature supports the counterplan and makes it
predictable. Our Woll evidence contrast the plan with the CP
which proves its germane and educational
---Test Its which is a core source of negative ground on the
topic
---Net benefits check abuse and provide a germane policy
warrant for voting negative. They shouldnt get to pick and
choose which parts of the plan to defend.
---Doesnt waste the 1AC-The affirmative still sets the the
initial grund for the debate. The negative must still find some
aspect of the affirmative plan with which to compete. The
negative isnt obligated to run arguments that were
preempted in the 1AC. Its more fair to let both teams partially
determine ground for the debate.
---Dont trivialize the debate-If the difference between the plan
and the counterplan is large enough to generate a net benefit,
than its worth debating. This argument isnt unique as many
affirmatives are only a small departure from the status quo.
---Risk-Reward-The affirmative generates offense from the way
their plan would be implemented. That reward should be offset
by allowing the negative to test the plan.
---Punishment doesnt fit the crime-The judge should evaluate
theory like extra-topicality. The counterplan should be judged
outside their jurisdiction and a lost option for the negative to
advocate.
A2: Conditionality
---Real World-Policy makers do consider multiple options at
once. Their argument guts one of the core elements of policy discussion.
---Best policy justifies-Multiple options make it more likeley that the best
policy will be found. The role of the judge is to endorse the best policy at the end of
the round. If a conditional counterplan has been proven to be the best policy, its
perverse not to allow it to be endorsed.
---Education-Argument breadth has benefits. If depth were the only value, teams
wouldnt be allowed to debate more than one advantage or disadvanatge per round.
Exploring the range of issues on a subject is also intellectualy important.
A finding of substantiality
should be based not only on the effort devoted to a matter, but on the
importance of the effort. While a series of peripheral involvements may be
insubstantial, the single act of approving or participating in a critical step
may be substantial. However, the review of procurement documents solely
to determine compliance with regulatory, administrative, or budgetary
procedures, does not constitute substantial participation in a
procurement.
even though it is not determinative of the outcome of a particular matter.
2NC Solvency
The CP solves the case
A. Economic incentives-User fees encourage the private sector
to take on a larger role for the financing of IOOS-fills in for
current funding gaps-Thats Woll from the 1NC
B. Re-allocation-Commercialization allows the government to
spend scare resources on other IOOS related projects-solves
better than the aff
Woll-OCEANS Conference-9
MTS/IEEE Biloxi - Marine Technology for Our Future: Global and Local Challenges
The role of commercialization in IOOS and the regional associations
http://ieeexplore.ieee.org.ezproxy.uky.edu/xpls/icp.jsp?arnumber=5422377#authors
A major side benefit to this construct is that it provides a second
significant source of funding for the enterprise . Obviously, end users are not
going to pay fees for a prospective product or service, so there will remain a need
for government funding to start projects up. However, the fees should be set at a
level that will provide a large portion ( ideally all) of the O&M funding required
to sustain projects over time. This second source of funding has the net effect
of freeing up previous government O&M funds to be re-applied to start other new
projects - such that the government funding truly becomes a type of seed funding,
helping worthy projects get off the ground and turning them over to operational
organizations. Fig. 3 depicts how the funding profile could change over time under
this proposal.
MTS/IEEE Biloxi - Marine Technology for Our Future: Global and Local Challenges
The role of commercialization in IOOS and the regional associations
http://ieeexplore.ieee.org.ezproxy.uky.edu/xpls/icp.jsp?arnumber=5422377#authors
SECTION IVCONCLUSION
The IOOS program offers an unparalleled opportunity for all members of the
oceanography enterprise to take part in a collaborative effort - one that provides the
best possible stewardship of taxpayer dollars while helping advance practical
scientific knowledge about the world's oceans. With the breadth and scope of
challenges facing the ocean environment, such an approach is not only
prudent - it offers the only reasonable chance of success in the future
fiscal environment.
Quite simply, the federal government will not be able to fund every
worthy project, so it is incumbent on IOOS and the oceanography
enterprise to make the most of every dollar. That means cooperating to make
choices from an enterprise perspective, making the hard choices to eliminate once
promising projects that have not panned out as expected, and being willing to
challenge and change habits that have been successful in years past.
Drones
Drones/Robots Solvency
T-rex AUVs are effective, high tech, and solve data collection
in all areas of the globe
McGann et al 8
[Conor McGann, Frederic Py, Kanna Rajan, Hans Thomas, Richard Henthorn, Rob McEwen Monterey Bay
Aquarium Research Institute. 2008, A Deliberative Architecture for AUV Control, http://ieeexplore.ieee.org/stamp/stamp.jsp?
arnumber=04543343 PDF //jweideman]
AUV control systems [8] are a variant of the behavior-based Subsumption architecture [9]. A behavior is a modular encapsulation of
a specific control task and includes acquisition of a GPS fix, descent to a target depth, drive to a given waypoint, enforcement of a
mission depth envelope etc. An operator defines each plan as a collection of behaviors with specific start and end times as well as
maximum durations, which are scripted a priori using simple mission planning tools. In practice, missions predominantly consist of
sequential behaviors with duration and task specific parameters equivalent to a linear plan with limited flexibility in task duration.
Such an approach becomes less effective as mission uncertainty increases. Further, the architecture offers no support to manage
the potentially complex interactions that may result amongst behaviors, pushing a greater cognitive burden on behavior developers
deal with a range of dynamic and episodic ocean phenomenon that cannot be observed with scripted plans. The remainder of this
paper is laid out as follows. Section II lays out the architecture of our autonomy system, section III details the experimental results to
intertwined.Fig. 2 shows a conceptual view of a Teleo-Reactive Agent. An agent is viewed as the coordinator of a set of concurrent
control loops. Each control loop is embodied in a Teleo-Reactor (or reactor for short) that encapsulates alldetails of how to
accomplish its control objectives. Arrows represent a messaging protocol for exchanging facts and goals between reactors: thin
arrows represent observations of current state; thick arrows represent goals to be accomplished. Reactors are differentiated in 3
ways: Functional scope: indicating the state variables of concern for deliberation and action. Temporal scope: indicating the lookahead window over which to deliberate. Timing requirements: the latency within which this component must deliberate for goals in
its planning horizon. Fig. 2 for example, shows four different reactors; the Mission Manager provides high-level directives to satisfy
the scientific and operational goals of the mission: its temporal scope is the whole mission, taking minutes to deliberate if necessary.
The Navigator and Science Operator manage the execution of sub-goals generated
by the Mission Manager. The temporal scope for both is in the order of a minute
even as they differ in their functional scope. Each refines high-level directives into executable commands
depending on current system state. The Science Operator is able to provide local directives to the Navigator . For example if
it detects an ocean front it can request the navigation mode to switch from a Yo-Yo
pattern in the vertical plane to a Zig-Zag pattern in the horizontal plane, to have
better coverage of the area. Deliberation may safely occur at a latency of 1 second for these
reactors. The Executive provides an interface to a modified version of the existing AUV functional layer. It encapsulates access to
commands and vehicle state variables. The Executive is reasonably approximated as having zero latency within the timing model of
our application since it will accomplish a goal received with no measurable delay, or not at all; in other words it does not deliberate.
[Dana R. Yoerger1, Michael Jakuba1, Albert M. Bradley1, and Brian Bingham2 1 Woods Hole
Oceanographic Institution 2 Franklin W. Olin College of Engineering. 1/1/2007, Techniques for Deep Sea Near Bottom Survey Using
an Autonomous Underwater Vehicle, http://business.highbeam.com/437280/article-1G1-156721474/techniques-deep-sea-nearbottom-survey-using-autonomous //jweideman]
submersibles and ROVs remain the only option for manipulation tasks such as sampling, deploying and recovering
high resolution
maps from AUVs can facilitate these tasks.Figure 1 shows the Autonomous Benthic Explorer (ABE), a
experiments on the sea oor, detailed inspection, and servicing subsea instruments; however,
6000 m autonomous underwater vehicle that our team has been developing and deploying for ne-scale quantitative
survey and mapping of the sea floor. ABE can survey at constant depth or bottom-follow even in rugged terrain, and
it can autonomously determine its position and drive track lines with a precision on the order of several meters.
shape and thruster placement allow it to maintain control over a wide range of speed, and to stop or back up if
necessary to avoid obstacles. ABE descends to the sea oor with the aid of a descent weight. ABE glides in a
controlled spiral trajectory to ensure that it reaches the desired starting point without consuming signi cant battery
energy. After reaching the sea oor and performing a series of checks, ABE releases its descent weight to become
neutrally buoyant and begins its survey. Throughout the dive, including descent, ABE uses acoustic long-baseline
(LBL) transponder navigation and, when in range of the bottom (< 300 m), bottom-lock acoustic Doppler
measurements to determine its position and velocity.A dive can consist of a mix of hydrothermal plume survey at
constant depth, sonar and magnetics survey following the sea oor (at heights of 50{ 200 m), and digital
photography (height of 5 m). ABE usually surveys until its batteries are depleted (between 15 and 30 hours
depending on sensor payload and terrain). At the end of its dive, ABE releases its ascent weight to become
positively buoyant and returns to the surface. The remainder of this report is organized as follows: Sect. 2
summarizes scienti c survey tasks that have motivated our AUV work, Sect. 3 reports an algorithm for acoustic
positioning, Sect. 4 reports methods for terrainfollowing and obstacle avoidance, Sect. 5 reports a technique for
automated nested survey, and Sect. 6 presents a brief summary and conclusion.2 Precisely Navigated, Coregistered
identify volcanic features such as lava flow units [1], delimit their fronts, and estimate their thicknesses [2, 3].
Meter-scale bathymetric maps show tectonic features such as faults with great
clarity, even enabling them to be resolved into multiple components [4]. In other cases,
these maps have revealed the relationship between tectonic features and morphology, such as volcanic domes [3],
and hydrothermal vents [1]. ABE bathymetric maps have proved to be of su cient detail and precision for one
collaborator to reconstruct the tectonic history of a rift valley by computationally removing faults [5]. The result
revealed a dome-like structure from which the valley evolved. On a recent cruise to the Atlantis Massif, detailed
renderings of faults and the hydrothermal structures provided critical clues as to the mechanisms controlling the
hydro-geology at the newly discovered Lost City hydrothermal vent site [6]. Digital photographs of the sea oor from
ABE have provided details of lava ow types and e usion rates [3], sediment cover, and the distribution of benthic
organisms. Water column data from ABE yields indications of hydrothermal plume activity and has been used to
estimate heat ux from known hydrothermal vent sites, and to locate undiscovered sites on the sea oor. To estimate
the heat ux from vent elds on the Juan de Fuca Ridge in the Northeast Paci c (47 540 N, 129 100 W) [7], ABE
measured temperature, salinity, and threeaxis water velocity while repeatedly executing a tight grid pattern above
the eld [8]. Recently ABE located and preliminarily characterized several previously unmapped hydrothermal sites
on the Eastern Lau Spreading Center (ELSC) south of Tonga (21 080 S, 175 120 W) [9]; and on the Southern Mid
Atlantic Ridge (SMAR) north of Ascension Island (7 570 S, 14 220 W) [10]. In each case, we started with clues
provided by towed systems that indicated a vent site within several kilometers. ABE then executed a three-dive
sequence [9, 10] of grid patterns at increasing ner scales and increasingly close to the sea oor. To plan each dive,
the scienti c party carefully scrutinized the data from the previous dive along with any available ancillary
data.These vent prospecting missions capitalized on ABE's ability to conduct precisely navigated surveys at scales
O (m{km), to operate over rugged terrain, and relied on nearly all of ABE's sensing modalities. Figure 2 shows
tracklines from sequence of dives designed to locate and survey a vent site on ELSC along with a sampling of the
variety of data products acquired and used to plan each stage of the dive sequence. ABE mapped plume activity
(temperature, optical backscatter, and reduction-oxidization potential (eH) [11]) to pinpoint the locations of plumes
emanating from the eld, built ne-scale bathymetric maps of the vent elds and surrounding environment, and nally
photographed the vent structures and animal populations. The remainder of this paper presents the underlying
algorithms that enabled ABE to perform this work.This paper reports navigation algorithms that enable an
underwater vehicle to accomplish fully autonomous scienti c surveys in the deep sea. These algorithms allow the
vehicle to determine its position, to bottom-follow (maintain a constant height above sea oor terrain) and avoid
obstacles, and to autonomously focus on the highest value parts of a survey. Scienti c exploration of the deep sea
has traditionally been performed using inhabited submersibles, towed vehicles, and tethered remotely operated
these existing systems, o ering superior mapping capabilities, improved logistics, and improved utilization of the
surface support vessel. AUVs are particularly well suited to systematic preplanned surveys using sonars, in situ
chemical sensors, and cameras in the rugged deep sea terrain that is the focus of many scienti c expeditions.
Inhabited submersibles and ROVs remain the only option for manipulation tasks such as sampling, deploying and
recovering experiments on the sea oor, detailed inspection, and servicing subsea instruments; however, high
resolution maps from AUVs can facilitate these tasks.Figure 1 shows the A utonomous
Benthic Explorer
(ABE), a 6000 m autonomous underwater vehicle that our team has been
developing and deploying for ne-scale quantitative survey and mapping of the
seafloor. ABE can survey at constant depth or bottom-follow even in rugged terrain, and it can autonomously
determine its position and drive tracklines with a precision on the order of several meters. ABE carries a variety of
sensors, including scanning and multibeam sonars; a magnetometer; a digital still camera; two sets of pumped
conductivity and temperature probes; an acoustic Doppler current pro ler (ADCP); several chemical sensors for
hydrothermal plume mapping; and occasional mission-speci c instrumentation. ABE's shape and thruster placement
allow it to maintain control over a wide range of speed, and to stop or back up if necessary to avoid obstacles. ABE
descends to the sea oor with the aid of a descent weight. ABE glides in a controlled spiral trajectory to ensure that it
reaches the desired starting point without consuming signi cant battery energy. After reaching the sea oor and
performing a series of checks, ABE releases its descent weight to become neutrally buoyant and begins its survey.
Throughout the dive, including descent, ABE uses acoustic long-baseline (LBL) transponder navigation and, when in
range of the bottom (< 300 m), bottom-lock acoustic Doppler measurements to determine its position and
velocity.A dive can consist of a mix of hydrothermal plume survey at constant depth, sonar and magnetics survey
following the sea oor (at heights of 50{ 200 m), and digital photography (height of 5 m). ABE usually surveys until
its batteries are depleted (between 15 and 30 hours depending on sensor payload and terrain). At the end of its
dive, ABE releases its ascent weight to become positively buoyant and returns to the surface. The remainder of this
report is organized as follows: Sect. 2 summarizes scienti c survey tasks that have motivated our AUV work, Sect. 3
reports an algorithm for acoustic positioning, Sect. 4 reports methods for terrainfollowing and obstacle avoidance,
Sect. 5 reports a technique for automated nested survey, and Sect. 6 presents a brief summary and conclusion.2
Precisely Navigated, Coregistered AUV Surveys Proximity to the sea oor, precise navigation, robust control, and
coregistered sensors permit an AUV to characterize the sea oor and the near-bottom environment with
complementary sensing modalities on the meter-scale. This section summarizes scienti c work in which ABE-derived
bathymetric maps, magnetics maps, digital photos, and hydrographic maps have played critical enabling roles.
Meter-scale bathymetric and magnetic maps made using ABE have provided geologists and geophysicists with new
perspectives on important sea oor processes. Combined magnetics and bathymetric maps show crustal
magnetization, which permits the age and thickness of lava ows to be determined. Combined maps have also been
used to identify volcanic features such as lava ow units [1], delimit their fronts, and estimate their thicknesses [2,
3]. Meter-scale bathymetric maps show tectonic features such as faults with great clarity, even enabling them to be
resolved into multiple components [4]. In other cases, these maps have revealed the relationship between tectonic
bathymetric maps
have proved to be of sufficient detail and precision for one collaborator to
reconstruct the tectonic history of a rift valley by computationally removing faults
[5]. The result revealed a dome-like structure from which the valley evolved. On a recent cruise to the Atlantis
features and morphology, such as volcanic domes [3], and hydrothermal vents [1]. ABE
Massif, detailed renderings of faults and the hydrothermal structures provided critical clues as to the mechanisms
controlling the hydro-geology at the newly discovered Lost City hydrothermal vent site [6]. Digital photographs of
the sea oor from ABE have provided details of lava ow types and e usion rates [3], sediment cover, and the
distribution of benthic organisms. Water column data from ABE yields indications of hydrothermal plume activity
and has been used to estimate heat ux from known hydrothermal vent sites, and to locate undiscovered sites on
the sea oor. To estimate the heat ux from vent elds on the Juan de Fuca Ridge in the Northeast Paci c (47 540 N, 129
100 W) [7], ABE measured temperature, salinity, and threeaxis water velocity while repeatedly executing a tight
grid pattern above the eld [8]. Recently ABE located and preliminarily characterized several previously unmapped
hydrothermal sites on the Eastern Lau Spreading Center (ELSC) south of Tonga (21 080 S, 175 120 W) [9]; and on
the Southern Mid Atlantic Ridge (SMAR) north of Ascension Island (7 570 S, 14 220 W) [10]. In each case, we started
with clues provided by towed systems that indicated a vent site within several kilometers. ABE then executed a
three-dive sequence [9, 10] of grid patterns at increasing ner scales and increasingly close to the sea oor. To plan
each dive, the scienti c party carefully scrutinized the data from the previous dive along with any available ancillary
survey a vent site on ELSC along with a sampling of the variety of data products acquired and used to plan each
stage of the dive sequence. ABE mapped plume activity (temperature, optical backscatter, and reductionoxidization potential (eH) [11]) to pinpoint the locations of plumes emanating from the eld, built ne-scale
bathymetric maps of the vent elds and surrounding environment, and nally photographed the vent structures and
animal populations. The remainder of this paper presents the underlying algorithms that enabled ABE to perform
this work.
Inherency
SQ IOOS/Data solves
IOOS already did enough mapping
IOOS report to congress 13
[Official US IOOS report sent to congress. 2013, U.S. Integrated Ocean Observing
System (U.S. IOOS) 2013 Report to Congress, http://www.ioos.noaa.gov/about/governance/ioos_report_congress2013.pdf
//jweideman]
Gliders are used to monitor water currents, temperature, and biological information
such as dissolved oxygen and nitrate. This information offers a more complete picture of what
is happening at and below the ocean surface, and may allow scientists to detect trends that otherwise might
have gone undetected. Gliders are assuming a prominent and growing role in ocean
science due to their unique capabilities for collecting data safely and at relatively low cost in remote
locations, both in deep water and at the surface. An advantage of gliders is that they can be quickly deployed to
[Will Rogers is a Research Assistant at the Center for a New American Security. Dr. Jay
Gulledge is a Non-Resident Senior Fellow at the Center for a New American Security and is the Senior Scientist and Science and
Impacts Program Director at the Pew Center on Global Climate Change. April 2010. Lost in Translation: Closing the Gap Between
Climate Science and National Security Policy http://www.cnas.org/files/documents/publications/Lost%20in
%20Translation_Code406_Web_0.pdf //jweideman]
recognition of the likelihood that there will be a surge in the demand for collection and analysis of climate information. Given the
proliferation of new tools (e.g., climate satellites and advanced computer models) and data acquisition systems, there will be no
short- age of climate information (especially data related to present conditions and short-term trends). The question for the national
security community is whether its unique needs will be met. Since the community has not traditionally based decisions on climate
change projections or assessments, there are few processes in place to ensure that the necessary information will be available when
it is needed and in a form that is useful.
Solvency
Satellites
Cant solve without new satellites
Parthemore and Rogers 11
In
the South China Sea, changing ocean conditions are altering fish migration, leading
neighboring countries to compete over access to billions of dol- lars in fish
resources; this situation could escalate into serious conflict in contested territorial
waters. DOD and development agencies rely on earth monitoring systems to monitor
and diplomatic tensions between India and Pakistan - states that have longstanding grievances over how they share water.
urbanization, migration patterns and internal population displacement. Several government agencies also rely on earth monitoring
the
government relies on space-based capabilities to monitor and verify compliance with non-proliferation
treaties. Responding to environmental and climate change trends requires a steady
stream of reliable information from earth monitoring satellites that is quickly
becoming unavailable. Ideally, the U.S. government would replace its aging earth moni- toring satellites. Yet the current
capabilities to analyze compliance with deforestation and emissions measures in international climate change treaties, just as
political and fiscal environments constrain available resources, making it less likely that Congress will appropri- ate funds to wholly
replace old systems. Given this reality, U.S. policymakers should use existing systems more efficiently, improve information sharing
among interagency partners and leverage international partners' investments in their own systems in order to bolster U.S. climate
and envi- ronmental data collection capabilities. The Capability Gap Policymakers have known about the challenges stemming from
America's declining earth moni- toring capabilities for years. In 2005,
Accountability Office (GAO), have recently reiter- ated those warnings. According to an April 2010 report by the GAO, "gaps in
coverage ranging from 1 to 11 years are expected beginning as soon as 2015" and "are expected to affect the continuity of
important climate and space weather measure- ments, such as our understanding of how weather cycles impact global food
production."4 These gaps will include key environmental and climate monitoring functions, from radar altimeters that measure
changes in land and ocean surfaces (such as sea level rise and desertification) to aerosol polarimetry sensors that can measure and
distin- guish between sulfates, organic and black carbon and other atmospheric particles. " Meteorologists,
oceanographers, and climatologists reported that these gaps will seriously impact
ongoing and planned earth monitoring activities," according to the GAO.s One recent interagency effort
to close such gaps has fallen short. The National Polar-orbiting Operational Environmental Satellite System (NPOESS) was designed
to translate climate and environmental data (including data from exten- sive existing databases) into products and analysis for DOD,
NASA and the National Oceanic and Atmospheric Administration (NOAA). However, after lone delays, cost overruns and
inadequatecoordination among the partners in the interagency working group, the project was split into two components (as an
alternative to being cancelled completely); DOD and the civilian agencies are moving forward separately with their own projects in
order to sustain the capabilities that NPOESS was intended to provide.
protection and the management of fisheries, water, transportation, oil and gas extraction, wind energy, tourism and
and (for multi- national regional entities such as the European Union) among governments. As a result,
competition for funding among govern- ment agencies often inhibits needed
collaborations and can result in policy choices that are detrimental to ecosystems
and their services and dependent coastal communities. The limitations of sector-specific 'stove pipe" approaches to
the stewardship of ecosystem services have been recognized for decades and are reflected in national and
international calls for EBAs to managing human uses and adapting to climate change "3.4.28.153). An effective set
provision of the required data on pressures, states and impacts. Conventional approaches to sustainable devel-
pressures, states and impacts simultaneously; (2) routine and repeated lEAs based on these observations: and (3)
the use of lEAs to help guide the sustainable implementation of EBAs (Fig. 31 The flow of data and information
among these activities must enable an iterative process of evaluating performance against objectives that leads to
more effective observing systems and EBAs.
Regional Seas Conventions |149|. and international maritime operations |154|. In addition to these practitioners,
important stakeholders include data providers (scientists and technicians involved in ocean observing and prediction systems), analysts (natural, social and political scientists and economists), and journalists all working for the
common good. This body of stakeholders might be called the "Integrated Govern- ance Forum for Sustaining Marine
Ecosystem Services" and would function as a "Community of Practice* |155|. However, sustained engagement of
such a diversity of stakeholders on a global scale to guide the establishment of EBAs on the spatial scales needed
to sustain ecosystem services (Section 2) is not feasible. Developing COOS to provide the data and information
needed to inform EBAs (Section 3) depends on enhancing the CCN locally and regionally based on priorities
established by stakeholders and articulated in national and regional ocean policies |2S|. The E.U. Marine Strategy
Framework Directive [156| and the U.S. National Policy for the Stewardship of the Ocean. Our Coasts, and the Great
Lakes 1157] are important examples of emerging regional scale approaches to integrated ocean governance. How
effective these policies will be remains to be seen.
Warming
Solvency
No Solvency-Satellites
Cant solve without new satellites
Parthemore and Rogers 11 (Christine Parthemore, Senior Advisor at United States
Department of Defense Adjunct Professor, Security Studies Program at Johns Hopkins University, Will Rogers, a
Research Assistant at the Center for a New American Security, Blinded: The Decline of U.S. Earth Monitoring
Capabilities and Its Consequences for National Security, Center for a New American Security, July 2011,
http://www.cnas.org/files/documents/publications/CNAS_Blinded_ParthemoreRogers_0.pdf)
Networks of satellites, ground-based sensors and unmanned aerial vehicles - the assets
America uses to monitor and understand environmental change and its consequences - are going dark. By
2016, only seven of NASA's current 13 earth monitoring satellites are expected to
be operational, leaving a crucial information gap that will hinder national security planning.1 Meanwhile,
efforts to prevent this capability gap have been plagued by budget cuts, launch
failures, technical deficiencies, chronic delays and poor interagency coordination.
Without the information that these assets provide, core U.S. foreign policy and
national security interests will be at risk. The United States depends on satellite
systems for managing the unconventional challenges of the 21st century in ways that are rarely
acknowledged. This is particularly true for satellites that monitor climate
change
diplomatic tensions between India and Pakistan - states that have longstanding grievances over how they share
In the South China Sea, changing ocean conditions are altering Ash migration,
leading neighboring countries to compete over access to billions of dollars in fish
resources; this situation could escalate into serious conflict in contested territorial
waters. DOD and development agencies rely on earth monitoring systems to monitor urbanization, migration
patterns and internal population displacement. Several government agencies also rely on earth
monitoring capabilities to analyze compliance with deforestation and emissions measures in
international climate change treaties, just as the government relies on space-based capabilities to monitor
water.
Political Will
Trying to solve for warming does nothing. Its politically
controversial and links to politics
Speath 12 (Ryu Speath, deputy editor at TheWeek.com, Why it's probably too late to roll back global
warming, The Week, December 5, 2012, http://theweek.com/article/index/237392/why-its-probably-too-late-to-rollback-global-warming)
Two degrees Celsius. According to scientists, that's the rise in global temperature, measured
against pre-industrial times, that could spark some of the most catastrophic effects of global warming. Preventing
the two-degree bump has been the goal of every international treaty designed to reduce greenhouse gas emissions,
a new study
shows that it's incredibly unlikely that global
warming can be limited to two degrees. According to the study, the world in 2011
"pumped nearly 38.2 billion tons of carbon dioxide into the air from the burning of
fossil fuels such as coal and oil," says Seth Borenstein at The Associated Press: The total
amounts to more than 2.4 million pounds (1.1 million kilograms) of carbon dioxide
released into the air every second. Because emissions of the key greenhouse gas
have been rising steadily and most carbon stays in the air for a century, it is not just
unlikely but "rather optimistic" to think that the world can limit future temperature
increases to 2 degrees Celsius (3.6 degrees Fahrenheit), said the study's lead author, Glen Peters at the
Center for International Climate and Environmental Research in Oslo, Norway. What happens when the
two-degree threshold is crossed? Most notably, that's when the polar ice caps will
begin to melt, leading to a dangerous rise in sea levels. Furthermore, the world's hottest regions
will be unable to grow food, setting the stage for mass hunger and global food
inflation. The rise in temperature would also likely exacerbate or cause extreme
weather events, such as hurricanes and droughts. There is a very small chance that
the world could pull back from the brink. The U.N. could still limit warming to two degrees if it adopts
including a new one currently being hammered out at a United Nations summit in Doha, Qatar. But
published by the journal Nature Climate Change
a "radical plan," says Peters' group. According to a PricewaterhouseCoopers study, such a plan would entail cutting
carbon emissions "by 5.1 percent every year from now to 2050, essentially slamming the breaks on growth starting
right now," says Coral Davenport at The National Journal, "and keeping the freeze on for 37 years." However, the
U.N. has set a deadline of ratifying a new treaty by 2015, and implementing it by 2020, which means the world is
already eight years behind that pace. There are still major disagreements between the U.S. and China over whether
And
there is, of course, a large contingent of Americans who don't even believe climate
change exists, putting any treaty's ratification at risk. Climate change is so politically toxic in
America that Congress has prioritized the fiscal cliff over no exaggeration
untold suffering and the end of the world as we know it. In other words, it isn't
happening. And if that's not bad enough, keep in mind that the two-degree mark is
just the beginning, says Davenport: Michael Oppenheimer, a professor of geosciences and international
affairs at Princeton University and a member of the Nobel Prize-winning U.N. Intergovernmental Panel on Climate
Change, says that a 2-degree rise is not itself that point, but rather the beginning of
irreversible changes. "It starts to speed you toward a tipping point," he said. "It's
driving toward a cliff at night with the headlights off. We don't know when we'll hit
that cliff, but after 2 degrees, we're going faster, we have less control. After 3, 4, 5
degrees, you spiral out of control, you have even more irreversible change." Indeed,
at the current emissions rate, the world is expected to broach the four-degree mark
by 2100 at which point, we can expect even worse environmental catastrophes.
Some analysts say that the best possible scenario is preventing the Earth from warming up
by three or four degrees. That means instead of focusing solely on preventing global warming, governments
the developed world, which industrialized first, should bear the bulk of the cost of reducing carbon emissions.
around the world should begin preparing for the major environmental upheavals,
starting with protections for coastal cities.
No Warming
No warming
Global warming theory is false 5 warrants
Hawkins 14 (John Hawkins, runs Right Wing News and Linkiest. He's also the co-owner of the The Looking
Spoon. Additionally, he does weekly appearances on the #1 in it's market Jaz McKay show, writes two weekly
columns for Townhall and a column for PJ Media. Additionally, his work has also been published at the Washington
Examiner, The Hill, TPNN, Hot Air, The Huffington Post and at Human Events, 5 Scientific Reasons That Global
Warming Isn't Happening, Town Hall, February 18, 2014, http://townhall.com/columnists/johnhawkins/2014/02/18/5scientific-reasons-that-global-warming-isnt-happening-n1796423/page/full)
How did global warming discussions end up hinging on what's happening with polar bears, unverifiable predictions
of what will happen in a hundred years, and whether people are "climate deniers" or "global warming cultists?" If
this is a scientific topic, why aren't we spending more time discussing the science involved? Why aren't we talking
about the evidence and the actual data involved? Why aren't we looking at the predictions that were made and
signed on to another report saying there is no global warming at all. There are tens of thousands of well-educated,
mainstream scientists who do not agree that global warming is occurring at all and people who share their opinion
are taking a position grounded in science. 3) Arctic ice is up 50% since 2012 : The loss of Arctic ice has
been a big talking point for people who believe global warming is occurring. Some people have even predicted that
How
much Arctic ice really matters is an open question since the very limited evidence
we have suggests that a few decades ago, there was less ice than there is today, but
all of the Arctic ice would melt by now because of global warming. Yet, Arctic ice is up 50% since 2012.
the same people who thought the drop in ice was noteworthy should at least agree that the increase is important as
4) Climate models showing global warming have been wrong over and over:
These future projections of what global warming will do to the planet have been
based on climate models. Essentially, scientists make assumptions about how much of an impact different
well.
factors will have; they guess how much of a change there will be and then they project changes over time.
Unfortunately, almost all of these models showing huge temperature gains have turned out to be wrong. Former
NASA scientist Dr. Roy Spencer says that climate models used by government agencies to create policies have
failed miserably. Spencer analyzed 90 climate models against surface temperature and satellite temperature data,
lower tropospheric temperatures (UAH). There's an old saying in programming that goes, "Garbage in, garbage
In other words, if the assumptions and data you put into the models are faulty,
then the results will be worthless. If the climate models that show a dire impact
because of global warming aren't reliable -- and they're not -- then the long term
projections they make are meaningless. 5) Predictions about the impact of global
warming have already been proven wrong: The debate over global warming has
been going on long enough that we've had time to see whether some of the
predictions people made about it have panned out in the real world. For example, Al Gore
out."
predicted all the Arctic ice would be gone by 2013. In 2005, the Independent ran an article saying that the Artic had
entered a death spiral. Scientists fear that the Arctic has now entered an irreversible phase of warming which will
accelerate the loss of the polar sea ice that has helped to keep the climate stable for thousands of years.... The
greatest fear is that the Arctic has reached a tipping point beyond which nothing can
reverse the continual loss of sea ice and with it the massive land glaciers of Greenland, which will raise sea
levels dramatically. Of course, the highway is still there. Meanwhile, Arctic ice is up
50% since 2012. James Hansen of NASA fame predicted that the West Side Highway in New York would be
under water by now because of global warming. If the climate models and the predictions about
global warming aren't even close to being correct, wouldn't it be more scientific to
reject hasty action based on faulty data so that we can further study the issue and
find out what's really going on?
On January 18, 2002, the journal Science published the results of satellite-borne radar and ice core studies
performed by scientists from CalTech's Jet Propulsion Laboratory and the University of California at Santa Cruz.
was prepared as a source material for a report titled "Forecast of the Defense Conditions for the Republic of Poland
in 2001-2020." The paper implied that the increase of atmospheric precipitation by 23 percent in Poland, which was
presumed to be caused by global warming, would be detrimental. (Imagine stating this in a country where 38
percent of the area suffers from permanent surface water deficit!) The same paper also deemed an extension of the
vegetation period by 60 to 120 days as a disaster. Truly, a possibility of doubling the crop rotation, or even
prolonging by four months the harvest of radishes, makes for a horrific vision in the minds of the authors of this
Also, measurements in the Kolobrzeg Baltic Sea harbor indicate that the number of gales has not increased
NASA satellite data from the years 2000 through 2011 show the Earths atmosphere is
allowing far more heat to be released into space than alarmist computer models have predicted,
reports a new study in the peer-reviewed science journal Remote Sensing. The study indicates far less
future global warming will occur than United Nations computer models have
predicted, and supports prior studies indicating increases in atmospheric carbon
dioxide trap far less heat than alarmists have claimed. Study co-author Dr. Roy Spencer,
a principal research scientist at the University of Alabama in Huntsville and U.S. Science Team Leader for the
No Consensus
97 percent consensus claims are false
Taylor 13, (James, Writer for forbes. 5/30/13, Global Warming Alarmists Caught Doctoring '97-Percent Consensus' Claims,
http://www.forbes.com/sites/jamestaylor/2013/05/30/global-warming-alarmists-caught-doctoring-97-percent-consensus-claims/
//jweideman)
Global warming alarmists and their allies in the liberal media have been caught
doctoring the results of a widely cited paper asserting there is a 97-percent
scientific consensus regarding human-caused global warming. After taking a closer look at
the paper, investigative journalists report the authors claims of a 97-pecent consensus
relied on the authors misclassifying the papers of some of the worlds most
prominent global warming skeptics. At the same time, the authors deliberately presented
a meaningless survey question so they could twist the responses to fit their own
preconceived global warming alarmism. Global warming alarmist John Cook, founder of the
misleadingly named blog site Skeptical Science, published a paper with several other global warming alarmists
claiming they reviewed nearly 12,000 abstracts of studies published in the peer-reviewed climate literature. Cook
reported that he and his colleagues found that 97 percent of the papers that expressed a position on human-caused
global warming endorsed the consensus position that humans are causing global warming .
As is the case
with other surveys alleging an overwhelming scientific consensus on global
warming, the question surveyed had absolutely nothing to do with the issues of
contention between global warming alarmists and global warming skeptics. The
question Cook and his alarmist colleagues surveyed was simply whether humans
have caused some global warming. The question is meaningless regarding the
global warming debate because most skeptics as well as most alarmists believe
humans have caused some global warming. The issue of contention dividing
alarmists and skeptics is whether humans are causing global warming of such
negative severity as to constitute a crisis demanding concerted action.
Warming slow
Worst case scenario warming will only be 1.5 degrees
Freitas 2 (C. R., Associate Prof. in Geography and Enivonmental Science @ U. Aukland, Bulletin of Canadian
Petroleum Geology, Are observed changes in the concentration of carbon dioxide in the atmosphere really
dangerous? 50:2, GeoScienceWorld)
In any analysis of CO2 it is important to differentiate between three quantities: 1) CO2 emissions, 2) atmospheric
CO2 concentrations, and 3) greenhouse gas radiative forcing due to atmospheric CO2. As for the first, between
1980 and 2000 global CO2 emissions increased from 5.5 Gt C to about 6.5 Gt C, which amounts to an average
annual increase of just over 1%. As regards the second, between 1980 and 2000 atmospheric CO2 concentrations
increased by about 0.4 per cent per year. Concerning the third, between 1980 and 2000 greenhouse gas forcing
data and using the surface data set, between 1980 and 2000 there has been this linear increase of both CO2
carbon cycle leading to atmospheric concentrations observed in the past. If one assumes, in addition, that the
increase of surface temperatures in the last 20 years (about 0.3 C) is entirely due to the increase in greenhouse
correspondingly lower. Based on this, the temperature increase over the next 100 years might be less than 1.5 C,
as proposed in Figure 19.
Too late
Warming is too fast for adaption
Willis et al 10 (K.J. Willis1,3,4 , R.M. Bailey2 , S.A. Bhagwat1,2 and H.J.B. Birks1,2,3 1 Institute of Biodiversity at the James
Martin 21st Century School, University of Oxford, South Parks Road, Oxford OX1 3PS, UK 2School of Geography and the Environment,
University of Oxford, Oxford OX1 3QY, UK 3Department of Biology, University of Bergen, Post Box 7803, N-5020 Bergen, Norway
4Department of Zoology, University of Oxford. 2010, Biodiversity baselines, thresholds and resilience: testing predictions and
assumptions using palaeoecological data http://users.ugent.be/~everleye/Ecosystem%20Dynamics/papers%20assignment/Willis
%20et%20al%20TREE%202010.pdf//jweideman)
by mod- els; rather there is strong evidence for persistence |25). However, there is also evidence that some species
expand- ed their range slowly or largely failed to expand from their refugia in response to this interval of rapid
climate warm- ing 138). The challenge now is to determine which specific respect, in particular information on
alternative stable states, rates of change, possible triggering mechanisms and systems that demonstrate resilience
to thresholds. In a study from central Spain, for example, it has been demonstrated that over the past 9000 years,
several threshold changes occurred, shifting from one stable forest type to another (pine to deciduous oak and then
evergreen oak to pine) (52). The trigger appears to have been a combination of at least two abiotic variables; in the
first shift, an interval of higher precipitation combined with less evaporation and in the second, increased aridity
combined with increased fire frequency. A similar 'double-trigger' was also found to be important in regime shifts
along the south-east coastline of Madagascar |53| where a threshold shift from closed littoral forest to open Ericadominated heathland occurred in response to the combined effects of aridity and sea-level rise. Neither of the
variables occurring alone resulted in a shift but the combination of the two did.
Global warming is not reversible but it is stoppable. Many people incorrectly assume
that once we stop making greenhouse gas emissions, the CO2 will be drawn out of
the air, the old equilibrium will be re-established and the climate of the planet will go back to the way it used to
be; just like the way the acid rain problem was solved once scrubbers were put on smoke stacks, or the way lead
viewpoint can lead to a fatalistic approach, in which efforts to mitigate climate change by cutting emissions are
seen as futile: we should instead begin planning for adaptation or, worse, start deliberately intervening through
geoengineering. But this is wrong. The inertia is not in the physics of the climate system, but rather in the human
economy. This is explained in a recent paper in Science Magazine (2013, paywalled but freely accessible here, scroll
down to "Publications, 2013") by Damon Matthews and Susan Solomon: Irreversible Does Not Mean Unavoidable.
Since the Industrial Revolution, CO2 from our burning of fossil fuels has been
ends up. It just so happens that the delayed heating from this thermal inertia balances almost exactly with the drop
in CO2 concentrations, meaning the temperature of the Earth would stay approximately constant from the minute
making risky interventions by scattering tons of sulphate particles into the upper atmosphere, to shade us from the
Sun. The good news is that, once we stop emissions, further warming will immediately cease; we are not on an
Despite the political rhetoric, the scientific warnings, the media headlines and the corporate promises,
he would say, carbon emissions were soaring way out of control - far above even
the bleak scenarios considered by last year's report from the Intergovernmental Panel on
Climate Change (IPCC) and the Stern review. The battle against dangerous climate change had
been lost, and the world needed to prepare for things to get very, very bad. "As an
academic I wanted to be told that it was a very good piece of work and that the conclusions were sound," Anderson
rises by more than 2ppm each year. The government's official position is that the world should aim to cap this rise
at 450ppm.
Here's a dark secret about the earth's changing climate that many scientists believe, but few seem
eager to discuss: It's too late to stop global warming . Greenhouse gasses pumped
into the planet's atmosphere will continue to grow even if the industrialized nations
cut their emissions down to the bone. Furthermore, the severe measures that would
have to be taken to make those reductions stand about the same chance as that
proverbial snowball in hell. Two scientists who believe we are on the wrong track argue in the current
issue of the journal Nature Climate Change that global warming is inevitable and it's time to
switch our focus from trying to stop it to figuring out how we are going to deal with
its consequences. "At present, governments' attempts to limit greenhouse-gas emissions through carbon
cap-and-trade schemes and to promote renewable and sustainable energy sources are probably too
late to arrest the inevitable trend of global warming," Jasper Knight of Wits University in
Johannesburg, South Africa, and Stephan Harrison of the University of Exeter in England argue in their study. Those
efforts, they continue, "have little relationship to the real world." What is clear, they contend, is a profound lack of
understanding about how we are going to deal with the loss of huge land areas, including some entire island
nations, and massive migrations as humans flee areas no longer suitable for sustaining life, the inundation of
coastal properties around the world, and so on ... and on ... and on. That doesn't mean nations should stop trying to
the entire issue of global warming. "Call me a converted skeptic," physicist Richard A. Muller says in an op-ed piece
published in the New York Times last July. Muller's latest book, "Energy for Future Presidents," attempts to poke
holes in nearly everything we've been told about energy and climate change, except the fact that " humans
are
almost entirely the cause" of global warming. Those of us who live in the
"developed" world initiated it. Those who live in the "developing" world will sustain
it as they strive for a standard of living equal to ours. "As far as global warming is
concerned, the developed world is becoming irrelevant ," Muller insists in his book. We could set
an example by curbing our emissions, and thus claim in the future that "it wasn't our fault," but about the only thing
that could stop it would be a complete economic collapse in China and the rest of the world's developing countries.
Anthropogenic Global Warming (AGW) theory has been dominant for the past three
decades as absolute fact in the public mind. In the last several years, however, cracks in the fortress of "settled
science" have appeared, and public opinion has begun to shift. Increasingly, alarmist predictions have failed to
tax. A group of German scientists predicts dramatic global cooling over the next 90 years toward a new "little ice
age." Of course, even many "low information" folks have an awareness of the record increase in Arctic sea ice, as
well as the current highly-publicized predicament of the cadre of wealthy global warmists stuck in record-high sea
ice while on a cruise to the Antarctic to prove the absence of sea ice. Now the UN's Intergovernmental Panel on
Climate Change (IPCC) has quietly downgraded their prediction for global warming for the next 30 years in the final
draft of their landmark "Fifth Assessment Report." The effect of this is that they are tacitly admitting that the
computer models they have religiously relied-upon for decades as "proof" of AGW theory are dead wrong.
The
Warming Natural
Scientific study and consensus validates warming is a natural
phenomenon
McClintock 09 (Ian C. McClintock, has held many positions in the NFF and in the NSW Farmers
Association where he currently serves on the Climate Change Task Force, the Business Economic & Trade
Committee, and on the Executive Council. He has served on the NFF s Economic Committee, Conservation
Committee, and as the NFF's representative on the National Farm Forestry Round Table, Proof that CO2 is not the
Cause of the Current Global Warming, Lavoisier, June 2009, http://www.lavoisier.com.au/articles/greenhousescience/climate-change/mcclintock-proofnotco2-2009.pdf)
A careful study of these Reports reveals that there is in fact no empirical evidence that
supports the main theory of anthropogenic global warming (AGW) (this has now been
changed to the all inclusive terminology climate change2 ). Some 23 specialized computer models provide the
have been shown to influence global cloud cover (and therefore temperature), and so this should be added to the
list in the above quotation as this material is now largely ignored by the IPCC. It has been common in recent times
to claim that there is a scientific consensus about the causes of climate change, however it has become evident
conclusively demonstrate and prove that the claims that CO2 is the principle driver of climate change are false.
This shows the Hadley UK ground based (pink) and the satellite (blue) temperature record and the simultaneously
recovery (warming) that has occurred since the last gasp of the Little Ice Age (LIA) in the mid 1800s. It shows three
warmings and two coolings, in fact there are now three coolings with details of the latest one (shown on p4) not
levels of CO2 being present during the last warming. There appears to be a clear cyclical signature evident in these
they are now, being 25 times at 545 Ma, (million years ago) (Veizer. J. et al 2000). The killer proof that CO2
does not drive climate is to be found during the Ordovician- Silurian (450-420 Ma) and the Jurassic-Cretaceous
periods (151-132 Ma), when CO> levels were greater than 4000 ppmv (parts per million by volume) and about 2000
and also Nitrous Oxide (N2O), not shown. The physics of how this might occur are well established and beyond
Thirdly, increased levels of CO2 did not inhibit in any way the subsequent fall
in temperatures that regularly occurred, plunging the world into another ice age.
dispute.
It occurs naturally
Ferrara 12 (Peter Ferrara, senior fellow for entitlement and budget policy at The Heartland Institute, a
senior fellow at the Social Security Institute, and the general counsel of the American Civil Rights Union, Sorry
Global Warming Alarmists, The Earth Is Cooling, The Heartland Institute, June 1, 2012,
http://blog.heartland.org/2012/06/sorry-global-warming-alarmists-the-earth-is-cooling/)
Check out the 20th century temperature record, and you will find that its up and
down pattern does not follow the industrial revolutions upward march of
atmospheric carbon dioxide (CO2), which is the supposed central culprit for man caused
global warming (and has been much, much higher in the past). It follows instead the up and
down pattern of naturally caused climate cycles . For example, temperatures
dropped steadily from the late 1940s to the late 1970s. The popular press was even
talking about a coming ice age. Ice ages have cyclically occurred roughly every 10,000 years, with a
new one actually due around now. In the late 1970s, the natural cycles turned warm and
temperatures rose until the late 1990s, a trend that political and economic interests
have tried to milk mercilessly to their advantage. The incorruptible satellite
measured global atmospheric temperatures show less warming during this period
than the heavily manipulated land surface temperatures. Central to these natural
cycles is the Pacific Decadal Oscillation (PDO). Every 25 to 30 years the oceans undergo a
natural cycle where the colder water below churns to replace the warmer water at
the surface, and that affects global temperatures by the fractions of a degree we
have seen. The PDO was cold from the late 1940s to the late 1970s, and it was
warm from the late 1970s to the late 1990s, similar to the Atlantic Multidecadal Oscillation (AMO).
In 2000, the UNs IPCC predicted that global temperatures would rise by 1 degree Celsius by 2010. Was that based
on climate science, or political science to scare the public into accepting costly anti-industrial regulations and
taxes? Don Easterbrook, Professor Emeritus of Geology at Western Washington University, knew the answer. He
publicly predicted in 2000 that global temperatures would decline by 2010. He made that prediction because he
knew the PDO had turned cold in 1999, something the political scientists at the UNs IPCC did not know or did not
think significant.
with great abandon and without restraint by Greens. Yet the Earth cooled slightly in
that time. And if man-made global warming is real, atmospheric as well as surface
temperatures should have increased steadily. But they haven't. There was merely
that one-time increase, possibly caused by a solar anomaly . In addition, an "urban heat island
effect" has been identified. Build a tarmac runway near a weather station, and the nearby temperature readings will
Global warming became the focus of activism at the time of the Earth Summit
in Rio, in 1992. Bush the elder signed a climate-change treaty, with signatories agreeing to
reduce carbon dioxide emissions below 1990 levels. The details were worked out in Kyoto, Japan. But America
was the principal target, everyone knew it , and Clinton didn't submit the treaty to the Senate for
ratification. The 1990 date had been carefully chosen. Emissions in Germany and the Soviet Union
were still high; Germany had just absorbed East Germany, then still using inefficient
coal-fired plants. After they were modernized, Germany's emissions dropped , so the
go up.
demand that they be reduced below 1990 levels had already been met and became an exercise in painless
The same was true for the Soviet Union. After its collapse, in 1991,
economic activity fell by about one-third. As for France, most of its electricity comes
from nuclear power, which has no global-warming effects but has been demonized
for other reasons. If the enviros were serious about reducing carbon dioxide they
would be urging us to build nuclear power plants , but that is not on their agenda. They want
windmills (whether or not they kill golden eagles). Under the Kyoto Protocol, U.S. emissions would
have to be cut so much that economic depression would have been the only certain
outcome. We were expected to reduce energy use by about 35 percent within ten years, which might have
moralizing.
meant eliminating one-third of all cars. You can see why the enviros fell in love with the idea.
Impact Defense
Adaption
IPCC consensus proves we can adapt
Rodgers 14 (Paul Rodgers, contributor of general sciences for Forbes, Climate Change: We Can Adapt,
Says IPCC, Forbes, March 31, 2014, http://www.forbes.com/sites/paulrodgers/2014/03/31/climate-change-is-realbut-its-not-the-end-of-the-world-says-ipcc/)
for the first time, the IPCC is offering a glimmer of hope. It acknowledges that
some of the changes will be beneficial including higher crop yields in places like Canada, Europe
and Central Asia and that in others cases, people will be able to adapt to them. The
really big breakthrough in this report is the new idea of thinking about managing
climate change, said Dr Chris Field, the global ecology director at the Carnegie Institution in Washington and
a co-chairman of the report. We have a lot of the tools for dealing effectively with it. We
just need to be smart about it. Climate-change adaptation is not an exotic agenda
that has never been tried, said Field. Governments, firms, and communities around
the world are building experience with adaptation. This experience forms a starting
point for bolder, more ambitious adaptations that will be important as climate and
society continue to change. Adaptations could include better flood defences or
building houses that can withstand tropical cyclones . Vicente Barros, another co-chairman, said:
Investments in better preparation can pay dividends both for the present and for
the future.
Yet
Cutting emissions will not help. Our only hope is to adapt to rising
temperatures
Roach 13 (John Roach, contributing writer for NBC News, It's time to adapt to unstoppable global warming,
scientists say, NBC News, November 7, 2013, http://www.nbcnews.com/science/environment/its-time-adaptunstoppable-global-warming-scientists-say-f8C11554338)
Even if the world's 7 billion people magically stop burning fossil fuels and chopping down
forests today, the greenhouse gases already emitted to the atmosphere will warm the
planet by about 2 degrees Fahrenheit by the end of this century, according to scientists who are urging a
focused scientific effort to help humanity adapt to the changing climate. And reality shows no sign of
such a magic reduction in emissions. The amount of greenhouse gases in the
atmosphere reached another new high in 2012, the World Meteorological Association announced
Wednesday. In fact, concentrations of carbon dioxide, the most abundant planet warming
gas, grew faster last year than its average growth rate of the past decade. "The fact
is, we are not making a lot of progress in reducing emissions ," Richard Moss, a senior
scientist with the Pacific Northwest National Laboratory's Joint Global Change Research Institute at the University of
Maryland, told NBC News. "So
just preventing it is nothing new. It's the need for more information that field of science can yield that's
increasingly vital. Superstorm Sandy and the onslaught of similar catastrophic events bear the fingerprint of climate
change. Growing evidence that more of the same is on the way brings a new urgency for information that people
The push
for adaptation science also represents a shift in climate policy circles away from an
agenda focused solely on cutting greenhouse gas emissions to reduce the impact of
climate change, according to Adam Fenech, a climate scientist at the University of Prince Edward Island in
can actually use to prepare, survive, and even thrive on a changing planet, Moss explained. Hope lost?
Canada who has studied adaptation science for about 15 years. He did not contribute to the article in Science. For
Joe Casola, director of the science and impacts program at the Center for Climate and Energy Solutions, an
environmental think tank in Arlington, Va. "I think that both of them are going to be required," he told NBC News.
"The more mitigation we do, the less adaptation we'll need," added Casola, who was not involved with the Science
article. "If we were to do no mitigation, then our adaptation investments now are probably going to be made
worthless or very, very limited as we have a lot of changes to deal with in the climate system." Institute of
adaptation Key to making adaptation work is an acknowledgement that climate is just one factor planners are
forced to wrestle with as they make decisions, noted Moss, whose lab is funded by the Department of Energy.
Adaptation to climate change / global warming will be needed along with mitigation. Because
impacts are already being observed and because the ocean takes a long time to
heat up, mitigation - although crucial - will not be enough. Adaptation unfortunately will
not be a simple matter. The human race is no longer composed of small groups of hunter-gatherers, but
billions of people generally living in highly arranged societies with limited mobility. The worse the impacts
of global warming, the more difficult adaptation will be. We should start planning for
adaptation now, along with mitigation to try to lessen the impact of global warming.
However, adaptation should not be used as an excuse not to mitigate global
warming. The most complete compendium on adaptation is the 2014 IPCC Report , Vol
WGII (Impacts, Adaptation and Vulnerability).
Bosello et al. (2010) also conclude that mitigation and adaptation are strategic
complements. They fold into an optimization model three types of adaptation measures - anticipatory,
reactive, and adaptation R&D - to sort out the relative contributions and timing of mitigation and adaptation
expenditures in an optimal climate policy.
No Extinction
Warming wont lead to complete extinction
Green 11 (Roedy, PHD from British Colombia, Extinction of Man,
http://mindprod.com/environment/extinction.html//umich-mp)
history, the Permian, 230 million years ago, was the worst. 70% of all species were lost. It was caused by natural
global warming when volcanoes released greenhouse gases. (The other extinction event more familiar to most
people was the more recent KT Cretaceous-Tertiary Mass Extinction event, 65 million years ago. It was caused when
We
are re-experiencing the same global warming conditions that triggered the
more devastating Permian extinction, only this time it is man made . When it
an asteroid plunged into the earth at Chicxulub Mexico wiping out the dinosaurs and half of earths species.)
gets too hot, plants die. When it gets too hot and dry, massive fires ravage huge areas. When plants die, insects
and herbivores die. When insects die, even heat-resistant plants dont get pollinated and die. Birds die without
insects to eat. Carnivores die without herbivores to eat, all triggered by what seems so innocuous heat. Similarly,
in the oceans, when they get just a few degrees too warm, corals expel their symbiotic algae and die soon
thereafter. When coral reefs die, the fish that live on them die, triggering extinction chains. Satellites can chart the
loss of vegetation over the planet. We are losing 4 species per hour, a rate on the same scale as the Permian and KT
extinction events. Man has no ability to live without the support of other species. We are committing suicide and
In an attempt to depict earth's current temperature a s being extremely high and, therefore,
extremely dangerous, Hansen focuses almost exclusively on a single point of the earth's surface in the Western Equatorial Pacific,
for which he and others (Hansen et al., 2006) compared modern sea surface temperatures (SSTs) with paleo-SSTs that were derived by Medina-Elizade and
Lea (2005) from the Mg/Ca ratios of shells of the surface-dwelling planktonic foraminifer Globigerinoides rubber that they obtained from an ocean
concluded that this critical ocean region, and probably the planet as a whole [our italics],
is approximately as warm now as at the Holocene maximum and within ~1C of the maximum temperature of the past
million years [our italics]. Is there any compelling reason to believe these claims of Hansen et al.
about the entire planet? In a word, no, because there are a multitude of other single-point
measurements that suggest something vastly different . Even in their own paper, Hansen et al. present data
sediment core. In doing so, they
from the Indian Ocean that indicate, as best we can determine from their graph, that SSTs there were about 0.75C warmer than they are currently some
125,000 years ago during the prior interglacial. Likewise, based on data obtained from the Vostok ice core in Antarctica, another of their graphs suggests
that temperatures at that location some 125,000 years ago were about 1.8C warmer than they are now; while data from two sites in the Eastern
Equatorial Pacific indicate it was approximately 2.3 to 4.0C warmer compared to the present at about that time. In fact, Petit et al.s (1999) study of the
Vostok ice core demonstrates that large periods of all four of the interglacials that preceded the Holocene were more than 2C warmer than the peak
been. To cite just a few examples of pertinent work conducted in the 1970s and 80s based on temperature reconstructions derived from studies of
latitudinal displacements of terrestrial vegetation (Bernabo and Webb, 1977; Wijmstra, 1978; Davis et al., 1980; Ritchie et al., 1983; Overpeck, 1985) and
vertical displacements of alpine plants (Kearney and Luckman, 1983) and mountain glaciers (Hope et al., 1976; Porter and Orombelli, 1985) we note it
was concluded by Webb et al. (1987) and the many COHMAP Members (1988) that mean annual temperatures in the Midwestern United States were about
2C greater than those of the past few decades (Bartlein et al., 1984; Webb, 1985), that summer temperatures in Europe were 2C warmer (Huntley and
Prentice, 1988) as they also were in New Guinea (Hope et al., 1976) and that temperatures in the Alps were as much as 4C warmer (Porter and
Orombelli, 1985; Huntley and Prentice, 1988). Likewise, temperatures in the Russian Far East are reported to have been from 2C (Velitchko and Klimanov,
1990) to as much as 4-6C (Korotky et al., 1988) higher than they were in the 1970s and 80s; while the mean annual temperature of the Kuroshio Current
between 22 and 35N was 6C warmer (Taira, 1975). Also, the southern boundary of the Pacific boreal region was positioned some 700 to 800 km north of
the
Medieval Warm Period, centered on about AD 1100, had lots of them. In fact, every single week since 1 Feb 2006, we have featured on
our website (www.co2science.org) a different peer-reviewed scientific journal article that testifies to the existence of this
several-centuries-long period of notable warmth , in a feature we call our Medieval Warm Period Record of the
its present location (Lutaenko, 1993). But we neednt go back to even the mid-Holocene to encounter warmer-than-present temperatures, as
Week. Also, whenever it has been possible to make either a quantitative or qualitative comparison between the peak temperature of the Medieval Warm
Period (MWP) and the peak temperature of the Current Warm Period (CWP), we have included those results in the appropriate quantitative or qualitative
The great global warming scare is over -- it is well past its peak , very much a spent force,
sputtering in fits and starts to a whimpering end. You may not know this yet. Or rather, you may know it but don't want to
acknowledge it until every one else does, and that won't happen until the press, much of which also knows it, formally acknowledges
I know that the global warming scare is over but for the shouting because that's
what the polls show, at least those in the U.S., where unlike Canada the public is polled extensively on global warming.
Most Americans don't blame humans for climate change -- they consider global
warming to be a natural phenomenon. Even when the polls showed the public
believed man was responsible for global warming, the public didn't take the scare
seriously. When asked to rank global warming's importance compared to numerous
other concerns -- unemployment, trade, health care, poverty, crime, and education
among them -- global warming came in dead last. Fewer than 1% chose global
warming as scare-worthy. The informed members of the media read those polls and know the global warming scare is
it.
over, too. Andrew Revkin, The New York Times reporter entrusted with the global warming scare beat, has for months lamented "the
public's waning interest in global warming." His colleague at the Washington Post, Andrew Freedman, does his best to revive public
fear, and to get politicians to act, by urging experts to up their hype so that the press will have scarier material to run with. The
entitled "Massive Estimates of Death are in Vogue for Copenhagen," Richard Cable of the BBC, until then the most stalwart of scaremongers, rattled off the global warnings du jour -they included a comparison of global warming to nuclear war and a report from the
former Secretary General of the UN, Kofi Annan, to the effect that "every
Arctic ice is returning , the Antarctic isn't shrinking, polar bear populations aren't
diminishing, hurricanes aren't becoming more extreme . The only thing that's scary about the science
is the frequency with which doomsayer data is hidden from public scrutiny, manipulated to mislead, or simply made up. None of this
matters anymore, I recently heard at the Global Business Forum in Banff, where a fellow panelist from the Pew Centre on Global
Climate Change told the audience that, while she couldn't dispute the claims I had made about the science being dubious, the rights
and wrongs in the global warming debate are no longer relevant. "The train has left the station," she cheerily told the business
audience, meaning that the debate is over, global warming regulations are coming in, and everyone in the room -- primarily
business movers and shakers from Western Canada --had better learn to adapt.
environmentalists' baseless allegations that the accumulation of man-made carbon dioxide, produced by cars,
power plants and other human activities, is causing dangerous global warming. Indeed, far from being a poisonous
episodes of global warming that occurred at the end of the last three ice ages. Interestingly, temperatures started
to rise during those warming periods well before the atmospheric carbon dioxide started to increase. In fact, the
carbon dioxide levels did not begin to rise until 400 to 1,000 years after the planet began to warm. Concludes
Dr. Idso, "Clearly, there is no way that these real-world observations can be construed to even hint at the possibility
that a significant increase in atmospheric carbon dioxide will necessarily lead to any global warming."1 On the other
conditions . In a study discussed in the journal Plant Ecology, a team of scientists subjected the Mojave
Desert evergreen shrub to three different concentrations of carbon dioxide - the current level of 360 ppm and at
biologists grew seedlings of three yucca plants in cooler greenhouse environments at the 360 ppm and 700 ppm
stimulated the trees' root growth by 23 percent. Expanded root systems help
tropical trees by increasing their ability to absorb water and nutrients.6 Bigger
trees, increased resistance to bad weather, improved agricultural productivity and a
boon to rainforests are just some of the many benefits that carbon dioxide bestows on
the environment. With little evidence that carbon dioxide triggers dangerous global warming but lots of
evidence showing how carbon dioxide helps the environment , environmentalists should be
extolling the virtues of this benign greenhouse gas.
future increases in
demand will have to come from a near doubling of productivity on a land area basis," and
they opine that "a large contribution will have to come from improved photosynthetic
conversion efficiency," for which they estimate that "at least a 50% improvement will
be required to double global production." The researchers' reason for focusing on
photosynthetic conversion efficiency derives from the experimentally-observed
facts that (1) increases in the atmosphere's CO2 concentration increase the
photosynthetic rates of nearly all plants, and that (2) those rate increases generally
lead to equivalent -- or only slightly smaller -- increases in plant productivity on a land area basis, thereby
providing a solid foundation for their enthusiasm in this regard. In their review of the matter, however,
they examine the prospects for boosting photosynthetic conversion efficiency in an
entirely different way: by doing it genetically and without increasing the air's
CO2 content.
So what is the likelihood that their goal can be reached via this approach?
Atmospheric CO2 enrichment has long been known to help earth's plants withstand the
debilitating effects of various environmental stresses, such as high temperature, excessive salinity levels and
deleterious air pollution, as well as the negative consequences of certain resource limitations, such as less than
their landmark work on experimental plots of subarctic heath located close to the Abisko Scientific Research Station
in Swedish Lapland (68.35N, 18.82E). The plots they studied were composed of open canopies of Betula
For a period
of five years, the scientists exposed the plots to factorial combinations of UV-B
radiation - ambient and that expected to result from a 15% stratospheric ozone depletion - and
atmospheric CO2 concentration - ambient (around 365 ppm) and enriched (around 600 ppm) - after
pubescens ssp. czerepanovii and dense dwarf-shrub layers containing scattered herbs and grasses.
which they determined the amounts of microbial carbon (Cmic) and nitrogen (Nmic) in the soils of the plots. When
the plots were exposed to the enhanced UV-B radiation level expected to result from a 15% depletion of the planet's
stratospheric ozone layer, the researchers found that the amount of Cmic in the soil was reduced to only 37% of
what it was at the ambient UV-B level when the air's CO2 content was maintained at the ambient concentration.
When the UV-B increase was accompanied by the CO2 increase, however, not only was
there not a decrease in Cmic, there was an actual increase of fully 37 %. The story with
These findings, in the words of Johnson et al., "may have far-reaching implications ... because the productivity of
many semi-natural ecosystems is limited by N (Ellenberg, 1988)." Hence, the 138% increase in soil microbial N
observed in this study to accompany a 15% reduction in stratospheric ozone and a concomitant 64% increase in
atmospheric CO2concentration (experienced in going from 365 ppm to 600 ppm) should do wonders in enhancing
the input of plant litter to the soils of these ecosystems, which phenomenon represents the first half of the carbon
sequestration process, i.e., the carbon input stage.
Biodiversity
Types of hazards and needed products have many common elements throughout
the country but also unique regional differences. For example, the Gulf of Mexico,
the Caribbean and the southeastern U.S. experience more frequent and more
intense hurricanes than other regions of the country. Substantial improvements in
the NWS forecasts of storm intensity, track, and timing of passage are necessary
for timely evacuations of communities and offshore facilities in the path of the
storm, while avoiding evacuations that are unnecessary. Long-term plans for
these regions include close coordination of data productS with the needs of
hurricane modelers, including providing information on ocean heat content via airdeployed sensors during hurricane approach and passage, and autonomous
underwater vehicles to monitor the water column. With the rapid loss of sea ice,
Arctic weather and ocean conditions are increasingly endangering Alaska Native
coastal communities. In a statewide assessment, flooding and erosion affects 184
out of 213 Native villages (GAO, 2003). This presents unique challenges in
forecasting and effectively communicating conditions for small communities in
isolated locations. These and many other types of regional differences must be
considered when tailoring and refining common information needs.
Regional Differences just make it to hard for the aff to solve in its
entirety
Rosenfeld 12 (Dr. Leslie Rosenfeld is currently a Research Associate Professor at the Naval Postgraduate School and an oceanographic
consultant. She has a Ph.D. in Physical Oceanography, and is a recognized expert on the California Current System, specializing in circulation over the
continental shelf. December 2012. Synthesis of Regional IOOS Build-out Plans for the Next Decade from the Integrated Ocean Observing System
Association.
http://www.ioosassociation.org/sites/nfra/files/documents/ioos_documents/regional/BOP%20Synthesis%20Final.pdf July 6, 2014.)
Water quality issues also exhibit a variety of regionally unique differences. For
example, water quality contamination in the Great Lakes is not only an issue for
wildlife, but also for 40 million consumers of public drinking water in the region. To
address this concern, GLOS will provide a decision support tool to track the
movement of drinking water contaminants, and will provide 19 model output that
helps county officials manage drinking water intake systems to avoid
contamination. Tracking of water quality plumes and impacts also has unique
challenges in the Gulf of Mexico. Drainage from urban development, industries
and farmland comes into the Gulf from the enormous Mississippi River watershed,
covering 1,245,000 square miles and 41% of the 48 contiguous states of the U.S.
This drainage leads to hypoxic conditions (low dissolved oxygen content) in
summertime over the Texas-Louisiana shelf waters, and delivers large loads of
sediment and associated pollutants to nearshore environments such as recreational
waters and shellfish beds. Informed management decisions require a broad
distribution of platforms, sensors and derived products to track this plume and its
impacts through estuaries, nearshore and offshore habitats.
Effective utilization of the observing system requires translation of the data and
model outputs into targeted products and decision support tools for a variety of
users. The process of product development ultimately links management
decisions to desired products, to the information needed to produce the products,
to data and models required to create that information and finally to the essential
observing system requirements. Successful engagement with users and the
resultant product development is at the heart of ultimate success of the 10-year
build-out plan. The 27 key products identified as part of the build-out plans include
products that have already been successfully developed and utilized by one or
more RAs as well as many cases where development is either not underway or is in
the early stages. Successful completion of development for these 27 products for
all the regions, and completion of additional unique regional products will involve
multiple components, as outlined below. Sufficient funding needs to be identified to
carry the process for a given product need through to completion, to ensure
effective management and meeting of user expectations. Iterative two-way
engagement with users is required for effective product development, often based
on small group discussions with key decision makers and technical staff in the user
community. It may also include surveys, user advisory panels, targeted workshops,
and discussions at existing forums of the targeted users. This provides an
understanding of the users objectives, the scope of the issue and the decision
processes they use. Those user decisions or management needs that could most
benefit from IOOS products can then be identified and prioritized, including
products involving data, processed information, visualizations, models and decision
support tools. Product development teams focused on specific priority needs
should be created, including RA staff, key partners and users. The RAs should act
as intermediaries that can translate between researchers and users, and should
stay engaged throughout the process. After fully understanding user needs, the
teams evaluate the variables, temporal and spatial scale and resolution, and data
quality required to meet those needs. They evaluate current coastal IOOS, or
other, products that could be used or modified, or new products that could be
developed to meet needs. This should include evaluation across all 11 RAs and
federal agencies to identify useful 26 building blocks and approaches for products.
Identification of any gaps between the users information requirements and current
coastal IOOS capabilities can then be identified, and, where possible, technical
means to fill them can be developed. This should include engaging partners who
may provide critical data or fulfill specific modeling needs, and pursuit of funding
sources to fill gaps, if not already obtained. In addition to addressing technical
issues, the team should identify any institutional issues among users that must be
overcome to fully utilize IOOS products, and where possible work with them to
develop approaches to overcome these barriers. Institutional issues include agency
and industry policies and practices, regulations and permit procedures,
perspectives and communication patterns, staff expertise, training and workload,
and/or governance structures that could impede full use of IOOS information in
decision-making. Addressing these issues will likely require engagement with
management representatives of the user agency or organization. Development of
the initial product includes software development to access and integrate data,
SQ Solves-Mapping
Data Insufficiency is being fixed in the Status Quothe plan
isnt needed
Piotroski 14 (Jon Piotroski is known for his articles on the science development website. He has written numerous articles pertaining to
oceanography, climate change, and other environmental issues. 03/20/14 from SciDev. Tidal wave of ocean data leaves scientists swamped.
http://www.scidev.net/global/data/news/tidal-wave-of-ocean-data-leaves-scientists-swamped.html July 3, 2014)
independent advice on matters of science, technology, and medicine. They enlist committees of the nations top scientists,
engineers, and other experts all of whom volunteer their time to study specific concerns. 08/24/2011. Ocean Exploration from the
National Academies. http://oceanleadership.org/wp-content/uploads/2009/08/Ocean_Exploration.pdf July 6, 2014).
U.S. and international efforts have made significant progress in recent years to-
ward establishing ocean observatories. The Integrated Ocean Observing System
(IOOS), the U.S. contribution to the international Global Ocean Observing Sys- tem,
is designed as a large network that collects high-resolution data along the entire
U.S. coast. This information supports a wide range of operational services,
including weather forecasts, coastal warning systems, and the monitoring of algal
blooms. The Ocean Observatories Initiative (OOI), a National Science Foundation
program, will enable land-based exploration and monitoring of processes
throughout the seafloor, water column, and overlying atmosphere by real-time,
remote interactions with arrays of sensors, in- struments, and autonomous
vehicles. For the first time, The worlds first underwater cabled observatory to span
an entire plate will be installed as part of a collaborative effort between the U.S.
Ocean Observatories Initiative (OOI) and U.S. and Canadian institutions. Fiberoptic
cables will extend high bandwidth and power to a network of hundreds of sensors
across, above, and below the seafloor, allowing in situ, interactive monitoring of
the ocean and seafloor for the next 20 to 30 years. Image courtesy University of
Washington Center for Environmental Visualization. Discover scientists, educators
and the public will have real- time access to the data and imagery being
collected, so that they can learn about events such as underwater volcanic
eruptions and anoxia events as they happen, even if there are no ships in the
area.
SQ Conservation Solves
Status Quo Solves -- US already implementing conservation programs
to prevent impacts
Tullo 14 (Michelle Tulo is a writer for various news magazines, in particular IPS. IPS is a reputable communication institution that specializes in
global news. U.S. Turns Attention to Ocean Conservation, Food Security from IPS (Inter Press Service) on 06/19/14. http://www.ipsnews.net/2014/06/u-sturns-attention-to-ocean-conservation-food-security/ July 7, 2014.)
No Impact-Species
No species snowball
Roger A Sedjo 2k, Sr. Fellow, Resources for the Future, Conserving Natures
Biodiversity: insights from biology, ethics & economics, eds. Van Kooten, Bulte and
Sinclair, p 114
As a critical input into the existence of humans and of life on earth, biodiversity obviously has a very high value (at
least to humans). But, as with other resource questions, including public goods,
biodiversity is not an
question is that of how valuable to the life support function are species at the margin. While this, in principle, is an
diversity. Thus, as in the water-diamond paradox, the value of the marginal unit of biodiversity appears to be very
small.
No extinction
Easterbrook 2003
Gregg, senior fellow at the New Republic, We're All Gonna Die!
http://www.wired.com/wired/archive/11.07/doomsday.html?
pg=2&topic=&topic_set=
If we're talking about doomsday - the end of human civilization - many scenarios simply don't
measure up. A single nuclear bomb ignited by terrorists, for example, would be awful beyond words, but life
would go on. People and machines might converge in ways that you and I would find ghastly, but from the
Western nations in the postwar era, might grow so widespread that vast numbers of people would refuse to get out
of bed, a possibility that Petranek suggested in a doomsday talk at the Technology Entertainment Design
conference in 2002. But Marcel Proust, as miserable as he was, wrote Remembrance of Things Past while lying in
bed.
No Impact-Exaggerated
Environmental threats exaggerated
Gordon 95 - a professor of mineral economics at Pennsylvania State University
[Gordon, Richard, Ecorealism Exposed, Regulation, 1995,
http://www.cato.org/pubs/regulation/regv18n3/reg18n3-readings.html
the environmental
movement has exaggerated the threats and ignored evidence of improvement . His
Easterbrook's argument is that although environmental problems deserve attention,
discontent causes him to adopt and incessantly employ the pejoratively intended (and irritating) shorthand
"enviros" to describe the leading environmental organizations and their admirers. He proposes-and overuses-an
equally infelicitous alternative phrase, "ecorealism," that seems to mean that most environmental initiatives can be
justifited by more moderate arguments. Given the mass, range, and defects of the book, any review of reasonable
length must be selective. Easterbrook's critique begins with an overview of environmentalism from a global
perspective. He then turns to a much longer (almost 500- page) survey of many specific environmental issues. The
overview section is a shorter, more devastating criticism, but it is also more speculative than the survey of specific
more penetrating criticism than typically appears in works expressing skepticism about environmentalism.
Earth Day 1970 provoked a torrent of apocalyptic predictions. We have about five more years at the outside to do
something, ecologist Kenneth Watt declared to a Swarthmore College audience on April 19, 1970. Harvard biologist
George Wald estimated that civilization will end within 15 or 30 years unless immediate action is taken against
problems facing mankind. We are in an environmental crisis which threatens the survival of this nation, and of the
world as a suitable place of human habitation, wrote Washington University biologist Barry Commoner in the Earth
Day issue of the scholarly journal Environment. The day after Earth Day, even the staid New York Times editorial
page warned, Man must stop pollution and conserve his resources, not merely to enhance existence but to save
the race from intolerable deterioration and possible extinction. Very Apocalypse Now. Three decades later, of
course, the world hasnt come to an end; if anything, the planets ecological future has never looked so promising.
With half a billion people suiting up around the globe for Earth Day 2000, now is a good time to look back on the
predictions made at the first Earth Day and see how theyve held up and what we can learn from them. The short
The prophets of doom were not simply wrong, but spectacularly wrong. More
important, many contemporary environmental alarmists are similarly mistaken
when they continue to insist that the Earths future remains an eco-tragedy that has
already entered its final act. Such doomsters not only fail to appreciate the huge
environmental gains made over the past 30 years, they ignore the simple fact that
increased wealth, population, and technological innovation dont degrade and
destroy the environment. Rather, such developments preserve and enrich the
environment. If it is impossible to predict fully the future, it is nonetheless possible
to learn from the past. And the best lesson we can learn from revisiting the discourse surrounding the very
answer:
first Earth Day is that passionate concern, however sincere, is no substitute for rational analysis.
No Impact-Resilient
No impact --- Ocean ecosystems are resilient
CO2 Science 2008
Marine Ecosystem Response to "Ocean Acidification" Due to Atmospheric CO2
Enrichment, Vogt, M., Steinke, M., Turner, S., Paulino, A., Meyerhofer, M., Riebesell,
U., LeQuere, C. and Liss, P. 2008. Dynamics of dimethylsulphoniopropionate and
dimethylsulphide under different CO2 concentrations during a mesocosm
experiment. Biogeosciences 5: 407-419,
http://www.co2science.org/articles/V11/N29/B2.php
Vogt et al. report that they detected no significant phytoplankton species shifts between treatments, and that " the
ecosystem composition, bacterial and phytoplankton abundances and productivity,
grazing rates and total grazer abundance and reproduction were not significantly
affected by CO2 induced effects," citing in support of this statement the work of Riebesell et al. (2007),
Riebesell et al. (2008), Egge et al. (2007), Paulino et al. (2007), Larsen et al. (2007), Suffrian et al. (2008) and
Carotenuto et al. (2007). In addition, they say that "while DMS stayed elevated in the treatments with elevated
CO2, we observed a steep decline in DMS concentration in the treatment with low CO2," i.e., the ambient CO2
millennia and perhaps longer. If nature has been interacting with genus Homo for thousands of years, then the
living things that made it to the present day may be ones whose genetic treasury renders them best suited to resist
This does not ensure any creature will continue to survive any clash
with humankind. It does make survival more likely than doomsday orthodox asserts .
If natures adjustment to the human presence began thousands of years ago, perhaps it will soon be complete. Far
from reeling helplessly before a human onslaught, nature may be on the verge of
reasserting itself. Nature still rules much more of the Earth than does genus Homo. To the statistical majority
human mischief.
No Impact-long TF
Their impact has a three hundred year timeframe
CO2 Science 2005
The Fate of Fish in a High-CO2 World, Ishimatsu, A., Hayashi, M., Lee, K.-S., Kikkawa,
T. and Kita, J. 2005. Physiological effects of fishes in a high-CO2 world. Journal of
Geophysical Research 110: 10.1029/2004JC002564,
http://www.co2science.org/articles/V8/N42/B3.php
Although this conclusion sounds dire indeed, it represents an egregious flight of the
imagination in terms of what could realistically be expected to happen anytime in
earth's future. Ishimatsu et al. report, for example, that "predicted future CO2
concentrations in the atmosphere are lower than the known lethal concentrations
for fish," noting that "the expected peak value is about 1.4 torr [just under 1850 ppm] around the
year 2300 according to Caldeira and Wickett (2003)." So just how far below the lethal CO2
concentration for fish is 1.4 torr? In the case of short-term exposures on the order of
a few days, the authors cite a number of studies that yield median lethal
concentrations ranging from 37 to 50 torr, which values are 26 and 36 times greater
than the maximum CO2 concentration expected some 300 years from now!
damage. It would take many generations to replace the lost forests, and the cultures that have been lost with
them can never be replaced, it warns. Many of the planets species have already been lost or
condemned to extinction because of the slow response times of both the environment and
policy-makers; it is too late to preserve all the bio-diversity the planet had. Sounding the
alarm, the UNEP said the planet now faced full-scale emergencies on several fronts, including
these: -- it is probably too late to prevent global warming , a phenomenon whereby exhaust gases
and other emissions will raise the temperature of the planet and wreak climate change. Indeed, many of the
targets to reduce or stabilise emissions will not be met, the report says. -- urban air pollution
problems are reaching crisis dimensions in the developing worlds mega-cities, inflicting damage to the health of
the seas are being grossly over-exploited and even with strenuous
efforts will take a long time to recover .
their inhabitants. --
No Solvency-Overpopulation swamps
Population growth makes biodiversity loss inevitable
Gaston 5 [Kevin J. Gaston Biodiversity and Macroecology Group, Department of
Animal and Plant Sciences, University of Sheffield. Progress in Physical Geography
29, 2 (2005) pp. 239247. Biodiversity and extinction: species and people
http://www.epa.gov/ncer/biodiversity/pubs/ppg_vol29_239.pdf //jweideman]
The human population is predicted to grow by 2 to 4 billion people by 2050 (United
Nations, 2001). While it took until about 1800 to attain a global population of I billion people, a medium
projection is that it may take just 13 to 14 years to add another billion to the present total (Cohen, 2003). All else remaining equal, which it seldom does, a number of pre- dictions
would seem to follow from the work that has been conducted to date on the relationships between human densities and the numbers of native species, numbers or
pro- portions of threatened species, and the numbers or proportions of introduced
species. First, the spatial scale at which relationships between overall numbers of native species and human
density become hump-shaped or at least gain marked negative phases seems likely to increase, even when species
on the basis of human pop- ulation growth alone. Such aggregate estimates provide no indication of precisely what
this is likely to do for the overall propor- tion of species that are globally threatened with extinction, but these
increases can only serve to increase the 12% of bird species and the 23% of mammals currently listed as such
(lUCN, 2003).
increase.
Likewise, the proportion of species that have become globally extinct will
No Impact-Tech Solves
Tech solves the impact
Stossel 2007
John, Investigative reporter for Fox news, Environmental Alarmists Have It
Backwards
http://www.realclearpolitics.com/articles/2007/04/how_about_economic_progress_da
.html
you'd think that the earth was in imminent danger -- that
human life itself was on the verge of extinction . Technology is fingered as the perp. Nothing
could be further from the truth. John Semmens of Arizona's Laissez Faire Institute points out that Earth
Watching the media coverage,
Day misses an important point. In the April issue of The Freeman magazine, Semmens says the environmental
movement overlooks how hospitable the earth has become -- thanks to technology. " The
environmental
alarmists have it backwards. If anything imperils the earth it is ignorant obstruction
of science and progress. ... That technology provides the best option for serving
human wants and conserving the environment should be evident in the progress
made in environmental improvement in the United States . Virtually every measure
shows that pollution is headed downward and that nature is making a comeback ."
(Carbon dioxide excepted, if it is really a pollutant.) Semmens describes his visit to historic Lexington and Concord
in Massachusetts, an area "lush with trees and greenery." It wasn't always that way. In 1775, the land was cleared
so it could be farmed. Today, technology makes farmers so efficient that only a fraction of the land is needed to
produce much more food. As a result, "Massachusetts farmland has been allowed to revert back to forest." Human
ingenuity and technology not only raised living standards, but also restored environmental amenities. How about a
day to celebrate that?
No Impact-Acidification
No ocean acidification problem its natural
Idso et al 2009
Sherwood, founder and former President of the Center for the Study of Carbon
Dioxide and Global Change and currently serves as Chairman of the Center's board
of directors, The Ocean Acidification Fiction Volume 12, Number 22: 3 June 2009
There is considerable current concern that the ongoing rise in the air's CO2 content
is causing a significant drop in the pH of the world's oceans in response to their absorption of a
large fraction of each year's anthropogenic CO2 emissions. It has been estimated, for example, that the globe's
seawater has been acidified (actually made less basic) by about 0.1 pH unit relative to what it was in pre-industrial
times; and model calculations imply an additional 0.7-unit drop by the year 2300 (Caldeira and Wickett, 2003),
which decline is hypothesized to cause great harm to calcifying marine life such as corals. But just how valid are
modern Porites corals recovered from the South China Sea, the nine researchers employed 14C dating using the
liquid scintillation counting method, along with positive thermal ionization mass spectrometry to generate high
precision 11B (boron) data, from which they reconstructed the paleo-pH record of the past 7000 years that is
the Industrial Revolution. As for the prior portion of the record, Liu et al. note that there is also "no correlation
between the atmospheric CO2 concentration record from Antarctica ice cores and 11B-reconstructed paleo-pH
over the mid-late Holocene up to the Industrial Revolution." Further enlightenment comes from the earlier work of
Pelejero et al. (2005), who developed a more refined history of seawater pH spanning the period 1708-1988
(depicted in the figure below), based on 11B data obtained from a massive Porites coral from Flinders Reef in the
western Coral Sea of the southwestern Pacific. These researchers also found that " there
is no notable trend
toward lower 11B values." Instead, they discovered that "the dominant feature of
the coral 11B record is a clear interdecadal oscillation of pH , with 11B values ranging
between 23 and 25 per mil (7.9 and 8.2 pH units)," which they say "is synchronous with the
Interdecadal Pacific Oscillation." Going one step further, Pelejero et al. also compared their results with
coral extension and calcification rates obtained by Lough and Barnes (1997) over the same 1708-1988 time period;
and as best we can determine from their graphical representations of these two coral growth parameters, extension
rates over the last 50 years of this period were about 12% greater than they were over the first 50 years, while
calcification rates were approximately 13% greater over the last 50 years. Most recently, Wei et al. (2009) derived
the pH history of Arlington Reef (off the north-east coast of Australia) that is depicted in the figure below. As can be
it would appear that the catastrophe conjured up by the world's climate alarmists is
but a wonderful work of fiction.
susceptibility for different ecosystems to OA." What was done "To determine if photosynthetic CO2 uptake
Manzello et al.
repeatedly measured carbonate chemistry across an inshore-to-offshore gradient in the upper,
middle and lower Florida Reef Tract over a two-year period. What was learned During times of heightened
oceanic vegetative productivity, the five U.S. researchers found "there is a net uptake of total
CO2 which increases aragonite saturation state (arag) values on inshore patch reefs of the upper Florida Reef
associated with seagrass beds has the potential to create OA refugia," as they describe it,
Tract," and they say that "these waters can exhibit greater arag than what has been modeled for the tropical
surface ocean during preindustrial times, with mean arag values in spring equaling 4.69 0.10." At the same
time, however, they report that arag values on offshore reefs "generally represent oceanic carbonate chemistries
the
pattern described above "is caused by the photosynthetic uptake of total CO2 mainly by
seagrasses and, to a lesser extent, macroalgae in the inshore waters of the Florida Reef Tract." And they
therefore conclude that these inshore reef habitats are "potential acidification
refugia that are defined not only in a spatial sense, but also in time, coinciding with seasonal productivity
dynamics," which further implies that "coral reefs located within or immediately downstream
of seagrass beds may find refuge from ocean acidification ." And in further support of this
consistent with present day tropical surface ocean conditions." What it means Manzello et al. hypothesize that
conclusion, they cite the work of Palacios and Zimmerman (2007), which they describe as indicating that
"seagrasses
and coordinates the scientific and outreach activities for the Center. He has over 20 years of experience in climate research and public outreach. January
6, 2010. Ocean Acidification: Another Failing Scare Story from Master Resource. http://www.masterresource.org/2010/01/ocean-acidification-anotherfailing-scare-story/ July 6, 2014.)
The folks over at the Science and Public Policy Institute have taken it upon
themselves to look a bit more closely into the ocean acidification story and see just
what the scientific literature has to say about how organisms are actually
responding to changes in ocean pH rather than just talk about how they may
respond. What SPPI finds is neither newsworthy nor catastrophic, simply that, as
has always been the case, the aquatic organisms of the worlds oceans are very
adaptable and are able to respond to environmental changes in such a way as to
avoid large-scale detrimental impacts. -
First, because it has not done so before. During the Cambrian era, 550 million years
ago, there was 20 times as much CO2 in the atmosphere as there is today: yet
that is when the calcite corals first achieved algal symbiosis. During the Jurassic
era, 175 million years ago, there was again 20 times as much CO2 as there is
today: yet that is when the delicate aragonite corals first came into being.
Secondly, ocean acidification, as a notion, suffers from the same problem of scale as
global warming. Just as the doubling of CO2 concentration expected this century
will scarcely change global mean surface temperature because there is so little
CO2 in the atmosphere in the first place, so it will scarcely change the acid-base
balance of the ocean, because there is already 70 times as much CO2 in solution
in the oceans as there is in the atmosphere. Even if all of the additional CO2 we
emit were to end up not in the atmosphere (where it might in theory cause a 4
very little warming) but in the ocean (where it would cause none), the quantity of
CO2 in the oceans would rise by little more than 1%, a trivial and entirely harmless
change. Thirdly, to imagine that CO2 causes ocean acidification is to ignore the
elementary chemistry of bicarbonate ions. Quantitatively, CO2 is only the seventhlargest of the substances in the oceans that could in theory alter the acid-base
balance, so that in any event its effect on that balance would be minuscule.
Qualitatively, however, CO2 is different from all the other substances in that it acts
as the buffering mechanism for all of them, so that it does not itself alter the acidbase balance of the oceans at all. Fourthly, as Professor Ian Plimer points out in
his excellent book Heaven and Earth (Quartet, London, 2009), the oceans slosh
around over vast acreages of rock, and rocks are pronouncedly alkaline. Seen in a
geological perspective, therefore, acidification of the oceans is impossible.
[NOAA ocean acidification program. March 12, DATA COLLECTION AND MANAGEMENT
http://oceanacidification.noaa.gov/AreasofFocus/DataCollectionandManagement.aspx//jweideman]
OAP scientists collect a variety of data to understand changing ocean chemistry and
its impacts on marine organisms and ecosystems. The National Oceanographic Data
Center (NODC) serves as the NOAA Ocean Acidification data management focal
point through its Ocean Acidification Data Stewardship (OADS) project. OA data will
be archived at and available from an ocean acidification data stewardship system at
NODC. Ocean acidification data can generally be classified as either physio-chemical or biological. Physio-chemical parameters
include, among others, pCO2 (measurement of carbon dioxide gas both in the air and dissolved in seawater), pH, total alkalinity,
shorelines. These collections can be instantaneous or from gear placed to collect organisms over time for later retrieval and
analysis. During laboratory experiments on marine organisms, scientists can measure calcification rates, shell growth, behavior,
otolith growth (for fish), metabolic rate, reproduction, among others parameters. Original datasets from all OAP research and the
No Impact-Coral Reefs
Coral Reefs Resilient as long as there is some Biodiversity
how much doesnt matter
Grimsditch and Salm 06 (Gabriel D Grimsditch holds the post of Programme Officer for oceans and climate change for
the UNEP Marine and Coastal Ecosystems Branch in Nairobi, Kenya. Before joining UNEP, Gabriel worked for the IUCN Global Marine Programme where he
coordinated the IUCN Climate Change and Coral Reefs Working Group. Rod Salm has his dissertation in marine ecology. He has 35 years experience in
international marine conservation and ecotourism. 2006. Coral Reef Resilience and Resistance to Bleaching from The World Conservation Union.
http://icriforum.org/sites/default/files/2006-042.pdf July 6, 2014)
The main ecological factor that affects coral reef resilience to bleaching is a
balanced biological and functional diversity (see Glossary) within the coral reef.
It is essential to have a balanced ecological community with sufficient species
interactions for coral reefs to recover from disturbances, and this applies not only
to bleaching but to other disturbances as well (Nystrm and Folke, 2001). An
especially important functional group (see Glossary) for coral reef resilience is
grazing animals, comprising herbivorous fish and sea urchins, among others. They
enhance coral reef resilience by preventing phase shifts from coral- dominated
reefs to algal-dominated reefs by keeping algal growth in check and allowing the
settlement of slower-growing coral recruits rather than faster-growing algae. Their
importance is highlighted in a classic example from Jamaica, where the
overfishing of predators and competitors (herbivorous fish) of the black-spined sea
urchin Diadema antillarum led to an explosion in its population, and thus to a
reduction of the diversity within the herbivorous functional group. Consequently,
grazing by D. antillarum became the primary mechanism for algal control and
crucial to coral reef recovery after Hurricane Allen in 1981. However, in 19831984, a pathogen killed off 95- 99% of its population and enabled a phase shift as
macroalgae out-competed coral. The extent to which this phase shift is
irreversible is still unclear (Nystrm et al, 2000).
Overfishing
Advantage Defense
Catch shares" are the latest fix-all solution to the world's overfishing crisis, and that's
too bad. The idea was recently promoted by Gregg Easterbrook here in The Atlantic, and NOAA Administrator
Jane Lubchenco has committed her agency to "transitioning to catch shares as a solution to overfishing.
Although it's tempting to think that property rights will transform fishing into an
industry focused on long-term sustainability, catch-shares are really just a retread of the
same "markets will fix everything" thinking that has been thoroughly discredited. Catch-
shares allocate portions of the total catch within a fishery and give fishers a property-like right to a share of the fish.
simply
fish . The catch-share approach tries to solve this problem by creating a permit trading
market. The thinking is that the permits will consolidate in the hands of "rational and
efficient" producers who will voluntarily forego using a portion of their shares. That
won't happen. The recent financial crisis ought to give anyone pause about the
claim that markets inherently promote choices that are in everyone's long-term best
interest . Regulators have sharply defined territorial jurisdictions, but fish do
not cooperate by staying in one place . Fish move between countries' Exclusive
Economic Zones (waters under the effective control of a coastal state) and the high seas. Boats
on the high seas can undermine
Canada almost went to war over this in the 1990s).
of a single state,
most
the
(Spain and
control
each boat takes only its allotted catch-share. Catch-shares also fail to
address bycatch , the dirty little secret of the fishing industry. By most estimates, at least 40
percent of every catch is discarded as bycatch --fish other than the target species, including at
least half a million endangered marine mammals and an unknown number of endangered sea turtles.
Catch-
introducing property rights on the high seas would not be sufficient to address
problems of scarcity or to inform institutional design. A final word justifying this presumption is in order.
Advocates of property rights suggest that they would serve as a corrective force for the
negative impacts of overfishing. Essentially, a property rights solution would involve the
creation of a market for individual transferable quotas (ITQs). After a TAC has been set, either an RFMO like
ICCAT could auction off quotas to interested rights holders, including corporations and states, or rights could
be granted to individual fishermen directly who could then swap and sell their rights to
others. Property rights to natural resources have never been introduced on the high
seas, largely due to the difficulties of regulating species outside of national
jurisdictions. When fisheries fall within one nation's enforcement jurisdiction, a public authority can institute a
system of property rights based on catch or fishing effort. Thus a state can create, allocate, and enforce the
allocation of property rights to fish on the basis of ITQs. This might involve setting a limit on the amount of fish that
On the high
seas, however, there is much more competition for the resource, and the role of property
rights becomes considerably more complex. Moreover, the costs of implementing,
maintaining, and enforcing a property rights solution becomes higher as the
competition for resources increases.
could be caught, and then auctioning off the fishing rights to those who want to purchase them.
Enforcement is Key Though... The crucial variable in all this is enforcement. Though nations can control
territorial waters--if they choose to, Libya for example actively turns a blind eye to overfishing in
its waters, as do a number of other nations--in the open ocean things become much more difficult.
Even with scientifically determined quotas, marine reserves and catch share systems in place, if
the rules are not enforced, it all falls apart . Going back to the bluefin example, if the quota is 15,000
tons a year but the actual take, due to utter lack of enforcement, is 60,000 tons, you simply can't manage
the fishery well. And when, as is the case with certain fisheries, you have organized crime syndicates
involved, it all gets that much more complicated.
the US, China, South Korea, Japan, Indonesia and Taiwan are responsible for 80%
of bigeye tuna caught each year. The remaining 20% is captured by vessels flagged to smaller fishing
nations. Some of the smallest nations depend on their fisheries for basic survival. In 2012, 2.6m
tonnes of tuna were extracted from the Pacific 60% of the global total. Scientists are in agreement
that
at an alarming rate. Some species are practically on the brink, with bluefin
Yet, the
organisation that has been entrusted by the international community to be the steward of
tuna fisheries in the Pacific ocean, the Western and Central Pacific Fisheries Commission, has failed to
protect the fish for yet another year. In spite of clear scientific advice regarding the need
to reduce tuna quotas, the large fishing nations that currently haul the most, have pointtuna populations being currently just 4% of what they were before industrial fishing commenced.
blank
refused to reduce their quota . Small Pacific nations have pointlessly warned of the
Overexploitation of marine living resources is a problem of both science and governance (Zha
2001: 577). As the example of Chinese domestic efforts in fisheries management show ,
many unilateral measures stay inefficient in view of the South China Sea ecosystem as a whole. On the
one hand institutional weakness and poor implementation of regulations at the
domestic level impairs any regional effort for sustainable resource management , on
the other hand, diverging national, regional and international interests lead to different objectives and outcomes.
The controversies over jurisdictional boundaries in many parts of the South China Sea and consequentially the
absence of property rights over maritime living resources has created the image of an open access resource pool. In
the high sea areas must therefore be compatible with each other.
Food prices have soared because agricultural production has not kept up with the
rising demand of cereals for food consumption, cattle feeding and biofuel production. For the
first time in decades, worldwide scarcity of food is becoming a problem . Global cereal stocks are
falling rapidly. Some predict that US wheat stocks will reach a 60-year low in 2008. Population growth in poor
countries is boosting the grain demand for food consumption . But cereal demand for the feeding
of cattle is increasing even more rapidly as consumers in both rich countries and fast growing economies are eating
The most important factor behind the sudden spike in food prices,
is the rapidly growing demand for biofuels, particularly in the EU and the US. Of total
corn production, 12% is used to make biofuel , and that share is growing fast. Concerns about
global climate change and soaring energy prices have boosted demand for biofuels . Until
more dairy and meat.
however,
recently, few voices critical of biofuels were heard, but now an increasing number of policy makers and analysts
strongly oppose converting food into fuel. In addition to directly threatening food security, there are alarming
examples of how biofuel production causes environmental harm and speeds up global warming. US ethanol
production uses large amounts of fuel, fertilizer, pesticides and water and most analysts consider its environmental
impact quite negative. And in Indonesia, Malaysia and Brazil, companies have slashed thousands of hectares of rain
financial investors speculating in commodity prices aggravate prices and increase volatility in the market.
Impact Defense
Impact Defense
Squo solves overfishing
NOAA 11 (National Oceanic and Atmospheric Administration, The Road to End Overfishing: 35 Years of
Magnuson Act, http://www.nmfs.noaa.gov/stories/2011/20110411roadendoverfishing.htm, 2011)
I want to acknowledge and highlight the 35 th anniversary of the Magnuson-Stevens Fishery Conservation and
Management Act. Simply called the Magnuson Act, this law, its regional framework and goal of sustainability, has
proven to be a visionary force in natural resource management - both domestically and internationally. The
Magnuson Act is, and will continue to be a key driver for NOAA as we deliver on our nations commitment to ocean
the U.S. is
on track to end overfishing in federally-managed fisheries, rebuild stocks, and ensure
conservation and sustainable use of our ocean resources. Fisheries harvested in the
United States are scientifically monitored, regionally managed and legally enforced
under 10 strict national standards of sustainability. This anniversary year marks a critical turning point
in the Acts history. By the end of 2011, we are on track to have an annual catch limit and
accountability measures in place for all 528 federally-managed fish stocks and
complexes. The dynamic, science-based management process envisioned by Congress is now in place, the
stewardship, sustainable fisheries, and healthy marine ecosystems Because of the Magnuson Act,
rebuilding of our fisheries is underway, and we are beginning to see real benefits for fishermen, fishing communities
and our commercial and recreational fishing industries.
sometimes closed to fishing, to allow stocks to recover. Others have been designated as marine reserves akin to
some of the technology that fishermen use to find their prey is now used by
inspectors to monitor the whereabouts of the hunters themselves. Most of these measures
have helped, as the recovery of stocks in various places has shown. Striped bass and
North Atlantic swordfish have returned along America's East Coast, for instance. Halibut have
made a comeback in Alaska. Haddock, if not cod, have begun to recover in Georges Bank off
national parks. And
Maine. And herring come and go off the coasts of Scotland. Those who doubt the value of government intervention
have only to look at the waters off Somalia, a country that has been devoid of any government worth the name
since 1991. The ensuing free-for-all has devastated the coastal stocks, ruining the livelihoods of local fishermen and
encouraging them, it seems, to take up piracy instead.
said Craig Idso, Ph.D., author of the 2009 book CO2, Global
Warming and Coral Reefs. The phenomenon of CO2-induced
ocean acidification appears to be no different.
Science Leadership
a productivity-driven, global competitiveness agenda, and also help explain the steady rise in our standard of living.
U.S. manufacturing accounts for 35 percent of value added in all of the world's high
technology production, and enjoys a trade surplus in revenues from royalties from
production processes and technology. U.S. inventors still account for more than onehalf of all patents granted in the United States (Figure 26), and the nation outpaces its rivals in
terms of industrial research and development. Technology has aided U.S. manufacturers to use less energy per unit
or dollar of production (Figure 30) and to lead all other sectors in reducing C02 emissions in the last two decades
For this reason, it is important to consider innovation in terms of processes as well as technologies.
Even when government was not the inventor, it was often the facilitator. One example: semiconductors. As a study
by the Breakthrough Institute notes, after the microchip was invented in 1958 by an engineer at Texas Instruments,
"the federal government bought virtually every microchip firms could produce." This
was particularly true of the Air Force, which needed chips to guide the new Minuteman II missiles, and NASA, which
required advanced chips for the on-board guidance computers on its Saturn rockets. " NASA
bought so many
[microchips] that manufacturers were able to achieve huge improvements in the
production processso much so, in fact, that the price of the Apollo microchip fell
from $1,000 per unit to between $20 and $30 per unit in the span of a couple
years."
No Impact-Soft Power
Soft power doesnt solve everything turns it
Quinn 2011 Adam Quinn, Lecturer in International Studies at the University of
Birmingham, previously worked at the University of Leicester and the University of
Westminster, focuses on the role of national history and ideology in shaping US
grand strategy, The art of declining politely: Obamas prudent presidency and the
waning of American power, International Affairs 87:4 (2011) 803824
http://www.chathamhouse.org/sites/default/files/87_4quinn.pdf
if we consider soft
power as a national attribute then it is difficult to separate it with confidence from
the economic and military dimensions of power. Is it really likely that Americas ideological and cultural
Nevertheless, this qualification demands two further qualifications of its own. The first is that
influence will endure undiminished in the absence of the platform of military and economic primacy upon which it has been
constructed? It may be overstatement to suggest that, borrowing Marxist terminology, hard power represents the base and soft
No Impact-Hegemony
Hegemony fails policy has shifted from helping other nations
to focusing on Americas domestic interests increasing
wealth, military power and influence dont solve
Kagan 5/26 Robert Kagan, PhD in American History, senior fellow at the
Brookings Institution and a member of the Council on Foreign Relations, 5/26/14,
Superpowers dont get to retire,
http://www.newrepublic.com/article/117859/allure-normalcy-what-america-stillowes-world
Almost 70 years ago, a new world order was born from the rubble of World War II,
built by and around the power of the United States. Today that world order shows
signs of cracking, and perhaps even collapsing. The Russia-Ukraine and Syria crises,
and the worlds tepid response, the general upheaval in the greater Middle East and
North Africa, the growing nationalist and great-power tensions in East Asia, the
worldwide advance of autocracy and retreat of democracy taken individually, these
problems are neither unprecedented nor unmanageable. But collectively they are a sign that
something is changing, and perhaps more quickly than we may imagine. They may
signal a transition into a different world order or into a world disorder of a kind not
seen since the 1930s. If a breakdown in the world order that America made is
occurring, it is not because Americas power is decliningAmericas wealth, power,
and potential influence remain adequate to meet the present challenges. It is not
because the world has become more complex and intractablethe world has
always been complex and intractable. And it is not simply war-weariness. Strangely
enough, it is an intellectual problem, a question of identity and purpose . Many Americans
and their political leaders in both parties, including President Obama, have either
forgotten or rejected the assumptions that undergirded American foreign policy for
the past seven decades. In particular, American foreign policy may be moving away
from the sense of global responsibility that equated American interests with the
interests of many others around the world and back toward the defense of narrower,
more parochial national interests. This is sometimes called isolationism, but that is not the
right word. It may be more correctly described as a search for normalcy. At the core
of American unease is a desire to shed the unusual burdens of responsibility that
previous generations of Americans took on in World War II and throughout the cold
war and to return to being a more normal kind of nation , more attuned to its own needs and
less to those of the wider world. If this is indeed what a majority of Americans seek today, then the current
period of retrenchment will not be a temporary pause before an inevitable return to
global activism. It will mark a new phase in the evolution of Americas foreign policy.
And because Americas role in shaping the world order has been so unusually
powerful and pervasive, it will also begin a new phase in the international system,
one that promises not to be marginally different but radically different from what we
have known these past 70 years. Unless Americans can be led back to an understanding of their
enlightened self-interest, to see again how their fate is entangled with that of the world, then the prospects
for a peaceful twenty-first century in which Americans and American principles can
thrive will be bleak.
the Advancement of Science, "Science and Technology Agreements as Tools for Science
Diplomacy: A U.S. Case Study," Science & Diplomacy, Vol. 1, No. 4 (December 2012).
http://www.sciencediplomacy.org/article/2012/science-and-technology-agreements-tools-forscience-diplomacy.
National Institutes of Health). The focus of this paper is on bilateral, government-wide agreements, also referred to
Scientific cooperation
between the United States and other countries is undertaken using a variety of
arrangements, from informal scientist-to-scientist collaborations to cooperation
between research institutions to formal agreements between technical agencies.
While S&T agreements are not necessary for these types of interactions, other
nations often seek S&T agreements with the United States because they carry the
weight of being legally binding and having been negotiated on behalf of the U.S.
government. These agreements endeavor to establish a framework to foster
international science collaboration while protecting intellectual property,
establishing benefit sharing, and preventing taxation of research equipment . The
as umbrella agreements, framework agreements, or simply S&T agreements.
contents of an S&T agreement usually include common features such as types of cooperative activities and ways to
encourage access to facilities and personnel, as well as clarification that some information or equipmentsuch as
There are
three areas where the agreement text often varies: (1) the preamble, which is not legally binding
and is often used to highlight the public motivations behind the agreement; (2) the intellectual property
rights annex, which delineates how the parties share and exploit intellectual
property generated; and (3) the implementation plan, including whether to establish a joint committee that
those requiring protection for national security reasonsare not covered under the agreement.
International Climate Change Conference sponsored by the Heartland Institute, held last week in Chicago. I
attended, and served as one of the speakers, talking about The Economic Implications of High Cost Energy. The
expanded regulatory and taxation powers for government bodies, or government body wannabees, such as the
response. The Heartland Institute has effectively become the international headquarters of the climate realists, an
analog to the UNs Intergovernmental Panel on Climate Change (IPCC). It has achieved that status through these
international climate conferences, and the publication of its Climate Change Reconsidered volumes, produced in
conjunction with the Nongovernmental International Panel on Climate Change (NIPCC). Those Climate Change
Reconsidered volumes are an equivalently thorough scientific rebuttal to the irregular Assessment Reports of the
UNs IPCC. You can ask any advocate of human caused catastrophic global warming what their response is to
Climate Change Reconsidered. If they have none, they are not qualified to discuss the issue intelligently. Check out
20th century temperature record, and you will find that its up and down pattern does
not follow the industrial revolutions upward march of atmospheric carbon dioxide (CO2),
the
which is the supposed central culprit for man caused global warming (and has been much, much higher in the past).
late 1990s, similar to the Atlantic Multidecadal Oscillation (AMO). In 2000, the UNs IPCC predicted that global
temperatures would rise by 1 degree Celsius by 2010. Was that based on climate science, or political science to
Easterbrook, Professor
Emeritus of Geology at Western Washington University , knew the answer. He publicly
predicted in 2000 that global temperatures would decline by 2010. He made that prediction
scare the public into accepting costly anti-industrial regulations and taxes? Don
because he knew the PDO had turned cold in 1999, something the political scientists at the UNs IPCC did not know
or did not think significant. Well, the results are in, and the winner is .Don Easterbrook.
Easterbrook also spoke at the Heartland conference, with a presentation entitled Are Forecasts of a 20-Year Cooling
Trend Credible? Watch that online and you will see how scientists are supposed to talk: cool, rational, logical
over a degree , and the gap was widening . Thats a big miss for a forecast just 10 years away,
when the same folks expect us to take seriously their predictions for 100 years in the future. Howard Hayden,
Professor of Physics Emeritus at the University of Connecticut showed in his presentation at the conference that
based on the historical record a doubling of CO2 could be expected to produce a 2 degree C temperature increase.
Such a doubling would take most of this century, and the temperature impact of increased concentrations of CO2
declines logarithmically. You can see Haydens presentation online as well. Because PDO cycles last 25 to 30 years,
Easterbrook expects the cooling trend to continue for another 2 decades or so.
Easterbrook, in fact, documents 40 such alternating periods of warming and cooling
over the past 500 years, with similar data going back 15,000 years. He further expects the flipping of the
ADO to add to the current downward trend. But that is not all. We are also currently experiencing a
surprisingly long period with very low sunspot activity . That is associated in the earths
history with even lower, colder temperatures . The pattern was seen during a period known as the
Dalton Minimum from 1790 to 1830, which saw temperature readings decline by 2 degrees in a 20 year period, and
the noted Year Without A Summer in 1816 (which may have had other contributing short term causes). Even worse
was the period known as the Maunder Minimum from 1645 to 1715, which saw only about 50 sunspots during one
30 year period within the cycle, compared to a typical 40,000 to 50,000 sunspots during such periods in modern
times. The Maunder Minimum coincided with the coldest part of the Little Ice Age, which the earth suffered from
about 1350 to 1850. The Maunder Minimum saw sharply reduced agricultural output, and widespread human
the cooling from the late 1940s to late 1970s? Or will the paucity of sunspots drive us all the way down to the
Dalton Minimum, or even the Maunder Minimum? He says it is impossible to know now. But based on experience, he
will probably know before the UN and its politicized IPCC.
a productivity-driven, global competitiveness agenda, and also help explain the steady rise in our standard of living.
U.S. manufacturing accounts for 35 percent of value added in all of the world's high
technology production, and enjoys a trade surplus in revenues from royalties from
production processes and technology. U.S. inventors still account for more than onehalf of all patents granted in the United States (Figure 26), and the nation outpaces its rivals in
terms of industrial research and development. Technology has aided U.S. manufacturers to use less energy per unit
or dollar of production (Figure 30) and to lead all other sectors in reducing C02 emissions in the last two decades
For this reason, it is important to consider innovation in terms of processes as well as technologies.
do marine planning are guaranteeing that the public and marine stakeholders will shape these decisions early on, promoting better
outcomes for everyone. Regions can define what they want to address and how they do so, in ways that reflect their unique
in the development of the National Ocean Policy and its Implementation Plan. We will update it as needed to reflect the lessons
learned in regions and ensure it continues to be a useful guide for successful, collaborative planning.
A perennial hand-wringing topic among policy geeks is Americas decline in math and science proficiency. This
sentiment has been expressed the entire 30 years Ive worked on space science and exploration new generations
dont care about space, cant do math and science, cant think properly and the countrys going to hell in a hand
basket. Complaint about the decline in our ability is something passed from one generation to the next. Todays
youth are being corrupted by iPods, Facebook and hip-hop; when I was a kid, it was Frisbees, MAD magazine and the
STEM). In this country, most Ph.D.s in science and technology are now foreign-born (these reports dont mention
back to the days of Sputnik the ping that shocked and alarmed the country. This event prompted loud public
cries for somebody to do something about the educational decline of Americas youth (corrupted then by 57
Chevys, hula-hoops and Elvis). Congress responded in the usual manner they threw money at the problem. The
National Defense Education Act of 1958 (interesting wording that) created a huge infrastructure largely dependent
upon federal funding that directly answered the Soviet challenge in space. The number of science graduates
exploded over the next couple of decades, leading many to conclude that 1) the excitement generated by the
Apollo program inspired these students to aspire to careers in science; 2) huge amounts of federal money can solve
Although Apollo is now a distant memory (or for many, a distant, historical event that
the confluence of space and education is taken
as a given by many in the business. NASA spends a designated fraction of their budget on a process
any problem.
called EPO (Education and Public Outreach), designed to inform and inspire the next generation of scientists and
explorers. As you might expect, these efforts range from the interesting and innovative to the embarrassing
A perception has emerged that the problem lies not with the
methodology, but with the product because we are not doing anything in space
that is exciting, we arent producing quality scientists and engineers. This may well
(though well intentioned).
account for what sensible students with an eye toward putting food on the table after they graduate choose to
study. Then too, perhaps there are too many in the field already. But with effort, excellence will find productive
work; self-esteem and entitlement will not cut it in the long run, no matter what your field of endeavor. Recently,
had the opportunity to directly interact with students at two ends of the education
pipeline and found the experience highly encouraging. In the first case, the father of a local second-grader
asked if his son could visit and interview me. The boy had chosen to write a semester paper (in second grade??)
about the Moon. The child was both well spoken and well informed . He asked relevant and very
intelligent questions. What is the value of the Moon? What do we want to know about it and how do we find these
things out? Can people live there? I found his questions and understanding of the problems and benefits of
exploring the Moon to be at a very high level (much higher than many adult reporters who call me). Then he asked
me an unexpected question: How fast does the Moon travel through space? After initially drawing a complete blank,
I suggested that we find out together and went on to calculate it on the spot. We concluded that the Moon flies
around the Earth at over 2200 miles per hour (much faster than he traveled down the freeway to visit me). He was
delighted by this episode of science in action. I was delighted to be challenged by his understanding and his
building a stable space program that will give us long-term benefits a step-wise, incremental program that
gradually increases the extent of our reach into space. Compared to the current policy chaos, it just might be
inspirational too.
of Government and Deputy Assistant to the President and Deputy National Security
Advisor for Strategic Planning, The Geopolitical Consequences of the World
Economic RecessionA Caution,
http://www.rand.org/pubs/occasional_papers/OP275.html
Did the economic slump lead to strategic amendments in the way Japan sees the world? No. Did
it slow the pace of Indias emergence as a rising great power? No. To the contrary, the new Congress-led government in
New Delhi will accelerate that process. Did it alter Irans apparent determination to acquire a nuclear capability or
something close to it? No. Was it a prime cause of the recent domestic crisis and instability in Iran after its 2009 presidential
election? No. Did it slow or
accelerate the moderate Arab states intent to move along the nuclear path? No. Did it affect
North Koreas destabilizing nuclear calculations? No. Did it importantly weaken political reconciliation in Iraq? No, because
there is almost none in any case. Did it slow the Middle East peace process? No, not least because prospects for progress on issues
between Israel and the Palestinians are the most unpromising in 25 years. Did it substantially
international challenges associated with the growth of Jihadiism in Pakistan? No. But at the same time, it is important to stress
that Pakistan, quite apart from the global recession, is the epicenter of global terrorism and now represents potentially the most
dangerous international situation since the 1962 Cuban Missile Crisis. Did the global economic downturn systemically affect the
future of Afghanistan? No. The fact that the United States is doing badly in the war in Afghanistan has nothing to do with the
economic deterioration. As Henry Kissinger observes, The conventional army loses if it does not win. The guerrilla wins if he does
not lose. And NATO is not winning in Afghanistan. Did it change in a major way the future of the Mexican state? No. Did the
downturn
make Europe, because of its domestic politics, less willing and able over time to join the U.S. in
effective alliance policies? No, there will likely be no basic variations in Europes external policies and no serious evolution in
transatlantic relations. As President Obama is experiencing regarding Europe, the problems with European publics in this regard are
civilizational in character, not especially tied to this recessionin general, European publics do not wish their nations to take on
foreign missions that entail the use of force and possible loss of life. Did the downturn slow further EU integration? Perhaps, at the
margin, but in any case one has to watch closely to see if EU integration moves like a turtle or like a rock. And so forth.
To be
clear, there will inevitably be major challenges in the international situation in the next five
years. In fact, this will be the most dangerous and chaotic global period since before the 1973 Middle East war. But it is not
obvious that these disturbing developments will be primarily a result of the global
economic problems. It is, of course, important to be alert to primary and enduring international discontinuities. If
such a convulsive geopolitical event is out there, what is it? One that comes to mind is another
catastrophic attack on the American homeland. Another is the collapse of Pakistan and the loss of government control of its nuclear
arsenal to Islamic extremists. But again, neither of these two geopolitical calamities would be connected to the current economic
Some argue that, even though geopolitical changes resulting from the current global economic
tribulations are not yet apparent, they are occurring beneath the surface of the international
system and will become manifest in the years to come. In short, causality not perceptible now will become so.
decline.
This subterranean argument is difficult to rebut. To test that hypothesis, the obvious analytical method is to seek tangible data that
demonstrates that it is so. In short, show A, B, and/or C (in this case, geopolitical transformations caused by the world slump) to
have occurred, thus substantiating the contention. One could then examine said postulated evidence and come to a judgment
asking, as the magisterial American soldier/statesman George Marshall often did, Why might I be wrong? If the global economic
numbers continue to decline next year and the year after, one must wonder whether any region would remain stable whether
China would maintain internal stability, whether the United States would continue as the pillar of international order, and whether
the European Union would hold together. In that same vein, it is unclear today what effect, if any, the reckless financial lending and
huge public debt that the United States is accumulating, as well as current massive governmental fiscal and monetary intervention
in the American economy, will have on U.S. economic dynamism, entrepreneurial creativity, and, consequently, power projection
over the very long term. One can only speculate on that issue at present, but it is certainly worth worrying about, and it is the most
important known unknown27 regarding this subject.28 In addition, perhaps the Chinese Communist Partys grip on China is more
fragile than posited here, and possibly Pakistan and Mexico are much more vulnerable to failed-state outcomes primarily because of
Robert Kaplan puts it, to embrace geography is not to accept it as an implacable force against which humankind is powerless.
Rather, it serves to qualify human freedom and choice with a modest acceptance of fate.29 In this connection, see in particular the
works of Sir Halford John Mackinder and his The Geographical Pivot of History (1904)30, and Alfred Thayer Mahan, The Influence of
Sea Power upon History, 16601783 (1890).31 Demographythe size, birth rate, growth, density, ethnicity, literacy, religions,
migration/emigration/ assimilation/absorption, and industriousness of the population. The histories, foreign and defense policy
character, capabilities, and policies of neighboring states. For the countries that matter most in the global order, perhaps
none of these decisive variables have changed very much since the global
downturn began, except for nations weaker economic performances. That single factor is not likely to
trump all these other abiding geopolitical determinants and therefore produce international
structural change. Moreover, the fundamental power relationships between and among the
worlds foremost countries have also not altered, nor have those nations perceptions of their vital
national interests and how best to promote and defend them. To sum up this pivotal concept, in the absence of war,
revolution, or other extreme international or domestic disruptions, for nation-states, the powerful abiding
conditions just listed do not evolve much except over the very long term , and thus neither do
unsurprisingly,
countries strategic intent and core external policies even, as today, in the face of world economic trials. This point was made
earlier about Russias enduring national security goals, which go back hundreds of years. Similarly, a Gulf monarch recently advised
with respect to Irannot to fasten on the views of President Ahmadinejad or Supreme Leader Khamenei. Rather, he counseled
that, to best understand contemporary Iranian policy, one should more usefully read the histories, objectives, and strategies of the
Persian kings Cyrus, Darius, and Xerxes, who successively ruled a vast empire around 500 BC.32 The American filmmaker Orson
Welles once opined that To
occupation
Disease
No Solvency
They cant solve disease anti-vaccination movement gives
disease a foothold within communities and spreads from there
Kollipara 5/5 Puneet Kollipara, Journalist for the Washington Post, 5/5/2014,
How the Anti-Vaccine Movement is Endangering Lives
http://www.washingtonpost.com/blogs/wonkblog/wp/2014/05/05/how-the-antivaccine-movement-is-endangering-lives/
Infectious diseases that we normally think of as rare in the United States are making
a comeback. In recent years, pertussis -- also known as whooping cough -- has returned to the headlines. A measles outbreak
that struck a Texas megachurch community late last summer sickened 21 people. And just recently, at least 16 people got sick
effectively eliminated in the United States. What's going on, exactly? Here are some answers. Why are so many outbreaks
parents are increasingly seeking personal or religious exemptions from vaccination requirements for their kids to attend schools.
Substandard vaccination rates create an opening for outbreaks, which often start
when an unvaccinated person catches the disease while traveling abroad and
spreads the illness to friends and family upon returning . But aren't overall vaccination rates really
high? Nationally, yes, though it wasn't always this way. Before the 1990s, rates languished below 80 percent for most childhood
vaccines at the time. In 1993, after the 1989-1991 measles outbreak, Congress enacted the Vaccines for Children (VFC) program to
promote childhood vaccinations. CDC data show that vaccination rates are now above 90 percent range for several routine vaccines,
including the measles-mumps-rubella (MMR) and whooping cough vaccines. Public health officials target a 90 percent vaccination
vaccination rates climbed after VFC's enactment: If vaccination rates are high, why are we seeing so many outbreaks? That's
California. Why are people not vaccinating their kids? There are a number of factors at play. Many of the diseases we vaccinate
against are so rare here now that the public's awareness of vaccination might have decreased. But the one reason that has most
gives a resounding no. Anti-vaccine activists often hang their case on a study published in the British journal The Lancet in 1998.
This study, which posited a link between the MMR vaccine and autism, was widely discredited by the scientific community and
that vaccines once routinely contained thimerosal, which government officials recognized as generally safe. But this preservative
has been phased out of nearly all vaccines as a precautionary measure. Anti-vaccine activists also worry that the CDC's
recommended vaccination schedule could overwhelm infants' immune systems by packing too many doses into a short period of
time. Although the number of vaccinations that kids receive now is higher than it used to be, the main ingredients in the vaccines
have actually decreased in amount. Even if these ingredient amounts hadn't decreased, research has found no relationship between
those amounts and autism risk. Vaccines do carry a risk of side effects, but they are usually minor. The CDC has concluded from
reviewing the scientific evidence that there's no causal link between childhood vaccinations and autism.
No Solvency-Cant contain
The aff cant solve, containment is impossible
Blancou et al 2005 Jean Blancou, former General Director of OEI, Bruno
Chomel, Researcher for WHO/PAHO Collaborating Center on New and Emerging
Zoonoses, Albino Belotto, Researcher for Veterinary Public Health Unit, Pan
American Health Organization, Franois Meslin, Researcher for Strategy
Development and Monitoring of Zoonoses, Food-borne Diseases and Kinetoplastidae
(ZFK),
Communicable Diseases Control Prevention and Eradication Emerging or reemerging bacterial zoonoses: factors of emergence, surveillance and control, Vet.
Res. 36, pg 507-522, http://hal.archives-ouvertes.fr/docs/00/90/29/77/PDF/hal00902977.pdf
The main obstacles that are encountered in the control of bacterial zoonoses are the same as those opposed to the
control of any infectious disease, that is most often finan- cial and human obstacles rather than tech- nical
limitations.
The financial resources needed to effi- ciently fight against zoonotic agents
are not available for all countries. Only the international communitys financial
support, could, notably, allow developing countries to organize a proper control of
zoonotic dis- eases, but it is rare that this is materialized as a financial gift and
mobilization of spe- cific funds, even by well-known interna- tional organizations (such as WHO, FAO,
OIE), is limited for such diseases. Due to all these difficulties, many sanitary authorities of these countries have
given up the estab- lishment of such prevention programs. Oth- ers manage, with a lot of perseverance, to
elaborate complicated multilateral financial arrangements. This allows punctual projects to be realized, but rarely to
establish the long-term prophylaxis plans that they really need. When financial and material problems are
supposedly solved, human-related dif- ficulties should not be underestimated . These
difficulties can originate within the services in charge of applying the national prophy- laxis plans, when these
services are not themselves convinced of the good use of these plans, or when they do not seem to get specific
control of some zoonotic dis- eases may simply be impossible in some countries.
No Extinction
---Super viruses wont cause extinction
(A.) Burnout.
Lafee 2009
Scott, Union-Tribune Staff Writer, Viruses versus hosts: a battle as old as time, May 3 rd,
http://www.signonsandiego.com/news/2009/may/03/1n3virus01745-viruses-versus-hosts-battle-old-time/?uniontrib
Generally speaking, it's not in a virus's best interest to kill its host. Deadly viruses such as Ebola and SARS are self-limiting
because
Flu viruses do kill, but they aren't considered especially deadly. The
fatality rate of the 1918 Spanish flu pandemic was less than 2.5 percent, and most of those deaths are now attributed to secondary bacterial
infections. The historic fatality rate for influenza pandemics is less than 0.1 percent. Humans make
imperfect hosts for the nastiest flu viruses, Sette said. From the point of view of the virus, infecting humans can be a
dead end. We sicken and die too soon.
(C.) Co-evolution.
Posner 2005
Richard, Judge, 7th Circuit court of Appeals, Catastrophe: Risk and Response, pg. 22
AIDS illustrates the further point that despite the progress made by modern medicine in the diagnosis and treatment of diseases, developing a
vaccine or cure for a new (or newly recognized or newly virulent) disease may be difficult, protracted, even impossible. Progress has been made
in treating ATDS, but neither a cure nor a vaccine has yet been developed. And because the virus's mutation rate is high, the treatments may not
work in the long run.7 Rapidly mutating viruses are difficult to vaccinate against, which is why there is no vaccine for the common cold and why
flu vaccines provide only limited protection.8 Paradoxically, a treatment that is neither cure nor vaccine, but merely reduces the severity of a
disease, may accelerate its spread by reducing the benefit from avoiding becoming infected. This is an important consideration with respect to
AIDS, which is spread mainly by voluntary intimate contact with infected people. Yet the fact that Homo sapiens has managed to survive every
disease to assail it in the 200,000 years or so of its existence is a source of genuine comfort, at least if the focus is on extinction events. There
have been enormously destaictive plagues, such as the Black Death, smallpox, and now AIDS, but none has come close to destroying the entire
human race. There is a biological reason. Natural selection favors germs of limited lethality; they are fitter in an
evolutionary sense because their genes are more likely to be spread if the germs do not kill their hosts too
quickly. The AIDS virus is an example of a lethal virus, wholly natural, that by lying dormant yet infectious in its host for years maximizes its
spread. Yet there is no danger that AIDS will destroy the entire human race. The likelihood of a natural pandemic that would
cause the extinction of the human race is probably even less today than in the past (except in prehistoric times,
when people lived in small, scattered bands, which would have limited the spread of disease), despite wider human contacts that make it more
difficult to localize an infectious disease. The reason is improvements in medical science. But the comfort is a small one. Pandemics can still
impose enormous losses and resist prevention and cure: the lesson of the AIDS pandemic. And there is always a first time.
No Impact-Zoonotic disease
Zoonotic diseases are less threatening than others they are
contained now
Torres 1999 Alfonso Torres, D.V.M., M.S., Ph.D., Deputy Administrator, USDA,
Animal Plant and Health Inspection Service, Veterinary Services, 2/6/99,
International Economic Considerations Concerning Agricultural Diseases and
Human Health Costs of Zoonotic Diseases, Annals of the New York Academy of
Sciences 894:80-82, http://onlinelibrary.wiley.com/doi/10.1111/j.17496632.1999.tb08047.x/abstract
Animal diseases can negatively affect the number and availability of animals, their productivity, or their
appearance. 1 A few centuries ago, animal diseases affected mostly individual owners or herdsmen, but did not
have serious consequences on the larger community. A similar event today will not only have a negative impact on
the animal owners, but more importantly, will significantly affect the general economy of the region, the entire
many infectious diseases were once all but eliminated from the United States ,
climate change is a factor that could help them expand their range
and make a comeback. Mosquitoes capable of carrying and transmitting diseases
like Dengue Fever, for example, now live in at least 28 states. As temperatures
increase and rainfall patterns change - and summers become longer - these insects
can remain active for longer seasons and in wider areas, greatly increasing the risk
for people who live there. The same is true on a global scale: increases in heat, precipitation,
and humidity can allow tropical and subtropical insects to move from regions where
infectious diseases thrive into new places . This, coupled with increased international travel to and
from all 50 states, means that the U.S. is increasingly at risk for becoming home to these
new diseases. Nearly 4,000 cases of imported and locally-transmitted Dengue Fever
were reported in the U.S. between 1995 and 2005, and that number rises to 10,000 when cases
While
in the Texas-Mexico border region are included. In Florida, 28 locally-transmitted cases were reported in a 20092010 outbreak, the first there in more than 40 years. Dengue Fever, also known as "Breakbone Fever", is
characterized by high fever, headaches, bone and joint aches, and a rash. Recurrent infection can lead to bleeding,
Lyme disease - transmitted primarily through bites from certain tick species - could
expand throughout the United States and northward into Canada , as temperatures warm,
allowing ticks to move into new regions. West Nile virus, which first entered the U.S. in 1999,
expanded rapidly westward across the country. By 2005, over 16,000 cases had
been reported. Warmer temperatures, heavy rainfall and high humidity have
reportedly increased the rate of human infection .
seizures, and death.
Sea Power
First place on the list is no surprise: the United States Navy. The U.S. Navy has the
most ships by far of any navy worldwide. It also has the greatest diversity of
missions and the largest area of responsibility. No other navy has the global reach
of the U.S. Navy, which regularly operates in the Pacific, Atlantic and Indian
Oceans, as well as the Mediterranean, Persian Gulf and the Horn of Africa. The
U.S. Navy also forward deploys ships to Japan, Europe and the Persian Gulf. The
U.S. Navy has 288 battle force ships, of which typically a third are underway at any
given time. The U.S. Navy has 10 aircraft carriers, nine amphibious assault ships,
22 cruisers, 62 destroyers, 17 frigates and 72 submarines. In addition to ships, the
U.S. Navy has 3,700 aircraft, making it the second largest air force in the world. At
323,000 active and 109,000 personnel, it is also the largest navy in terms of
manpower. What makes the U.S. Navy stand out the most is its 10 aircraft carriers
more than the rest of the world put together. Not only are there more of them,
theyre also much bigger: a single Nimitz-class aircraft carrier can carry twice as
many planes (72) as the next largest foreign carrier. Unlike the air wings of other
countries, which typically concentrate on fighters, a typical U.S. carrier air wing is
a balanced package capable of air superiority, strike, reconnaissance, antisubmarine warfare and humanitarian assistance/disaster relief missions. The U.S.
Navys 31 amphibious ships make it the largest gator fleet in the world, capable
of transporting and landing on hostile beaches. The nine amphibious assault ships
of the Tarawa and Wasp classes can carry helicopters to ferry troops or act as
miniature aircraft carriers, equipped with AV-8B Harrier attack jets and soon F-35B
fighter-bombers. The U.S. Navy has 54 nuclear attack submarines, a mix of the Los
Angeles, Seawolf, and Virginia classes. The U.S. Navy is also responsible for the
United States strategic nuclear deterrent at sea, with 14 Ohio-class ballistic
missile submarines equipped with a total of 336 Trident nuclear missiles. The USN
also has four Ohio-class submarines stripped of nuclear missiles and modified to
carry 154 Tomahawk land attack missiles. The U.S. Navy has the additional roles of
ballistic missile defense, space operations and humanitarian assistance/disaster
relief. As of October 2013, 29 cruisers and destroyers were capable of intercepting
ballistic missiles, with several forward deployed to Europe and Japan. It also
monitors space in support of U.S. military forces, tracking the satellites of potential
adversaries. Finally, the U.S. Navys existing aircraft carriers and amphibious
vessels, plus the dedicated hospital ships USNS Mercy and USNS Comfort,
constitute a disaster relief capability that has been deployed in recent years to
Indonesia, Haiti, Japan and the Philippines.
Chinas plans for a blue-water navy centered on aircraft carriers follow a course the
U.S. Navy plotted more than half a century ago. This head start gives the U.S. an
insurmountable lead in operational experience and military hardware. The
Liaoningis a refurbished Soviet warship with a conventional (not nuclear) power
plant and a tilted deck that severely limits the range and payload of its aircraft. It
would be no match for any of the U.S. Navy's 11 nuclear-powered carriers, each
carrying twice as many aircraft and state-of-the-art catapults for launching them.
Chinas planned new carriers will be far more capable than the Liaoning -- but at a
crippling price. The newest U.S. carrier will cost $13 billion. Add roughly 60
aircraft, 6,700 sailors, and other vessels to support and protect it, and a single U.S.
carrier strike group costs roughly $6.5 million a day to operate. Such outlays are a
questionable investment even for the U.S., let alone China. China's blue-water
ambitions will drain a significant portion of its military budget to a sphere it cannot
hope to dominate. Better still, the strategic aims that China can reasonably
advance with a carrier-based navy are relatively benign. Such a force could protect
Chinas energy lifeline of oil tankers stretching to the Middle East, which is
occasionally threatened by pirates. It could more effectively respond to
humanitarian disasters and, if the need arose, evacuate Chinese citizens working
in far-flung locations. In addition, of course, China's government wants its bluewater navy for prestige, as a symbol of the country's rise to great-power status.
That's fine, too -- so long as it poses no threat to U.S. naval dominance. When it
comes to Chinese military power, aircraft carriers are the least of the world's
concerns.
Lockheed Martin [NYSE:LMT] will work to enhance how the Navy exchanges C4ISR
data throughout the space, air, surface, subsurface, and unmanned sensor domains
under a contract with Space and Naval Warfare Systems Center Pacific. This IDIQ
contract has a ceiling value of $35 million over five years. For the Navy, every
platform is a sensor, and every sensor must be networked, said Dr. Rob Smith,
vice president of C4ISR for Lockheed Martin Information Systems and Global
Solutions. Well leverage our more than 30 years developing and fielding signals
intelligence systems to increase the Navys intelligence sharing capability across
the full spectrum of maritime and littoral missions. Lockheed Martin co-developed
the Navys Distributed Information Operations-System, which addresses the Navys
need for network-centric intelligence to improve interoperability and enhance
battlespace awareness. For that effort, Lockheed Martin connected disparate Navy
signals intelligence systems facilitating tactical data exchange and allowing
commanders to better understand their operational environment. Building upon
those capabilities, Lockheed Martin will to continue to enhance the Navys signals
intelligence collection, data fusion, and intelligence processing and dissemination
capabilities. This could involve integrating and deploying capabilities that monitor
the status of all sensors registered in the network; then displaying the input from
those sensors in support of real-time planning. Network integration of sensors will
be designed to accomplish cross-cueing, cooperative sensing and, where feasible
and prudent, automated target recognition or classification. The workscope for this
contract also includes analyzing ways to enhance the Navys use of Unmanned
Aerial Vehicles (UAVs) for surface combatant land attacks.
Lockheed Martin (NYSE: LMT) said it will work to enhance how the Navy exchanges
C4ISR data throughout the space, air, surface, subsurface, and unmanned sensor
domains under a contract with Space and Naval Warfare Systems Center Pacific.
This IDIQ contract has a ceiling value of USD35m over five years. Lockheed
Martin co-developed the Navy's Distributed Information Operations-System, which
addresses the Navy's need for network-centric intelligence to improve
interoperability and enhance battlespace awareness. For that effort, Lockheed
Martin connected disparate Navy signals intelligence systems facilitating tactical
data exchange and allowing commanders to better understand their operational
environment.
Cheng 14(Joey, June 30, Navy awards $35M contract to boost C4ISR info sharing,
Defense Systems, Joey Cheng is a writer for Defense Systems Magazine,
http://defensesystems.com/articles/2014/06/30/spawar-dio-s-unpgrades-lockheed.aspx)
In an effort to improve information sharing from its multitude of sensors, the Navy
has awarded Lockheed Martin a contract to enhance the services C4ISR data
exchange capabilities. The indefinite delivery/indefinite quantity contract with the
Space and Naval Warfare Systems Center Pacific has a ceiling of up to $35 million
over five years, and would upgrade how the Navys space, air, surface, subsurface
and unmanned sensors would collect and disseminate data, according to a
Lockheed Martin release. SPAWAR is the Navys Information Dominance system
command, and is responsible for the development of communications and
information capabilities for warfighters. The Navys Distributed Information
Operations-System, originally co-developed by Lockheed, was designed to improve
interoperability and enhance battlespace awareness through network-centric
intelligence, and connects a variety of signals intelligence systems for tactical data
exchange. Leveraging its experience with DIO-S, Lockheed may be working to
implement a monitoring system capable of checking the statuses of all of the
sensors registered in its network in the future. The same system would then
display the input from the sensors for real-time planning. Further integration of the
sensors would allow cross-cueing and cooperative sensing, as well as automated
target recognition. The scope of the contract also includes possible enhancements
of the Navys use of unmanned aerial vehicles for ground attacks. Lockheed Martin
also is a co-developer of the Distributed Common Ground System , which seeks to
integrate multiple intelligence, surveillance and reconnaissance (ISR) sensors from
different services and agencies into a common network. The Navys version, DCGSN, is the services primary ISR&T (the T is for Targeting) support system and
provides processing, exploitation, and dissemination capabilities for the
operational and tactical level. These systems are designed to allow modularity,
flexibility, and standardization for integrating data sources, transformation services,
and user interfaces, according to Lockheed Martin. For the Navy, every platform is
a sensor, and every sensor must be networked, said Dr. Rob Smith, vice president
of C4ISR for Lockheed Martin Information Systems and Global Solutions. Well
leverage our more than 30 years developing and fielding signals intelligence
systems to increase the Navys intelligence sharing capability across the full
spectrum of maritime and littoral missions.
will be fielded to extend orgnic sensor capability and capacity; commercial spacebased imaging and Automatic lnformation Systems (AIS) collection systems will
proliferate, and an emergent capability to task, plan and direct organic and
Planning tools with Theater Security Cooperation capabilities and mission partner
capability information sources will become better integrated; ' Improved BA
training fi)r all operators and watchstanders will be developed.
1NCInformation Overload
IOOS failsoverloads system with too much data
RAND 14(The RAND Corporation is a nonprofit institution that helps improve policy
and decisionmaking through research and analysis, Data Flood Helping the Navy Address
the Rising Tide of Sensor Information,
http://www.rand.org/content/dam/rand/pubs/research_reports/RR300/RR315/RAND_RR315.
pdf)
Despite the bat"t1etested value of ISR systems, however, the large amount
of data diey gener- ate has become overwhelming to Navy analysts. As the
Intelligence Science Board wrote in 2008, referring to d'1C entire
Department of Defense (DoD), the number of images and signal intercepts are
well beyond the capacity of the existing analyst community, so there are huge
bacldogs For tra.nslators and image interpreters, and much of the collected data
are never reviewed." This is a good description of the Navys big data
challenge.
2NCInformation Overload
IOOS doesnt solveinformation overload
RAND 14(The RAND Corporation is a nonprofit institution that helps improve policy
and decisionmaking through research and analysis, Data Flood Helping the Navy Address
the Rising Tide of Sensor Information,
http://www.rand.org/content/dam/rand/pubs/research_reports/RR300/RR315/RAND_RR315.
pdf)
but does not change the way it processes, exploits, and disseminates
information, it will reach an ISR tipping point"the point at which intelligence
analysts are no longer able to complete a minimum number of exploitation tasks
Were only now at the point where were starting to put up new UAVs with
incredible sensors, so the problem is only going to get worse, said Isaac R. Porche
III, a senior researcher with RAND and co-author of the report. Porche said the
Navy had argued for more manpower to deal with the growing volumes of data, but
budgetary pressures forced it to seek other options to improve efficiency. RAND,
which was hired to do a quantitative assessment, looked at the imagery, video and
audio that the Navy was collecting from UAVs, studying how long it took for its
analysts to process data from several siloed databases using different desktop
applications. The report, Data Flood: Helping the Navy Address the Rising Tide of
Sensor Information, concluded that the Navy may reach a tipping point as early as
2016 when analysts are no longer able to complete a minimum number of
exploitation tasks within given time constraints. In other words, analysts will be
much less effective than they are now in finding value in the exponentially
increasing data if the Navy doesnt change the way it collects, processes, uses, and
distributes that data.
The U.S. Navy loves sensors. The same gyrometers and motion detectors that fuel
smartphones and Microsofts Kinect also keep drones in the skies and aircraft
carriers in sharp order. But the Navy--which is the size of a large global
megacorporation and also saddled with a huge bureaucracy--also has problems
dealing with the flood of data that sensors create. The RAND Corporation is arguing
for something unorthodox to deal with that data influx: A cloud for the Navy
(PDF).What do you think? In a paper called, succinctly, Data Flood, think tank
analyst Isaac R. Porche argues that the United States Navy needs a private cloud
service to cope with the influx of data created by sensors on unmanned aerial
vehicles (UAVs) and other smart military devices. Porche and his coauthors argue
that theres a precedent for creating a private cloud service for the military--the
little-known fact that the NSA, CIA, and other American intelligence agencies are
building their own cloud. In late October, Amazon Web Services reportedly sealed a
$600 million, 10-year deal with the CIA to develop a custom cloud infrastructure for
U.S. intelligence agents. Porche and his team argue that if the Navy doesnt adopt
a cloud infrastructure, there will be a tipping point as early as 2016 where analysts
will become less and less effective because bandwidth choke will prevent access to
the information they need. 2016 is the year when the number of UAS (drone)
systems will be high enough and generating enough sensor data/imagery to
overwhelm human analysts, Porche told Co.Labs by email.
even aware of what is critical in light of so much unfiltered and unverified gatherings. The effort required to produce
pertinent and actionable intelligence , or even timely and worthwhile information, oftentimes suffers from such an
approach to information and intelligence gathering. Without launching a long dissertation regarding NCW and the problems created by
information inundation resulting from a sensor/shooter marriage across a massive battlefield, I believe such a term as Information
Dominance pays short shrift to a less sophisticated enemy whose extensive lowtech networks of informants are capable of negating billions of dollars of maritime stealth
technology (or finding the right merchant ship in a 400-mile long shipping lane) by using the eyeball, the messenger, the motor scooter, or the
sandal. Such an enemy , I would argue, has a much clearer and simpler set of information
requirements, and is far more skilled, more disciplined, and much more successful at meeting those
Terms such as
Information Dominance ring of a grandiose bit of self-delusion that is part plan and part capability, with a large
requirements than we are. So, who is dominant in the information game? One could argue very effectively that it is not us.
measure of wishful thinking. In the words of a certain retired Marine Corps General, words mean things. They also reveal a lot about who uses them, and
how. The term Information Dominance gives me the willies. Can we call it something a little less presumtuous?
According to Dr. John Mueller, the United States has been and will continue to be
substantially free from threats that require a great deal of military
preparedness. In his view, there are no major threats to U.S. security, and there
have been none since the end of the Second World War. During the Cold War, the
United States spent trillions of dollars to deter a direct military threat that did not
exist, since the Soviet Union had no intention of launching an unprovoked attack
on Europe or the United States. Despite the continued absence of significant
threats today, the United States is still engaged in a number of conflicts in an
effort to make the world look and act the way we want. In reality, however, most
modern security issues are not really military in nature; rather, they are policing
and diplomatic activities that do not require substantial U.S. military involvement.
While isolationism is not a viable policy, the United States does not need to use its
military to solve all of the problems in the world. V 17 Dr. Mueller argued that
the absence of war among developed countries since 1945 is the greatest single
development about war in history. The end of the Cold War ushered in a New
World Order, and from 1989 to 2000 the United States was engaged in what Dr.
Mueller called policing wars. There was very little domestic support for most of
these ventures, however, because there was a strong U.S. public aversion to
nation building, a low tolerance for casualties, and a lack of concrete political
gains from success.
Recent experience suggests that the lessons learned with such difficulty by the
allies when they confronted the chaos ensuing from the sudden collapse of Nazi
Germany in 1945 were major casualties of the Cold War. In Afghanistan and Iraq,
the United States and its allies are being reminded painfully that stabilization and
reconstruction require larger numbers of troops on the ground for much longer
than preintervention planners might have thought necessary, and that the
soldiers in question need much more than "mere" warlighting skills. To relearn
these lessons will likely require a shift in the organizational cultures of the armed
services? Navies and air forces around the world have drawn from this experience
the obvious conclusion that future defense priorities in countries with similar
interventionist aspirations are likely to reflect a grow- ing relative emphasis on the
provision of intelligently trained and responsive "boots on the ground." With
resources being finite, defense expenditure on those aspects of air and naval
forces whose function seems less than wholly related to this central aim seem
likely to be limited.3
On the other hand, it also suggests the possibility that the Army is right and that if
forward presence is to matter it needs to be on the ground, that an offshore
presence of a potent but limited force, with only the implicit threat of surged ground
forces, is less likely to have an impact, at least if the potential aggressor has limited
goals. It also suggests the possibility that the symbolism of naval forward
presence, serving as a reminder of the full weight and power the United States
could ultimately bring to bear, may not be that important. In war, the argument
that forward naval forces operating with a littoral strategy can have an important
impact in the initial phases of the conflict, thereby preparing the ground for later
U.S. successes, is doubtless true. While true, however, it may well be relevant in
only a limited range of cases. Most potential conflicts or contingencies involve
adversaries who are too small for this effect to matter much. Short of a major
regional conflict ( MRC ), the superiority of U.S. military forces is sufficiently overwhelming that initial setbacks are not likely to be critically important. At the other
extreme, in the case of a regional near-peer competitora Russia or a Chinait is
hard to imagine a littoral strategy having much of an impact: the amount of
(nonnuclear)power that can be projected from the sea is trivial compared to the size
of the adversarys society or military establishment.
In crisis, the forward-deployed capacity to project power "from the sea" is touted as
having an immediate deterrent effectthat is, dissuading an adversary who is
tentatively considering going to war from following through on that idea. Here we
do have some evidence; at very best, however, it must be regarded as offering
mixed support for the Navy's advocacy of a littoral approach. A variety of studies of
conventional deterrence have been undertaken.60 While the research questions,
underlying theoretical assumptions, and research methods have varied, several
general findings emerge. The principal one is that immediate extended deterrence
with conventional meansthat is, using threats of conventional response to deter
an adversary who is considering aggression against a third partyregularly fails,
even in cases where commitments are "clearly defined, repeatedly publicized and
defensible, and the committed [gives] every indication of its intentions to defend
them by force if necessary."61 Unlike nuclear deterrence, conventional deterrence
does not appear to result in a robust, stable stalemate but in a fluid and competitive
strategic interaction that, at best, buys time during which underlying disputes or
antagonisms can be resolved. The possession of decisive conventional military
superiority and the visible demonstration of a resolve will not necessarily permit
the United States to deter attacks on friends and interests. There are three
reasons why immediate extended conventional deterrence is so problematic. First,
potential aggressors are sometimes so strongly motivated to challenge the status
quo that they are willing to run a high risk, or even the certainty, of paying the lessthan-total costs of losing a war. Second, potential aggressors frequently conclude,
correctly or incorrectly, that they have developed a military option that has
politically or militarily "designed around" the deterrent threat. Third, there is
considerable evidence that, particularly when they are under severe domestic
stress, potential aggressors are unable to understand or respond rationally to
deterrent threats. "Wishful thinking" by leaders who find themselves caught in a
difficult situation appears to be an all-too-common pathology. Further, and more
germane to the issue of naval forward presence as a crisis deterrent tool, there is
some evidence that because of the general insensitivity of potential aggressors to
information, efforts to "signal" resolve through measures such as reinforcing or
redeploying forces have limited effectiveness. If force movements are large enough
to foreclose particular military options, they may forestall aggression. But as a
means of indicating resolve and convincing an aggressor of the credibility of
deterrent commitments, they do not generally appear to have an impact.
The first and most immediate danger is from competitors to the littoral strategy:
there are, as Army and Air Force voices have noted, a variety of ways besides
projecting power "from the sea" to support a liberal-internationalist foreign policy
and to fight a transoceanic-countermilitary war. While budgetary realities have
stimulated this strategic competition between the services and are likely to
continue to serve as the spur, it would be wrong to dismiss this challenge to the
littoral strategy as mere interservice rivalry or budgetary gamesmanship. Rather,
what has developed is a serious, if admittedly parochially grounded, intellectual
debate over alternative national military strategiesover alternative ways to use
America's military potential in support of "engagement and enlargement." While a
littoral naval strategy is consistent with a liberal-internationalist vision of national
security and a transoceanic-countermilitary image of war, it is not the only military
strategy of which that can be said, and the Army and Air Force have successfully
articulated alternative military strategies that call into question the need for
significant naval effort in the littorals. The second danger, linked to the first, is that
the Navy may be unable to develop a workable operational concept for putting the
littoral strategy into effect. Indeed, the Navy has found it remarkably difficult to
script a convincing story about precisely how a littoral strategy worksthat is, the
Navy has had a hard time identifying what it is about naval operations in the
littorals that yields political-military leverage and what forces and activities are
therefore required. The failure of "Forward . . . from the Sea" to address the issue
of alternative force packages is illustrative in this regard: continued insistence that
carrier battle groups and amphibious ready groups are needed at all times in all
theaters reflects the conceptual and bureaucratic difficulty of determining the
actual requirements of a littoral strategy. Any decision to change deployment
patterns, mixes, or timetables would at least implicitly require a prioritization of
peacetime, crisis, and wartime duties; it would also represent a reallocation of
resources within the service. But without a clear understanding of the process by
which littoral operations generate the peacetime, crisis, and wartime outcomes
sought, the Navy will find it impossible to make the difficult tradeoffs demanded by
budgetary pressures. Indeed, as budgetary pressures, the need to moderate
personnel and operational tempos, and the need to modernize become greater, the
imperative for a clearer understanding of the relative value of (for example)
forward peacetime presence, forward peacetime presence by carriers and
amphibious forces, rapid crisis response, and massive wartime strike capacity will
increase. Ultimately the danger is that a littoral strategy will become unworkable
through an inability of the Navy to make the required tradeoffs, in which case it will
find itself with forces that are too small, too overstretched, too poorly maintained,
too poorly trained or manned, too obsolescent, or simply improperly configured to
meet what prove to be the essential demands of a littoral strategy. The third
danger, more basic and more beyond the control of the Navy than the first two, is
that the vision of warfare underlying the littoral strategy will be abandoned by the
nation. The DESERT STORM image of war as a transoceanic countermilitary
encounter is increasingly vulnerable, and as the elite and public begin to imagine
war in other, more traditional terms, the attractiveness and importance of projecting
power "from the sea" will become less apparent. To stay in harmony with national
leadership and national strategy, the Navy will be called upon to offer a revised
account of the utility of naval power.
Over the course of the last several decades, the United States has variously sensed
threat from small counties led by people it found to be decidedly unpleasant.
These rogue states (as they came to be called in the 1990s) were led by such
devils de jour as Nasser, Sukarno, Castro, Qaddafi, Khomeini, Kim Il-Sung, and
Saddam Hussein, all of whom have since faded into historys dustbin. Today, such
alarmed focus is directed at teetering Iran, and at North Korea, the most pathetic
state on the planet. Except in worst-case fantasies, however, neither country
presents a threat of direct military aggression Iran, in fact, has eschewed the
practice for several centuries. Nonetheless, it might make some sense to maintain
a capacity to institute containment and deterrence efforts carried out in formal or
informal coalition with concerned neighboring countries and there are quite a
few of these in each case. However, neither country is militarily impressive and the
military requirements for effective containment are far from monumental and do
not necessarily need large forces-in-being. Moreover, the Iraq syndrome seems
already to be having its effect in this area. Despite nearly continuous concern
about Iranian nuclear developments, proposals to use military force to undercut
this progress have been persistently undercut. The Gulf War of 1991 is an
example of military force being successfully applied to deal with a rogue venture
the conquest by Saddam Husseins Iraq of neighboring Kuwait. This experience
does not necessarily justify the maintenance of substantial military forces,
however. First, Iraqs invasion was rare to the point of being unique: it has been
the only case since World War II in which one UN country has invaded another
with the intention of incorporating it into its own territory. As such, the
experience seems to be much more of an aberration than a harbinger. Second, in
a case such as that, countries do not need to have a large force-in-being because
there is plenty of time to build a force should other measures to persuade the
attacker to withdraw, such as economic sanctions and diplomatic forays, fail. And
third, it certainly appears that Iraqs pathetic forces lacking strategy, tactics,
leadership, and morale needed the large force thrown at them in 1991 to decide
to withdraw.18
Even so, at least three problems remain. First, there may be a tendency to focus
force protection too much x3 on the sea lines of communication and not enough
on the security of sea ports, both of embarkation and arrival. There also is a
natural tendency to concentrate on the safe and timely arrival of soldiers and
their equipment in the theater, neglecting the dangers that, in these
asymmetrical days, may be posed to their collection and dispatch at home. In the
Iraq operation, Greenpeace attempted to interfere with the loading of military
supplies for the British forces at the military port of Marchwood in Southampton
Water. They were ineffective and harmless, but nonetheless represented a useful
reminder of the vulnerability at the suppliers end of the supply chain. Second,
many would doubt the permanence of the shift of naval priorities away from
oceanic sea control and towards its coastal force-protection variant. The
emergence of new maritime powers such as Japan and China, or the recovery of
Russia, might lead to a resurgence of peer competition and old-fashioned
maritime rivalry on the high seas. The U.S. Navys current wariness about the
prospective maritime expansion of China later in the century may be used to
justify investment in more conventional forms of naval power. Third, the ability of
naval forces to maintain an operational posture against relatively unsophisticated
shore-based opposition can be exaggerated. The vulnerability of ships to coastal
mines, small, quiet diesel submarines, terrorists on jet skis, or radical weaponry
of the kind recently demonstrated by the Iranians, has to be taken seriously.24
1NCAlt Causes
Alt causes to navy power decline that IOOS cant solve for
Hultin & Blair 6(Jerry MacArthur HULTIN and Admiral Dennis BLAIR, Naval Power
and Globalization: The Next Twenty Years in the Pacific, Jerry M. Hultin is former
president of Polytechnic Institute of New York University From 1997 to 2000 Mr. Hultin
served as Under Secretary of the Navy., Dennis Cutler Blair is the former United States
Director of National Intelligence and is a retired United States Navy admiral who was the
commander of U.S. forces in the Pacific region, http://engineering.nyu.edu/files/hultin
%20naval%20power.pdf)
As successful as this transformation has been, there are new technological, geopolitical, and warfighting challenges on the horizon that require solutions. The
three most noteworthy are missile defense, irregular warfare peace operations,
counter-insurgency warfare, stability operations and military operations against
non-state organizations using terrorist tactics and the integration of all factors of
political-military influence. First, missile defense: in the future increasing
numbers of littoral countries, organizations such as Hezbollah that are sponsored
by countries, and even unsponsored militant organizations will field ballistic and
cruise missiles up to intermediate ranges. These weapons will threaten deployed
US naval expeditionary forces, military forces and civilian populations of U.S. allies,
and ultimately the United States itself. Thus, it is imperative that the United
States and its allies develop and deploy a highly effective capacity to defend
against such attacks8 . There is not a single system solution to this threat the
Navy must continue developing more capable sensors, communications networks,
offensive weapons to take out enemy missiles and their support systems, along
with both hard-kill and soft-kill defensive systems that can defeat enemy missiles
once they have been launched.
events in the Pacific. The U.S. attack submarine serves an important part in
establishing sea control and supremacy, and it is not interchangeable with other
assets. Its unique capabilities make it a force multiplier and allow it to "punch above
its weight." To protect U.S. interests in East Asia and the Pacific, and to support and
reassure U.S. allies, the U.S. must halt and then reverse the decline of its submarine
fleet as a critical component of a broader policy to maintain the military balance in
the Pacific.
While many in Washington are focused on the rapidly-approaching March 1st sequestration deadline, few seem to be paying much
massive cuts that will go into effect this week across our military cuts that will
have an equally devastating impact on our economy and security. As the result of the continuing
attention to the
resolution (CR) that was agreed upon in the final hours of 2012, a deal that averted the so-called fiscal cliff, the U.S. Navy was
forced to accept $6.3 billion in cuts for the remainder of the forces Fiscal Year 2013 (FY13) budget. If sequestration goes into effect,
that number could grow to $10 billion. As the representative of 80,000 workers employed at U.S. shipyards and naval bases
workers who take great pride in the role they play in supporting our military I would like to shed some light on the severity and
senselessness of the Navy budget cuts and implore the 113th Congress to take prudent and expeditious action to avoid further
Defense spending cuts in FY13.
budget; we simply cannot sustain any further cuts this year. The $6.3 billion in Navy
budget reductions cut too deeply . For example, Navy Fleet Commanders have been
ordered to cancel 3rd and 4th Quarter surface ship maintenance. Thats a very tall order
considering our shipyards are already fully loaded with high priority work on
extremely tight schedules to support fleet deployment. Once scheduled maintenance on a ship
is missed, there is no capacity to make it up later without causing delays elsewhere. The
well-oiled machine breaks down. With the cuts contained in the CR alone, it is conceivable that
aircraft carriers, destroyers, nuclear submarines, and other Navy vessels would be
tied up at piers awaiting critical repairs without the ability to move , much less support
our national defense. If we allow our fleet to collect barnacles in harbor for six months, we
would significantly disrupt our militarys access to the ships they need to defend
our country.
Navy leaders on Thursday told a panel chaired by Rep. Randy Forbes, R-Chesapeake, that
more budget cuts would dull the edge of the submarine program , where the
U.S. enjoys a distinct advantage over other superpowers. Forbes chairs the House Armed Service's
program. Two
subcommittee on sea power. About 2,300 jobs at Newport News Shipbuilding are tied to the U.S. submarine
The
U.S. military has three types of nuclear-powered submarines . First are the smaller fast-attack
program. The shipyard builds them in partnership with General Dynamics Electric Boat of Groton, Conn.
submarines that fall primarily in two classes, the older Los Angeles class and the newer Virginia class. Last week,
the Navy commissioned the newest Virginia class sub at Naval Station Norfolk. The second type are Ohio class
ballistic missile submarines that roam the seas and provide a nuclear strike capability. The third type is an offshoot
of the second: When the Cold War ended, the U.S. converted four of those ballistic missile submarines into guidedcruise missile submarines.
All three types are scheduled to drop , said Rear Adm. Richard P.
Breckenridge and Rear Adm. David C. Johnson. who testified before the Forbes panel.
Undersecretary of the Navy, The US Navy in Distress, Strategic Analysis Vol. 34,
No. 1, January 2010, 3545, http://navsci.berkeley.edu/ma20/Class%208/Class
%2012.%20Sea%20Power.%20%20Cropsey_US_Navy_In_Distress.pdf
The most tangible result is the continued withering of the US combat fleet which today numbers about
280 ships. This is less than half the size achieved towards the end of the Reagan
administration buildup and 33 ships short of what the Navy says it needs to fulfill todays
commitments.
decline over the long term. Four years ago the Navys projected fleet size had dwindled to 313 ships. There it has stayed . . .
until May of this year when a senior Navy budget official, commenting on the proposed 2010 budget, suggested that the new
Quadrennial Review now underway at the Defense Department will likely result in a smaller
projected fleet size. Huge increases in current and projected national debt and
the vulnerability of the military budget to
without compelling events
help
the nations sea services will experience additional and perhaps drastic
reductions . National indebtedness will grow from its current ratio of 40 per cent of GDP to 80 per
cent of GDP in a decade. Servicing this will cripple the nations ability to modernize
and increase a powerful world-class fleet or drive us deeper into a yawning financial hole.
The Third World War: August 1985. 9 Such anxieties obviously proved to be
over-wrought, but to the degree that they were correctly focused on a potential
cause of major war, that specific impetus no longer exists. World War III, then,
continues to be the greatest nonevent in human history, and that happy condition
seems very likely to continue. There have been wars throughout history, of course,
but the remarkable absence of the species worst expression for two-thirds of a
century (and counting) strongly suggests that realities may have changed , and
perhaps permanently. Accordingly it may be time to consider that spending a lot of
money preparing for a conceivable eventuality or fantasy that is of everreceding likelihood is a highly questionable undertaking.
rebalancing to Asia is not directed at containing China. For example, Admiral Samuel J. Locklear III, Commander of the US Pacific
Command, recently stated, there has also been criticism that the Rebalance is a strategy of containment. This is not the case it is
a strategy of collaboration and cooperation. However, a review of past USChina military-to-military interaction indicates that an
agreement to jointly manage security in the South China Sea is unlikely because of continuing strategic mistrust between the two
communication open with China through three established bilateral mechanisms: Defense Consultative Talks, the Military Maritime
Consultative Agreement (MMCA), and the Defense Policy Coordination Talks. On the one hand, these multilateral mechanisms reveal
very little about USChina military relations. Military-to-military contacts between the two countries have gone through repeated
cycles of cooperation and suspension, meaning that it has not been possible to isolate purely military-to-military contacts from their
continuing working-level discussions under the MMCA; agreement on the 7-point consensus; and no serious naval incidents since
the 2009 USNS Impeccable affair. They have also helped to ensure continuing exchange visits by senior military officers; the
initiation of a Strategic Security Dialogue as part of the ministerial-level Strategic & Economic Dialogue process; agreement to hold
meetings between coast guards; and agreement on a new working group to draft principles to establish a framework for military-to-
a modus vivendi of
a hypothetical scenario in which a uniformed Chinese military member is killed during a firefight with Japanese soldiers. Regardless
of the specific circumstances, the casualty would create a new martyr in China and, almost as quickly, catalyze popular protests
against Japan. Demonstrators would call for blood, and if the government (fearing economic instability) did not extract enough,
citizens would agitate against Beijing itself. Those in Zhongnanhai, the Chinese leadership compound in Beijing, would find
themselves between a rock and a hard place. It is possible that Xi lost track of these basic facts during the fanfare of his rise to
power and in the face of renewed Japanese assertiveness. It is also possible that the Chinese state is more rotten at the core than is
understood. That is, party elites believe that a diversionary war is the only way to hold on to power -- damn the economic and social
Xi does not seem blind to the principles that have served Beijing so well
over the last few decades. Indeed, although he recently warned unnamed others about infringing upon China's
"national core interests" during a foreign policy speech to members of the Politburo, he also underscored China's
commitment to "never pursue development at the cost of sacrificing other country's
interests" and to never "benefit ourselves at others' expense or do harm to any
neighbor." Of course, wars do happen -- and still could in the East China Sea. Should either side draw first blood through
consequences. But
accident or an unexpected move, Sino-Japanese relations would be pushed into terrain that has not been charted since the middle
understanding that war would be a no-win situation, China has avoided rushing over
the brink. This relative restraint seems to have surprised everyone . But it shouldn't.
of the last century. However,
Beijing will continue to disagree with Tokyo over the sovereign status of the islands, and will not budge in its negotiating position
over disputed territory. However, it cannot take the risk of going to war over a few rocks in the
sea. On the contrary, in the coming months it will quietly seek a way to shelve the
dispute in return for securing regional stability , facilitating economic development,
and keeping a lid on the Pandora's box of rising nationalist sentiment. The ensuing peace,
while unlikely to be deep, or especially conducive to improving Sino-Japanese relations, will be enduring.
Whoever forges sea, land, and air forces into the sharpest weapon of sea combat stands a good chance of prevailing. That could be
Japan if its political and military leaders think creatively, procure the right hardware, and arrange it on the map for maximum effect.
After all, Japan doesnt need to defeat Chinas military in order to win a showdown
at sea, because it already holds the contested real estate; all it needs to do is deny
China access. If Northeast Asian seas became a no-mans land but Japanese forces hung on, the political victory would be
Tokyos. Japan also enjoys the luxury of concentrating its forces at home, whereas the PLA Navy is dispersed into three fleets spread
President Hu Jintao ordered PLA commanders to construct a powerful peoples navy that could defend the nations maritime
That takes
lots of ships. If it lost much of the fleet in a Sino-Japanese clash even in a winning effort Beijing
lifelines in particular sea lanes that connect Indian Ocean energy exporters with users in China at any time.
could see its momentum toward world-power status reversed in an afternoon. Heres hoping Chinas political and
military leaders understand all this. If so, the Great Sino-Japanese Naval War of 2012 wont be happening outside these pages.
it is also conceivable, and far more likely, that the whole problem will be
worked out over the course of time without armed conflict. The Chinese strongly
stress that their perspective on this issue is very long term and that they have a
historic sense of patience. Indeed, if China eventually becomes a true democracy, Taiwan might
even join up voluntarily or, failing that, some sort of legalistic face-saving
agreement might eventually be worked out. Above all, China is increasing
becoming a trading state, in Richard Rosecrances phrase.11 Its integration into the world
economy and its increasing dependence on it for economic development and for
the consequent acquiescent contentment of the Chinese people is likely to keep the
country reasonable. Armed conflict over the issue would be extremely even
overwhelmingly costly to the country, and, in particular, to the regime in charge,
and Chinese leaders seem to realize this.
relevant. But
Although the trend lines are undoubtedly working in Chinas favor, it is ultimately
extremely unlikely that China will try to seize Taiwan by force. Furthermore, should it
try to do this, it is unlikely to succeed. Even assuming Chinas military capabilities
are great enough to prevent the U.S. from intervening, there are two forces that
would likely be sufficient to deter China from invading Taiwan. The first and least
important is the dramatic impact this would have on how countries in the region
and around the world would view such a move. Globally, China seizing Taiwan would
result in it being permanently viewed as a malicious nation. Regionally, Chinas
invasion of Taiwan would diminish any lingering debate over how Beijing will use its
growing power. Every regional power would see its own fate in Taiwan. Although
Beijing would try to reassure countries by claiming that Taiwan was part of China
already, and thus the operation was a domestic stability one, this narrative would
be convincing to none of Chinas neighbors. Consequently, Beijing would face an
environment in which each state was dedicated to cooperating with others to
balance against Chinese power. But the more important deterrent for China would
be the uncertainty of success.
Homeland Security
Terrorism
Better detection fails- terrorists can hide their HEU from
radiation detectors.
de Rugy 05 (Veronique, senior research fellow at the Mercatus Center at George Mason
University, Is Port Security Spending Making Us Safer?,
http://cip.management.dal.ca/publications/Is%20Port%20Security%20Spending%20Making
%20Us%20Safer.pdf)
research assistant at the Institute for National Strategic Studies (INSS) at the National Defense
University (NDU) in Washington, D.C. ], Ships and Terrorists Thinking Beyond Port Security,
http://csis.org/files/media/csis/pubs/pac0445a.pdf
require access to a major port or a shipping container to carry out a strike. There
remain numerous small ports and small vessels not covered under the new security
initiatives. The ISPS Code for instance only covers ships of 500 tons or more and
port facilities that serve large international-bound vessels. The Code would not have
protected the USS Cole.
University],Is Port Security Funding Making Us Safer?, MIT Center for International Studies, November
2007, http://web.mit.edu/cis/pdf/Audit_11_07_derugy.pdf.
The Domestic Nuclear Detection Office (DNDO) received $535 million in 2007.14
DNDOs mission addresses a broad spectrum of radiological and nuclear protective
measures, but is focused exclusively on domestic nuclear detection.15 The
fundamental problem is that DNDO relies on radiation portal monitors that have
been proven unable to detect shielded nuclear material essentially rendering them
useless.16 Besides, even if the system could detect every dangerous item, it is
ineffective unless the nuclear material is brought through the fixed ports of entry
where the monitors are located. With thousand of miles of unguarded borders and
no cost effective way to address the issuesmugglers can easily find positions to
bring illicit goods inside the country. Consider the countrys long standing War on
Drugs and the inability to stop the flow of illegal drugs into the country.
University],Is Port Security Funding Making Us Safer?, MIT Center for International Studies, November
2007, http://web.mit.edu/cis/pdf/Audit_11_07_derugy.pdf.
A close look at port security allocation decisions indicates that spending occurs
without regard for risk analysis let alone cost-benefit analysis, leading to a large
array of misallocated spending. For instance, why should the highest priorities
preventing terrorists from acquiring nuclear devices and materialreceive less
money than much less cost-effective policies such as nuclear detection in the ports
or post-disaster response activities. Because it rests mainly on domestic detection
of WMD in portsa task that is not clear could be achieved the port security model
offers almost no value to the nation.6 Even if we could seal our ports, America
wouldnt be safe. The only effective way to prevent nuclear attacks is to deny
terrorists access to weapons and material. Without nuclear materials there can be
no nuclear bombs.
closed and not sealed, well inland of a port. They are then transferred by truck, or
truck and train, to a seaport, where they are loaded onto a ship. Following the sea
journey, they are transferred to another truck, and perhaps another train, for a
further journey over land to the ultimate destination. Each container trip to the
United States has, on average, 17 different stops, or points at which the containers
journey temporarily halts.14 The adage goods at rest are goods at risk readily
applies to the terrorist threat. The container will be at rest at any point in the
journey that involves a change in mode of transportation. While at rest, the
container is vulnerable to thieves and terrorists alike. Providing port security
therefore involves closely scrutinizing activities not only at the port but at points all
along the shipping chain. The truck driver picking up the container at the U.S. port,
often poorly paid and possibly an illegal immigrant not well integrated into U.S.
society, may himself represent a vulnerability in the system. The issue is not
merely that something could be put in a container illicitly for an attack on the port
where it is unloaded but that nuclear weapons or radiological material could be
inserted, shipped to the United States, moved inland without inspection, and then
unloaded into the hands of terrorists. These objects could then be transported for
use in major population centersperhaps better targets than a port complex.
Likewise, explosive material could be put in several containers and then detonated
at or near port complexes around the same time, leading to a security reaction that
could shut down the entire maritime transportation system until officials, and port
workers and management, were certain the threat had passed. There is no way to
completely inspect all of the millions of containers entering the United States . They
are about as large as a fullsize moving van and are often tightly packed. Inspecting
each thoroughly would bring commerce to a halt, exactly the kind of reaction that
terrorists hope to generate.
those who decidedly disagree with such scary-sounding, if somewhat elusive, prognostications
about nuclear terrorism often come out seeming like they more or less agree. In his Atomic Bazaar, William
Langewiesche spends a great deal of time and effort assessing the process by means of which a terrorist group could come up with a
bomb. Unlike Allisonand, for that matter, the considerable bulk of accepted opinion he concludes that it
"remains very, very unlikely . It's a possibility, but unlikely." Also: The best information is that no one has gotten
anywhere near this. I mean, if you look carefully and practically at this process, you see that it is an enormous undertaking full of risks for the would-be terrorists. And
so far there is no public case, at least known, of any appreciable amount of weapons-grade HEU [highly enriched uranium] disappearing. And that's the first step. If you don't have
that, you don't have anything. The first of these bold and unconventional declarations comes from a book discussion telecast in June 2007 on C-SPAN and the second from an interview on National Public Radio. Judgments in the book itself, however, while consistent with such conclusions, are expressed more ambiguously, even coyly: "at the extreme is the
possibility, entirely real, that one or two nuclear weapons will pass into the hands of the new stateless guerrillas, the jihad-ists, who offer none of the retaliatory targets that have so far
underlain the nuclear peace" or "if a would-be nuclear terrorist calculated the odds, he would have to admit that they are stacked against^ffen," but they are "not impossible."5 The
previous chapter arrayed a lengthy set of obstacles confront-: v , ing the would-be atomic terroristoften making use in the process of Langewlesche's excellent reporting. Those who
warn about the likelihood of a terrorist bomb contend that a terrorist group could, if often with great difficulty, surmount each obstaclethat doing so in each case is, in
Langewiesche's phrase, "not impossible."6 But it is vital to point out that, while it may be "not impossible" to surmount each individual step, the likelihood that a group
atomic bomb seems to be vanishingly small . ARRAYING THE BARRIERS Assuming terrorists have some
desire for the bomb (an assumption ques-tioned in the next chapter), fulfillment of that desire is obviously another
matter. Even the very alarmed Matthew Bunn and Anthony Wier contend that the atomic terrorists' task "would clearly
be among the most difficult types of attack to carry out" or "one of the most difficult missions a terrorist group could
hope to try" But, stresses the CIA's George Tenet, a terrorist atomic bomb is "possible" or "not beyond the realm of possibility." In his
excellent discussion of the issue, Michael Levi ably catalogues a wide array of difficulties confronting the would-be
atomic terrorist, adroitly points out that "terrorists must succeed at every stage , but the defense needs
to succeed only once ," sensibly warns against preoccupation with worst-case scenarios, and pointedly formulates "Murphy's Law
of Nuclear Terrorism: What can go wrong might go wrong." Nevertheless, he holds nuclear terrorism to be a "genuine possibility," and
concludes that a good defensive strategy can merely "tilt the odds in our favor."7 Accordingly, it might be useful to take a stab
at estimating just how "difficult" or "not impossible" the atomic terrorists' task , in aggregate, is that is, how
far from the fringe of the "realm of possibility" it might be, how "genuine" the possibilities are, how tilted the odds actually are. After
all, lots of things are "not impossible." It is "not impossible" that those legendary monkeys with
typewriters could eventually output Shakespeare.8 Or it is "not impossible"that is, there is a "genuine possibility"
that a colliding meteor or comet could destroy the earth, that Vladimir Putin or the British could decide one
morning to launch a few nuclear weapons at Ohio, that an underwater volcano could erupt to cause a
civilization-ending tidal wave, or that Osama bin Laden could convert to Judaism , declare himself to be the
Messiah, and fly in a gaggle of mafioso hit men from Rome to have himself publicly crucified .9 As suggested, most
discussions of atomic terrorism deal in a rather piecemeal fashion with the subjectfocusing separately on individual tasks such as
procuring HEU or assembling a device or transporting it. However, as the Gilmore Commission, a special advisory panel to the president
and Congress, stresses, setting off a nuclear device capable of producing mass destruction presents
" Herculean challenges ," requiring that a whole series of steps be accomplished: obtaining enough fissile
material, designing a weapon "that will bring that mass together in a tiny fraction of a second" and figuring out some
way to deliver the thing. And it emphasizes that these merely constitute "the minimum requirements ."
If each is not fully met, the result is not simply a less powerful weapon, but one that can't produce any
significant nuclear yield at all or can't be delivered.10 Following this perspective, an approach that seems appropriate is to catalogue
the barriers that must be overcome by a terrorist group in order to carry out the task of producing, transporting, and then successfully
detonating an improvised nuclear device, an explosive that, as Allison acknowledges, would be "large, cumbersome, unsafe, unreliable,
unpredictable, and inefficient." Table 13.1 attempts to do this, and it arrays some 20 of these all of which must be surmounted by the
atomic aspirant. Actually, it would be quite possible to come up with a longer list: in the interests of keeping the catalogue of hurdles
down to a reasonable number, some of the entries are actually collections of tasks and could be divided into two or three or more. For
example, number 5 on the list requires that heisted highly enriched uranium be neither a scam nor part of a sting nor of inadequate
quality due to insider incompetence, but this hurdle could as readily be rendered as three separate ones. In contemplating the task before
them, woixftlsbe atomic terrorists effectively must go through an exercise that looks much like this. If and when they do so, they are
likely to find the prospects daunting and accordingly uninspiring or even terminally dispiriting. "
Economy
Terrorist attacks have a minimal effect on the economy
a. Ports arent key and empirics disprove
Jon D. Haveman and Howard J. Shatz 2006- DirectoroftheEconomyProgramat
thePublicPolicyInstituteof California and Senior Economist; Professor, Pardee RAND
Graduate School (Protecting the Nations Seaports: Balancing Security and Cost, Pg
30-31, http://www.ppic.org/content/pubs/report/r_606jhr.pdf)
In Chapter 2, Edward E. Leamer and Christopher Thornberg argue that the actual
costs of an attack on the Los AngelesLong Beach port complex may not be as high
as many fear. For example, if a port is closed, many shippers will reroute their
shipments through other ports. In addition, displaced workers will seek alternative
employment. As a result, the economy will adjust. Some output will be lost, but it
may be so small in magnitude that it will not reveal itself in data that track
national or even regional macroeconomic trends. The authors provide examples of
other disruptions that might have caused severe economic damage but did not,
such as the terrorist attacks of September 11, 2001. Consumer spending fell
immediately after the attacks but then rebounded sharply at the end of 2001 ,
growing at an unprecedented, seasonally adjusted annual rate of 7 percent.
Likewise, although retail sales fell immediately after the attacks, they returned to
trend in November, only two months later. Some sectors did suffer, most notably
the airline industry, which had already been in deep trouble before the end of the
technology boom in early 2001. But consumer spending actually increased,
suggesting that people reallocated the money that they would have spent on
airline travel to other forms of consumption. Similarly, the authors argue that other
disruptions such as hurricanes, earthquakes, and even labor disputes at seaports
did have immediate negative economic effects but that these effects dissipated
quickly as the economy adjusted. The message in this is that most such disruptions
lead to business being delayed rather than business being cancelled, which in turn
results in much less economic harm than would be expected.
September 11 Did Not Cause the 2001 Recession There is a strong tendency
to blame too many secondary effects on disasters. A good example of
this phenomenon is found in the September 11 attacks on New York and
Washington, D.C. In the days after the attacks, the rhetoric regarding
the potential effect on the national economy was both loud and
wrong. The theory proposed by many analysts and journalists was
that psychologically fragile consumers in the United States would
suffer a crisis and stop spending, driving the economy into a deeper
recession. Support for this theory came from the first Gulf War, which
supposedly caused a similar consumer crisis of confidence that in turn drove us into
a recession in 1990. For example, the Wall Street Journal reported on September 13,
2001: Past shocks to Americas sense of security, such as the Oklahoma City
bombing or the Gulf War, have prompted consumers to pull back temporarily on
major purchases and other discretionary spending, said Richard Curtin, director of
surveys of consumers at the University of Michigan. He expects a similar reaction
now, which could mean a rough time in the next several weeks for the economy,
which was already struggling with rising jobless rates and high consumer debt
burdens. We were teetering on the edge, and this might well push us over, said
exceptionin 1967, when the economy was wobbling, appearing to be on the verge
of recession, the sharp increase in spending for the Vietnam War propelled the
economy forward. This was just the reverse of what Mr. Curtin suggested.
Bright greens take the continued existence of a habitable Earth with viable,
sustainable populations of all species including humans as the ultimate truth and
the meaning of life. Whether this is possible in a time of economic collapse is
crucially dependent upon whether enough ecosystems and resources remain post
collapse to allow humanity to recover and reconstitute sustainable, relocalized
societies.It may be better for the Earth and humanity's future that economic
collapse comes sooner rather than later, while more ecosystems and opportunities
to return to nature's fold exist. Economic collapse will be deeply wrenching -- part
Great Depression, part African famine. There will be starvation and civil strife, and a
long period of suffering and turmoil.Many will be killed as balance returns to the
Earth. Most people have forgotten how to grow food and that their identity is more
than what they own. Yet there is some justice, in that those who have lived most
lightly upon the land will have an easier time of it, even as those super-consumers
living in massive cities finally learn where their food comes from and that ecology is
the meaning of life. Economic collapse now means humanity and the Earth
ultimately survive to prosper again. Human suffering -- already the norm for many,
but hitting the currently materially affluent -- is inevitable given the degree to which
the planet's carrying capacity has been exceeded. We are a couple decades at most
away from societal strife of a much greater magnitude as the Earth's biosphere fails.
Humanity can take the bitter medicine now, and recover while emerging better for
it; or our total collapse can be a final, fatal death swoon.
Empirical support for the economic growth rate is much weaker. The finding
that poor economic performance is associated with a higher likelihood of
territorial conflict initiation is significant only in Models 34.14 The weak results
are not altogether surprising given the findings from prior literature. In accordance with
the insignificant relationships of Models 12 and 56, Ostrom and Job (1986), for example, note that the likelihood that a U.S.
President will use force is uncertain, as the bad economy might create
incentives both to divert the publics attention with a foreign adventure and to
focus on solving the economic problem, thus reducing the inclination to act
abroad. Similarly, Fordham (1998a, 1998b), DeRouen (1995), and Gowa (1998) find no
relation between a poor economy and U.S. use of force . Furthermore, Leeds and Davis
(1997) conclude that the conflict-initiating behavior of 18 industrialized democracies is
unrelated to economic conditions as do Pickering and Kisangani (2005) and
Russett and Oneal (2001) in global studies. In contrast and more in line with my findings of a significant relationship (in Models 3
4), Hess and Orphanides (1995), for example, argue that economic recessions are linked with forceful action by an incumbent U.S. president.
Furthermore, Fordhams (2002) revision of Gowas (1998) analysis shows some effect of a bad economy and DeRouen and Peake (2002) report
that U.S. use of force diverts the publics attention from a poor economy. Among cross-national studies, Oneal and Russett (1997) report that slow
growth increases the incidence of militarized disputes, as does Russett (1990)but only for the United States; slow growth does not affect the
behavior of other countries. Kisangani and Pickering (2007) report some significant associations, but they are sensitive to model specification,
while Tir and Jasinski (2008) find a clearer link between economic underperformance and increased attacks on domestic ethnic minorities. While
none of these works has focused on territorial diversions, my own inconsistent findings for economic growth fit well with the mixed results
reported in the literature.15 Hypothesis 1 thus receives strong support via the unpopularity variable but only weak support via the economic
These results suggest that embattled leaders are much more likely to
respond with territorial diversions to direct signs of their unpopularity (e.g.,
strikes, protests, riots) than to general background conditions such as economic
malaise. Presumably, protesters can be distracted via territorial diversions while fixing the economy would take a more concerted and
growth variable.
prolonged policy effort. Bad economic conditions seem to motivate only the most serious, fatal territorial confrontations. This implies that leaders
may be reserving the most high-profile and risky diversions for the times when they are the most desperate, that is when their power is
threatened both by signs of discontent with their rule and by more systemic problems plaguing the country (i.e., an underperforming economy).
can be directed toward solving internal problems. In the case of the diversion
option, rivals provide logical, believable actors for leaders to target; the presence of
a clear rival may offer unstable elites a particularly inviting target for hostile
statements or actual conflict as necessary. The public and relevant elites already
consider the rival a threat or else the rivalry would not have continued for an
extended period; the presence of disputed issues also provides a casus belli with
the rival that is always present. Rivals also may provide a target where the possible
costs and risks of externalization are relatively controlled. If the goal is diversion,
leaders willwant to divert attention without provoking an actual (and expensive)war.
Over the course of many confrontations, rival states may learn to anticipate
response patterns, leading to safer disputes or at least to leaders believing that
they can control the risks of conflict when they initiate a new confrontation. In sum,
rivals provide good targets for domestically challenged political leaders. This leads
to our first hypothesis, which is as follows: Hypothesis 1: Poor economic conditions
lead to diversionary actions against the rival. Conflict settlement is also a distinct
route to dealing with internal problems that leaders in rivalries may pursue when
faced with internal problems. Military competition between states requires large
amounts of resources, and rivals require even more attention . Leaders may
choose to negotiate a settlement that ends a rivalry to free up important
resources that may be reallocated to the domestic economy. In a guns versus
butter world of economic trade-offs, when a state can no longer afford to pay the
expenses associated with competition in a rivalry, it is quite rational for leaders to
reduce costs by ending a rivalry. This gain (a peace dividend) could be achieved at
any time by ending a rivalry. However, such a gain is likely to be most important
and attractive to leaders when internal conditions are bad and the leader is
seeking ways to alleviate active problems. Support for policy change away from
continued rivalry is more likely to develop when the economic situation sours and
elites and masses are looking for ways to improve a worsening situation. It is at
these times that the pressure to cut military investment will be greatest and that
state leaders will be forced to recognize the difficulty of continuing to pay for a
rivalry. Among other things, this argument also encompasses the view that the cold
war ended because the Union of Soviet Socialist Republics could no longer compete
economically with the United States. Hypothesis 2: Poor economic conditions
increase the probability of rivalry termination. Hypotheses 1 and 2 posit opposite
behaviors in response to a single cause (internal economic problems). As such, they
demand a research design that can account for substitutability between them.
Water
Water Wars-Solvency
Scarcity drives middle eastern cooperation
Sikimic 11
[Simona, author for the daily star, Lebanon news. Jan. 21, 2011, Water scarcity can be source for regional
cooperation: report http://www.dailystar.com.lb/News/Lebanon-News/2011/Jan-21/61157-water-scarcity-can-be-source-for-regionalcooperation-report.ashx#axzz36jugkuUs //jweideman]
BEIRUT: Water scarcity in the region can be channeled for a common good and used to
reduce, rather than ignite conflict, an environmental report released Thursday has
claimed. The Blue Peace: Rethinking Middle East Water launched at the Beirutbased Carnegie Middle East Center, proposes radical cooperation between the six
concerned states: Lebanon, Israel, the Palestinian Territories , Jordan, Iraq and
Turkey, and envisages the neighbors setting up a mutual monitoring system to
guarantee collaboration and more equal partitioning of resources. The social and economic
development of nations depends on water availability in terms of quantity and quality, said Fadi Comair, the
president of Mediterranean Network of River Basin Organizations. It holds a major
place on the diplomatic agenda of the [six] governments, Comair said. River flows in Turkey, Syria,
Iraq, Lebanon and Jordan have been depleted by 50 to 90 percent in the last 50 years alone, while the vital Jordan River, which acts
as a water source for five of the concerned countries, has decreased its discharge by over 90 percent from 1960, the report said.
This is a serious problem, said Sundeep Waslekar, the president of the India-based think tank Strategic Foresight Group, which
coordinated the reports compilation. Especially as water demand is rising and consumption has gone up from [an estimated] 10-15
percent 50 years ago to 37 percent now. With consumer requirements predicted to increase to 50-60 percent over the next decade,
further pressure will be put on ever-dwindling supplies, he said. But hydrodiplomacy as outlined in the report has the potential to
alleviate conflicts on trans-boundary watercourse between riparian states [which] will intensify more and more, especially in the
Trade
No Solvency-Alt cause
Too many alt causes trade is declining in the SQ
IEC 09 (The International Economy is a specialized quarterly magazine covering global financial policy, economic trends, and international trade.
Spring of 2009. Collapse in World Trade from The Magazine of International Economic Policy. http://www.internationaleconomy.com/TIE_Sp09_WorldTrade.pdf July 8, 2014)
The collapse of trade since the summer of 2008 has been absolutely terrifying,
more so insofar as we lack an adequate understanding of its causes. Murky
protectionism has played a role. Disruptions to the supply of trade credit from
international banks in particular have negatively impacted some countries. The
most important factor is probably the growth of global supply chains, which has
magnified the impact of declining final demand on trade. When a U.S. household
decides not to buy a $40,000 Cayenne sport utility vehicle from Germany,
German exports to the United States go down by $40,000, but Slovakian exports to
Germany go down by perhaps half that amount, since while the final assembly is
done in Leipzig, the coachwork is done in Bratislava. All this said, it really is the
case that we dont fully understand the rel- ative importance of the effects. If it is
any consolation, there are signs that trade will rise with recovery every bit as fast
as it fell with the onset of the crisis.
There are several reasons why international trade has collapsed faster than global
GDP. First, the decline in manufacturing output has exceeded the decline in real
GDP. For example, U.S. industrial pro- duction in the first quarter of 2009 was 14
percent lower than its peak in the fourth quarter of 2007. In contrast, the level of
real GDP in the United States in the first quarter of 2009 was only 3.2 percent below
its peak. Demand for tradable goods is largely derived from demand for
manufactured goods, so the collapse in trade reflects the collapse in manufacturing.
Second, the decline in global trade has been exacerbated by business efforts to
conserve cash rather than build up inventories to prior levels. Trade flows are being
buffeted not only by a general decline in activity, but also by this temporary move
to lower inventories. Third, the decline in trade reflects the crisis in finance. Credit
plays a critical role in international trade, and the disruption in global credit markets
has restricted flows of credit needed to support trade. Finally, the boom in
commodities-related trade has been replaced by gloom. Prices for many
commodities are falling and ports are packed with imports (think Houston and oilcountry tubular goods) for which there is currently limited demand. This new reality
is forcing even hyperactive exporters to cut back shipments dramatically.
No Trade War
Trade war wont happen.
Fletcher 2010
Ian, Senior Economist of the Coalition for a Prosperous America, former Research
Fellow at the U.S. Business and Industry Council, The Mythical Concept of Trade War,
April 2nd 2010, http://www.huffingtonpost.com/ian-fletcher/the-mythical-concept-oft_b_523864.html
As Americans ponder how to get the U.S. out of its current trade mess, we are constantly warned to do nothing - like
impose a tariff to neutralize Chinese currency manipulation - that would trigger a "trade war." Supposedly, no
matter how bad our problems with our trading partners get, they are less bad than the spiraling catastrophe that
the curious
it has no
would ensue if we walked a single inch away from our current policy of unilateral free trade. But
thing about the concept of trade war is that, unlike actual shooting war,
historical precedent . In fact, the reality is that there has never been a significant
trade war, "significant" in the sense of having done serious economic damage. All history records are minor
skirmishes at best. The standard example free traders give is that America's Smoot-Hawley
tariff of 1930 either caused the Great Depression or made it spread around the world. But this canard does not
survive serious examination, and has actually been denied by almost every
economist who has actually researched the question in depth--a group ranging from
Paul Krugman on the left to Milton Friedman on the right. The Depression's cause
was monetary. The Fed allowed the money supply to balloon during the late 1920s, piling up in the stock
market as a bubble. It then panicked, miscalculated, and let it collapse by a third by 1933, depriving the economy
number of countries do not discriminate against the commerce of the United States in any way. "Notorious"
Supposedly, China can suddenly stop buying our Treasury Debt if we rock the boat.
But this would immediately reduce the value of the trillion or so they already hold-not to mention destroying, by making their hostility overt, the fragile (and desperately-tended) delusion
in the U.S. that America and China are still benign economic "partners" in a win-win economic relationship.
At the end of the day, China cannot force us to do anything economically that we don't choose to. America is still a
nuclear power. We can--an irresponsible but not impossible scenario--repudiate our debt to them (or stop paying the
interest) as the ultimate countermove to anything they might contemplate. More plausibly, we might simply restore
the tax on the interest on foreign-held bonds that was repealed in 1984 thanks to Treasury Secretary Donald Regan.
A certain amount of back-and-forth token retaliation (and loud squealing) is indeed likely if
America starts defending its interests in trade as diligently as our trading partners
have been defending theirs, but that's it. After all, the world trading system has
survived their trade barriers long enough without collapsing .
But in one other important respect the comparison with the 1930s is highly
misleading. Then, tit-for-tat trade protection rapidly followed the Wall Street Crash,
and the world splintered into warring trade blocs . This has not happened
today, and it is unlikely to happen anytime soon. As we will discuss in the next
section, crisis-related protectionism today is remarkably restrained .
Multilateral trade rules, international policy cooperation and market-led
globalisation provide defences against a headlong descent into 1930s-style
protectionism.
No retaliation
Erixon and Sally, directors-ECIPE, 10 (Fredrik and Razeen, European Centre
for International Political Economy, TRADE, GLOBALISATION AND EMERGING PROTECTIONISM
SINCE THE CRISIS, http://www.ecipe.org/media/publication_pdfs/trade-globalisation-andemerging-protectionism-since-the-crisis.pdf)
Perhaps the biggest surprise is that the world has not hurtled into tit-for-tat protectionism . That
spectre was real enough in late 2008, given the scale of growth contraction and deglobalisation. The good news is
increased by 15 per cent from mid 2008 to mid 2009, and there has been a marked pickup in new investigations for
of forgoing the economic gains of trade did not stop Europes great powers from fighting a prolonged and
Beijing recognizes its strategic interest in preserving peace in East Asia. Stability in the region, and in Sino-
For a
the United States, this is the optimal realpolitik strategy:
buying time for its economy to grow so that the nation can openly balance against
the United States militarily and establish its own regional hegemony in East Asia.
Beijing is pursuing a peaceful policy today in order to strengthen itself to
confront the United States tomorrow.
American relations, allows China to become richer and to catch up to the United States in relative power.
state in Chinas position vis--vis