open-evidence.s3-website-us-east-1.amazonaws.comopen-evidence.s3-website-us-east-1.amazonaws.com/files/... ·...

138
JWST CORE NEGATIVE

Upload: dinhdan

Post on 01-Sep-2018

213 views

Category:

Documents


0 download

TRANSCRIPT

JWST CORE NEGATIVE

***STEM IMMIGRATION CP

1NC CP TEXTThe United States federal government should provide employment-based visas to graduates who earn a Masters or PhD degree from an accredited U.S. university in a science, technology, engineering or math field of study.

1NC ECONOMY SOLVENCY The plan solves competitiveness and economic leadership in the short and long termFitz, 09 (Marshall, Center For American Progress, Director of Immigration Policy at American Progress, director of advocacy for the American Immigration Lawyers Association, key legislative strategists in support of CIR and has served as a media spokesperson on a broad array of immigration policy and legislative issues, quoted extensively across spectrum of international, national, presented at national conferences and universities on immigration matters, advised numerous members of Congress on immigration, helped draft major legislation, Prosperous Immigrants, Prosperous Americans, How to Welcome the World’s Best Educated,Boost Economic Growth, and Create Jobs, December 2009)

Immigrants who come to the U nited S tates to study at our best universities and then go to work at our nation’s leading companies contribute directly and immediately to our nation’s global economic competitiveness . High-skilled immigrants who have started their own high-tech companies have created hundreds of thousands of new jobs and achieved company sales in the hundreds of billions of dollars. Yet despite the importance of such immigrants to the nation’s economic success in a global economy, our current high-skilled immigration system , while inadequate safeguards fail to prevent against wage depression and worker mistreatment. The reforms outlined in this paper will help establish a 21st century immigration system that serves the nation’s economic interests and upholds our responsibilities in a global economy. Of course, our current immigration policies have failed the country on many fronts beyond the high-skilled policy arena. And the urgent need for comprehensive, systemic reforms is beyond question. The national debate has understandably focused up to this point on the most visible and most highly charged issue—ending illegal immigration. Solving that riddle and ending illegal immigration is indisputably a national imperative and must be at the heart of a comprehensive overhaul of our system. But reforms to our high-skilled immigration system are an important component of that broader reform and integral to a progressive growth strategy .1 Science, technology, and innovation have been—and will continue to be— keys to U.S. economic growth . The United States must remain on the cutting edge of technological innovation if we are to continue driving the most dynamic economic engine in the world,2 and U.S. companies must be able to recruit international talent to effectively compete in the international innovation arena. To be certain, educating and training a 21st century U.S. workforce is a paramount national priority and the cornerstone of progressive growth. Improving access to topflight education for everyone in this country will be the foundation for our continued global leadership and prosperity. 3 But it is shortsighted in a globalized economy to expect that we can fill all of our labor needs with a homegrown workforce . In fact, our current educational demographics point to growing shortfalls in some of the skills needed in today’s economy.4 And as global economic integration deepens, the source points for skill sets will spread—such as green engineering in Holland or nanotechnology in Israel—the breadth of skills needed to drive innovation will expand, and global labor pools must become more mobile. Reforming our high-skilled immigration system will stimulate innovation, enhance competitiveness, and help cultivate a flexible, highly-skilled U.S. workforce while protecting U.S. workers from globalization’s destabilizing effects. Our economy has benefitted enormously from being able to tap the international pool of human capital.5 Arbitrary limitations on our ability to continue doing so are ultimately self-defeating : Companies will lose out to their competitors making them less profitable, less productive, and less able to grow; or they will move their operations abroad with all the attendant negative economic consequences. And the federal treasury loses tens of billions of dollars in tax revenues by restricting the opportunities for high-skilled foreign workers to remain in the United States.6 Access to high-skilled foreign workers is critical to our economic competitiveness and growth, but facilitating such access triggers equally critical flip-side considerations, in particular the potential for employers to directly or indirectly leverage foreign workers’ interests against the native workforce. Current enforcement mechanisms are too weak to adequately prevent fraud and gaming of the system.7 And current regulations tie foreign workers too tightly to a single employer, which empowers employers with disproportionate control over one class of workers. That control enables unscrupulous employers to deliberately pit one group of workers against another to depress wage growth .8 Even when there is no malicious employer intent or worker mistreatment, the restriction of labor mobility inherently affects the labor market by preventing workers from pursuing income maximizing opportunities.

2NC SOLVENCY – LONG TERMSustainable economic recoveryQuigley and Flake, 09 (House Representative Mike Quigley and Jeff Flake, Letter to the House of Representatives Committee on the Judiciary, October 29, 2009)The National Economic Council’s Strategy for American Innovation, released in September, rightly focuses on strengthening competitive markets that foster productive entrepreneurship. One key factor in ensuring a vibrant and long-term economic recovery is making certain U.S. employers have access to the best and brightest talent. However, U.S. companies are hampered by stringent caps on green cards and temporary visas for highly-skilled foreign workers that have earned Ph.D.s at American universities. We urge you to support H.R. 1791, the Stopping Trained in American Ph.D.s from Leaving the Economy Act, or “STAPLE” Act, of 2009. The STAPLE Act would exempt foreign-born individuals who have earned an American Ph.D. in science, technology, engineering, or mathematics from the limits on the number of employment-based green cards and H-1B visas awarded annually.”

Counterplan solves the base for economic growth- retention of foreign nationals in STEM fields now is critical to economic recoveryACIP, 09 (American Council on International Pesonnel, Letter from ACIP to House Representative Jeff Flake, April 3, 2009)Allowing certain foreign nationals who have earned a Ph.D. degree from a U.S. educational institution in a science, technology, engineering and math (STEM) field to be exempt from both the employment-based green card and H-1B visa caps will help ensure employers continue to innovate and lay the ground-work for America’s future economic recovery and growth . Quotas on foreign national talent have always kept bright minds from coming to our country and staying in our country. At a time when our country needs its economy to recover, the STAPLE Act sends a strong message that it is in our national economic interest to benefit from the foreign national talent that we educate at home in our U.S. universities. Losing such talent in these fields to our overseas competitors at this critical economic juncture would be detrimental to America’s economic recovery . With President Obama’s call for the creation of 4 to 5 million jobs over the next ten years in key fields like healthcare, energy and education, the STAPLE Act allows U.S. employers to work toward that goal by retaining highly-educated individuals in fields where America does not have the necessary pipeline of U.S. students to position us to better compete .

2NC SOLVENCY – GENERICTechnology, jobs, economic expansion, productivityFitz, 09 (Marshall, Center For American Progress, Director of Immigration Policy at American Progress, director of advocacy for the American Immigration Lawyers Association, key legislative strategists in support of CIR and has served as a media spokesperson on a broad array of immigration policy and legislative issues, quoted extensively across spectrum of international, national, presented at national conferences and universities on immigration matters, advised numerous members of Congress on immigration, helped draft major legislation, Prosperous Immigrants, Prosperous Americans, How to Welcome the World’s Best Educated,Boost Economic Growth, and Create Jobs, December 2009)Talented immigrants have made crucial contributions to the development of next generation technologies and have founded some of the most innovative businesses in the United States. They have created jobs, fueled productivity, and driven economic expansion. And as global economic integration deepens, sustainable growth will depend in part on our continued ability to attract the best and brightest innovators and entrepreneurs. Simply put, enhanced labor mobility is a 21st century reality and ultimately an economic imperative. But as the global talent pool expands and becomes more fluid, it also creates instability in some sectors of our homegrown labor force. Our policy makers must strive to minimize those effects and prevent employers from leveraging the interests of immigrant and native workers against each other. As the nation emerges from the shadows of this great recession, we must embrace a progressive growth strategy that enhances our global competitiveness . The reforms to our high-skilled immigration policies outlined in this paper will help promote the nation’s dual interest in growing the economy and protecting workers.

2NC SOLVENCY – SPILL OVERSTEM immigrants key to aid university growth – that also spills over to other sectors of the economyLu, 09 (Meng Lu, Part Of The Family: U.S. Immigration Policy And Foreign Students, Thurgood Marshall Law Review. Spring, 2009. 34 T. Marshall L. Rev. 343)Foreign students are considered technical brainpower, who often enroll in academic programs in science, technology, engineering, and mathematics [*346] (STEM). n11 Scholars and educators found that among the major developed countries and the newly industrialized countries, the United States ranks near the bottom in mathematics and science achievement among eighth graders. n12 Therefore recruiting foreign students improves the overall quality of new science and engineering Ph.D.s by drawing on a wider range of talented students who become the key contributors in driving the knowledge-based economy. n13 According to a report by the Institute of International Education (IIE), more than one-third of Nobel laureates from the United States are immigrants . n14 A study conducted by Chellaraj, Maskus, and Mattoo highlighted that foreign graduate students are significant inputs into developing new technologies in the American economy. n15 They found that their impacts are particularly pronounced within the universities but also have spill-over effects to non- university patenting. n16 Economic benefits, especially to the universities, also arise because foreign students, "while enrolled in schools, are an important part of the workforce at those institutions, particularly at large research universities." n17 They usually work as research assistants providing skilled assistance or research labor in the scientific labs, or as teaching assistants to help faculty grade undergraduates' homework and sometimes teach undergraduate classes. According to Borjas, these students play the same role in staffing the research labs of American universities that Mexican illegal workers play in staffing the vast agricultural fields of California. Both groups of workers enter the country, substantially increase the supply of workers, but lower wages in their respective occupations, and increase the profits and economic resources of the companies that hire them. n19 For those students who have to finance their studies in the United States, they pay full tuition and the higher education sector has become increasingly dependent on the high tuition fees paid by foreign students. n20 According to a 2005 study conducted by the Institute of International Education, there were 565,039 international students studying in the United States, and the net contribution of foreign students to the national economy is approximately $ 13.3 billion just through their tuition expenditure and living expenses. n21 These numbers show the important role played by international students in supporting the higher education institutions.

2NC SOLVENCY – WAGESCounterplan solves wages – high skilled workers don’t compete with American jobsFNS, 08 (Federal News Service, Capitol Hill Hearing, June 12, 2008 Thursday, Of The Subcommittee On Immigration, Citizenship, Refugees, Border Security And International Law Of The House Committee On The Judiciary; Subject: The Need For Green Cards For Highly Skilled Workers)To borrow a line from Harvard economist George Borjas, quote, "Skilled immigrants earn more, pay higher taxes and require fewer social services than less-skilled immigrants," end quote. This is verified by the National Research Council, which found that each immigrant with more than a high school education provides a net fiscal benefit to American taxpayers of $105,000 over their lifetime. On the other hand, each immigrant with less than a high school education imposes a net fiscal burden of $89,000 on taxpayers. It's clear that American taxpayers benefit from highly skilled and educated immigrants but not from low-skilled and uneducated immigrants, yet this is ignored, again, by our immigration system. Despite these facts, 95 percent of legal immigrants to the United States are not admitted based on theirs skills and education. Well, what's the result? Hundreds of thousands of new immigrants without a high school education arrive each year. This has a devastating impact on the wages and job opportunities of disadvantaged native-born Americans. In 2003, there were 8.8 million employed, native-born adults without a high school diploma, 1.3 million who were unemployed and 6.8 million no longer even in the labor force. Native-born Americans comprise 68 percent of all workers employed in occupations requiring no more than a high school education. These are some of the Americans competing with low-skilled and uneducated immigrants for jobs. Immigration is already having a depressing effect on the standard of living of vulnerable American workers. Steve Camarota at the Center for Immigration Studies has estimated that immigration has reduced the wages of the average native-born worker in a low-skilled occupation by 12 percent a year, or almost $2,000. Professor Borjas estimates that immigration in recent decades has reduced the wages of native-born workers without a high school degree by 7.4 percent. Congress should have revised our immigration policy long ago. Given the current state of the economy and the ever-increasing retirement of baby boomers, we can no longer wait any longer. Congress has a responsibility to promote immigration policies that protect the American worker and promote a strong American economy. To do that, we must prioritize the immigration of high-skilled and educated individuals.

Counterplan solves long term wage growth – most recent studies proveHolen 09, (Arlene, Senior fellow, Technology Policy Institute, “The Budgetary Effects Of High-Skilled Immigration Reform” March 2009,A number of studies have estimated labor market outcomes for domestic workers that result from the presence of foreign-born workers. In principle, to the extent foreign-born workers have similar skills and experience as native workers, they would compete with native workers for jobs and tend to lower their wages. But immigrants in general have different characteristics than native workers. Among other differences, they more frequently hold advanced degrees. Differences between domestic and immigrant workers in education and skills can lead to complementarities that result in benefits including higher earnings for domestic workers. Studies of the effects of immigration on labor markets have taken two approaches: some have focused on areas where there were large increases in the number of immigrants while others have looked at nationwide variations in the number of immigrants over time. A study by George Borjas, examining detailed census data on native workers, concluded that a 10 percent increase in workers in a particular education-experience group would reduce weekly earnings in that group by roughly 4 percent before adjustments in new investment in capital or before investments in skills by workers are made (Borjas 2003). Most recent studies have found little effect of immigrants on domestic workers (e.g. Card 1990). A review of the empirical literature by the National Research Council concluded that there is 6 only a weak relationship between native wages and the number of immigrants. One group that appears to be most affected are immigrants from earlier waves for whom the more recent immigrants are close substitutes in the labor market (Smith and Edmonston 1997, p. 6). Secondary economic adjustments to immigration occur because immigrants stimulate the demand for capital and also encourage domestic workers to invest in more education. A subsequent study by Borjas concluded that if there were complete adjustment of the capital stock, immigrants would have no adverse effect on native workers’ earnings (Borjas 2005). One recent analysis that examined adjustment costs concluded that capital generally adjusts quickly to changes in other factors of production (Hall 2004). A study of immigrants’ wage effects that took account of adjustments in the capital stock concluded that immigration tends to slightly raise the average wages of domestic workers and that the effect is greater when capital has had more time to adjust (Ottaviano and Peri 2006). A more recent study by the same

authors found that in the long run, immigration has a small positive effect on average native wages and on the wages of native workers without a high school degree (Ottaviano and Peri 2008).1

Positive effects on wages outweigh any negative onesHolen 09, (Arlene, Senior fellow, Technology Policy Institute, “The Budgetary Effects Of High-Skilled Immigration Reform” March 2009,On net, CBO concluded that, notwithstanding many uncertainties surrounding assessments of the budgetary impact of proposed immigration policies, S. 2611 would increase economic growth by a small degree and could improve the financial outlook for the Social Security system, although not by enough to avert the funding shortfall projected in Social Security’s long-term outlook. The agency’s review of the existing research literature on immigration found that, in aggregate and over the long term, tax revenues generated by immigrants exceed the cost of the services they receive (CBO 2007c, p. 1). An important factor that affects budgetary impact is the skill level of new workers—policies that provide more access for higher-skilled workers would yield more favorable budgetary effects than policies that provide more access for lower-skilled workers.

2NC SOLVENCY – EXPANDING MARKETSExpanding marketsCFR, 09 (U.S. Immigration Policy: Report of a CFR-Sponsored Independent Task Force, July 18, 2009, http://www.cfr.org/publication/19759/us_immigration_policy.html)First of all, coming from the business world, clearly, the economy both in our country and around the world is changing dramatically. We see that in a very profound way. So science and engineering is going to be critically important as we reinvent/recast our economy, and there's no question that talent, mindpower from other countries that come here to study, can be so dramatically helpful in that regard. Number two, I think with those students who come here, work, have these ideas, a very large percent of the patents and new ideas and so forth clearly are reflecting that kind of input from students who have come here to study. That creates jobs, good-paying jobs, new jobs, again, for American workers. And thirdly, I think very importantly, that I have seen time and time again, and both of our sons have as well, when students come here to study, they are allowed to stay here to work and contribute. Many times they go home, start companies, endeavors in their own countries that obviously has links to our countries -- country and expanding markets around the world, and that is absolutely crucial going forward if we're to have a vibrant, full-employment economy here.

2NC SOLVENCY – FEDERAL BUDGETSTEM immigrants solve the federal budgetHolen 09, (Arlene, Senior fellow, Technology Policy Institute, “The Budgetary Effects Of High-Skilled Immigration Reform” March 2009,Most economists believe that admitting more highly skilled workers from other countries is beneficial to the U.S. economy. This is particularly true of workers in the fields of science, technology, engineering, and mathematics (STEM). Immigration also has positive effects on the federal budget. Highly skilled workers pay more in taxes than less skilled workers and they are not likely to receive federal benefits, particularly in the near term. This paper examines those fiscal effects to help inform the immigration policy debate. The estimates are not precise. They rely on very simple assumptions that are consistent with the economics literature and indicate the magnitudes involved. The paper finds: In the absence of green card and H-1B constraints, roughly 182,000 foreign graduates of U.S. colleges and universities in STEM fields would likely have remained in the United States over the period 2003-2007. They would have earned roughly $13.6 billion in 2008, raised the GDP by that amount, and would have contributed $2.7 to $3.6 billion to the federal treasury. In the absence of green card constraints, approximately 300,000 H-1B visa-holders whose temporary work authorizations expired during 2003-2007 would likely have been in the United States labor force in 2008. These workers would have earned roughly $23 billion in 2008, raised the GDP by that amount, and would have contributed $4.5 to $6.2 billion to the federal treasury. Similar results are obtained when analyzing legislation considered by Congress during the last few years. For example, under reasonable assumptions, the relaxation of green card constraints proposed in the Comprehensive Immigration Reform Act of 2006 could have increased labor earnings and GDP by approximately $34 billion in the tenth year following enactment and had a net positive effect on the budget of $34 to $47 billion over ten years. Relaxation of H-1B caps under the Comprehensive Immigration Reform Act of 2007 could have increased labor earnings and GDP by $60 billion in the tenth year following enactment and improved the federal budget’s bottom line by $64 to $86 billion over ten years. Failing to enact such legislation has been costly to the economy and the federal treasury. Although economists hold different views on the economic effects of immigration in general, they are virtually unanimous in their belief that admitting more highly skilled workers, particularly in STEM fields (science, technology, engineering, and mathematics), is beneficial to the U.S. economy (see e.g., Mankiw 2008). High-skilled immigration promotes technological entrepreneurship, economic growth, and productivity. It is less well understood that immigration—especially high-skilled immigration—has beneficial fiscal effects at the federal and also possibly the state and local levels (see e.g., Lee and Miller 2000). This paper examines the economic effects of high-skilled immigration and its effects on the federal budget. Its purpose is to provide data and analysis to help inform the immigration policy debate. Constraints on Admissions in Current Law High-skilled workers can enter the U.S. labor force by obtaining an employment-based green card, which allows an individual to stay in the United States as a permanent resident, or an H-1B visa, which allows an individual to work here for three years, renewable to six years. Current law limits the annual number of H-1B visas to 65,000 and also exempts up to 20,000 foreign nationals holding a master’s or higher degree from a U.S. university from the cap. H-1B petitions far exceed the number of slots and are allocated through a random selection process. Most H-1B visa holders and their employers hope to be able to convert their H-1B visa to a green card, so they can stay permanently. The current annual cap on green cards for skilled workers is 40,000 and there is a five-year backlog of applications. (There are separate caps of 40,000 for priority workers with extraordinary ability and also for professionals holding advanced degrees.) Per-country caps further limit admissions, especially of applications from China and India. The result of these constraints is that many high-skilled workers in scientific and technical fields who are currently working in the United States on temporary H-1B visas are forced to leave their jobs each year and return home. Similarly, many foreign students completing scientific and technical training at U.S. colleges and universities who would otherwise remain and work in the United States are returning to their home countries, taking their U.S.-acquired human capital with them. This loss of human resources imposes significant costs on the U.S. economy and constitutes a drain on federal revenues. Over the past three years, Congress has considered comprehensive immigration reform packages that increased employment-based admissions and other, more narrowly targeted bills. Immigration issues are likely to be revisited during the coming months as technology spending in the stimulus package boosts demand for engineers, individuals with advanced degrees, and other skilled workers, at the same time as news of layoffs raise concerns about the jobs and wages of domestic workers.

1NC INNOVATION Counterplan solves – matching green cards with diplomas restores innovation, growth, and leadershipFrank, 09 (T.A,  Irvine Fellow at the New America Foundation and an editor of the Washington Monthly, Green Cards for Grads, Washington Monthly, May/June 2009, http://www.washingtonmonthly.com/features/2009/0905.frank.html)

For years, the United States effortlessly excelled at this game. The governments of China and India made conditions too disagreeable for innovators, while Europe and Australia effectively coasted. But our run of lazy luck is long gone. Today, China, India, the European Union, Australia, and other competitors are all vying to attract the most promising talents to their shores. China and India have greatly improved conditions for would-be entrepreneurs, and the EU is considering a skills-based "blue card" to compete with the green card offered here. Now, we are starting to see our rivals pull ahead of us. In 2007, high-tech exports from the United States to other countries totaled $214 billion, but imports amounted to $333 billion. If we must do the math (and in this case we must, even if India offers to help), then we arrive at an ugly result: a high-tech trade deficit of nearly $119 billion. When the world’s leader in innovation can’t break even, it’s a dubious leader. One way to regain our dominance in the tech sector would be to get more of the brightest people in the world to move here. According to a recent report issued by Duke University and the University of California at Berkeley, roughly a quarter of technology and engineering start-ups in the United States have founders who were born abroad. Over half of Silicon Valley start-ups have at least one immigrant founder. And, according to Microsoft, 35 percent of its patents filed last year came from visa and green card holders. Skilled immigration has become central to innovation and entrepreneurship in the United States . As things stand, however, we’re not exactly making new talent feel welcome. The United States provides advanced degrees to thousands of the world’s most gifted students and then sends them home, where they can work for our competitors. That we pursue such a policy only increases my concern that all the smart people in the United States have already left. Students who wish to stay here after graduating must undergo arduous procedures that encourage people to exit rather than stay. Some are lucky enough to obtain O-1 visas or green cards purely on the basis of "extraordinary ability," but requirements are so stringent that few students can meet them early in their careers. Most face years of working on H-1B visas, for which demand dwarfs supply. Because the H-1B relies on employer sponsorship, which can easily be revoked, foreign workers are vulnerable to exploitation, which makes the prospect of working here less attractive and drives down wages for everyone. And once the clock runs out on an H-1B visa, the line for an employment-based green card is daunting, with more than a million applicants waiting for just over 100,000 spots each year. What’s more, a maximum of 7 percent of employment-based green cards may go to citizens of any one country, meaning that Chinese and Indians are at an immense disadvantage to, say, Tuvaluans. For many applicants, no option remains but to leave the country. Although precise numbers aren’t available, newspaper reports and statements by college professors and students suggest that today, in contrast to only five or ten years ago, students from China and India are more likely to return home after completing their education in the United States. Capitol Hill isn’t entirely oblivious to such problems. While there is immense disagreement over how to resolve the status of those who are in the country illegally, most members of Congress recognize that legally employed, accomplished immigrants are valuable to the economy. Even a hard-liner like Republican Tom Tancredo has conceded that "the kind of immigration that would have a positive impact would most likely be the kind that is made up of high-income, high-skilled people." Nevertheless, because the entire topic of immigration has become so poisonous, even the most milquetoast stand-alone legislative proposals have stalled. One bill that seemed to have a decent chance at passage was sponsored by Republican Senator John Cornyn in 2007 and would have removed caps on employment-based green cards for workers with advanced degrees. (The bill wouldn’t have scrapped H-1Bs, but it would have made them more easily available, which is better than nothing.) This went nowhere. Arlen Specter proposed something similar. This went nowhere. Last year, congressional Democrats such as Zoe Lofgren of San Jose proposed lifting the 7-percent-per-country cap on green cards. Guess where that went. Lately, with the country in recession, Senators Charles Grassley and Dick Durbin have tried to restrict H-1Bs even further, stating that Americans should get an easier shot at available jobs. One especially prominent opponent of increased H-1B immigration is Professor Norm Matloff, a computer scientist at the University of California at Davis. Matloff believes that Silicon Valley is blinded by two vices: greed and ageism. Because of greed, technology companies hire H-1B immigrants, who save them money. Because of ageism, Silicon Valley pushes out older programmers who are perfectly qualified to do the work. Such practices not only force talented older Americans out of the technology industry, they also discourage bright young Americans from entering it, thus squandering some of the country’s finest potential talents. American technology workers would be doing better if there were fewer foreign workers, says Matloff, and Silicon Valley would be just as profitable and innovative, if not more so. Now, there’s no way to say for sure that Matloff is wrong. But two points work against his overall line of thinking. The first is that high-skilled U.S. workers have enjoyed tremendous increases in wages in the past few

decades, while low-skilled workers have seen losses and stagnation. To the extent that immigration has had a negative effect on the wages of American-born workers, the burden hasn’t been borne by graduates of MIT. The second is that from the perspective of the United States as a whole, it’s worth remembering that even if Matloff is right about ageism in Silicon Valley (and if he is, we have a legal system to address such claims), the point of nourishing such a place isn’t only to provide American technology workers with employment. It’s to provide the nation with wealth and a technological edge. It’s to help right our trade balance through high-tech, high-income industries. And to do that, we need to do more than simply fill available vacancies with qualified applicants. For any endeavor that aims to be the best of its sort, an oversupply of talent is necessary. For every professional violinist in the United States, dozens of nearly-as-good—or sometimes even superior—violinists are out of luck in the quest for orchestral employment. If we examine a different sector, such as professional basketball, all these principles are taken for granted. Take Yao Ming of the Houston Rockets. He’s a one-man multimillion-dollar business, and his talent creates work for hundreds of Americans who work for him directly or indirectly in support roles. Undeniably, though, he is also taking away a job that an American-born basketball player could do instead, and there are thousands of American-born basketball players unable to find employment in the field. If the point of having basketball teams is to provide as many American basketball players as possible with jobs, then the United States is following a very wrongheaded immigration policy. (Out of fewer than 500 NBA players currently employed in the United States, more than seventy are foreign-born.) But if the point is to create the most competitive teams in the world and to broaden the appeal (and export, and merchandising, and ticket sales, and broadcasts) of basketball to as wide a world market as possible, then our current policy is rather good. A sensible immigration policy would cease placing undue burdens on low-income Americans (in the form of millions of low-skill workers) and start adding to the ranks of high-skill workers, who generate greater innovation and wealth. An immigrant janitor competes for one of a limited number of janitorial jobs, and an oversupply of candidates merely drives down wages. But an immigrant inventor may devise a product that will be manufactured, distributed, marketed, and even exported—all processes that create jobs. While low-skill workers tend to take jobs, high-skill workers tend to make jobs. And high-skill workers who are displaced can at worst usually move down the job chain: an unemployed nuclear physicist may take a temporary job as a cashier. But low-skill workers who are displaced can rarely move up the job chain: an unemployed cashier cannot take a temporary job as a nuclear physicist. The way forward, then, should be pretty simple in principle. Any student with an advanced degree in science, technology, engineering, or math should be offered a reasonable chance at permanent residency in the United States, provided that she can obtain (and retain) employment in her field. The overall aim of any policy we devise should be to prevent an exodus of the people we’ve educated. Yes, we’re in a recession. But that doesn’t mean we can take a break in the race for economic and technological leadership—quite the opposite. The surest way to lose a battle is to forget that the primary goal isn’t to make jobs for soldiers. It’s to avoid getting one’s ass kicked.

2NC SOLVENCY – GENERICKnowledge and brain powers keyWadhwa et al, 07 (Vivek Wertheim, Labor and Worklife Program at Harvard Law School, prof. the Pratt School of Engineering at Duke University, Dr. Guillermina Jasso, prof of sociology at NYU and research fellow, Institute for the Study of Labor (IZA), Ben Rissing,the Labor and Worklife Program at the Harvard Law School, research scholar at the Pratt School of Engineering at Duke University, Dr. Gary Gereffi, prof of sociology and director of the Center on Globalization, Governance & Competitiveness at Duke University, bachelor’s degree from the University of Notre Dame, and master’s and doctorate degrees from Yale University, Richard Freeman, holds the Herbert Ascherman Chair in Economics at Harvard University, and serves as faculty director at the Harvard Law School, Intellectual Property, the Immigration Backlog, and a Reverse Brain-Drain: America’s New Immigrant Entrepreneurs, Part III)The United States benefits from having foreignborn innovators create their ideas in the country. Their departures would, thus, be detrimental to U.S. economic well-being. And, when foreigners come to the United States, collaborate with Americans in developing and patenting new ideas, and employ those ideas in business in ways they could not readily do in their home countries, the world benefits. Therefore, foreign national departures from the United States also reduce global well-being. Given that the U.S. comparative advantage in the global economy is in creating knowledge and applying it to business, it behooves the country to consider how we might adjust policies to reduce the immigration backlog, encourage innovative foreign minds to remain in the country, and entice new innovators to come.

Long term, innovative capacity, global image and reputationLu, 09 (Meng Lu, Part Of The Family: U.S. Immigration Policy And Foreign Students, Thurgood Marshall Law Review. Spring, 2009. 34 T. Marshall L. Rev. 343)Historically, the presence of foreign students is viewed, particularly by the academic community, as a positive one. Their coming to the United States for advanced study and high-skilled work increases the Country's long-term competitiveness, innovative capacity, global image, and reputation. n8 Yet, more recently, discrepancies on the impacts of foreign students have arisen. n9 Many scholars show concerns about the large inflows of foreign students positing a threat to national security and taking away educational opportunities for native-born citizens, and they hold suspicious views on how much foreign students contribute to the American economy. n10 Following are the issues central to the public and scholarly debates.

2NC SOLVENCY – PATENTSInternational patent creationWadhwa et al, 07 (Vivek Wertheim, Labor and Worklife Program at Harvard Law School, prof. the Pratt School of Engineering at Duke University, Dr. Guillermina Jasso, prof of sociology at NYU and research fellow, Institute for the Study of Labor (IZA), Ben Rissing,the Labor and Worklife Program at the Harvard Law School, research scholar at the Pratt School of Engineering at Duke University, Dr. Gary Gereffi, prof of sociology and director of the Center on Globalization, Governance & Competitiveness at Duke University, bachelor’s degree from the University of Notre Dame, and master’s and doctorate degrees from Yale University, Richard Freeman, holds the Herbert Ascherman Chair in Economics at Harvard University, and serves as faculty director at the Harvard Law School, Intellectual Property, the Immigration Backlog, and a Reverse Brain-Drain: America’s New Immigrant Entrepreneurs, Part III)Evidence from the “New Immigrant Survey” indicates that approximately one in five new legal immigrants and about one in three employment principals either plan to leave the United States or are uncertain about remaining. Moreover, media reports suggest that increasing numbers of skilled workers have begun to return home to countries like India and China where the economies are booming. Given the substantial role of foreign-born residents in the United States in international patent creation, and the huge backlog in granting visas to employment-based principals, the potential exists for a reverse brain-drain of skilled workers who contribute to U.S. global competitiveness.

2NC SOLVENCY – EXPERTSTop innovation experts vote negativeCo, 10 (Emily Co, Green cards for foreign students will boost innovation, expert says, http://absolutelyfobulous.com/2010/06/18/green-cards-for-foreign-students-will-boost-innovation-expert-says/)What caught my eye in Friedman’s “Start-Ups as Graduation Gifts” op-ed piece last week was a quote by one of the country’s top experts on innovation, Robert Litan. He is the Vice President of Research and Policy at the Kauffman Foundation. Litan gave a number of suggestions that he thinks would help inject more innovation into the economy; there was one in particular that stood out to me: “Litan said he’d staple a green card to the diploma of every foreign student who graduates from a U.S. university and push for a new meaningful entrepreneurs visa (the current one, the EB-5, requires $1 million of capital that few foreign entrepreneurs have). It would grant temporary residence to any foreigner who comes here to establish a company and permanent residency if that company generates a certain level of new full-time jobs and revenues.” Litan makes a really bold statement, but it’s a plan that makes a lot of sense. Foreign-born entrepreneurs have been a constant contributor to the economy, co-founding many of the nation’s biggest companies. Even Forever 21–every American girl’s one-stop shop for fashion–was started by a Korean immigrant couple who opened their first store in the K-town of Los Angeles. There is a movement that’s pushing for a less restrictive startup visa for entrepreneurs who are non-US citizens, but since the Obama administration is not making the growth of start-ups a priority, there’s no telling how long it will take for the proposal to pass.

1NC SCIENCE LEADERSHIP The counterplan’s essential to US science leadershipAlberts, 03 (Bruce Alberts, President, National Academy of Sciences, Wm. A. Wulf, President, National Academy of Engineering, and Harvey Fineberg, President, Institute of Medicine, Dec. 13, 2002, Current Visa Restrictions Interfere with U.S. Science and Engineering Contributions to Important National Needs)Building stronger allies through scientific and technological cooperation. It is clearly in our national interest to help developing countries fight diseases such as AIDS, improve their agricultural production, establish new industries, and generally raise their standard of living. There is no better way to provide that help than to train young people from such countries to become broadly competent in relevant fields of science and technology. Yet our new visa restrictions are making this more difficult. For example, several hundred outstanding young Pakistanis, carefully selected by their government as potential leaders of universities there and accepted for graduate training in U.S. universities, experienced a 90 percent denial rate in applying for U.S. visas. Maintaining U.S. global leadership in science and technology. Throughout our history, this nation has benefited enormously from an influx of foreign-born scientists and engineers whose talents and energy have driven many of our advances in scientific research and technological development. Over half a century ago, Albert Einstein, Enrico Fermi, and many others from Western Europe laid the foundations for our global leadership in modern science. More recently, immigrants from other parts of the world -- most notably China, India, and Southeast Asia -- have joined our research institutions and are now the leaders of universities and technology-based industries. Many others have returned to take leadership positions in their home countries, and now are among the best ambassadors that our country has abroad. Approximately half of the graduate students currently enrolled in the physical sciences and engineering at U.S. universities come from other nations. These foreign students are essential for much of the federally funded research carried out at academic laboratories . Scientific and engineering research has become a truly global enterprise. International conferences, collaborative research projects , and the shared use of large experimental facilities are essential for progress at the frontiers of these areas. If we allow visa restrictions to stop international collaborations at our experimental facilities, then these facilities will cease to attract international support. Moreover, our scientists and engineers will no longer enjoy reciprocal access to important facilities abroad. And if we continue to exclude foreign researchers from conferences held in the United States, then those meetings may cease to take place in this country in the future, depriving many American scientists of the opportunity to participate in them. In short, the U.S. scientific, engineering, and health communities cannot hope to maintain their present position of international leadership if they become isolated from the rest of the world. We seek the help of the U.S. government in implementing effective and timely screening systems for issuing visas to qualified foreign scientists and students who bring great benefit to our country. We view this as an urgent matter, one that must be promptly addressed if the United States is to meet both its national security and economic development goals. 

2NC SOLVENCY – GENERICCounterplan revitalizes US scientific progress – that’s key to US science leadershipPearson and Yang, ‘08 (Jon Pearson, director of the Bechtel International Center at Stanford University, testifying on behalf of my professional association, NAFSA, the Association of International Educators. NAFSA is the world's largest professional association dedicated to the promotion and advancement of international education and exchange, Dr. Yongjie Yang, a Ph.D. degree in genetics and neuroscience, his lab is one of the best leading labs in the research of this disease in the world. Federal News Service, Capitol Hill Hearing, June 12, 2008 Thursday, Of The Subcommittee On Immigration, Citizenship, Refugees, Border Security And International Law Of The House Committee On The Judiciary; Subject: The Need For Green Cards For Highly Skilled Workers)

My remarks today will focus on the broad challenges the United States now faces in attracting and retaining international students. Specific interest, of course, is the current law capping the number of green cards issued annually, even to those who graduate from U.S. colleges and universities with higher degrees. The United States is in a global competition for international students and scholars. That may seem like an unremarkable statement, but often, U.S. law and policy do not always reflect an understanding of this reality. Though the U.S. is renowned and still renowned for being home to the majority of the top colleges and universities in the world, the international student market is being transformed in this century. There are many new players in the game acting much more purposefully and strategically than ever before. Competitor countries have implemented strategies for capturing a greater share of this market. Their governments are acting to create more streamlined visa and entry processes and more welcoming environments and are setting goals for international student recruitment. Our neighbor Canada recently changed its employment policy to allow international graduates to work for up to three years after graduation, and in fact Canada does recruit our international students on our own campuses, including my own. They have visited Stanford three times in the last few years to talk to students about opportunities in Canada. At Stanford, we have been recently dealing with the Homeland Security extension on practical training for STEM students. A broader context is that France, Germany, the United Kingdom and Canada have all made similar changes to the possibilities for international students remaining in those countries and working after graduation. New competitors will also enter the market for international students. Primary among them is the European higher education area, which comprises the signatories to the Bologna Declaration. This goal is to create a seamless higher education system in Europe by 2010, with credits entirely transferable among their higher education institutions and often instruction in English. The European Union is also considering a blue card, similar to our green card, to be more competitive for non-European talent. Other countries are recognizing the value of educating the next generation of leaders in attracting the world's scientific, technological and intellectual elite. U.S. immigration law and policy has not yet effectively been adapted to this era of globalization. My own institution has been witness to this as we also offer services to higher foreign-born faculty and researchers. But even so, many of the best and the brightest around the world still wish to come here and study. We should welcome them by creating a clearer path to the green card status for them that is not tied to these low caps on the green cards available annually. In a global job market, employers look for the talent they need wherever they can find it, and students and highly talented workers look for the places to study and work that offer them the most opportunity. This means that options for employment after graduation are integral to attracting bright and talented international students. Employment prospects are often a part of that calculus in deciding where to study, work and live. Not all students who arrive in the U.S. wish to remain. Some have commitments to their home country, but others discover their potential in the environment of U.S. higher education and their career and life goals are changed. Google, Hotmail, Yahoo and Sun are examples in Stanford's own backyard of former students who have remained in the United States. I don't think it's a secret the U.S. immigration law often makes it difficult for international students to work after graduating, even from the most prestigious U.S. higher education institutions. The annual H-1B cap lottery is reported internationally, highlighting that the entire annual allotment is depleted in a day or two. In conclusion, what better way to capture the world's best and brightest who want to become part of our nation and to make it easier for them to remain to contribute to American economic and scientific leadership after they graduate from U.S. universities? It's with these comments that I am delighted to support H.R. bill 6039. Thank you very much. REP. LOFGREN: Thank you very much, Mr. Pearson. Dr. Yang, we'd be delighted to hear from you. MR. YANG: Good afternoon, Madame Chairwoman and Congressman King and members of committee. I want to first thank our Representative Congresswoman Zoe Lofgren for giving this opportunity for me to testify here, and I'd like to share my personal experience on permanent residency application with this panel. And along with other people's testimony, I'd like to draw attention to -- for the Americans' need to change the laws regarding the highly skilled immigrants. My name's Yongjie Yang. I was born in China and came here in 2000, when I was admitted to the neuroscience center genetics program in Iowa State University, and there I basically focused on the mechanisms for environmental toxin-induced nerve cell degeneration, which is highly relevant to the Parkinson's disease research. I was awarded a Ph.D. degree in genetics and neuroscience in 2005. That same year, my wife also was awarded the master degrees from also Iowa State University. Currently I'm now a research scientist in the department of neurology at Johns Hopkins University, and my current work also focuses on the pathogenic mechanisms in neurodegenerative disease, including Alzheimer's disease, Parkinson's disease and amyotrophic lateral sclerosis, also known as Lou Gehrig's disease. Our lab is one of the

best leading labs in the research of this disease in the world. By better understanding the pathogenic mechanisms of the disease, we hope to develop effective neuroprotective strategies to cure or delay the progression of this disease. We hope to find the cure here. On a personal note, I married my wife while we were both at Iowa State University, and my wife also works at Johns Hopkins University as a specialist in Parkinson's disease research. We have a U.S. citizen daughter who is about four years old, and we recently just bought a house in Ellicott City, Maryland. So we do plan to stay here long. I currently have H-1B status which will expire next year. Also, I have filed my immigrant visa petition in May 2006 and got approved last year in February, but I haven't received my green card yet because of the severe backlog of employment-based visa numbers. And I don't know now, because of the situation, how long I have to wait before I can become the permanent resident and also become the U.S. citizen. I'd like to emphasize the three major obstacles that the immigration status poses on my situation as well as other people's. The first one is because of the unavailability of the green card, I'm not eligible to apply for many federal grants from National Institute of Sciences or from National Institute of Health or National Science Foundation and from other federal agencies, although my research is very promising to identify the drug target to cure or delay the ALS. The second obstacle is because of the situation, not me but some other people who share a similar background as me cannot work for the federal agencies such as FDA, NIH or other federal agencies , although they possess specialized skill that is very much needed for these agencies. The third obstacle obviously is the travel inconvenience. For example, next -- last year, I had an opportunity to go to London for international conference which is very important in my field, but I couldn't go because if I go, I have to go back to China to reapply for my H-1B stamp and then come back to Baltimore, which could take months. So opportunity like this got wasted, and for scientific research it is vital to have discussions to meet with colleagues to talk about the latest research programs, and also that's also a problem to establish the long-term collaboration with your international colleagues . So as I understand it, the whole point of the employment-based immigration system is to keep the brightest, the best of the foreign minds people in this land, in this land of opportunities. However, we cannot become the U.S. citizen before we got the green card, the permanent residence. Because of all these problems, we cannot travel freely, we cannot apply for some federal grants, we cannot apply jobs for the federal agencies, even though we are doing very cutting-edge researches and the important -- developing important technologies which might bring -- might create new job opportunities for the U.S.

Retaining Foreign student is key to science leadershipCFR, 09 (Council on Foreign Relations, Broken Immigration System Risks Serious Damage to U.S. National Interests, Warns CFR Task Force, July 8, 2009, http://www.cfr.org/publication/19743/broken_immigration_system_risks_serious_damage_to_us_national_interests_warns_cfr_task_force.html)-Attracting skilled immigrants: The United States must tackle head-on the growing competition for skilled immigrants from other countries, and make the goal of attracting such immigrants a central component of its immigration policy. The report urges an end to the hard caps on employment-based immigrant visas and skilled work visas in favor of a more flexible system, the elimination of strict nationality quotas, and new opportunities for foreign students earning advanced degrees to remain in the United States after they graduate. -National security: The Task Force calls for minimizing visa restrictions that impede scientific collaboration, noting that America’s long-term security depends on maintaining its place as a world leader in science and technology . The administration should also permit a broader effort by the U.S. military to recruit recent immigrants who are not yet citizens or green card holders, so as to bolster U.S. military capabilities.

Reforming EB visas is key to our global leadership in scienceAILA, 07 (American Immigration Lawyers Association, Position paper, www.aila.org/content/default.aspx?docid=25509)Eliminating the Employment-Based (EB) Green Card Backlog: Vital to America’s Economic CompetitivenessTHE ISSUE: Reform of the permanent employment-based green card program is urgently needed in order for U.S. employers to hire the foreign talent necessary for the American economy to remain vibrant and competitive. Over half of all science, technology, engineering, and mathematics graduates of American universities are foreign born. We are also facing a severe shortage of registered nurses as the tidal wave of retiring baby boomers is upon us. At a time when our economy needs high-skilled workers more than ever, our current system forces most of these graduates to leave the U.S. and apply their valuable skills in other countries, a scenario that is beneficial to all but the U.S. Needless to say, foreign countries are not complaining, but are instead poised to take advantage in their increasingly successful attempts to surpass us. Simply put, if the problem isn’t solved soon, the U.S. stands to rapidly lose not only the competitive economic edge generations

of Americans have worked so hard to achieve, but also its global preeminence in science and technology— areas vital to our prosperity and national security .

2NC SOLVENCY – SOFT POWER INTERNALForeign scientists is key to soft power through science – solves their internal linkPickering and Agre, 10 (Science diplomacy aids conflict reduction, THOMAS R. PICKERING AND PETER AGRE, FEBRUARY 20, 2010, http://www.signonsandiego.com/news/2010/feb/20/science-diplomacy-aids-conflict-reduction/)In addition to providing resources, the government should quickly and significantly increase the number of H1-B visas being approved for foreign doctors, scientists and engineers. Foreign scientists working or studying in U.S. universities make critical contributions to human welfare and to our economy, and they often become informal goodwill ambassadors for America overseas. Science is a wide-ranging effort that naturally crosses borders, and so scientist-to-scientist collaboration can promote goodwill at the grass roots. San Diego boasts a remarkable initiative at High Tech High charter school. Twice in recent years, biology teacher Jay Vavra has led student teams to Africa to study the illegal trade in meat from wild and endangered animals. Working with game wardens and tribal leaders, they use sophisticated DNA bar coding techniques to analyze the meat and track down poachers. Such efforts advance science while supporting peace and the health of the planet. In an era of complex global challenges, science diplomacy can be crucial to finding solutions both to global problems and to global conflict.

2NC POLITICS NBPublic and CongressAlden, 10 (Edgar Alden, A Worthy Effort on Immigration Reform, April 30, http://www.cfr.org/publication/22029/worthy_effort_on_immigration_reform.html)The Democrats also want passage of the DREAM Act to help children of illegal migrants brought here by their parents, and the AGJobs Act to reform the rules for foreign agricultural workers, two bills that would probably have passed already had they not been linked to comprehensive immigration reform. Attracting high-skilled workers by offering a green card for foreign students who earn graduate degrees in science and engineering from U.S. universities should enjoy broad public and congressional support. With innovation driving the U.S. economy, the importance of maintaining America's attractiveness as a magnet for the world's best and brightest is hard to overstate.

Tech lobbies – they are overwhelmingly key to the agenda – dissipates opposition to the planMatloff 2 (Norman, Professor of Computer Science at UC-Davis, “Debunking the Myth of a Desperate Software Labor Shortage,” Testimony to the U.S. House Judiciary Committee Subcommittee on Immigration, Question: How was the industry able to get Congress to pass the H-1B increase in 1998, given that a Harris Poll had shown that 82% of Americans opposed the increase?)The high-tech industry wields enormous, unstoppable clout on Capitol Hill and in the White House, and even in academia. In Spring 2000, a major supporter of pending legislation which would increase the H-1B quota, Rep. Tom Davis (R-Va.), had the gall to say, “This is not a popular bill with the public. It’s popular with the CEOs...This is a very important issue for the high-tech executives who give the money.” Rep. Davis is chair of the Republican Congressional Campaign Committee. Then when the Senate passed the H-1B bill on October 3, 2000, even more outrageous talk came from Capitol Hill, as reported by the San Francisco Chronicle: “Once it’s clear (the visa bill) is going to get through, everybody signs up so nobody can be in the position of being accused of being against high tech,” said Sen. Robert Bennett, R-Utah, after the vote. “There were, in fact, a whole lot of folks against it, but because they are tapping the high-tech community for campaign contributions, they don’t want to admit that in public.”

That also gets Republicans on boardSewell, 10 (Abby Sewell, Medill News Service, Tech firms play quiet role in immigration-overhaul push, http://www.mcclatchydc.com/2010/05/06/93699/tech-firms-play-quiet-role-in.html#ixzz0uoyrVX21)Advocates in the broader immigration-overhaul coalition said support from the technology industry would be key to winning the wide political backing that was necessary to give a comprehensive bill a shot at passing. "I think it is important, and in part that is because tech is one of the key business sectors that will be necessary to bring the Republican votes we will need, in the Senate, especially," said Jeanne Butterfield, a senior adviser for the National Immigration Forum, a group that advocates policies that are more welcoming toward immigrants. Technology companies make up a substantial portion of the voices that are lobbying for federal immigration revisions. Of the 288 federal lobbyist filings that had reported lobbying on immigration issues in the first quarter of the year as of Monday, an analysis shows that about 17 percent came from companies and organizations that represent the technology and engineering sectors. Others represented fields such as medicine and education, which also are interested in skilled immigrants. The people who are lobbying on behalf of the tech sector said that although their issues with the immigration system were specific, they had no plans to peel off from the broader overhaul coalition to pursue a more tailored bill. Muller said the word from Capitol Hill had been that immigration was too contentious an issue to tackle piecemeal. PROVISIONS THAT WOULD AFFECT TECH SECTOR: Green cards (legal permanent resident visas): Foreign students who graduate from U.S. schools with advanced degrees in science, technology, engineering or mathematics automatically would be eligible for green cards if U.S. employers offer them jobs.

BipartisanshipBusiness Line, 08 (INFO-TECH US BILL SEEKS TO EASE GREEN CARD RESTRICTIONS, June 12, 2008 Thursday, lexis)Two US Senators have introduced a legislation that seeks to allow foreign-born students graduating from US universities with advanced degrees in science, technology, engineering and math (STEM) to obtain green cards if they have jobs waiting for them in the US. The legislation was introduced last week by Senators Ms Barbara Boxer and Mr Judd Gregg. In other words, the legislation proposes to exempt students graduating from the US universities from the annual limit on employment-based (EB)

permanent resident visas or 'green cards'. The US Representative, Ms Zoe Lofgren, has already introduced companion legislation in the House with bipartisan support .

***WFIRST ADVANTAGE

1NC EXOPLANETS FRONTLINEExoplanet research is an economic inevitability – Kepler telescope solvesBillings, 11 – freelance science writer [February 2, 2011, Lee Billings, Nature 470, 27-29, “Astronomy: Exoplanets on the cheap,” http://www.nature.com/news/2011/110202/full/470027a.html]

Astronomers searching for planets around stars other than the Sun have had much to celebrate over the past decade. The number of confirmed 'exoplanets' has soared from about 50 to more than 500 in that time. And although none of these planets closely resembles Earth, NASA's Kepler space telescope, launched in

2009, is now delivering candidates from distant stars by the hundreds — some of which may prove to be very Earth-like indeed (see page 24). The exoplanet search itself has been wildly successful, but not so the searchers' quest for multibillion-dollar follow-up missions. Hopes for ambitious spacecraft such as a Space Interferometry Mission or Terrestrial Planet Finder have been dashed as missions have been cancelled or postponed owing to a combination of sluggish economic growth, deep cuts to space-science funding and programme difficulties with NASA's James Webb Space Telescope (JWST). In response, the planet-hunting community has got creative, devising ways to maximize the science and minimize the costs. An Exoplanet Task Force jointly commissioned by NASA and the US National Science Foundation accordingly issued a report1 in 2008, supporting a new strategy for exoplanet research. Rather than waiting for the launch of costly, dedicated planet-hunting spacecraft, it calls for astronomers to press ahead with cheaper, ground-based surveys to discover worlds orbiting nearby stars, which appear brighter to us than do those farther away, and so are easier to study. The hope is that

such low-cost surveys will yield at least a few worlds that can be studied using space-based resources such as the JWST. Such facilities would allow astronomers to spectroscopically search the exoplanets' atmospheres for ingredients such as carbon dioxide, water vapour and perhaps methane, oxygen and other trace gases, which could indicate that life is present. A 2010 report from the European Space Agency reached nearly identical

conclusions2. "The planets are out there, and it's relatively inexpensive to go after them," says Greg Laughlin, an astrophysicist at the University of California, Santa Cruz, who served on the NASA/National Science Foundation task force. "There's an economic inevitability to this." Of the many ideas that astronomers have come up with for conducting exoplanet searches on the cheap, five stand out.

Ground-based telescopes solve exoplanet study

Dacey, 10 – a reporter for physicsworld.com [Feb 5, 2010, James Dacey, “Exoplanet hunting brought down to Earth,” http://physicsworld.com/cws/article/news/41642]

Researchers in the US, UK and Germany have used a ground-based telescope to detect organic compounds in the atmosphere of an exoplanet – that is, a planet orbiting a star other than our Sun. The result, the

researchers claim, will open up the hunt for Earth-like planets to anyone with access to a decent telescope. "We expect a massive explosion of exoplanet research because it is not limited anymore to the lucky few who have access to space telescopes," says Pieter Deroo at the California Institute of Technology. Since astronomers made the first discovery of a planet orbiting another star in 1992 they have gone on to catalogue more than 400 of these exoplanets . The favoured hunting technique is known as the transit method whereby astronomers monitor the light from a star and look for dips in its intensity caused by a planet sweeping in front of its parent star cutting across the line of vision from Earth. Honing in on exoplanets The next stage in exoplanet research is to start looking a bit more closely at the nature of these planets with the ultimate goal of discovering a planet with habitable conditions like Earth's. The first step is to decipher the chemical composition of exoplanetary atmospheres as this could provide information about a planet's formation and evolution; it might also reveal the signatures of life. To date, the most popular approach has been to adapt the transit detection method to study how starlight, observed from Earth, is affected during eclipses. The idea is a relatively simple one: compare the spectrographic data of a star's light when an exoplanet is first in front then behind its parent star in relation to our line of vision.

EXT. FINDING EXOPLANETS NOWKepler will inevitably discover exoplanets – we don’t need WFIRSTMadrigal, 10 – a staff writer for Wired Science. Madrigal is hard at work on a book about the history of green technology and is a visiting scholar at UC-Berkeley’s Office for the History of Science and Technology. [January 4, 2010, Alexis Madrigal, Wired Science, “New Exoplanet Hunter Makes First 5 Discoveries,” http://www.wired.com/wiredscience/2010/01/kepler-finds-first-exoplanets/]

The Kepler Space Telescope, a designated planet-hunting satellite, has found its first five planets, among them an odd, massive world only as dense as Styrofoam. The number of planets now known outside the solar system has risen to more than 400, but none is yet Earth-like enough to harbor life. Right now, Kepler can only detect large planets orbiting close to their stars, which means that these first planets are too hot to hold liquid water, a requirement for life as we know it. But over the next year, the mission’s scientists will be homing in on ever more life-friendly places. “We expected Jupiter-size planets in short orbits to be the first planets Kepler could detect,” said Jon Morse, director of the Astrophysics Division at NASA in a release. “It’s only a matter of time before more Kepler observations lead to smaller planets with longer period orbits, coming closer and closer to the discovery of the first Earth analog.” Kepler is pointed at a single field of stars in the constellation Cygnus. By watching the same stars over time, the mission can detect the periodic dimming of those stars, a possible indication that a planet has passed in front of the star. Finding an Earth-like planet will probably take quite awhile, though, because if it has an Earth-like orbit, it will take around a year to cross in front of its star just one time. The current set of Kepler planets are not much like ours at all. The smallest is 0.4 times the size of Jupiter, while the largest is 1.5 times the largest planet in our solar system. They are all very hot, too, running between 2,200 and 2,900 degrees Fahrenheit. They have been given the catchy names Kepler 4b, Kepler 5b, Kepler 6b, and Kepler 7b. (Kepler 1b-3b were assigned to previously known exoplanets in the telescope’s field of view.) Still, the planet detections show that Kepler is in great working order as it monitors its sample of the sky. The precision of the instrument has astounded scientists since its first light. “This exquisite data is just the tip of the iceberg,” MIT astronomer Sara Seager said back then. “We’re going to see a new world of exoplanet exploration where discoveries will come much more rapidly than they’ve come in the last 10 years.” The mission, championed for more than a decade by Bill

Borucki, a NASA extrasolar planet specialist, looks like it will be capable of completing all of its scientific goals.

That means it’s just a matter of time before we find an Earth twin or two out there in the light-years beyond.

We are approaching a new era of exo-planet research – recent discoveriesAFP, 10 [Aug 24, 2010, Space Daily, “Astronomers bag biggest harvest of exoplanets,” http://www.spacedaily.com/reports/Astronomers_bag_biggest_harvest_of_exoplanets_999.html]

European astronomers on Tuesday said they had found a distant star orbited by at least five planets in the biggest discovery of so-called exoplanets since the first was logged 15 years ago. The star is similar to our Sun and its planetary lineup has an intriguing parallel with own Solar System , although no clue has so far been found to suggest it could be a home from home, they said. The star they studied, HD 10180, is located 127 light years away in the southern constellation of Hydrus, the male water snake, the European Southern Observatory (ESO) said in a press release. The planets were detected over six years using the world's most powerful spectograph, an instrument to capture and analyse light signatures, at ESO's telescope at La Silla, Chile. The method consists of observing a star and seeing how the light that reaches Earth "wobbles" as a result of the gravitational pull of a passing planet. The tiny fluctuation in light can then be used as a telltale to calculate the mass of the transiting planet. The five detected planets are big, being the size of Neptune, although they orbit at a far closer range than our own gas giant, with a "year" ranging from between six and 600 days. The astronomers also found tantalising evidence that two other candidate planets are out there. One would be a very large planet, the size of our Saturn, orbiting in 2,200 days. The other would be 1.4 times the mass of Earth, making it the smallest exoplanet yet to be discovered. It orbits HD 10180 at a scorchingly close range, taking a mere 1.18 Earth days to zip around the star. If confirmed, that would bring the distant star system to seven planets, compared with eight in our own Solar System. A total of 402 stars with planets have been logged since the first was detected in 1995, according to NASA's Jet Propulsion Laboratory

(JPL). The tally of exoplanets stands at 472. None, though, is even remotely similar to Earth, which is rocky and inhabits the famous "Goldilocks zone" where the temperature is just right to enable water, the stuff of life, to exist in liquid form. ESO astronomer Christophe Lovis said knowledge was progressing fast. "We are now entering a new era in exoplanet research -- the study of complex planetary systems and not just of individual planets. "Studies of planetary motions in the new system reveal complex gravitational interactions between the planets and give us insights into the long-term evolution of the system."

We are finding exoplanets now

Gill, 10 – science reporter [Aug 24, 2010, Victoria Gill, BBC News, “Rich exoplanet system discovered,” http://www.bbc.co.uk/news/science-environment-11070991]

The star is 127 light years away, in the southern constellation of Hydrus. The researchers used the European Southern Observatory (Eso) to monitor light emitted from the system and identify and characterise the planets. They say this is the "richest" system of exoplanets - planets outside our own Solar System - ever found.

Christophe Lovis from Geneva University's observatory in Switzerland was lead researcher on the study. He said that his team had probably found "the system with the most planets yet discovered".

The discovery could provide insight into the formation of our own Solar System "This also highlights the fact that we are now entering a new era in exoplanet research - the study of complex planetary systems and not just of individual planets," he said.

We have found enough exoplanetsMarston, 10 – a news reporter for Scientific American magazine, specializing in stories on space and physics [Aug 20, 2010, John Matson, “Slow and Steady: Astronomy Advisory Report Charts a Long Road for Exoplanet Science,” http://www.scientificamerican.com/article.cfm?id=slow-and-steady-astronomy]

The study of planets outside the solar system was one of the hottest corners of the science world in the 2000s, a decade that saw the known tally of exoplanets increase by more than a factor of 10. By the end of 2009, more than 400 worlds had been discovered in the young field of study, and the scientists working on NASA's Kepler planet-hunting mission were preparing to announce the first discoveries of their recently launched spacecraft. Just how the next decade will shake out for exoplanet scientists became a little clearer August 13 with the release of a report designed to guide the fields of astronomy and astrophysics through the 2010s. The so-called decadal survey, produced every 10 years by an expert committee convened by the National Research Council, ranks funding priorities that NASA, the National Science Foundation and the Department of Energy should follow.

Recent discoveries provide a Rosetta Stone for exoplanet researchAFP, 10 [March 18, 2010, “‘Rosetta Stone’ of exoplanet research exciting scientists,” http://www.taipeitimes.com/News/world/archives/2010/03/18/2003468293]

Marking a further step in the hunt for worlds orbiting other stars, astronomers yesterday said they had found a cool planet the size of Jupiter that encircles a sun at searing proximity . The work is a technical exploit in the field of extrasolar planets — or exoplanets — as planets outside our solar system are

called, they said. “This is the first [exoplanet] whose properties we can study in depth,” said Claire

Moutou, one of 60 astronomers who took part in the discovery. “It is bound to become a Rosetta Stone in exoplanet research ,” she said .

1NC DARK MATTER FRONTLINEA boon in dark matter research is coming nowFerguson, 11 – the Founder, Chair, and Managing Partner of Next Street, a merchant bank for the urban enterprise. Formerly the Managing Director of Putman Investments, and CEO of HSBC Asset Management, Mr. Ferguson is on the board for the Initiative for a Competitive Inner City, a trustee of the Institute of Contemporary Art, a director of the Boston Center for Community and Justice [Feb 7, 2011, Tim Ferguson, “UK supercomputer probes dark matter and dark energy,” http://www.silicon.com/management/public-sector/2011/02/07/uk-supercomputer-probes-dark-matter-and-dark-energy-39746923/]

What happened just after the Big Bang? How do stars evolve? What's powering the expansion of the universe? These are some of the biggest questions posed by the universe, and scientists are being helped in their quest to answer them by the University of Portsmouth's Sciama supercomputer. The University's Institute of Cosmology and Gravitation (ICG) recently went live with its first supercomputer, which has a 1,008-core Intel cluster capable of one billion calculations per second. It uses 2.66GHz Intel Xeon processors and has 85 terabytes of fast parallel storage and 10 terabytes of NFS storage. The facility is named after Dennis Sciama, a leading figure in the international development of astrophysics and cosmology, and is also an acronym for the South East Physics Network Computing Infrastructure for Astrophysical Modelling and Analysis. The University of Portsmouth's ICG was founded in 2002 and now has about 50 members, including academic staff, post-doctoral researchers, PhD students and international visitors. About half of the ICG's work concerns the theoretical physics of the universe, while the other half is focused on observational work, although some research projects combine the two.

One of the projects that will use Sciama is the Dark Energy Survey - beginning in late 2011 - which will use optical data from the Cerro Tololo Inter-American Observatory in Chile to create an image of the night sky in the southern hemisphere. As the name suggests, the research will study dark energy, which describes the force driving the accelerating expansion of the universe - as

shown by the movement of galaxies away from each other. "All the galaxies in the universe seem to be going faster and faster away from each other, which is a very strange thing indeed. We don't understand why they're going away faster and faster. Whatever's causing that we call dark energy, and from those distortions you see, you can learn about dark energy," senior research fellow at the University of Portsmouth Dr David Bacon explained. The project will also look at dark matter, which... ...is the theoretical matter whose gravitational pull is attributed to distorting light as it travels through the universe. The raw images taken by the Chilean telescope will be processed and analysed to profile the shape of galaxies. When the shape of these galaxies is similar, or aligned, it means the light that has travelled from them has been distorted in the same way, indicating the presence and location of dark matter. "If that's true, you're learning directly about gravity and the universe

- because it's gravity that would be distorting those objects," Bacon said. He added that if a human spent one second looking at each galaxy in this image it would take five years to look at them all. On the other hand, the Sciama supercomputer can process this data rapidly through the use of parallel computing. "We realised there's an awful lot of our calculations that can be done embarrassingly easily in a parallel way, and that's what our machine is good at," Bacon told silicon.com. Parallel computing means each processor core can process small parcels of data at the same time, rather than having to store a huge image in its memory. "You can literally chop up this vast image of the night sky into little postage stamps of the galaxies you care about. And for each of those galaxies you just want to do some fairly simple tasks - you want to fit a profile to it, you want to measure its width, you want to count up all the pixel values. "You want to do all those little things and you can do that independently of each little galaxy in your image - you don't need to have that vast image all in memory at one time. You can just send a little postage stamp to each core independently," Bacon said. "By examining these observations in great detail, we can measure all the properties of galaxies - the facts and the data we need - and on the other hand, we can use the supercomputer to predict what different theories of dark energy would predict for those things we're seeing. So you can compare the predictions with the observations," he added. Other supercomputers used for cosmology and astronomy by UK researchers include the Cosmos facility in Cambridge, which the University of Portsmouth already uses extensively. Unlike Sciama, Cosmos has a shared-memory architecture that makes it... ...suitable for different kinds of computational calculations. Another example is the Blue Gene supercomputer in the Dutch city of Groningen, which processes data from the Low Frequency Array, or Lofar, project using images from a series of radio telescopes across Europe, including one in Chilbolton near Winchester. Bacon said this computer creates an image of the night sky across Europe "as it glows with radio waves". Each radio telescope can create up to 24 terabytes of data per day, so the computer requires a huge amount of memory as well as fast processing. Sciama will be used to analyse the images created by the Groningen supercomputer to further understanding of dark energy as part of the Lofar project. The initial plan is for the University of Portsmouth to use Sciama 70 per cent of the time, with other universities in the South East Physics Network using the remainder. The South East Physics Network is a consortium comprising the universities of Kent, Oxford, Queen Mary, Southampton, Surrey and Sussex. There are 28 registered users of the supercomputer. They will act as testers, identifying tweaks or the need for additional technology before heavy use of the facility

begins. Bacon's PhD student is using Sciama in a project investigating unified dark matter theory in which material has properties of both dark matter and dark energy. The research is about predicting how light will be distorted if unified dark matter theory is applied to objects in the universe. Another researcher at Portsmouth is using the supercomputer to show how galaxies are spread out and to predict which types of galaxies occur in which parts of the universe. International projects will use Sciama in the future, including the Sloan Digital Sky Survey III, which has been gathering data from the Sloan radio telescope in New Mexico for the past year. The five-year project aims to create a 3D image of the universe showing the distribution of galaxies in the largest volume of data from the universe ever surveyed.

Dark matter isn’t even a conclusive theoryD’Amico, 11 – writer for universetoday.com [April 18, 2011, Vanessa D'Amico, “Antigravity Could Replace Dark Energy as Cause of Universe’s Expansion,” http://www.universetoday.com/84934/antigravity-could-replace-dark-energy-as-cause-of-universes-expansion/]

Since the late 20th century, astronomers have been aware of data that suggest the universe is not only expanding, but expanding at an accelerating rate. According to the currently accepted model, this accelerated expansion is due to dark energy, a mysterious repulsive force that makes up about 73% of the energy density of the universe. Now, a new study reveals an alternative theory: that the expansion of the universe is actually due to the relationship between matter and antimatter. According to this study, matter and antimatter gravitationally repel each other and create a kind of “antigravity” that could do away with the need for dark energy in the universe. Massimo

Villata, a scientist from the Observatory of Turin in Italy, began the study with two major assumptions. First, he posited that both matter and antimatter have positive mass and energy density. Traditionally, the gravitational influence of a particle is determined solely by its mass. A positive mass value indicates that the particle will attract other particles gravitationally. Under Villata’s assumption, this applies to antiparticles as well. So under the influence of gravity, particles attract other particles and antiparticles attract other antiparticles. But what kind of force occurs between particles and antiparticles? To resolve this question, Villata needed to institute the second assumption – that general relativity is CPT invariant. This means that the laws governing an ordinary matter particle in an ordinary field in spacetime can be applied equally well to scenarios in which charge (electric charge and internal quantum numbers), parity (spatial coordinates) and time are reversed, as they are for antimatter. When you reverse the equations of general relativity in charge, parity and time for either the particle or the field the particle is traveling in, the result is a change of sign in the gravity term, making it negative instead of positive and implying so-called antigravity between the two. Villata cited the quaint example of an apple falling on Isaac Newton’s head. If an anti-apple falls on an anti-Earth, the two will attract and the anti-apple will hit anti-Newton on the head; however, an anti-apple cannot “fall” on regular old Earth, which is made of regular old matter. Instead, the anti-apple will fly away from Earth because of gravity’s change in sign .

In other words, if general relativity is, in fact, CPT invariant, antigravity would cause particles and antiparticles to mutually repel. On a much larger scale, Villata claims that the universe is expanding because of this powerful repulsion between matter and antimatter. What about the fact that matter and antimatter are known to annihilate each other? Villata resolved this paradox by placing antimatter far away from matter, in the enormous voids between galaxy clusters. These voids are believed to have stemmed from tiny negative fluctuations in the primordial density field and do seem to possess a kind of antigravity, repelling all matter away from them. Of course, the reason astronomers don’t actually observe any antimatter in the voids is still up in the air. In Villata’s words, “There is more than one possible answer, which will be investigated elsewhere.” The research appears in this month’s edition of Europhysics Letters.

There is neutrino detection technology nowCourtland, 11 – writer for the New Scientist [April 28, 2011, Rachel Courtland, “"Ghost Particle" Detectors Closer to Preventing Nuclear Proliferation,” http://spectrum.ieee.org/energy/nuclear/ghost-particle-detectors-closer-to-preventing-nuclear-proliferation/2]

The Point Lepreau detector, which will measure about 3 meters on each side, is the fourth in a series developed by the Livermore-Sandia team. The previous three detectors were installed over the course of the past

decade at a 1.1-GW pressurized water reactor at California’s San Onofre Nuclear Generating Station. The team’s first demonstration detector at San Onofre picked up about 400 antineutrinos per day during a 600-day test period .

That was good enough to provide near real-time data on the state of the reactor. The detector could "tell when the reactor

has been turned off within a few hours, which is important, because if you wanted to remove any fissile material, you would have to turn the reactor off," says team member Timothy Classen, of Lawrence

Livermore. But the team hopes it can mine antineutrino data to give even more information about what’s going on inside the reactor. An operator that runs a reactor at a higher power or uses fuel with more uranium-238 can boost the production rate of plutonium that could be used for nuclear weapons. Uranium releases more detectable antineutrinos than plutonium does, so monitoring the rate of antineutrino emission could potentially indicate whether a reactor is being run as intended or if material for weapons has been removed. At the moment, the sensitivity of the detectors to changes in plutonium content is fairly low. But in a paper that will appear in the Journal of Applied Physics, Adam Bernstein, who leads the Lawrence Livermore group, and colleague Vera Bulaevskaya determined that, using 90 days of data, a detector that can register 2000 antineutrinos per day from a pressurized water reactor could be used to detect whether 73 kilograms of plutonium had been removed. The International Atomic Energy Agency (IAEA) deems just 8 kg of plutonium a "significant quantity," because it is enough to make a nuclear explosive device. On their own, antineutrino detectors may

never be able to sense whether such a small amount goes missing from the entire reactor. But the data they collect, combined with reactor simulations, might be able to show whether each grouping of fuel rods contains as much plutonium as expected to within a "significant quantity" at the end of a fuel cycle, Bernstein says.

EXT. DARK MATTER RESEARCH NOWThere is revolutionary research in dark matter nowMcLean, 10 – undergrad in Johnathan Edwards college majoring in Molecular, Cellular and Developmental Biology. She is a writer for the Yale Scientific Magazine and works in Professor Crews’ lab studying the molecular mechanisms of limb regeneration in axolotls. [Dec 1, 2010, Kaitlin McLean, “Dark Energy: Studying the Expansion and Fate of the Universe,” http://www.yalescientific.org/2010/12/dark-energy-studying-the-expansion-and-fate-of-the-universe/]

Since ancient times, people have stared into the heavens and pondered what was out there—what the fate of our universe might be. Dr. Priyamvada Natarajan’s, Yale Professor of Astronomy and Physics, research into dark energy has brought us closer than ever to answering these questions. What is Dark Energy? As Professor Natarajan admits, dark energy is currently one of the great embarrassments of astrophysics: it makes up roughly 72% of the universe, yet there is no consensus on what exactly dark energy is. As she described in a recent publication in Science, dark energy is an unknown material with negative pressure originally hypothesized to explain one of the most troubling conundrums of astrophysics. Gravity, one of the four fundamental forces that act in our environment , causes objects with mass to attract each other. Beyond keeping the planets and their moons in orbit, gravity is also what makes people fall when they trip. Gravity should be pulling all the bodies in the universe towards each other and ultimately cause our universe to shrink and collapse inward upon itself. Yet despite this, astrophysicists have determined that the universe is expanding at an accelerating rate. For this to be possible, there must be a greater force that opposes gravity on the scale of the larger universe. Dark energy was postulated to be a solution to this puzzle since dark energy, having negative pressure, is an expanding repulsive force that opposes the coalescing force of gravity. Though scientists aren’t precisely sure what dark energy is, they have been able to model its properties and behavior. To provide a more descriptive image of dark energy, it can be imagined as a fluid modeled by the equation P=w. This is the hypothesized equation of state for dark energy, yet definitively determining this equation of state has remained one of the main goals for cosmologists today. Dark Energy vs. Dark Matter Dark energy and dark matter are unrelated quantities and should not be confused, but both are not well understood by scientists .

Dark matter is composed of exotic particles produced immediately following the Big Bang. The composition of these particles is unknown. Approximately 90% of the universe is composed of dark matter while the remainder consists of baryons—matter such as that from which objects on earth are composed. Dark matter aggregates in the universe due to gravity, providing a scaffolding for galaxies. Our own Milky Way galaxy is constructed upon a base of dark matter, so from outer space, it appears as though

our universe has a halo of dark matter surrounding it. To detect dark matter, scientists measure the speed of stars.

In 1933, the Swiss astrophysicist Fritz Zwicky of CalTech estimated the total mass of a galaxy cluster. He recorded the acceleration of the cluster’s orbits and found the force of gravity for the estimated mass to be much too small to account for such fast orbits. He determined that there must be about 400 times more estimated mass than was visibly observable. This invisible mass was deemed “dark matter.” Professor Natarajan’s Groundbreaking Research Priyamvada Natarajan was a member of an international team of cosmologists, astronomers, and physicists who used gravitational lensing to study dark energy. One of the current goals in observational cosmology is to characterize the mass-energy content of the universe. Natarajan’s team recently published their results from a geometric test based on strong lensing in galaxy clusters. They used data from images taken by the Hubble telescope and spectroscopic

analysis of the galaxy cluster Abell 1689, a massive galaxy cluster 2.2 billion light years from Earth, to perform their geometric test. These results, combined with results from other experimental involving supernovae, x-ray galaxy clusters, and the Wilkinson Microwave Anisotropy Probe (WMAP) spacecraft, have allowed the team to lower the uncertainty of their findings by about 30%. What is Gravitational Lensing and How Does It Work? Originally predicted by Einstein’s general theory of relativity, gravitational lensing occurs when light from a bright and distant source is bent around a massive object—say, a cluster of galaxies—between the source of light and the viewer. The gravity from an object as massive as a cluster of galaxies actually warps space-time, causing everything around it to bend. This has the effect of bending the path that light follows just as an optical lens refracts light. The light gets focused when passing massive objects and defocused when traveling through empty space. Gravitational lensing functions similar to optical lenses in which a lens transmits and refracts light, converging or diverging the beam. Professor Natarajan’s research uses a special type of gravitational lensing called strong lensing. Strong lensing is so strong that multiple images of the same galaxy are produced. These copies provide many different light paths, which can then be reconstructed to show the geometry of the galaxy cluster of interest. To make use of strong gravitational lensing, Professor Natarajan’s team took images of the galaxy cluster Abell 1689 with the Hubble Space Telescope, a telescope that produces the best

resolution of any existing telescope. From these images, the team had to determine which galaxy clusters were in fact repeats of the same galaxy cluster of interest. To do this, the scientists analyzed the red shift of each galaxy. The ones with identical spectra were determined to be part of the same cluster. Why This Research is Groundbreaking Sometimes in science, the most exciting discoveries involve the discovery of a new application for

an existing technique. Such is the case with Professor Natarajan’s recent research. The method of strong lensing has been used in several other cosmological applications, but never before has it been used to measure dark energy. Professor Natarajan’s development of this new application of strong lensing was the accumulated work of years of effort. The method had to be pieced together until all of its components were synthesized into a reliable and accurate technique. This new method offers several different strengths when compared to other techniques used to study and measure dark energy—most remarkably its accuracy and wide applicability. Having been used since 1979, gravitational lensing is well understood. Additionally, there are numerous massive galaxy clusters that can produce the effect of strong lensing. Before, scientists would study dark energy through supernovae, a technique limited by the fact that cosmologists have to search for more supernovae to study, cosmologists at increasingly further distances; the farther cosmologists have to look, the weaker the supernovae. Another limitation of this approach is the assumption that all supernovae are a “standard candle”—that is, all supernovae are of the same magnitude. However, this assumption may not be accurate in all cases. As Professor Natarajan states, “The content, geometry and fate of the universe are all linked, so by constraining two of those things, you can learn something about the third.” Like the dark energy that is its subject, Professor Natarajan believes this work continue expanding—not collapsing.

We already know everything we need to know about dark matter –Cowen, 11 – the astronomy and physics writer at Science News [April 9, 2011, Ron Cowen, “New study gives dark energy a boost,” http://www.sciencenews.org/view/generic/id/71286/title/New_study_gives_dark_energy_a_boost]

New evidence bolsters the case that a bizarre form of energy is uniformly accelerating the expansion of the universe, refuting one of the alternative models developed by researchers who refuse to accept the idea. The new study, which measures the present-day expansion of the universe to unprecedented accuracy, also suggests that the cosmos may be slightly older than previously calculated. A team of astronomers used the Hubble Space Telescope’s new infrared camera to refine the Hubble constant — a

number that indicates the current rate at which galaxies are receding from one another due to cosmic expansion. By precisely measuring the distance to various celestial objects and then gauging the speed at which they are receding from each

other, the team measured a Hubble constant of 73.8 kilometers per second per megaparsec. That means that for every million parsecs (3.26 million light-years) separating two distant galaxies, they move apart 73.8 kilometers per second faster. The value has an uncertainty of only 3.3 percent, Adam Riess of Johns Hopkins University and the Space Telescope Science Institute in Baltimore, Md., and his colleagues report in the April 1 Astrophysical Journal. That margin of error is about 30 percent better than the previous value, reported in 2008 (SN Online:

5/5/08). “The Hubble constant is the first and most important number in cosmology,” says Michael

Turner of the University of Chicago. “It is hard to measure accurately. Any improvement — and this is a real improvement — has broad implications.” Although the Hubble constant measures only the current rate of cosmic expansion, given certain assumptions the new value implies that the universe is about 75 million years older than the previous estimate of 13.75 billion years, Riess says. Obtaining a precise value of the Hubble constant also places new restrictions on one alternative to “dark energy” as the driving force behind accelerated cosmic expansion, says Riess. In the alternative scenario, Earth and its environs would sit at the center of a vast void a few billion light-years across (SN: 6/7/08, p. 12). That configuration would produce an optical illusion making it appear as if the universe’s expansion is accelerating. But such a setup would require a significantly lower value of the Hubble constant than the one Riess and his colleagues have now measured to high precision. Previous measurements of the constant were already at odds with the void model, but the added precision of the new study refutes the model conclusively, Riess says. Having Earth live at the center of a void “is a really weird model, but so is dark energy,” he notes. “It might be a matter of taste which model someone thinks is weirder, but now we have data to show that dark energy is favored.” The new measurement “makes life very difficult” for the void scenario, acknowledges one of its developers, Robert Caldwell of Dartmouth College.

Turner notes that “while this does not completely rule out the ‘we are at the center of the universe’ alternative to dark energy, it does put a stake in its heart .” In combination with data from NASA’s Wilkinson Microwave Anisotropy Probe, which surveys radiation left over from the Big Bang, the new results also suggest that dark energy is what physicists call the “cosmological constant.” It refers to a constant density of energy residing in the vacuum of space, suggested by Einstein as an addition to his equations of general relativity but later abandoned.

EXT. DARK MATTER =/= REAL

Dark Energy skeptics still exist Laura Greenhalgh Debate over dark energy reality reignited May 24 2011 by Cosmos Online http://www.cosmosmagazine.com/news/4341/debate-over-dark-energy-reality-reignited?page=0%2C0]

"This is a remarkable survey, and a convincing fit to the standard hypothesis of a homogeneous cosmology that includes Einstein's cosmological constant," said Thomas Buchert, a professor of cosmology at Claude

Bernard University, France. "However, it does not rule out alternatives to the standard model, and it does not prove that dark energy 'exists'." This view is also taken by David Wiltshire, a theoretical cosmologist from the University of Canterbury in New Zealand. "If one makes the assumption that the universe expands as a uniform fluid, then these measurements are independent evidence for dark energy," he said.

EXT. NUETRINO DECTORS NOWAND --- even if we have them, they are decades away from being used for non-proliferation effortsCartwright, 10 – a freelance journalist based in UK [Nov 24, 2010, Jon Cartwright, physicsworld.com, “Neutrinos could detect secret fission reactors,” http://physicsworld.com/cws/article/news/44411]

Others are not so sure. Andrew Monteith of the IAEA's Novel Technologies Unit says that the IAEA is at present only interested in neutrino detectors for near-field detection, because only that is within its current remit. "The far-field approach that's discussed in the paper has never really been an official part of our thinking," Monteith explains. "We're taking it on a stage-by-stage basis, and the near-field one is certainly more realistic for us, in terms of cost and deployment." Expensive solution? Julian Whichello, head of the Novel Technologies Unit, believes Lasserre's SNIF detector could cost in the region of $100 million – almost the same as the IAEA's entire budget for global verification of fission reactors. "This is something that's well and truly outside of the current budget of the agency," he says. Still, Lasserre explains that his group's goal was to explore the scientific possibilities rather than have political influence. " This is very futuristic ," he says. "It's huge, it will cost a lot of money and it's a difficult effort. Technically it would be possible in the next 30 years, but I'm not aware of any programme in the world to build such devices."

1NC DARK MATTER BADDiscovery of dark energy sets the false vacuums decay clock to zero—Makes extinction inevitableMatt Ford 2007 got his bachelors degree in chemical engineering from the University of Delaware and graduate school at the University of Massachusetts Amherst where he got his doctorate in chemical engineering. Human observation of dark energy may shorten the life span of the universe http://arstechnica.com/old/content/2007/11/human-observation-of-dark-energy-may-shorten-the-life-span-of-the-universe.arsTheir official paper, titled "The Late Time Behavior of False Vacuum Decay: Possible Implications for Cosmology and Metastable Inflating States," is far from grandiose. It extends a body of work initiated in the 1950's by Soviet physicist L. A. Khalfin. Khalfin determined that the long time behavior of a metastable quantum state is not described by the traditional exponential decay, but rather by a power-law type decay. That determination is accurate, but has remained obscure because nearly all experimental systems will have decayed long before this transition becomes significant. Krauss and Dent attempt to extend this idea from traditional quantum mechanics to quantum field theory, and look at its implications for cosmology. To understand the potential implications of the calculations in the paper, one must start at the beginning—the Big Bang, and even before. It is currently believed that the universe blinked into existence somewhere around 13.7 billion years ago. This event—the Big Bang—is theorized to have been precipitated when a "bubble of weird high-energy 'false vacuum' with repulsive gravity decayed into a zero-energy 'ordinary' vacuum." The energy released during this transition would have created intense heat, and all the matter we see in the universe. This idea was challenged in the late 1990's by the discovery of dark energy. Dark energy, coupled with the fact that the expansion of the universe is accelerating, suggests that the Big Bang did not produce a zero-energy vacuum, but another metastable false vacuum. Using the analogy of a decaying radioactive atom where shifts in energy states occur at random, Professor Krauss says that "it is entirely possible [the energy state of the universe] could decay again, wiping the slate of our universe clean." In a nutshell, this would mean that we, and everything we know, would cease to exist. How does this relate to the work in the research article? If the current false vacuum state that our universe exists in survives past a a certain point—the point where the decay switches from exponential to power-law—then it should become eternal. This is explained by the assumption that the false vacuum state will grow at an exponential rate for all time. If its decay suddenly becomes slower—as in the power law decay regime—then the false vacuum will grow faster than it could possibly decay and would never be destroyed. According to calculations contained in the paper, the closer the false vacuum energy is to zero, the less time that will need to elapse before the decay rate switches from fast (exponential) to slow (power law). Given the fact that, in our universe, the vacuum energy is just above zero (0.01 eV), we could be well past the point at where the universe has switched from the fast decay to slow decay—although without an estimate of the current decay rate, this cannot be known for sure. However, since this is a quantum issue at its core, Krauss points out that measurements can affect the outcome of the system. He suggests that our measurements of supernovae in 1998, which detected the existence of dark energy, may have reset the false vacuum's decay clock to zero, switching it back to the fast decay regime, and greatly decreasing the universe's chance of surviving. "In short, we may have snatched away the possibility of long-term survival for our universe and made it more likely it will decay," says Krauss. How could something like this possibly happen? In quantum mechanics, there is an effect known as the quantum Zeno effect—an oddity of the quantum world that suggests a system can be kept in an excited state simply by repeated measurements. This can be described using a quantum system initially in state 'A'. After time begins, the system wants to decay to state 'B' but, before it reaches state 'B', it will exist as a superposition of states 'A' and 'B'. If one measures the system shortly after it begins, it would have a high probability of collapsing entirely to state 'A' again, essentially resetting the system's internal quantum clock. Krauss is suggesting that, by observing the dark energy, we reset the internal quantum clock of the false vacuum universe, and that may have caused it to return to a point before it has switched from the fast decay to the slow decay—in the process greatly reducing the universe's ultimate chance of survival.

EXT. DARK MATTER --> COLLAPSE OF UNIVERSEThere is a linear risk of collapse of the universe from finding dark energyDavis quoting scientist Professor Kraus, 07 [“The five scientific experiments most likely to end the world”, http://www.cracked.com/article_16583_the-5-scientific-experiments-most-likely-to-end-world.html]

No doubt the strangest part is the Quantum Zeno effect, which points out that simply observing and measuring particles changes them (specifically, changing the rate at which they decay). How? No one knows. It appears to be the closest science has ever come to proving black magic exists. One prominent scientist theorized that the changes caused by simply observing dark energy could cause it to collapse, taking the universe with it. Scientists, eager to see if this is true, are furiously observing dark energy whenever they get the chance. So, Basically It's Like... It's like crossing the streams in

Ghostbusters, apparently. How Long Have We Got? That scientist, Professor Lawrence Krauss, thinks it may already be underway

1NC EUCLID CP – ESA MECHANISMThe European Space Agency should fully fund the construction and development of the Euclid Telescope.

Euclid solves both of their scenarios –microlensing discovers dark energy and exoplanetsHsu, ’09 [Jeremy Hsu, 11/20/09, “New Space Telescope Could Search for Both Exoplanets and Dark Energy”, http://www.popsci.com/technology/article/2009-11/new-space-telescope-could-search-both-exoplanets-and-dark-energy, SM]

Dark energy may not have much in common with aliens, unless there's a flotilla of freaky monoliths out there with really weird physical properties. But astrophysicists hope to build a   two-in-one space telescope  that can search for signs of dark energy along with exoplanets. Europe's proposed Euclid space telescope would use a microlensing technique to detect the bending of light as it travels through clumps of dense matter. That would allow scientists to map out the locations of galaxies scattered across the universe, which in turn may reveal more about the role of dark energy as a theorized unseen force behind the universal expansion. Some cosmologists consider dark energy such a bugaboo that they have tried coming up with alternative explanations. The very same technique could also pinpoint the light-bending effect of distant stars and planets, and help astronomers find new exoplanets that could potentially harbor extraterrestrial life. Folding in the search for ET may also prove more compelling for getting the space telescope funded, given that even the Vatican has recently begun pondering the cosmic complications of aliens. Both scientific investigations require a space telescope that can stare patiently at a large swath of sky, and so Euclid fits the bill almost perfectly. But Astrobiology Magazine notes that both planet-hunters and cosmologists would have to share their viewing time with the telescope, and probably also financial support. Still, even shared viewing time beats no viewing time. Astrophysicists estimate that 3 months of observations with Euclid could survey 200 million stars and perhaps find 10 planets resembling our own planet Earth . That hinges upon the European Space Agency's review of Euclid and five other similar missions, with only two slated to make the cut for launch.

1NC EUCLID CP – US MECHANISMThe United States National Aeronautics and Space Administration should provide a twenty percent investment in the Euclid telescope.

US refused to place a 20% investment in Euclid – that could have solved the reasons W-First is key, but failure to invest is putting the US behind in dark energy and exoplanet research Bhattacharje, ’11 [3/4/11, Yudhijit Bhattacharje “Scientists Fear WFIRST Will Be Trailing the Pack”, ScienceMag, Vol. 331, SM]

The 1990s were a decade of glory for U.S. astronomers, including the discovery of dark energy and planets outside the solar system. But the instrument that they say would keep them at the forefront even its name refl ects that aspiration may be too costly for the U.S. government to build. Last August, a National Academies panel charged with ranking the priorities of U.S. astronomers for the next decade identifi ed the Wide-Field Infrared Survey Telescope (WFIRST) as its top choice in the category of space observatories. The panel recommended that NASA launch the $1.6 billion mission by 2020 as the best way to advance the study of dark energy and the search for new extrasolar planets. But hopes of realizing that goal appear to be fading . The ballooning cost of the James Webb Space Telescope (JWST), and the resulting delays in its launch, have all but guaranteed that the space agency will not be able to deliver WFIRST by 2025, much less the end of the decade. And NASA has rejected a backup plan that would have given it a 20% stake in a similar European dark energy mission. That decision, announced last week, leaves U.S. astronomers with the prospect of being marginalized in the next decade.

CP is budget neutral – avoids our spending DAs NRC, ’10 [The National Research Council was organized by the National Academy of Sciences in 1916 to associate the broad community of science and technology with the Academy’s purposes of furthering knowledge and advising the federal government. “Report of the Panel on Implementing Recommendations from the New Worlds, New Horizons Decadal Survey”, http://www.nap.edu/catalog.php?record_id=13045, SM]

NASA has stated that a 20 percent investment in Euclid as described would be cost-neutral over the decade—owing to a complementary ESA contribution in WFIRST. However, the panel concludes that the 20 percent plan would deplete resources for the timely execution of the broader range of NWNH space-based recommendations and would significantly delay implementing the Explorer augmentation ($463 million), as well as augmentations to the core activities ($110 million) that were elements in the survey’s first tier of activities for a less optimistic budget scenario. Moreover, the present panel emphasizes that a 20 percent share dedicated to Euclid would be a non-negligible fraction of the resources needed for these and other NWNH priorities (such as New Worlds [$100-200 million] and Cosmic Microwave Background/inflation [$60-200 million] technology development and theory and computation networks [$50 million from NASA]), and would be spent in part during the period of greatest stress on the NASA budget due to JWST cost growth and delay.

2NC EUCLID SOLVENCYUS investment in Euclid solves all the reasons WFIRST is key – our evidence assumes their tradeoff internal linkKlamper, ’10 [Amy Klamper, 17 September, 2010, “NASA Sees Expanded Role on Euclid as Down Payment on Dark Energy Flagship”, http://www.spacenews.com/civil/nasa-dark-energy.html, SM]

WASHINGTON — NASA plans to begin preliminary work this fall on a dark energy-mapping observatory recommended in the National Research Council’s latest 10-year plan for space- and ground-based astronomy, though full-scale development of the new flagship-class mission will have to wait until the agency launches its $5 billion James Webb Space Telescope (JWST), according to officials. Jon Morse, astrophysics division director at NASA headquarters here, said the Wide-Field Infrared Survey Telescope (WFIRST) proposed in the decadal survey is not likely to launch before 2022, some seven years after NASA hopes to loft JWST to the second Lagrange point — a gravitationally stable spot 1.5 million kilometers from Earth — in 2014. “The pacing of WFIRST’s development time is JWST, plain and simple. We can’t afford to do them simultaneously,” Morse said during a two-day public meeting of the NASA Advisory Council’s astrophysics subcommittee here Sept. 16-17, adding that the division has seen reduced budgets in recent years and expects only flat funding through 2015. “When you’re trying to execute a new flagship, you are in line behind the previous flagship, that’s the key message here.” Morse said the Astro2010 decadal survey envisions a WFIRST launch as early as 2020 under the panel’s most optimistic funding scenario. But a launch in 2022 or beyond is more likely, he said, given that JWST is undergoing several programmatic reviews that could reveal additional development costs and potential schedule delays. Ed Weiler, NASA associate administrator for science, said the JWST reviews are expected to wrap up next month and that a final decision on the program’s future should come by November’s end. Weiler declined to speculate on cost-growth specifics, but said, “It’s going to cost more. That’s about all I can say.” In the meantime, Morse said NASA is proposing to increase a planned investment in Euclid, a dark energy mapper designed by the European Space Agency (ESA). Morse said Euclid is “similar to WFIRST and could potentially be well aligned” with the highest science priorities laid out in the Astro2010 decadal survey. “We’re trying to prudently plan for the future such that the U.S. would have a leading role in a dark energy program that would get us to data this decade, ” Morse said, adding that Euclid’s projected launch near the end of this decade could provide the U.S. astronomy community with access to dark energy data sooner than WFIRST. Euclid is one of three so-called M-class missions vying for funding under the ESA’s Cosmic Vision program for separate launch opportunities in 2017 and 2018. A decision is expected in mid-2011 on which two of three finalists proceed toward launch. Morse said Euclid — which is expected to cost roughly 500 million euros ($650 million) — shares a number of scientific goals with WFIRST. He views a proposed 33 percent investment in Euclid as a near-term down payment on the decadal survey’s top science objective.

1NC PROLIF GOOD – STABILITY Nuclear prolif creates incentives for de-escalation and guarantees international stability – plus, new proliferators won’t be destabilizing. Asal & Beardsley, 07 [Victor, Assistant Prof. Pol. Sci. – SUNY Albany, and Kyle, Assistant Prof. Pol. Sci. – Emory U., Journal of Peace Research, “Proliferation and International Crisis Behavior”, 44:2, Sage]

Other, more optimistic, scholars see benefits to nuclear proliferation or, perhaps not actively advocating the development of more nuclear weapons and nuclear-weapon states, see that the presence of nuclear weapons has at least been stabilizing in the past. For example, some scholars are confident of the promise of the ‘nuclear peace’.4 While those who oppose proliferation present a number of arguments, those who contend that

nuclear weapons would reduce interstate wars are fairly consistent in focusing on one key argument: nuclear weapons make the risk of war unacceptable for states . As Waltz argues, the higher the stakes and the closer a country moves toward winning them, the more surely that country invites retaliation and risks its own destruction. States are not likely to run major risks for minor gains. War between nuclear states may escalate as the loser uses larger and larger warheads. Fearing that, states will want to draw back. Not escalation but deescalation becomes likely . War remains possible, but victory in war is too dangerous to fight for. (Sagan & Waltz, 2003: 6–7) ‘Nuclear war simply makes the risks of war much higher and shrinks the chance that a country will go to war’ (Snyder & Diesing, 1977: 450). Using similar logic,

Bueno de Mesquita & Riker (1982) demonstrate formally that a world with almost universal membership in the nuclear club will be much less likely to experience nuclear war than a world with only a few members. Supporters of proliferation do not see leaders of new nuclear states as being fundamentally different from those of the old nuclear states in terms of their levels of responsibility (Arquilla, 1997), nor do they see them facing unique challenges in managing and securing these weapons (Feaver, 1992/93: 162–163). The response to the argument that small powers, non-Western powers, and military powers will behave less responsibly than the USA and other ‘responsible’ powers is that the evidence does not support the view that new nuclear powers are ‘different’ in the worst sense of the word (Lavoy, 1995; Hagerty, 1998; Arquilla, 1997; Feldman, 1995; Karl, 1996/ 97).

Van Creveld (1993: 124) sums up this viewpoint when he points out that ‘where these weapons have been introduced, large-scale interstate warfare has disappeared’. Dismissing the fear that deterrence will not work if the arsenal is not big

enough or under enough control, Chellaney (1995) contends that the Cold War is evidence that even minimum deterrence is sufficient. In support, Feaver (1992/93: 186) argues that ‘even a modest nuclear arsenal should have some existential deterrent effect on regional enemies, precisely because decapitation is so difficult’. There are those who argue that security is increased at a systemic level when the number of nuclear states increases because of the

level of uncertainty created when more than one or two players are playing with a nuclear deck. When this happens, ‘the probability of deliberate nuclear attack falls to near zero with three, four, or more nuclear nations’ (Brito & Intriligator, 1983: 137). Cimbala (1993: 194) agrees, arguing that ‘it is only necessary to threaten the plausible loss of social value commensurate with the potential gains of an attacker’. 

EXT. STABILITYProlif makes conflicts less probable and reduces their magnitudeSechser, 2009 [Todd, Assistant Professor of Politics at the University of Virginia, “The Stabilizing Effects of Nuclear Proliferation”, http://faculty.virginia.edu/ tsechser/Sechser-Haas-2009.pdf, SM] 

Will additional nuclear proliferation stabilize world politics, or will it worsen the problem of interstate conflict? We cannot answer this question with certainty, of course, since we cannot collect data about the future. We can, however, learn from events that have already happened. Imagine that, at the advent of the nuclear age in 1945, today’s proliferation optimists and pessimists had put forth their competing predictions about the likely consequences of the spread of

nuclear weapons.Whose predictions would be borne out? In this section I argue that historical data confirm the predictions of proliferation optimism , while offering little corroboration for rival perspectives . Scholars who take the view

that proliferation bolsters global stability argue that the spread of nuclear weapons produces three observable effects.2 First, by deterring aggression, nuclear weapons reduce the frequency with which wars occur. Second,

nuclear weapons induce caution among leaders in crises and during wartime, thereby mitigating the intensity of wars. Third, nuclear weapons defuse arms races and obviate the need for high levels of convention alarms spending. Let us now consider each claim with respect to five proliferators: China, Israel, India, South Africa, and Pakistan. These five states provide a useful laboratory for examining the behavior of proliferators because they more closely resemble the types of states most likely to proliferate today. The United States, the Soviet Union, Great Britain, and France were all major industrialized powers when they acquired nuclear weapons, but these five proliferators were weaker, poorer, and less internally stable—much as today’s proliferators are likely to be.

Nuclear proliferation increases stability. Roth, 2007 [Ariel Ilan, Associate Dir. National Security Studies – Johns Hopkins U. and Visiting Assistant Prof. IR – Goucher College, International Studies Review, “REFLECTION, EVALUATION, INTEGRATION Nuclear Weapons in Neo-Realist Theory”, 9, p. 369-384]The current author disagrees with Snyder’s conclusions regarding the core difference between Waltz and Mearsheimer, contending that the core difference lies not in states’ tolerance for risk, but rather in the different estimations each holds regarding the amount of security that results from the possession of nuclear weapons.Waltz, it will be shown, is a disciple of the movement that sees nuclear weapons as having affected a ‘‘revolution,’’ in Robert Jervis’

(1989) phrase.2 According to this approach, once states attain a secure second-strike capability, that is, an ability to retaliate massively no matter how devastating the first blow, they are secure because the outcomes of a potential war are clear and absolutely devastating. As such, war becomes an irrational act under all circumstances . Indeed, so pacifying and sobering is the impact of these super-deadly weapons that Waltz (1981) argues that their proliferation to areas currently suffering from high political tension could be beneficial in stabilizing those zones of conflict . In short, a state with a secure second-strike capability is secure and, therefore, concerned only with maintaining its relative position, not with an expansion that would add costs with no concurrent increase in security. 

Empirical proof – countries always choose restraint over nuclear annihilation. Tepperman, 2009 [Jonathan, Newsweek International's first Assistant Managing Editor (now Deputy Editor), “Why Obama Should Learn to Love the Bomb” 8-29, http://www.newsweek.com/2009/08/28/why-obama-should-learn-to-love-the-bomb.html, SM]Why indeed? The iron logic of deterrence and mutually assured destruction is so compelling, it's led to what's known as the nuclear peace: the virtually unprecedented stretch since the end of World War   II in which all the world's major powers have avoided coming to blows. They did fight proxy wars, ranging from

Korea to Vietnam to Angola to Latin America. But these never matched the furious destruction of full-on, great-power war (World War II alone was responsible for some 50 million to 70 million deaths ). And since the end of the Cold War, such bloodshed has declined precipitously . Meanwhile, the nuclear powers have scrupulously avoided direct combat, and there's very good reason to think they always will . There have been some near misses, but a close look at these cases is fundamentally reassuring—because in each instance, very different leaders all came to the same safe conclusion . Take the mother of all nuclear standoffs: the Cuban missile crisis . For

13 days in October 1962, the United States and the Soviet Union each threatened the other with destruction. But both countries soon stepped back from the brink when they recognized that a war would have meant curtains for everyone . As

important as the fact that they did is the reason why: Soviet leader Nikita Khrushchev's aide Fyodor Burlatsky said later on, " It is

impossible to win a nuclear war, and both sides realized that, maybe for the first time ." The record since then shows the same pattern repeating: nuclear-armed enemies slide toward war, then pull back, always for the same reasons . The best recent example is India and Pakistan , which fought three bloody wars after independence before acquiring their own nukes in 1998 . Getting their hands on weapons of mass destruction didn't do anything to lessen their animosity . But it did dramatically mellow their behavio r. Since acquiring atomic weapons, the two sides have never fought another war , despite severe provocations (like Pakistani-based terrorist attacks on India in 2001 and 2008). They have

skirmished once. But during that flare-up, in Kashmir in 1999, both countries were careful to keep the fighting limited and to avoid threatening the other's vital interests . Sumit Ganguly, an Indiana University professor and

coauthor of the forthcoming India, Pakistan, and the Bomb, has found that on both sides, officials' thinking was strikingly similar to that of the Russians and Americans in 1962. The prospect of war brought Delhi and Islamabad face to face with a nuclear holocaust, and leaders in each country did what they had to do to avoid it .

Proliferation deters interstate conflict – four reasons.Feldman, 1997 [Shai, Senior Research Fellow – CSIS, “Nuclear Weapons and Arms Control in the Middle East, p. 16-17]

THE STABILIZING EFFECTS OF NUCLEAR WEAPONS In an earlier work. I portrayed the logic of nuclear deterrence from the

perspective of a single small state seeking security and survival.39 Effective deterrence requires robust capabilities and considerable determination. The outcome of deterrent confrontations is determined by the relative capacity to inflict punishment and the relative willingness to absorb it. Generally, and despite the clear constraints addressed below, nuclear weapons provide states with far more robust deterrence than conventional arms.40 There are several reasons for this. First, a state equipped with nuclear weapons enjoys a nearly unlimited capability to inflict punishment , since the

amount of damage that can be caused by a small number of nuclear weapons is virtually unlimited. Nuclear weapons can cause significantly greater damage than conventional explosives because the blast-to-weight ratio of nuclear devices is about a million times greater. Second, nuclear weapons leave far less room for misperceptions about the damage that can be caused. A simple extrapolation from the widely recognized horrors of Hiroshima and Nagasaki to today's far larger-yield nuclear weapons leaves little doubt that the use of any nuclear arms would be catastrophic. The 1986 nuclear accident at

Chernobyl, whose disastrous effects were very modest compared with the expected outcomes of a possible nuclear weapons exchange, reinforced perceptions of the power of nuclear weapons. Third, nuclear deterrence, unlike conventional deterrence, is not vulnerable to variations in states' sensitivity to costs. Conventional deterrence may fail because countries and regimes may be Insensitive to the costs involved in the execution of a threat to use conventional forces. In contrast, it is difficult to envisage a regime so insensitive to costs that it might ignore credible threats to inflict nuclear punishment. Fourth, nuclear weapons provide stable deterrence, because states possessing these weapons can feel confident that their nuclear arsenals will not be preempted. It is suicidal to make a preemptive nuclear attack unless such an attack can entirely destroy an adversary's nuclear arsenal; if even a few weapons survive the preemptive attack, the likely nuclear retaliation would be devastating. Unless a state stores its entire nuclear arsenal in a single exposed location, and unless its adversaries become convinced that it does not hide additional

weapons, they cannot be confident that a preemptive strike would destroy all of the state's nuclear weapons. The knowledge that their nuclear arsenals are largely invulnerable to a first strike should make states possessing such weapons self-confident.41 In turn, this confidence should diminish their own propensity to act preemptively. Thus, overall regional stability should increase in areas where nuclear weapons have proliferated. In contrast, in the conventional realm, states are sometimes tempted to preempt, fearing that an adversary's first strike would decide the outcome of the confrontation, and assessing that a preemptive attack could avert these dire consequences. In addition, a failed preemptive attack with conventional weapons would not necessarily result in catastrophe. Indeed,

due to the more limited risks involved in the conventional battlefield, states are far more willing to exercise violence, for offensive as well as defensive purposes.

Nukes deters conflict – risk of total destruction.

Waltz, 95 [Kenneth, member of the faculty at Columbia University, one of the most prominent scholars of IR alive today, one of the founders of neorealism, or structural realism, in international relations theory, “The Spread of Nuclear Weapons: A Debate, 1995, p. 29-30]

Uncertainty about outcomes does not work decisively against the fighting of wars in conventional worlds. Countries armed with conventional weapons go to war knowing that even in defeat their suffering will be limited. Calculations about nuclear war are differently made. Nuclear worlds call for and encourage a different kind of reasoning . If countries armed with nuclear weapons go to war, they do so knowing that their suffering may be unlimited. Of course, it

also may not be. But that is not the kind of uncertainty that encourages anyone to use force. In a conventional world, one is uncertain about winning or losing. In a nuclear world, one is uncertain about surviving or being annihilated. If force is used and not kept within limits, catastrophe will result. That prediction is easy to make because it does not require close estimates of opposing forces. The number of one's cities that can be severely damaged is at least equal to the number of strategic warheads an adversary can deliver.

Variations of number mean little within wide ranges. The expected effect of the deterrent achieves an easy clarity because wide margins of error in estimates of probable damage do not matter. Do we expect to lose one city or two, two cities or ten? When these are the pertinent questions, we stop thinking about running risks and

start worrying about how to avoid them. In a conventional world, deterrent threats are ineffective because the damage threatened is distant, limited, and problematic. Nuclear weapons make military miscalculations difficult and politically pertinent prediction easy .

Nuclear weapons cause international stability – expert consensus. Roberts, 96 [Brad, Ed. Washington Quarterly and Research Fellow – CSIS, “Weapons Proliferation and World Order: After the Cold War”, p. 191-192, SM]The belief that weapons and weapon development programs bring security is deeply ingrained in the strategic communities of all countries, whether military officers working iwthin the traditions of their institution or diplomats and statesment who have inherited a nineteenth-century-based understanding of an anarchic international system in which self defense is the primary right and task of states. Such

advocates also tend to equate national security with international stability and thus peace, out of a belief that any individual nation’s efforts to make itself more secure will also translate into a broader systemic effect that reduces the likelihood of war. Anecdotal evidence suggests that strategits in the developing world rigidly equate security and stability and believe that stable deterrence is achieved through military strength . Indian strategists cite the absence of major war with China as evidnce of the success of such policies. Both Indians and Pakistanis cite the fact that the Kashmir dispute of summer 1990 did not erupt into war as proof that deterrence under the nuclear shadow work s. Israelis cite the benefits of the undeclared “bomb in the basement” for security arab recognition of Israel’s right to exist (or at least the futility of trying to drive the Israelis back into the

sea). Iranians praise unconventional weapons as necessary to prevent the United States from capricuously using its power to again forment revolution in Iran. These strategists from the developing world tend to dismiss doubts in the developed world about the stability of deterrence in their regions as born of hubris among the technologically advanced or the racism of white societies.

Nukes prevent conflict escalation – four reasons. Waltz, 81 [Kenneth, a member of the faculty at Columbia University, one of the most prominent scholars of international relations (IR) alive today,co-founder of neorealism, or structural realism, in international relations theory, “The Spread of Nuclear Weapons: More May Better,” Adelphi Papers, Number 171 (London: International Institute for Strategic Studies]Do nuclear weapons increase or decrease the chances of war? The answer depends on whether nuclear weapons permit and encourage states to deploy forces in ways that make the active use of force more or less likely and in ways that promise to be more or less destructive. If nuclear weapons make the offence more effective and the blackmailer's threat more compelling, then nuclear weapons increase the chances of war—the more so the more widely they spread. If defence and deterrence are made easier and more reliable by the spread of nuclear weapons, we may expect the opposite result. To maintain their security, states must rely on the means they can generate and the arrangements they can make for themselves. The quality of international life therefore varies with the ease or the difficulty states experience in making themselves secure. Weapons and strategies change the situation of states in ways that make them more or less secure, as Robert Jervis has brilliantly shown. If weapons are not well suited for conquest, neighbours have more peace of mind. According to the defensive-deterrent ideal, we should expect war to become less likely when weaponry is such as to make conquest more difficult, to discourage pre-emptive and pre - ventive war, and to make coercive threats less credible. Do nuclear weapons have those effects?

Some answers can be found by considering how nuclear deterrence and how nuclear defence may improve the prospects for peace. First, wars can be fought in the face of deterrent threats, but the higher the stakes and the closer a country moves toward winning them, the more surely that country invites retaliation and risks its own destruction. States are not likely to run major risks for minor gains . Wars between nuclear states may escalate as the loser uses larger and larger warheads. Fearing that states will want to draw back. Not escalation but de-escalation becomes likely. War remains possible. but victory in war is too dangerous to fight for. If states can score only small gains because large ones risk retaliation, they have little incentive to fight. Second, states act with less care if the expected costs of war are low and with more care if they are high. In 1853 and 1854, Britain and France expected to win an easy victory if they went to war against Russia. Prestige abroad and political popularity at home would be gained. if not much else. The vagueness of their plans was matched by the carelessness of their acts. In blundering into the Crimean War they acted hastily on scant information, pandered to their people's frenzy for war, showed more concern for an ally's whim than for the adversary's situation, failed to specify the changes in behaviour that threats were supposed to bring and inclined towards testing strength first and bargaining second. In sharp contrast, the presence of nuclear weapons makes States exceedingly cautious. Think of Kennedy and Khruschev in the Cuban missile crisis. Why fight if you can't win much and might lose everything? Third, the question demands a negative answer all the more insistently when the deterrent deployment of nuclear weapons contributes more to a country's security than does conquest of territory. A country with a deterrent strategy does not need the extent of territory required by a country relying on a conventional defence in depth. A deterrent strategy makes it unnecessary for a country to fight for the sake of increasing its security, and this removes a major cause of war. Fourth, deterrent effect depends both on one's capabilities and on the will one has to use them. The will of the attacked, striving to preserve its own territory, can ordinarily be presumed stronger than the will of the attacker striving to annex someone else's territory. Knowing this, the would-be attacker is further inhibited. Certainty about the relative strength of adversaries also improves the prospects for peace. From the late nineteenth century onwards the speed of technological innovation increased the difficulty of estimating relative strengths and predicting the course of campaigns. Since World War II, technology has advanced even faster, but short of an anti ballistic missile (ABM) breakthrough, this does not matter very much . It does not disturb the American-Russian equilibrium because one side's missiles are not made obsolete by improvements in the other side's missiles. In 1906 the British Dreadnought, with the greater range and fire power of its guns, made older battleships obsolete. This does not happen to missiles. As Bernard Brodie put it: 'Weapons that do not have to fight their like do not become useless because of the advent of newer and superior types”. They do have to survive their like, but that is a much simpler problem to solve (see discussion below).

More ev – they decrease the likelihood of wars and make de-escalation possible. Waltz, 81 [Kenneth, a member of the faculty at Columbia University, one of the most prominent scholars of international relations (IR) alive today,co-founder of neorealism, or structural realism, in international relations theory, “The Spread of Nuclear Weapons: More May Better,” Adelphi Papers, Number 171 (London: International Institute for Strategic Studies]Countries more readily run the risks of war when defeat, if it comes, is distant and is expected to bring only limited damage. Given such expectations, leaders do not have to be insane to sound the trumpet and urge their people to be bold and courageous in the pursuit of victory. The outcome of battles and the course of campaigns are hard to foresee because so many things affect them, including the shifting allegiance and determination of alliance members. Predicting the result of conventional wars has proved difficult. Uncertainty about outcomes does not work decisively against the fighting of wars in conventional worlds. Countries armed with conventional weapons go to war knowing that even in defeat their suffering will be limited. Calculations about nuclear war are differently made. Nuclear worlds call for and encourage a different kind of reasoning. If countries armed with nuclear weapons go to war, they do so knowing that their suffering may be unlimited. Of course, it also may not be.

But that is not the kind of uncertainty that encourages anyone to use force. In a conventional world, one is uncertain about winning or losing. In a nuclear world, one is uncertain about surviving or being annihilated . If

force is used and not kept within limits, catastrophe will result. That prediction is easy to make because it does not require close estimates of opposing forces. The number of one's cities that can be severely damaged is at least equal to the number of strategic warheads an adversary can deliver. Variations of number mean little within wide ranges. The expected effect of the deterrent achieves an easy clarity because wide margins of error in estimates of probable damage do not matter. Do we expect to lose one city or two, two cities or ten? When these are the pertinent questions, we stop thinking about running risks and start worrying about how to avoid them. In a conventional world, deterrent threats are ineffective because the damage threatened is distant, limited, and problematic. Nuclear weapons make military miscalculations difficult and politically pertinent prediction easy . Dissuading a would-be attacker by throwing up a good-looking defence may be as effective as dissuading him through deterrence. Beginning with President Kennedy and Secretary of Defense McNamara in the early 1960s, we have asked how we can avoid. or at least postpone, using nuclear weapons rather than how we c:an mount the most effective defence. NATO's attempt to keep a defensive war conventional in its initial stage may guarantee that nuclear weapons, if used, will be used in a losing cause and in ways that multiply destruction without promising victory. Early use of very small warheads may stop escalation. Defensive deployment, if it should fail to dissuade, would bring small nuclear weapons into use before the physical, political and psychological environment had deteriorated. The chances of deescalation are high if the use of

nuclear weapons is carefully planned and their use is limited to the battlefield. We have rightly put strong emphasis on strategic deterrence, which makes large wars less likely, and wrongly slighted the question of whether nuclear weapons of low yield can effectively be used for defence, which would make any war at all less likely still. Lesser nuclear states, with choices tightly constrained by scarcity of resources, may be forced to make choices that NATO has avoided, to choose nuclear defence or nuclear deterrence rather than planning to fight a conventional war on a large scale and to use nuclear weapons only when conventional defences are breaking. Increased reliance on nuclear defence would decrease the credibility of nuclear deterrence. That would be acceptable if a nuclear defence were seen to be unassailable. An unassailable defence is fully dissuasive. Dissuasion is what is wanted whether by defence or by deterrence. The likelihood of war decreases as deterrent and defensive capabilities increase. Whatever the number of nuclear states, a nuclear world is tolerable if those states are able to send con vincing deterrent messages: It is useless to attempt to conquer because you will be severe ly punished. A nuclear world becomes even more tolerable if states are able to send con vincing defensive messages: It is useless to attempt to conquer because you cannot. Nuclear weapons and an appropriate doctrine for their

use may make it possible to approach the defensive-deterrent ideal, a condition that would cause the chances of war to dwindle. Concentrating attention on the destructive power of nuclear weapons has obscured the important benefits they promise to states trying to coexist in a self-help world.    

Nonprolif guarantees World War 3 – nuclear deterrence key. Barnett, 09 [5/14/09, Thomas Barnett, B.A. in International Relations and Russian Literature, University of Wisconsin, 1984, A.M. in Soviet Union Program, Harvard University, 1986, Ph.D. in Political Science, Harvard University, Esquire Magazine, “Seven Reasons Why Obama's Nuke-Free Utopia Won't Work”http://www.esquire.com/the-side/war-room/obama-nuclear-proliferation-051409]The president wants to rid the world of nuclear weapons. Sounds like he's fighting the good fight, but Esquire.com's global-strategy expert argues that it's absolutely the wrong one — a fight that might open globalization's door to World War III. Last month in Prague, President Obama declared his country's "moral responsibility to act" in transforming our planet into one free of nuclear weapons. He called for a global summit and a treaty to end nuke development, then signaled his seriousness back home by axing the Pentagon's much-needed Reliable Replacement Warhead program. Speaking before tens of thousands of Czechs on the day North Korea tested a long-range missile, Obama may have sounded like Martin Luther King ("This goal will not be reached quickly — perhaps not in my lifetime"), but his concept of a nuclear-proof world is patently unattainable, potentially dangerous, and inherently wrong. "I'm not naïve," the president said. "But we go forward with no illusions." But he is, and he has. George W. Bush had his "axis of evil," while Obama seems to find nuclear weapons to represent a kind of natural evil unto themselves — no matter who possesses them. Now the twentysomethings in Prague may have cheered his invocations of "hope" and "change," and others may be jumping on board, but I've discovered something in my years of global-strategy analysis, and it's not the deadly fatalism Obama describes — it's the modern realism he ignores: Nuclear weapons are the single best thing that has ever happened in mankind's long history of war . Globalization existed prior to World War One, but

then nukes arrived with their own "crystal-ball effect," previewing the suicidal destruction of modern war between great powers. And if globalization's economic interdependence was a "great illusion" back then, it's become a

rock-solid strategic reality in recent decades — and our recent global financial contagion has only made that more indisputably clear. Meanwhile, the world's great powers have come to understand that nuclear weapons are for having, not using.

And that is why no nuclear power has ever directly gone to war against another. If Obama simply wants to reengage Russia on further warhead reductions, fine. But it seems to me that his nuclear utopianism is not so much an extension of his youthful optimism as a vestige of the generational guilt promoted by Cold Warriors like Henry Kissinger — "wise men" who seek to end America's hypocrisy in preaching non-proliferation while relying on nuclear weapons as strategic back-stop. This vision isn't just a backwards one; it's a dangerously destabilizing policy agenda that makes conventional great-power war conceivable once again. Here's why Obama's nuclear ideals put World War III back on the table :

Prolif decreases likelihood of conflicts. Waltz, 81 [Kenneth, a member of the faculty at Columbia University, one of the most prominent scholars of international relations (IR) alive today,co-founder of neorealism, or structural realism, in international relations theory, “The Spread of Nuclear Weapons: More May Better,” Adelphi Papers, Number 171 (London: International Institute for Strategic Studies]What will a world populated by a larger number of nuclear states look like? I have drawn a picture of such a world that accords with experience throughout the nuclear age. Those who dread a world with more nuclear states do little more than assert that more is worse and claim without substantiation that new nuclear states will be less responsible and less capable of self-control than the old ones have been. They express fears that many felt

when they imagined how a nuclear China would behave. Such fears have proved un-rounded as nuclear weapons have slowly spread. I have found many reasons for believing that with more nuclear states the world will have a promising future. I have reached this unusual conclusion for six main reasons. First, international politics is a self-help system, and in such systems the principal par ties do most to determine their own fate, the fate of other parties, and the fate of the system. This will continue to be so, with the United States and the Soviet Union filling their customary roles. For the United States and the Soviet Union to achieve nuclear maturity and to show this by behaving sensibly is

more important than preventing the spread of nuclear weapons. Second, given the massive numbers of American and Russian warheads, and given the impossibility of one side destroying enough of the other side’s missiles to make a retaliatory strike bearable, the balance of terror is indes tructible. What can lesser states do to disrupt the nuclear equilibrium if even the mighty efforts of the United States and the Soviet Union cannot shake it? The international equilibrium will endure . Third, at the strategic level each of the great powers has to gauge the strength only of itself in relation to the other. They do not have to make guesses about the strengths of opposing coalitions, guesses that involve such impon derables as the coherence of diverse parties and their ability to concert their efforts. Estimating effective forces is thus made easier. Wars come most often by miscalculation. Miscalculation

will not come from carelessness and inattention in a bipolar world as it may in a multipolar one. Fourth, nuclear weaponry makes miscalcu lation difficult because it is hard not to be aware of how much damage a small number of warheads can do. Early in this century Norman Angell argued that wars could not occur because they would not pay. But conventional wars have brought political gains to some countries at the expense of others. Germans founded a state by fighting three short wars, in the last of which France lost Alsace. Lorraine. Among nuclear countries, possible losses in war overwhelm possible gains. In the nuclear age Angell’s

dictum, broadly interpreted, becomes persuasive. When the active use of force threatens to bring great losses, war become less likely. This proposition is widely accepted but insufficiently emphasized. Nuclear weapons have reduced the chances of war between the United States and the Soviet Union and between the Soviet Union and China. One may expect them to have similar effects elsewhere. Where nuclear weapons threaten to make the cost of

wars immense, who will dare to start them? Nuclear weapons make it possible to approach the deterrent ideal. Filth, nuclear weapons can be used for defence as well as for deterrence. Some have argued that an apparently impregnable nuclear defence can be

mounted. The Maginot Line has given defence a bad name. It nevertheless remains true that the incidence of wars decreases as the perceived difficulty of winning them increases. No one attacks a defence believed to be impregnable. Nuclear weapons may make it possible to approach the defensive ideal. If so, the spread of nuclear weapons will further help to maintain peace. Sixth, new nuclear states will confront the possibilities and feel the constraints that present nuclear states have experienced. New nuclear states will be more concerned for their safety and more mindful of dangers than some of the old ones have been. Until recently, only the great and some of the major powers have had nuclear weapons. While nuclear weapons have spread, conventional weapons have proliferated. Under these circumstances, wars have been fought not at the centre but at the periphery of international politics. The like lihood of war decreases as deterrent and defensive capabilities increase. Nuclear weapons, responsibly used, make wars hard to start. Nations that have nuclear weapons have strong incentives to use them responsibly. These statements hold for small as for big nuclear powers. Because they do, the measured spread of nuclear weapons is more to be welcomed than feared.

Nuclear weapons solve insecurity – all nations large and small become more cautious and less antagonistic of the world order when they possess nuclear weapons.Madson, 06 [Peter N. Madson, Lieutenant, United States Navy, Master’s degree in National Security Affairs, 3-6, “The Sky is Not Falling: Regional Reaction to a Nuclear Armed Iran,” http://oai.dtic.mil/oai/oai?verb=getRecord&metadataPrefix=html&identifier=ADA445779]Nuclear Optimists advocate a gradual increase in the number of nuclear states. They argue that a cautious increase does not correspond to an increased likelihood that nuclear weapons will be used. They further contend that this gradual spread is far better than if it were rapid or nonexistent.12 Supporters point to over sixty years in

which deterrence helped prevent nuclear conflict. According to Professor Kenneth Waltz of the University of

California at Berkeley, “The world has enjoyed more years of peace since 1945 than had been known in modern history.”13 Indeed, there has been no general war in this period, in spite of a variety of confrontations that could lead to rapid escalation and conflict.14 Instead nuclear weapons made nuclear war an unlikely possibility.15 Professor Waltz argues that if deterrence produces the ideal, then the opposite must be correct: not having a clear balance of terror preventing a misstep leads to uncertainty of action by a state. He states that defeated countries like Germany following World War II, which fought conventionally,

will at the very worst survive with limited suffering. Nuclear deterrence assures a totality of defeat; therefore, no rational actor will risk destruction.16Instead of instability and uncertainty, nuclear weapons increase stability and certainty, making “miscalculation difficult and politically pertinent predictions easy.”17

Nuclear weapons deter conflict. Waltz, 81 [Kenneth, a member of the faculty at Columbia University, one of the most prominent scholars of international relations (IR) alive today,co-founder of neorealism, or

structural realism, in international relations theory, “The Spread of Nuclear Weapons: More May Better,” Adelphi Papers, Number 171 (London: International Institute for Strategic Studies]The world has enjoyed more years of peace since 1945 than had been known in this century —if peace is defined as the absence of general war among the major states of the world. The Second World War followed the first one within twenty-one years. As of 1980 35 years had elapsed since the Allies’ victory over the Axis powers. Conflict marks all human affairs. In the past third of a century, conflict has generated hostility among states and has at times issued in violence among the weaker and smaller ones. Even though the more powerful states of the world have occasionally been direct participants, war has been confined geographically and limited militarily. Remarkably, general war has been avoided in a period of rapid and far-reaching changes—decolonization;

the rapid economic growth of some states; the formation. tightening, and eventual loosening of blocs; the development of new technologies; and the emergence of new strategies for fighting guerrilla wars and deterring nuclear ones . The prevalence of peace , together with the fighting of circumscribed wars, indicates a high ability of the post-war international system to absorb changes and to contain conflicts and hostility.

Presumably features found in the post-war system that were not present earlier account for the world's recent good fortune. The biggest changes in the post-war world are the shift from multipolarity to

bipolarity and t he introduction of nuclear weapons. Nuclear weapons have been the second force working for peace in the post-war world . They make the cost of war seem frighteningly high and thus discourage states from starting any wars that might lead to the use of such weapons . Nuclear weapons have helped maintain peace between the great powers and have not led their few other possessors into military adventures.5 Their further spread, however, causes widespread fear. Much of the writing about the spread of nuclear weapons has this unusual trait: It tells us that what did no, happen in the past is likely to happen in the future, that tomorrow's nuclear states are likely to do to one another what today's nuclear states have not done. A happy nuclear past leads many to expect an unhappy nuclear future. This is odd, and the oddity leads me to believe that we should reconsider how weapons affect the situation of their possessors. States coexist in a condition of anarchy. Self-help is the principle of action in an anarchic order, and the most important way in which states must help themselves is by providing for their own security. Therefore, in weighing the chances for peace, the first questions to ask are questions about the ends for which states use force and about the strategies and weapons they employ. The chances of peace rise if states can achieve their most important ends without actively using force. War becomes less likely as the costs of war rise in relation to possible gains. Strategies bring ends and means together. How

nuclear weapons affect the chances for peace is seen by considering the possible strategies of states. Force may be used for offense, for defense, for deterrence, and for coercion. Consider offence first. Germany and France before World War 1 provide a classic case of two adversaries each neglecting its defence and both planning to launch major attacks at the outset of war. France favoured offence over defence, because only by fighting an offensive war could Alsace-Lorraine be reclaimed. This illustrates one purpose of the offence: namely, conquest. Germany favoured offence over defence believing offence to be the best defence, or even the only defence possible. Hemmed in by two adversaries. she could avoid fighting a two-front war only by concentrating her forces in the West and defeating France before Russia could mobilize and move effectively into battle. This is what the Schlieffen plan called for. The Plan illustrates another purpose of the offence: namely, security. Even if security had been Germany's only goal, an offensive strategy seemed to be the way to obtain it. How can one state dissuade another state from attacking? In either or in some combination of two ways. One way to counter an intended attack is to build fortifications and to muster forces that look forbiddingly strong. To build defences so patently strong that no one will try to destroy or overcome them would make international life perfectly tranquil. I call this the defensive ideal. The other way to inhibit a country's intended aggressive moves is to scare that country out of making them by threatening to visit unacceptable punishment upon it. 'To deter' literally means to stop someone from doing something by frightening him. In contrast to dissuasion by defence, dissuasion by deterrence operates by frightening a state out of attacking, not because of the difficulty of launching an attack and carrying it home, but because the expected reaction of the attacked will result in one's own severe punishment. Defence and deterrence are often confused. One frequently hears statements like this: 'A strong defence in Europe will deter a Russian attack'. What is meant is that a strong defence will dissuade Russia from attacking. Deterrence is achieved not through the ability to defend but through the ability to punish. Purely deterrent forces provide no defence. The message of a deterrent strategy is this: 'Although we are defenceless, if you attack we will punish you to an extent that more than cancels your gains'. Second-strike nuclear forces serve that kind of strategy. Purely defensive forces provide no deterrence. They offer no means of punishment. The message of a defensive strategy is this: 'Although we cannot strike back, you will find our defences so difficult to overcome that you will dash yourself to pieces against them'. The Maginot Line was to serve that kind of strategy. **We don’t endorse gendered language.

Comparative ev – nukes prevents worse and more violent alternatives. Quinlan, 93 [Sir Michael, July, “The Future of Nuclear Weapons: Policy for Western Possessors,” International Affairs, JSTOR]Alternatively, international behaviour might become so securely accustomed to the 'no war' assumption that it could always and everywhere be relied upon , even amid stresses. Such possibilities,

however, are at best far too remote to be a dependable basis for security policy now. If we want nuclear weapons to help deter war-or limit its incidence and severity-in a world like ours today, they have to have some physical actuality . If we further

judge (as we reasonably now can) that fears of nuclear war arising purely by accident, just from the existence of any nuclear weapons at all, can be discounted as far-fetched, there then remains a strong fundamental case of a general kind for maintaining in being some properly controlled military

nuclear capability. The more particular fact that the world faces a phase of highly uncertain political development reinforces the general

argument. Put another way, that argument is that the world community need not and should not allow circumstances to arise in which there could be any temptation to revert to the illusion that major war could be dependably held to tolerable levels of destructiveness and so could again be an option for settling serious differences among advanced military powers. Nor do we want circumstances to arise in which a state with a risk-taking leadership (or one feeling under especial threat) might be tempted to gamble on a clandestine dash to seize advantage through a period of sole nuclear possession. We cannot tell precisely how high such risks are; our concern must be to

avoid ever having to find out the hard way. The point is further reinforced by the tacit value of nuclear weapons as an underlying element of deterrence to intolerable use of force in other ways, for example with chemical or biological weapons. (The 'negative security assurances' whereby the Western nuclear powers in the late I970S formally renounced the option of nuclear use against non-nuclear powers theoretically excludes that last argument. But Saddam Hussein is unlikely to have felt confident in I99I of relying upon the exclusion; and that lack of confidence seems both realistic and healthy.) In brief, a purportedly non-nuclear world would be likely to be a more dangerous world, even if the formidable problems of transition from our present state could be effectively managed.

Nukes substantially reduce the probability of conflict escalation. Gartzke, Kroenig & Rauchhaus, 08 [Robert Rauchhaus: Assistant Professor of political science at the University of California, Santa Barbara Erik Gartzke: PhD, Associate Professor of Political Science, University of California at San Diego; Matthew Kroenig: assistant professor of Government at Georgetown University and research fellow with at Harvard University, 11/9/08, Belfer Center for Science and International Affairs at Harvard University; “A Strategic Approach to Nuclear Proliferation”]The second set of papers considers the consequences of nuclear proliferation. Taken together they find that nuclear weapons do not affect the frequency of conflict, but they do affect the timing, duration, severity, and outcome of conflict. These papers provide considerable support for the argument that nuclear weapons enhance the security and diplomatic power of their possessors. Nuclear weapon states are neither more nor less conflict prone, but their conflicts are shorter and less intense, and they tend to emerge victorious from them. Furthermore, the authors find that nuclear powers enjoy enhanced international bargaining power. Gartzke and Jo’s paper examines the effect of nuclear weapon possession on the probability of conflict. They find that nuclear weapons have no overall effect. Nuclear weapon states are neither more nor less likely to be involved in international disputes. Instead, they argue that

even if nuclear weapons do not directly affect the probability of conflict, nuclear weapons status can still influence the allocation of resources and bargains in favor of nuclear powers. States may be able to use nuclear weapons strategically in order to garner international influence. To test the hypothesis that nuclear weapon states enjoy greater influence, Gartzke and Jo examine whether nuclear possession affects patterns of diplomatic missions. Important states send and attract diplomatic missions to and from other nations. The authors build on previous research on diplomatic missions and carefully controls for other relevant factors including population and economic size. They find that nuclear weapon states tend to host greater numbers of diplomatic missions. The primary effect of nuclear proliferation on international politics is not a reduction or increase in the probability of conflict, but greater international influence for their possessors. Michael Horowitz examines how the length of time of nuclear possession affects crisis behavior. If a state’s capabilities and resolve, and the way in which a state’s capabilities and resolve are perceived by adversaries, influences the probability of conflict, then the probability of conflict may change over time as nuclear learning occurs. Using multiple statistical models, Horowitz finds that when states acquire nuclear weapons they are more likely to reciprocate international disputes and are also more likely to have their disputes reciprocated. Over time, however, this effect reverses. Inexperienced nuclear states are more dispute-prone, while experienced nuclear states are less so. Consistent with the theme of this issue, nuclear weapons improve the strategic position of their possessor. The longer a state possesses nuclear weapons, the less likely it is to become involved in disputes. This finding also has important implications. Any static understandings of nuclear proliferation are likely incomplete because they ignore how nuclear possession interacts with time to influence international conflict behavior. 

 

EXT. COLD WAR PROVESCold war proves prolif guarantees peace. Collins, 5-20-2010 [Bennett, University of St Andrews, “Assessing the Arguments For and Against Nuclear Proliferation”, http://www.e-ir.info/?p=4147, SM] For proliferation advocates, such as Kenneth Waltz, nuclear deterrence is one of the greatest, if not one of the best, ways to bring about peace .   Nuclear deterrence is seen by Waltz as a state’s ability to withhold the means to cancel the gains of an attacker.[1] Waltz basically argues that a deterrent strategy makes it unnecessary for a country to fight for the sake of increasing its security, and that this removes a major cause of war. Henceforth there is the argument that the Cold War is a prime example of how nuclear deterrence brought about one of the longest eras of peace the world had ever seen. The

world observed two nuclear powers at their best; not attacking one another, learning to adapt to other nuclear powers, and helping other states evolve into nuclear powers with deterrent strategies, essentially promoting peace through fear. It is the general assumption of Waltz that no matter the number of nuclear states, a nuclear world becomes tolerable when they are able to send deterrent messages out.[2] To Waltz and other proliferation advocates, a select few should not be held above the rest when give nuclear capabilities. In fact, Waltz criticizes Westerners for fearing nuclear weapons will reach the hands of leaders in the Third World. He states that the all rulers want a state to lead and as a result would not threaten that goal by bringing about nuclear war .

[3] To sum it up, Waltz argues that nuclear deterrence should be seen as a form of peace that should be available to all states to obtain through the possession of nuclear arms 

Cold War proves nuclear deterrence is effective. Freedman, 2K [Lawrence, Professor of War Studies at King's College London, was a foreign policy adviser to Tony Blair, “Does Deterrence Have a Future?” Oct., http://www.armscontrol.org/act/2000_10/detoct00, SM] For some four decades, deterrence was at the center of U.S. defense policy. Every move was made with reference to its requirements—a

centrality that was due to three important presentational features. First, it sounded robust without being reckless. Forces were not being used to compel a change in the status quo, only to contain an enemy. Second, it was hard to think of a better way to make sense of a nuclear inventory. Preparing to use nuclear weapons as if they were conventional seemed criminal, yet the prospect of nuclear war appeared to encourage a welcome caution all around. If, as it seemed, there was no way of getting out of the nuclear

age, then deterrence made the best of a bad job. Third, it seemed to work. Exactly how it worked was often difficult to explain, and historians can point to some terrifying moments when catastrophe was just around the corner. But World War III did not happen, and the fact that the superpowers were scared of this war surely had something to do with its failure to materialize. In the 1960s, the role of nuclear weapons in securing superpower restraint came to be recognized in an almost formulaic way as "mutual assured destruction." So long as each side was confident that it could inflict utter hell on the other and it was understood that both would need to buy new weapons in a precautionary way to sustain such confidence, then a wider political equilibrium was possible. There was, however, an awkward thought at the heart of this concept, which is why its critics seized on the acronym "MAD": if a nuclear war meant an inevitable slide into the ultimate catastrophe, then who would be irrational enough to set it in motion? If the argument was that circumstances could produce irrationality, carrying the risk of the ultimate madness, then how could we be confident that these dangerous circumstances would not arise over incidents that were comparatively trivial in their origins, conflicts marked by confusion rather than unremitting belligerence? In practice, the political circumstances never quite arose for nuclear employment, although there were times when it seemed close, such as during the early 1960s and early 1980s. One conclusion that might be drawn from the Cold War experience is that

deterrence worked because it was not asked to do too much. The East-West conflict became institutionalized and relatively stable over time.

AT COLD WAR THEORY BAD/DOESN’T APPLY Irrelevant – theories are still valid with non-superpowers. Karl, 98 [David, president of the Asia Strategy Initiative, a consultancy based in Los Angeles, and a lecturer in international relations at the University of Southern California, “Prolif Pessimism and Emerging Nuclear Powers” JSTOR, SM] Although this school bases its claims upon the U.S.-Soviet Cold War nuclear relationship, it admits of no basic exception to the imperatives of nuclear deterrence. Nothing within the school's

thesis is intrinsic solely to the super-power experience. The nuclear "balance of terror" is seen as

far from fragile. Nuclear-armed adversaries, regardless of context, should behave toward each other like the superpowers during the Cold War's "nuclear peace." The reason for this near-absolute claim is the supposedly immutable quality of nuclear weapons: their presence is the key variable in any deterrent situation, because fear of their devastating consequences simply overwhelms the operation of all other factors.10 Martin van Creveld alleges that "the leaders of medium and small powers alike tend to be extremely cautious with regard to the nuclear weapons they possess or with which they are faced-the proof being that, to date, in every region where these weapons have been introduced, large-scale interstate warfare has disappeared."11 Shai Feldman submits

that "it is no longer disputed that the undeclared nuclear capabilities of India and Pakistan have helped stabilize their relations in recent years. It is difficult to see how escalation of the conflict over Kashmir could have been avoided were it not for the two countries' fear of nuclear escalation." The spread of nuclear weap-ons technology is thus viewed by optimists as a positive development, so much so that some even advocate its selective abettance by current nuclear powers.12 

Nuclear deterrence still works. Freedman, 2K [Lawrence, Professor of War Studies at King's College London, was a foreign policy adviser to Tony Blair, “Does Deterrence Have a Future?” Oct., http://www.armscontrol.org/act/2000_10/detoct00, SM] The possibilities of deterrence should not be ignored. As an approach to security policy, deterrence still has a role to play, although

not the role it was granted during the Cold War. Deterrence still helps explain why states, and even non-state actors, fail to act against the interests of others. Actors may be deterred because they have constructed a possible future in which they are worse off. These caution-inducing constructions are often developed without any external help, but occasionally they can be encouraged by various combinations of statements and military deployments, put together as deliberate acts of deterrence. In this way, we can understand deterrence as a feature and consequence of good strategy. So, at one level deterrence never goes away. Certain options , whole categories of actions, are precluded because of the possible responses of others. Land may be coveted, but it is not grabbed; the unacceptable practices of foreign governments are denounced, but they are left untouched; ideological ambitions are shelved; inconveniences, disruptions, and outrages are tolerated; punches are pulled. Over time, after operations have been delayed and plans shelved, it is forgotten that these operations were ever proposed or that the plans were once taken seriously.

Deterrence works – prolif makes nuclear wars impossible and rationality is irrelevant. Waltz, 95 [Kenneth, member of the faculty at Columbia University and one of the most prominent scholars of international relations (IR) alive today, co-founder of neorealism, or structural realism, in international relations theory, “The Spread of Nuclear Weapons: A Debate,” W. W. Norton & Company, October 1995, p. 112-113] CMR Third, Sagan and others use the term theory" or even "rational deterrence theory." Deterrence is not a theory. Instead, deterrent policies derive from structural theory, which emphasizes that the units of an international-political system must tend to their own se- curity as best they can. The

means available for doing so shape the policies of states and, when nuclear weapons become available, lead them in fact to take deterrent stances even though they may still talk about the need to be able to defend and to fight for their nations' security. In applying theories, one considers salient conditions in the world, and

nothing is more salient than nuclear weapons. Moreover, deterrence does not rest on rational- ity,

whatever that term might mean. By a simple defini- tion, one is rational if one is able to reason. A little reasoning leads to the conclusions that to fight nuclear wars is all but impossible and that to launch an offensive that

might prompt nuclear retaliation is obvious folly. To reach those conclusions, complicated calculations are not required, only a little common sense.24

1NC PROLIF GOOD – MISCALC Proliferation stops miscalculation – risks of nuclear war are too clear. Roth, 2007 [Ariel Ilan, Associate Dir. National Security Studies – Johns Hopkins U. and Visiting Assistant Prof. IR – Goucher College, International Studies Review, “REFLECTION, EVALUATION, INTEGRATION Nuclear Weapons in Neo-Realist Theory”, 9, p. 369-384]No such potential for miscalculation exists in a nuclear conflict . In several papers and articles, as well as a co-

authored book, Waltz makes explicit his belief that nuclear weapons eliminate (or at least severely reduce) the likelihood of miscalculation of the degree to which a war will be costly. Because, according to Waltz, one of the main engines for war is uncertainty regarding outcomes and because the immense destruction that can come as the result of a nuclear exchange can be fully anticipated, it is never rational to engage in a war where the possibility of a nuclear exchange exists. Consequently, as

Waltz (1990:740) forcefully argues, ‘ ‘the probability of major war among states having nuclear weapons approaches zero.’’

EXT. MISCALC More ev – prolif stops miscalc. Roth, 07 [Ariel Ilan, Associate Dir. National Security Studies – Johns Hopkins U. and Visiting Assistant Prof. IR – Goucher College, International Studies Review, “REFLECTION, EVALUATION, INTEGRATION Nuclear Weapons in Neo-Realist Theory”, 9, p. 369-384]Waltz’s stand on the unique restraining influence of nuclear weapons is unambiguously clear. Waltz (1990:732) states that ‘‘the absolute quality of nuclear weapons sharply sets a nuclear world off from a conventional one.’’ It is true that in

his form of structural neo-realist theory war remains possible because of the self-help nature of the system, but war, for Waltz, is a rational act and as such subject to considerations of costs and benefits. States and their leaders engage in a series of calculations wherein they consider their aims for a war, the prospects for victory, and finally, and most importantly, how the war will affect their attainment of security. The fact that certain states initiate a war they expect will better their long-term security and end up both losing the war and having a worsened security situation does not mean, necessarily, that the war was irrational. What Waltz argues is that, since in a conventional war it is possible to miscalculate how things will turn out, the decision to go to war is rational given the interpretation of the data available at the time that hostilities were initiated. In fact, (Waltz and Sagan

2003:8) goes further, suggesting that even if foreknowledge of eventual defeat is perceived, the decision to fight is still rational because much can happen on the battlefield and losing would not mean losing too much anyway. Predicting the results of a conventional conflict, says Waltz, ‘‘has proved difficult’’ to say the least. 

No miscalc – nuclear forces are too important. Waltz, 95 [Kenneth, member of the faculty at Columbia University and one of the most prominent scholars of international relations (IR) alive today, co-founder of neorealism, or structural realism, in international relations theory, “The Spread of Nuclear Weapons: A Debate,” W. W. Norton & Company, October 1995, p. 98-99]America has long associated democracy with peace and authoritarianism with war, overlooking that weak authoritarian states often avoid war for fear of upsetting the balance of internal and external forces on which their power depends. Neither Italy nor Germany was able to persuade Franco's Spain to enter World War II. External pressures affect state behavior with a force that varies with conditions. Of all of the possible external forces, what could affect state behavior more strongly than nuclear weapons? Who cares about the "cognitive" abilities of leaders when nobody but an idiot can fail to comprehend their destructive force? What more is there to learn? How can leaders miscalculate?

For a country to strike first without certainty of success, all of those who control a nation's nuclear weapons would have to go mad at the same time. Nuclear reality transcends political rhetoric. Did our own big words, or the Soviet Un- ion's prattling about nuclear war-fighting, ever mean anything? Political, military, and academic hard-liners imagined conditions under which we would or should be willing to use nuclear weapons. None was of relevance. Nuclear weapons dominate strategy. Nothing can be done with them other than to use them for deterrence. The

United States and the Soviet Union were both reluctant to accept the fact of deterrence. Weaker states find it easier to substitute deterrence for war-fighting, precisely because they are weak.

1NC NPT – DEFENSE Nonproliferation regime fails – only thing preventing prolif are political leaders. Potter & Mukhatzhanova, 2008 [William C. and Gaukhar, * Sam Nunn and Richard Lugar Professor of Nonproliferation Studies and Director of the James Martin Center for Nonproliferation Studies at the Monterey Institute of International Studies and ** Research Associate at the James Martin Center, “Divining Nuclear Intentions: a review essay.” International Security, Vol. 33, No. 1 (Summer 2008), pp. 139–169, Google scholar]  The more extreme position of the two authors is staked out by Hymans, for whom the real proliferation puzzle is not why there are so few nuclear weapons possessors, but why there are any at all (p. 8). Hymans finds the major international relations paradigms—realism, institutionalism, and constructivism—of limited utility in explaining the slow pace of proliferation and those rare instances of its occurrence. The answer to the puzzle, he believes, has to do primarily with the lack of motivation on the part of nearly all state leaders. Put simply, he argues, nonproliferation restraint stems less from external efforts to stop states from going nuclear, and more from “the hearts of state leaders themselves” (p. 7). Contrary to conventional wisdom, he maintains, few national political agures have either the desire or certitude to go nu- clear (p. 8).

According to Hymans, although the nonproliferation regime may have many virtues, the appearance of its success in containing proliferation results mainly from the fact “that few state leaders have desired the things it prohibits” (ibid.). A major determinant of this reticence to pursue nuclear weapons, Hymans explains, is the revolutionary nature of a decision to acquire them, which is recognized as such by all top decisionmakers. Only leaders who possess a deep-seated “national identity conception” (NIC) of a particular type will acutely perceive the need for the bomb and have the exceptional will- power to take that extraordinary step (p. 12).

NPT fails – they don’t prevent prolif and make conflict more likely. Collins, 5-20-2010 [Bennett, University of St Andrews, “Assessing the Arguments For and Against Nuclear Proliferation”, http://www.e-ir.info/?p=4147, SM] However, these treaties can easily be discredited as being effective in the long run through simply showing observances on the structure of the treaties themselves. For example, the combined facts that not only have not all proliferated states signed the NPT, but the treaty even accepts nuclear proliferation for some states.[14] This ultimately furthers the status quo by retaining a global hierarchy of proliferated states and as a result, makes states question as to why a select few are entrusted with such disastrous means. ‘As long as nations maintain nuclear deterrence as essential for protection, it will hard to deny the right to other countries.’[15] In modern day, the world sees that such institutions that promote a status quo to be maintained will always have its dissenters who wish to change the global order. North Korea’s indigenously created nuclear capabilities are a testament to the ineffectiveness international institutions will hold over its signatories. Henceforth, the nonproliferation attempts to see global cooperation can only be seen effective when complete disarmament on the account of all states is seen. In addition, through a more proliferation-bias, these international institutions have done little to ease the spread of nuclear weapons in the world. Besides pressuring superpowers to encourage nonproliferation and to disarm their already massive

arsenals,[16] the treaties are simply institutionalizing the concept of selective proliferation into an international norm.[17] Selective proliferation advocates find that nuclear powers should act as models for the ‘inevitable spread’ of nuclear weaponry as opposed to disarming themselves.[18] Nonproliferation supporter Robert Fischer claims that ‘even in the unlikely event of total disarmament, nuclear weapons would hold the existential threat over states deterring nations from resorting to conventional war.’[19] Thus to proliferation advocates, such as Mearscheimer and Waltz, such treaties are a waste of time and are simply delaying the inevitable as opposed to helping states learn to live side-by-side with international nuclear proliferation.[20]

EXT. DEFENSENPT is inadequate to deal with prolif – strategic uncertainties and U.S. firepower make nuclear weapons a viable option for potential proliferants. Wesley, 2005 [Michael S., executive director of the Lowy Institute for International Policy and former Professor of International Relations at Griffith University, “It’s time to scrap the NPT,” Australian Journal of International Affairs, Vol. 59, No. 3, ed. by W. Tow pp. 283-299) ]The drivers of proliferation among several of Asia’s emerging great powers combine both mounting demand-side incentives and crumbling supply-side controls . Neither of these can adequately be addressed by the NPT in its current state . The major demand-side incentives are greater strategic uncertainty among regional powers and a rising thirst for international prestige. At the global level, the actions and statements of the United States, which currently combines a belief in its unassailable power with a post-11 September 2001 conviction of its unrivalled vulnerability, have increased the strategic uncertainties of many states. The current US preoccupation with terrorism and non-proliferation and recent high-visibility demonstrations of US air power have enhanced the credibility of Washington’s threats of coercion against ‘rogue states’. As the United States’ inhibitions against the use of force have fallen, the attractiveness of nuclear weapons*/the ultimate insurance policy*/have risen. In Asia, a newly intense pattern of competition and collusion among the current and emerging great powers has further increased the attractiveness of nuclear weapons. China, Japan, India, Russia and Iran have reacted to a range of recent changes*/rising prosperity, regime change in Afghanistan and Iraq, patterns of alignment and basing during the ‘war on terror’, uncertainties over energy security*/to create a shifting pattern of alignments and tensions that are yet to settle into a stable and predictable template. In the

meantime, this new great power manoeuvring has begun to link up previously separate security dyads and complexes, as combinations of powers jostle for position in Northeast, Southeast, Central, South and Western Asia. This is a fluid and potentially dangerous power dynamic, as Asia’s powers are yet to settle among themselves issues of status, spheres of influence, regional norms of behaviour, patterns of alignment and enmity and tacit conditions governing the use of force. Meanwhile, the threat perceptions of many middle and smaller powers have been raised. As regional rivalries drive various containment and counter- containment strategies (see Paul 2003), and

increased strategic uncertainty raises states’ security concerns, the demand-side pressures for nuclear weapons will continue to mount. The other major demand-side driver of proliferation is the growing thirst for status among Asia’s emerging great powers. Rising prosperity and growing nationalism has fed a renewed interest in gaining symbols of international prestige and influence. The campaign of states such as Japan, India, Indonesia and Brazil for permanent seats on the UN Security Council is one manifestation of the new hunger for

prestige. Membership of the ‘nuclear club’   has long been recognised as another tacit symbol of great power status. Possession of nuclear weapons is one indicator of membership in the great power ‘club’. The ability to design and manufacture nuclear warheads and ballistic missiles is thought to signal high levels of technological competence, a particularly important status symbol for developing countries (Navais 1990: 9*/13). 

The NPT doesn’t prevent prolif – spread inevitable. Wesley, 2005 [Michael S., executive director of the Lowy Institute for International Policy and former Professor of International Relations at Griffith University, “It’s time to scrap the NPT,” Australian Journal of International Affairs, Vol. 59, No. 3, ed. by W. Tow pp. 283-299) ]The NPT’s inability either to prevent the spread of nuclear components, materials and technology, or to secure the nuclear disarmament of the nuclear weapons states (as discussed below), only

adds to these demand-side pressures. In developing nuclear weapons, Israel, India, Pakistan, North Korea and probably Iran have demonstrated that neither the NPT nor any other international regime provides them with an adequate security guarantee against either nuclear or conventional coercion. To the contrary, by confining the possession of nuclear weapons to some states and not others, the NPT has raised the attractiveness of nuclear weapons for those states not covered by the nuclear weapons states’ guarantees of extended deterrence. These demand-side pressures

suggest that the incentives of a small number of states to acquire nuclear weapons will endure over time. Each new nuclear weapons state will give rise to proliferation incentives among a limited number of neighbours and rivals, thereby maintaining a fairly consistent level of proliferation

pressure over time. As I discuss below, because the vast majority of states choose to eschew nuclear weapons, because their sense of insecurity is insufficient to justify the costs of possessing nuclear weapons, the risks of a major nuclear ‘break out’ are low. It is the conditions of proliferation, rather than its

occurrence, that a new regime should try to regulate. While the demand-side pressures for proliferation will continue, the supply- side restrictions have crumbled and are unlikely to be rebuilt. In the words of one technical expert, ‘one by one, the barriers to proliferation are gradually falling, and for those states that anticipate continuing security challenges, there may be a strong temptation during the first decades of this century to proliferate’ (Erickson 2001: 46). On the one hand, the economic and technological barriers to acquiring nuclear components and technology are falling. Most potential nuclear weapons states are becoming wealthier at the same time as the costs of building a nuclear weapons program are falling. Globalisation has led to the broad dispersal of sophisticated project manage- ment skills, while the international education market and the fact that the basic knowledge required to make nuclear weapons is now nearly 50 years old means that the technological competence required for a viable nuclear program is no longer a rare commodity (Zimmerman 1994). On the other hand, the effectiveness of export controls has eroded. The post-Cold War priority of economic growth and integration led to the abolition of most blanket restrictions on dual-use technology exports and a reduction of the range of dual-use military technology subject to export controls (Saunders 2001: 127*/8).

States such as Russia and China have engaged in a form of diplomatic rent seeking by continuing to export nuclear technology and dual-use materials to potential proliferators*/sometimes at the cost of substantial financial losses and threats of US sanctions

(Diaconu and Maloney 2003)*/in order to gain diplomatic influence and weaken US leverage over key regional states. If this combination of demand-side and supply-side conditions leads to several states’ moves towards proliferation in the years ahead, the NPT will be singularly unable to prevent it, or to stabilise the process of proliferation. 

NPT is failing – inequality noncompliance likely. Wesley, 2005 [Michael S., executive director of the Lowy Institute for International Policy and former Professor of International Relations at Griffith University, “It’s time to scrap the NPT,” Australian Journal of International Affairs, Vol. 59, No. 3, ed. by W. Tow pp. 283-299) ]The unfairness of the NPT risks generating cynicism among states about their obligations under the treaty, and therefore impacts directly on its effectiveness. Friedrich Kratochwil (1989) has argued convincingly that states do not follow rules out of a sense of unreflective obligation or blind habit, but on the basis of explicitly developed justifications derived from socially shared conceptions of rationality and justice. Because the NPT effectively enshrines an unequal distribution of the security and status conferred by nuclear weapons, it contravenes the principles of natural justice. This in turn detracts from its legitimacy and ultimately from its effectiveness. As Abram Chayes and Antonia Chayes have argued, ‘a system in which only the weak can be made to comply with their undertakings will not achieve the legitimacy needed for reliable enforcement of treaty obligations’ (1998: 3). Furthermore, by effectively making the prohibition of the spread of nuclear weapons a higher priority than the eradication of the nuclear arsenals of the nuclear weapons states, the NPT regime implies that some states, and not others, can be trusted with nuclear weapons. The implicit judgement the regime makes about competence and trustworthiness only further aggravates the status inequality issues that plague the NPT. By arguing that the NPT enshrines a system of ‘nuclear apartheid’, Indian leaders and diplomats rehearsed many of these issues in their defence of India’s nuclear tests in 1998. The effectiveness of this line of argument, plus the fact that interests deemed more important than non- proliferation soon brought an end to most states’ sanctions against India and Pakistan, have done a great deal of damage to the moral authority of the NPT . Empirically unsuccessfulGwertzman, 5-19-2010 [5/19, Bernard, Consulting Editor, CFR, “The Logic of a Nuclear-Free Mideast,” http://www.cfr.org/publication/22153/logic_of_a_nuclearfree_mideast.html]

It has been more than forty years since the Nonproliferation Treaty went into effect. What is your impression now of that treaty? The treaty was meant to prevent the emergence of new nuclear weapon states, disarm those that had these weapons, and ensure that peaceful nuclear technology was accessible to all. These are commendable goals even if we had to start with some states being nuclear armed and others not. Forty years later we have more nuclear weapon states and a larger number of nuclear warheads, continuous proliferation concerns and inaccessible peaceful nuclear technology. This is truly disappointing, and it is petty to say we expected worse. The treaty is still important. However, it is becoming stale and could become irrelevant if not nurtured with real disarmament measures and greater equity. Cooperation is required if it is to meet the challenges of our time, particularly the emergence of new nuclear states, non-state parties, and the dissemination of technology.  NPT lacks credibility and accountability – renders it useless. Wesley, 2005 [September, Michael, Executive Director of the Lowy Institute for International Policy, Australian Journal of International Affairs, “It’s Time To Scrap the NPT,” EBSCO]Many countries regard the NPT as deeply unfair, because it effectively solidifies an inequality in international relations that accords some states the (albeit questionable) status and security conferred by nuclear weapons while denying it to others.1 Although the NPT commits nuclear weapons states to eradicating their nuclear arsenals, after over a quarter of a century they have made only partial moves towards fulfilling this undertaking.2 Some commentators have It’s time to scrap the NPT 287 argued

that the indefinite extension of the NPT in 1995 further underlined this breach of trust by depriving non-nuclear weapons states of the opportunities provided by periodic renewal (rather than just review)

conferences to press the nuclear weapons states on nuclear disarmament or to pressure Israel over its covert nuclear weapons program (Ogilvie-White and Simpson 2003: 42). With little prospect of securing the nuclear disarmament of the nuclear weapons states, the continuation of the NPT has become farcical.3 Commenting on the 2000 NPT Review Conference, two observers noted that ‘agreement on the Final Document had been possible only because many of the provisions were capable of varying interpretations, and thus unlikely to be implemented in full’ (Ogilvie-White and Simpson 2003: 43). They also

noted that evidence of backsliding on commitments, particularly by the US, was ‘greeted by most delegations with resignation and quiet cynicism, rather than forthright and persistent criticism ’

(Ogilvie-White and Simpson 2003: 45). The unwillingness of states to expend diplomatic capital in taking on the US over its commitments indicates that fewer and fewer states continue to regard the integrity of the NPT as a foreign policy priority.

The NPT is failing – it’s poisonous politics between countries makes success impossible The Economist, 5-1-2010 [5/1/10, “If not now, never; Defending the NPT,” Lexis Nexis]It is not a lack of technical ideas that threatens progress, but the NPT's poisonous politics. Iran and a coterie that includes Cuba, Syria, Venezuela and sometimes Libya (North Korea, thankfully, will be absent)

will block what they can—though consensus minus this crew would be no dishonour. If others played spoiler too it would be more damaging still. Some want deeper cuts by the weapons states before they consider tighter rules. Brazil and South Africa (understandably) resent a nuclear deal struck with India that exempted it from the restrictions NPT members are being asked to accept, even though it stayed outside the treaty to get the bomb. Egypt demands that Israel, another NPT hold-out, first be obliged to start negotiating away its nuclear arsenal . These concerns carry weight. But if they are allowed to destroy the conference in a blaze of recrimination, the only victors will be Iran and North Korea. The world's future would be neither secure nor even remotely nuclear-free. This is a time to put the collective interest first .

  

1NC NPT – OPAQUE PROLIF The NPT is a disaster – doesn’t solve and encourages opaque prolif, which is more destabilizing and makes nuclear terrorism more likely. Wesley, 2005  [Michael S., executive director of the Lowy Institute for International Policy and former Professor of International Relations at Griffith University, “It’s time to scrap the NPT,” Australian Journal of International Affairs, Vol. 59, No. 3, ed. by W. Tow pp. 283-299]My central argument is that the horizontal proliferation of nuclear weapons will probably continue at the rate of one or two additional nuclear weapons states per decade, whether or not the NPT is retained. Persisting with the NPT will make this proliferation much more dangerous than if the NPT is replaced with a more practical regime. I argue that the NPT is a major cause of opaque proliferation, which is both highly destabilising and makes use of transnational smuggling networks which are much more likely than states to pass nuclear components to terrorists . On the other hand, scrapping the NPT in favour of a more realistic regime governing the possession of nuclear weapons would help put transnational nuclear smuggling networks out of business and stabilise the inevitable spread of nuclear weapons. The NPT was always a flawed regime, based on an unequal distribution of status and security. Its apparent effectiveness in containing nuclear proliferation was largely due to other factors . The

events of the past 15 years have only magnified the NPT’s flaws. The end of the Cold War decoupled the possession of nuclear weapons from the global power structure. While many commentators were applauding the expansion of the number of NPT signatories, and South Africa, South Korea, Brazil and Argentina renounced plans to acquire nuclear weapons, deeper and more insistent proliferation pressures were building among the emerging great powers of Asia. The succession of Persian Gulf wars demonstrated to many insecure states that only nuclear*/not chemical or biological*/weapons deter conventional military attack.

The international community was repeatedly surprised by the extent and sophistication of Iraq’s, Pakistan’s, North Korea’s and Libya’s progress in acquiring nuclear materials and know-how, each time underlining the inadequacies of the non-proliferation regime. After the 1998 South Asian nuclear tests, India’s highly effective rhetorical defence of its policy and the world’s half-hearted and short-lived sanctions against India and Pakistan damaged the moral authority of the NPT regime, perhaps terminally. Even worse than being ineffective, the NPT is dangerous, because it increases the pressures for opaque proliferation and heightens nuclear instability . Equally flawed, I argue, is the current counter-proliferation doctrine of the United States. I advocate scrapping the NPT (and the

doctrine of counter-proliferation) and starting again, because the NPT is a failing regime that is consuming diplomatic resources that could be more effectively used to build an alternative arms control regime that is responsive to current circumstances. We need to confront the practicalities of scrapping the NPT*/the positives and the negatives*/and think clearly about the requirements of a replacement regime. 

The NPT is failing – maintaining the agreement risks nuclear war. Wesley, 2005 [Michael S., executive director of the Lowy Institute for International Policy and former Professor of International Relations at Griffith University, “It’s time to scrap the NPT,” Australian Journal of International Affairs, Vol. 59, No. 3, ed. by W. Tow pp. 283-299) ]Clearly the NPT’s days are numbered; the major policy challenge is how to design a more stable and effective nuclear weapons regime in a post-NPT world. Unlike the NPT, an effective new regime would concentrate mainly on the nuclear weapons states, and return the attention of the international community to the important issues surrounding the possession, storage and possible use of nuclear weapons. Once the inevitability of limited proliferation is accepted, it will free up the resources and attention needed to develop more traditional and stabilising arms control agreements . As several

participants in the debate over proliferation have noted, the stakes are too high with nuclear weapons to rely on conventional risk analyses (Sagan and Waltz 1995). For this reason, it is dangerous to rely on a regime whose shortcomings are unlikely ever to be remedied. It is time to let the NPT go, and begin work on a regime that has a better chance of keeping ours and succeeding generations free of the horrors of nuclear war . 

EXT. NPT → PROLIF (GENERAL) NPT allows uranium enrichment – states will take advantage and proliferate. Chicago Journal of International Law, 2005 [“NPT, Where Art Thou? The Nonproliferation Treaty and Bargaining: Iran as a Case Study,” July 1, http://www.allbusiness.com/legal/international-law/884004-1.html]By virtue of the IAEA safeguards system, the nonproliferation regime established by the NPT is intricate and makes it costly for a nonnuclear weapon state to successfully cheat by establishing a clandestine nuclear weapons program. The NPT's broad language , however, gives countries the legal right to enrich and reprocess uranium for peaceful purposes. This default rule is unfortunate for countries that wish to prevent nuclear weapon proliferation because enrichment technology can be easily used to manufacture nuclear weapons , and some countries are willing to incur the costs of cheating. Furthermore, some countries may use the threat of cheating to bargain for economic and security concessions from wealthier nations that wish to maintain the nuclear status quo. Solely criticizing the broad language of the NPT is tempting, but doing so places too much emphasis on the normative value of nonproliferation and too little emphasis on states' self-interest. Such criticism also suffers from hindsight bias; as a matter of political reality, the NPT would have enjoyed little credibility had it sought to restrict the right to develop peaceful nuclear technology. This would have been seen as an attempt by the nuclear weapon states to retain a monopoly on all nuclear technology and would have discouraged countries from signing the NPT in the first place. Conceptualizing the issue as one of bargaining, however, better captures the true state of affairs-states can contract around legal rights that they have acquired through multilateral agreements. Under this view, the NPT does not serve to establish legal norms, but rather gives parties a baseline from which to bargain. Further, regional security uncertainties

resulting from the Cold War's end will spur more countries to capitalize on their NPT rights because uranium enrichment will either provide security or enable them to negotiate for better economic and security terms . Therefore, bargaining with threshold states is likely to increase over time . This is not necessarily bad-providing economic and security incentives in exchange for nonproliferation may be more effective than solely relying on multilateral treaty regimes. Of course, the success of bargaining between states is a matter of politics, not law; thus the sustainability of a nonproliferation bargaining regime will ultimately depend on how much it is valued by political leaders. 

NPT structure allows access to nuclear materials – spurs prolif. Wesley, 2005 [Michael S., executive director of the Lowy Institute for International Policy and former Professor of International Relations at Griffith University, “It’s time to scrap the NPT,” Australian Journal of International Affairs, Vol. 59, No. 3, ed. by W. Tow pp. 283-299) ]Some of the causes of the NPT’s declining effectiveness in containing nuclear proliferation have been rehearsed above.

However the main cause of its ineffectiveness is structural: as Frank Barnaby observes, ‘The problem is that military and peaceful nuclear programs are, for the most part, virtually identical’ (1993: 126). This

directly erodes the viability of the deal that lies at the heart of the NPT : that non-nuclear weapons states agree not to try to acquire nuclear weapons in exchange for assistance with peaceful nuclear programs, should they want them. The NPT and the International Atomic Energy Agency (IAEA) are thus simultaneously engaged in promoting and controlling two types of nuclear technology that are virtually indistinguishable until a point very close to the threshold of assembling the components of a nuclear weapon. For many states that have contemplated the nuclear option, adherence to the NPT thus actually makes it easier to obtain cutting edge nuclear technology and dual-use components that could be applied to a nuclear weapons program (Dunn 1991: 23). As Barnaby argues,

‘Under [Article X of] the NPT, a country can legally manufacture the components of a nuclear weapon, notify the IAEA and the UN Security Council that it is withdrawing from the Treaty, and then assemble its nuclear weapons’(1993: 124). Although the IAEA’s inspections role has been strengthened during the course of the

1990s, there is little prospect that its powers will be increased to such a level that it will be able to counter the highly sophisticated deception programs mounted by most covert proliferators. The only remedy to this dilemma has been to question the need of states such as Iran for peaceful nuclear power and to doubt the veracity of their statements that they do not intend to acquire nuclear weapons. This only further opens the regime up to charges of selectivity, unfairness and politicisation (Jones 1998). 

NPT encourages nuclear energy tech – that incentivizes prolif. Basini, 2005

[5/14, Mark, The Western Mail, Staff Writer, “why we should disarm before expecting the same of others,” http://findarticles.com/p/news-articles/western-mail-cardiff-wales/mi_8001/is_2005_May_14/disarm-expecting/ai_n37517633/]One reason why the NPT has so far proved ineffective in preventing the growth of nuclear weapons lies in the treaty itself. As a carrot for not developing nuclear bombs, the major powers promised to provide small states with 'atoms for peace'. They would supply the technology needed to develop peaceful atomic energy, provided the recipients steered clear of bombs. The problem is that the technology needed to produce electricity in nuclear power stations is very similar to that needed for nuclear weapons. Once, for example, a country has the means of enriching uranium for peaceful purposes, it is well on the way to making the fuel it needs to produce bombs. 

NPT encourages prolif. Glick, 6-1-2010 [Reporter, The Jerusalem Post, “Ending Israel’s Losing Streak,” lexis]If Israel's leaders were better informed, they would have recognized a number of things in the lead-up to the conference. They would have realized that Obama's anti- nuclear conference in April, his commitment to a nuclear- free world, as well as his general ambivalence - at best - to US global leadership rendered it all but inevitable that he would turn on Israel. The truth is that Egypt's call for the denuclearization of Israel jibes with Obama's own repeatedly statedviews both regarding Israel and the US's own nuclear arsenal. Armed with this basic understanding of Obama's inclinations, Israel should have taken for granted that the NPT conference would target it. Consequently, in months preceding the conference, it should have stated loudly and consistently that as currently constituted, the NPT serves as the chief enabler of nuclear proliferation rather than the central instrument for preventing nuclear proliferation. North Korea exploited its status as an NPT signatory to develop its nuclear arsenal. Today Iran exploits its status as an NPT signatory to develop nuclear weapons. Unless the NPT is fundamentally revised, it will continue to serve as the primary instrument for nuclear proliferation.

EXT. NPT → OPAQUE PROLIF NPT causes opaque proliferation – that risks miscalculation and spread of nuclear materials to terrorists. Wesley, 2005 [Michael S., executive director of the Lowy Institute for International Policy and former Professor of International Relations at Griffith University, “It’s time to scrap the NPT,” Australian Journal of International Affairs, Vol. 59, No. 3, ed. by W. Tow pp. 283-299) ]By prohibiting proliferation, without the capacity or moral authority to enforce such a prohibition, the NPT makes opaque proliferation the only option for aspiring nuclear weapons states.4 Opaque proliferation is destabilising to regional security. It breeds miscalculation*/both

overestimation of a state’s nuclear weapons development (as shown by the case of Iraq), and under- estimation (in the case of Libya)*/that can force neighbouring states into potentially catastrophic moves. Even more dangerous, argues Lewis Dunn, is the likelihood that states with covert nuclear weapons programs will develop weak failsafe mechanisms and nuclear doctrine that is destabilising: In camera decision making may result in uncontrolled programs, less attention to safety and control problems and only limited

assessment of the risks of nuclear weapon deployments or use. The necessary exercises cannot be conducted, nor can procedures for handling nuclear warheads be practised, nor alert procedures tested. As a result, the risk of accidents or incidents may rise greatly in the event of deployment in a crisis or a conventional conflict. Miscalculations by neighbours or outsiders also appear more likely, given their uncertainties about the adversary’s capabilities, as well as their lack of information to judge whether crisis deployments mean that war is imminent (1991: 20, italics in original). And because both the NPT and the current US counter-proliferation doctrine place such emphasis on preventing and reversing the spread of nuclear weapons, states such as Pakistan, which desperately need assistance with both failsafe technology and stabilising nuclear doctrine, have been suspicious of US offers of assistance (Pregenzer 2003). As the dramatic revelations of the nature and extent of the A. Q. Khan network showed, some states undertaking opaque proliferation have been prepared to rely on transnational smuggling networks to gain vital components, materials and knowledge. Quite apart from the incapacity of the NPT regime to deal with this new form of proliferation (Clary 2004), such non-state networks raise very real risks that for the right price, criminals or other facilitators could pass nuclear materials to terrorist groups or extortionists (Albright and Hinderstein 2005). Both through its inadequacies and its obsessive focus on stopping the spread of nuclear weapons, the NPT could be contributing to the ultimate nightmare: terrorists armed with nuclear or radiological weapons .

NPT pushes countries towards opaque prolif – destabilizes. Hagerty, 1998 [Devin T, PhD, Associate Professor in UMBC's political science department. He teaches courses in international relations, national security, Asia-Pacific affairs, nuclear proliferation, and qualitative methods in political science, “The Consequences of Nuclear Proliferation: Lessons from South Asia,” 1998 p. 2-3]At another level, my empirical research is theoretically situated in the analysis of opaque proliferation. Opaque proliferants publicly deny developing nuclear weapons while secretly doing just that. This behavior has characterized every emerging nuclear power since China’s nuclear explosive test in 1964, and the subsequent evolution of nonproliferation norm of international politics. The Nuclear Non-Proliferation Treaty (NPT) and related components of the non-proliferation regime have not prevented the spread of nuclear weapons; instead they have pushed it underground. This transformation in the prevailing mode of acquiring nuclear weapons has profound implications for nuclear dynamics in the relevant regions-the Middle East, South Asia, and the Korean peninsula. In particular, the nature of nuclear deterrence is fundamentally different in the opaque and transparent proliferation universes .

Benjamin Frankel writes: “The theory of nuclear deterrence in its various manifestations is ultimately the waving of a big and visible nuclear stick at a potential aggressor .” But , as Frankel and Avner Cohen note, “there is little in the literature to tell us how a country should plan to use its nuclear weapons to deter its adversaries while denying the possession of these weapons.NPT spurs opaque prolif. Hagerty, 1998

[Devin T, PhD, Associate Professor in UMBC's political science department. He teaches courses in international relations, national security, Asia-Pacific affairs, nuclear proliferation, and qualitative methods in political science, “The Consequences of Nuclear Proliferation: Lessons from South Asia,” 1998 p. 45]Other sources of opacity. Several other factors may influence the evolution of opaque rather than transparent nuclear postures. Opacity is a way to signal a country's nuclear capabilities and flex some deterrent muscle without antagonizing adversaries into like responses and spurring a destabilizing and expensive nuclear arms race. In this context, India's "peaceful nuclear explosion," followed by its refusal to deploy nuclear weapons, may have been meant to send a message of rough nuclear equivalence to China without driving Pakistan into its own pursuit of nuclear weapons. If this was

in fact New Delhi's strategy, it obviously failed. In addition, opacity is much less expensive than a transparent nuclear posture, which for the first-generation nuclear weapon states involved developing redundant and diverse nuclear forces to ensure the survivability of second-strike weapons. Finally, opacity preserves the flexibility that future policymakers may need to denuclearize if security conditions change, without losing face or suffering domestic discontent owing to the popularity of an open nuclear stance. Whatever the relative influence of these factors—and it differs across cases—the most compelling reason for opacity seems to be the belief that it provides deterrent security while avoiding theme steep international costs of open deployments.  

More evWesley, 2005 [Michael S., executive director of the Lowy Institute for International Policy and former Professor of International Relations at Griffith University, “It’s time to scrap the NPT,” Australian Journal of International Affairs, Vol. 59, No. 3, ed. by W. Tow pp. 283-299) ]Due to persisting demand-side factors and crumbling supply-side controls, the nuclear Non-Proliferation

Treaty (NPT) will probably be unable to prevent a likely proliferation rate of one or two additional nuclear weapons states per decade into the foreseeable future. Beyond being ineffective, I argue that

the NPT will make this proliferation much more dangerous. The NPT is a major cause of opaque proliferation, which is both highly destabilising and makes use of transnational smuggling networks which are much more likely than states to pass nuclear components to terrorists . However,

abandoning the NPT in favour of a more realistic regime governing the possession of nuclear weapons would help put transnational nuclear smuggling networks out of business and stabilise the inevitable spread of nuclear weapons.  

Israel proves. Frankel, 1991[Benjamin, Editor of Security Studies. He is completing a manuscript on Henry Kissinger and US national security in the early 1970s and is the editor of In the National Interest (1990); and the International Politics: 1945-1990 volumes of the Twentieth Century Encyclopedia (1991), “Opaque Nuclear Proliferation: Methodological and Policy Implications,” 1991, p. 16]The current sturdiness of Israel’s nuclear opacity is not the result of the execution of a well-planned design. Rather, it appears to have grown piecemeal. In retrospect it appears to have evolved as a result of a complex series of historical interactions. Although the Israeli declaratory posture gives the impression of continuity and uniformity, the particular contours of the Israeli program are a result of dynamic and dialectical relationships among many players on different levels: domestic, regional, superpower, and the NPT regime. Opacity emerged as the preferred posture for players on all these levels.

***SPACE LEADERSHIP ADVANTAGE

1NC HEGEMONY FRONTLINEDecline doesn’t cause warGeller and Singer, ’99 [*Chair of the Department of Political Science @ Wayne State University (Daniel S and Joel David, Nations at war: a scientific study of international conflict, p. 116-117)]

Hopf (1991) and Levy (1984) examine the frequency, magnitude and severity of wars using polarity (Hopf) and “system size” (Levy) as predictors. Hopf’s database includes warfare in the European subsystems for the restricted temporal period of 1495–1559. The system is classified as multipolar for the years 1495–1520 and as bipolar for the years 1521–1559. Hopf reports that the amount of warfare during those two periods was essentially equivalent. He concludes that polarity has little relationship to patterns of war for the historical period under examination. Levy (1984) explores a possible linear association between the number of great powers (system size) and war for the extended temporal span of 1495 – 1974. His findings coincide with those of Hopf; he reports that the frequency, magnitude and severity of war in the international system is unrelated to the number of major powers in the system.

There are multiple alt causes to decline in US space leadershipKaufman, 08 – Washington Post Staff Writer [July 9, 2008, Marc Kaufman, The Washington Post, “U.S. Finds It's Getting Crowded Out There,” Lexis]

NASA and the U.S. space effort, meanwhile, have been in something of a slump. The space shuttle is still the most sophisticated space vehicle ever built, and orbiting observatories such as the Hubble space telescope and its in-development successor, the James Webb space telescope, remain unmatched. But the combination of the 2003 Columbia disaster, the upcoming five-year "gap" when NASA will have no American spacecraft that can reach the space station, and the widely held belief that NASA lacks the funding to accomplish its goals, have together made the U.S. effort appear less than robust. The tone of a recent workshop of space experts brought together by the respected National Research Council was described in a subsequent report as "surprisingly sober, with frequent expressions of discouragement, disappointment, and apprehension about the future of the U.S. civil space program." Uncertainty over the fate of President Bush's

ambitious "vision" of a manned moon-Mars mission, announced with great fanfare in 2004, is emblematic. The program was approved by Congress, but the administration's refusal to significantly increase spending to build a new generation of spacecraft has slowed development while leading to angry complaints that NASA is cannibalizing promising unmanned science missions to pay for the moon-Mars effort. NASA's Griffin has told worried members of Congress that additional funds could move up the delivery date of the new-generation spacecraft from 2015 to 2013. The White House has rejected Senate efforts to provide the money. Although NASA's annual funding of $17 billion is large by civilian space agency standards, it constitutes less than 0.6 percent of the federal budget and is believed to be less than half of the amount spent on national security space programs. According to the Futron report, a considerably higher percentage of U.S. space funding goes into military hardware and systems than in any other nation. At the same time, the enthusiasm for space ventures voiced by Europeans and Asians contrasts with America's lukewarm public response to the moon-Mars mission. In its assessment, Futron listed the most significant U.S. space weakness as "limited public interest in space activity." The cost of manned space exploration, which requires expensive measures to sustain and protect astronauts in the cold emptiness of space, is a particular target. "The manned space program served a purpose during the Apollo times, but it just doesn't anymore," says Robert Parks, a University of Maryland physics professor who writes about NASA and space. The reason: "Human beings haven't changed much in 160,000 years," he said, "but robots get better by the day." Satellite Launches Fall The study by Futron, which consults for public clients such as NASA and the Defense Department, as well as the private space industry, also reported that the United States is losing its dominance in orbital launches and satellites built. In 2007, 53 American-built satellites were launched -- about 50 percent of the total. In 1998, 121 new U.S. satellites went into orbit.

EXT. NO HEG IMPACTWar-mongerers are wrong – U.S. leadership doesn’t solve conflictSapolsky-Gholz & Press, ’97 (PhD’s @ MIT, Come Home America, International Security)

The selective engagers' strategy is wrong for two reasons. First, selective engagers overstate the effect of U.S. military presence as a positive force for great power peace. In today's world, disengagement will not cause great power war, and continued engagement will not reliably prevent it. In some circumstances, engagement may actually increase the likelihood of conflict. Second, selective engagers overstate the costs of distant wars and seriously understate the costs and risks of their strategies. Overseas deployments require a large force structure. Even worse, selective engagement will ensure that when a future great power war erupts, the United States will be in the thick of things. Although distant great power wars are bad for America, the only sure path to ruin is to step in the middle of a faraway fight. Selective engagers overstate America's effect on the likelihood of future great power wars. There is little reason to believe that withdrawal from Europe or Asia would lead to deterrence failures. With or without a forward U.S. presence, America's major allies have sufficient military strength to deter any potential aggressors.

Heg doesn’t solve warMastanduno, ’09[Professor of Government at Dartmouth (Michael, World Politics 61, No. 1, Ebsco)]During the cold war the United States dictated the terms of adjustment. It derived the necessary leverage because it provided for the security of its economic partners and because there were no viable alternatives to an economic order centered on the United States. After the cold war the outcome of adjustment struggles is less certain because the United States is no longer in a position to dictate the terms. The United States, notwithstanding its preponderant power, no longer enjoys the same type of security leverage it once possessed, and the very success of the U.S.-centered world economy has afforded America’s supporters a greater range of international and domestic economic options. The claim that the United States is unipolar is a statement about its cumulative economic, military, and other capabilities.1 But preponderant capabilities across the board do not guarantee effective influence in any given arena. U.S. dominance in the international security arena no longer translates into effective leverage in the international economic arena. And although the United States remains a dominant international economic player in absolute terms, after the cold war it has found itself more vulnerable and constrained than it was during the golden economic era after World War II. It faces rising economic challengers with their own agendas and with greater discretion in

international economic policy than America’s cold war allies had enjoyed. The United States may continue to act its own way, but it can no longer count on getting its own way.

EXT. A/CAlt cause – ITAR is crushing US space leadership – the aff can’t solveKaufman, 08 – Washington Post Staff Writer [July 9, 2008, Marc Kaufman, The Washington Post, “U.S. Finds It's Getting Crowded Out There,” Lexis]

Ironically, efforts to deny space technology to potential enemies have hampered American cooperation with other nations and have limited sales of U.S.-made hardware. Concerned about Chinese

use of space technology for military purposes, Congress ramped up restrictions on rocket and satellite sales, and placed them under the cumbersome International Traffic in Arms Regulations (ITAR). In addition, sales of potentially "dual use" technology have to be approved the State Department rather than the Commerce Department. The result has been a surge of rocket and satellite production abroad and the creation of foreign-made satellites that use only homegrown components to avoid complex U.S. restrictions under ITAR and the Iran Nonproliferation Act. That law, passed in 2000, tightened a ban on direct or indirect sales of advanced technology to Iran (especially by Russia). As a result, a number of foreign governments are buying European satellites and paying the Chinese, Indian and other space programs to launch them. "Some of these companies moved ahead in some areas where, I'm sorry to say, we are no longer the world leaders," Griffin said. Joan Johnson-Freese, a space and national security expert at the Naval War College in Rhode Island, said the United

States has been so determined to maintain military space dominance that it is losing ground in commercial space uses and space exploration. "We're giving up our civilian space leadership, which many of us think will have huge strategic implications," she said. "Other nations are falling over each other to work together in space; they want to share the costs and the risks," she added. "Because of the dual-use issue, we really don't want to globalize."

1NC SCIENCE LEADERSHIP FRONTLINEScience leadership not key to soft power – their evidence mistakes correlations of cooperation with a causality Dickson,’09 [David, Direction Science & Development Network. June 2, 2009, “Science diplomacy: the case for caution”, http://scidevnet.wordpress.com/category/new- frontiers-in-science-diplomacy-2009, SM]

One of the frustrations of meetings at which scientists gather to discuss policy-related issues is the speed with which the requirements for evidence-based discussion they would expect in a professional context can go out of the window. Such has been the issue over the past two days in the meeting jointly organised in London by the American Association for the Advancement of Science (AAAS) and the Royal Society on the topic “New Frontiers in Science Diplomacy“. There has been much lively discussion on the value of international collaboration in achieving scientific goals, on the need for researchers to work together on the scientific aspects of global challenges such as climate change and food security, and on the importance of science capacity building

in developing countries in order to make this possible. But there remained little evidence at the end of the meeting on how useful it was to lump all these activities together under the umbrella term of “science diplomacy”. More significantly, although numerous claims were made during the conference about the broader social and political value of scientific collaboration – for example, in establishing a framework for collaboration in other areas, and in particular reducing tensions between rival countries – little was produced to demonstrate whether this hypothesis is true. If it is not, then some of the arguments made on behalf of “science diplomacy”, and in particular its value as a mechanism for exercising “soft power” in foreign policy, do not stand up to close scrutiny. Indeed, a case can be made that where scientific projects have successfully involved substantial international collaboration, such success is often heavily dependent on a prior political commitment to cooperation, rather than a mechanism for securing cooperation where the political will is lacking. Three messages appeared to emerge from the

two days of discussion. Firstly, where the political will to collaborate does exist, a joint scientific project can be a useful expression of that will. Furthermore, it can be an enlightening experience for all those directly involved. But it is seldom a magic wand that can secure broader cooperation where none existed before. Secondly, “science diplomacy” will only become recognised as a useful activity if it is closely defined to cover specific situations (such as the negotiation of major international scientific projects or collaborative research enterprises). As an umbrella term embracing the many ways in which science interacts with foreign policy, it loses much of its impact, and thus its value. Finally, when it comes to promoting the use of science in developing countries, a terminology based historically on maximising self-interest – the ultimate goal of the diplomat – and on practices through which the rich have almost invariably ended up exploiting the poor, is likely to be counterproductive. In other words, the discussion seemed to confirm that “science diplomacy” has a legitimate place in the formulation and implementation of policies for science (just as there is a time and place for exercising “soft power” in international relations). But the dangers of going beyond this – including the danger of distorting the integrity of science itself, and even alienating potential partners in collaborative projects, particularly in the developing world – were also clearly exposed.

Recent budgets terminally destroyed science leadership – full spectrum cuts destroys US ability to compete scientificallyOrbach, 11 – Energy Institute at The University of Texas at Austin and served as Under Secretary for Science in the United States DOE [Raymond L. Orbach, Director, Energy Institute at The University of Texas at Austin, “Research Vital to Economic Growth”, http://www.energy.utexas.edu/index.php?option=com_content&view=article&id=100:research-vital-to-economic-growth&catid=32:editorials&Itemid=47, SM]

It was with a mixture of astonishment and dismay that I watched as the U.S. House of Representatives approved H.R. 1, a bill to fund the federal government for the rest of the 2011 fiscal year. Left intact, the massive cuts in research contained in the bill passed on 19 February would effectively end America's legendary status as the leader of the worldwide scientific community , putting the United States at a distinct disadvantage when competing with other nations in the global marketplace. Other countries, such as China and India, are increasing their funding of scientific research because they understand its critical role in spurring technological advances and other innovations. If the United States is to compete in the global economy, it too must continue to invest in research programs. As the Under Secretary for Science at the Department of Energy (DOE) in the administration of George W. Bush, I can personally attest that funding for scientific research is not a partisan issue—or at least it shouldn't be. The cuts proposed in H.R. 1 would reverse a bipartisan commitment to double the science research budgets of the National Science Foundation, the DOE Office of Science,

and the National Institute for Science and Technology over 10 years. These are national goals supported by both Presidents Bush and Obama, and they were affirmed as recently as last December in the America COMPETES Act. The spending cuts included in the bill would have a devastating effect on an array of critical scientific research . For example, H.R. 1 removes $900 million from the budget for the Office of Science, the basic research arm of the DOE—a reduction of

some 20%. The bill specifically targets the Office of Biology and Environmental Research, slicing its budget by 50%; reductions that would all but eliminate funding for the office's three Biological Research Centers, the hope for

developing transportation fuels derived from plant cellulose. The hugely successful Energy Frontier Research Centers, which support activities based at 28 universities and 16 national laboratories, would be cut in midstream. The university centers support 1300 students working on the conversion of sunlight and heat into electricity, improved efficiency of photosynthesis in plants for the production of fuels, and enhanced combustion efficiency to increase mileage for automobiles. The work now at risk at the national laboratories includes projects to improve solid-state lighting and the conversion of coal into chemicals and fuels. This research is vitally important if the United States is to be a leader in transforming how humans get and use energy globally, in a way that maintains societal and economic viability. To make matters worse, the bill would also destabilize the large-scale scientific facilities operated by the DOE's Office of Science. These research projects include the country's work with powerful light sources (which other

countries are copying en masse), so vitally important to the U.S. biological, medical, and materials communities. Also included are the nation's remaining accelerators, responsible for advances in the high-energy and nuclear science communities; its spallation neutron source and nanotechnology centers, critically important to both university and industrial communities; and the quest for environmentally benign unlimited energy through investment in the International Thermonuclear Experimental Reactor. The budget deficit is serious. But escaping from its clutches requires economic growth as well as budget reductions. Well over half of U.S. economic growth in the past century can be traced to investments in science and technology. To compete in

the global economy, the United States must remain a leader in science and technology. For that to happen, the Senate must restore funding for science in the fiscal year 2011 budget. Failure to do so would relegate the United States to second-class status in the scientific community and threaten economic growth and prosperity for future generations of Americans.

No impact to soft power – believers exaggerate benefits – hard power is comparatively more important Gray 2011 – Professor of International Politics and Strategic Studies at the University of Reading, England. (Colin S., April, “HARD POWER AND SOFT POWER: THE UTILITY OF MILITARY FORCE AS AN INSTRUMENT OF POLICY IN THE 21ST CENTURY.” Published by Strategic Studies Institute)

Soft power is potentially a dangerous idea not because it is unsound, which it is not, but rather for the faulty inference that careless or unwary observers draw from it. Such inferences are a challenge to theorists because they are unable to control the ways in which their ideas will be interpreted and applied in

practice by those unwary observers. Concepts can be tricky. They seem to make sense of what otherwise is intellectually undergoverned space, and thus potentially come to control pliable minds. Given that men behave as their minds suggest and command, it is easy to understand why Clausewitz identified the enemy’s will as the target for influence.37 Beliefs about soft power in turn have potentially negative implications for attitudes toward the hard power of military force and economic muscle. Thus, soft power does not lend itself to careful regulation, adjustment, and calibration. What

does this mean? To begin with a vital contrast: whereas military force and economic pressure (negative or

positive) can be applied by choice as to quantity and quality, soft power cannot. (Of course, the enemy/rival too

has a vote on the outcome, regardless of the texture of the power applied.) But hard power allows us to decide how we will play in shaping and modulating the relevant narrative, even though the course of history must be an interactive one

once the engagement is joined. In principle, we can turn the tap on or off at our discretion. The reality is apt to be somewhat different because, as noted above, the enemy, contingency, and friction will intervene. But still a noteworthy measure of initiative derives from the threat and use of military force and economic power. But soft power is very different indeed as an instrument of policy. In fact, I am tempted to challenge the proposition that soft power can even be regarded as one (or more) among the grand strategic instruments of policy. The seeming validity and attractiveness of soft power lead to easy exaggeration of its potency. Soft power is admitted by all to defy metric analysis, but this is not a fatal weakness. Indeed, the instruments of hard power that do lend themselves readily to metric assessment can also be unjustifiably seductive. But the metrics of tactical calculation need not be strategically revealing. It is important to win battles, but victory in war is a considerably different matter than the simple accumulation of tactical successes. Thus, the burden of proof remains on soft power: (1) What is this concept of soft power? (2) Where does it come from and who or what controls it? and (3) Prudently assessed and anticipated, what is the quantity and quality of its potential influence? Let us now consider answers to these questions. 7. Soft power lends itself too easily to mischaracterization as the (generally unavailable) alternative to military and economic power. The first of the three questions posed above all but invites a misleading answer. Nye plausibly offers the co-option of people rather than their coercion as the defining principle of soft power.38 The source of possible misunderstanding is the fact that merely by conjuring an alternative species of power, an obvious but

unjustified sense of equivalence between the binary elements is produced. Moreover, such an elementary shortlist implies a fitness for comparison, an impression that the two options are like-for-like in their consequences, though not in their methods. By conceptually corralling a country’s potentially attractive co-optive assets under the umbrella of soft power, one is near certain to devalue the significance of an enabling context. Power of all kinds depends upon context for its value, but especially so for the soft variety. For power to be influential, those who are to be influenced have a decisive vote. But the effects of contemporary warfare do not allow

recipients the luxury of a vote. They are coerced. On the other hand, the willingness to be coopted by American soft power varies hugely among recipients. In fact, there are many contexts wherein the total of American soft power would add up in the negative, not the positive. When soft power capabilities are strong in their values and cultural trappings, there is always the danger that they will incite resentment, hostility, and a potent “blowback.” In those cases, American soft power would indeed be strong, but in a counterproductive direction. These conclusions imply no criticism of American soft power per se. The problem would lie in the belief that soft power is a reliable instrument of policy that could complement or in some instances replace military force. 8. Soft power is perilously reliant on the calculations and feelings of frequently undermotivated foreigners. The second question above asked about the provenance and ownership of soft power. Nye correctly notes that “soft power does not belong to the government in the same degree that hard power does.” He proceeds sensibly to contrast the armed forces along with plainly national economic assets with the “soft power resources [that] are separate from American government and only partly responsive to its purposes.” 39 Nye cites as a prominent example of this disjunction in responsiveness the fact that “[i]n the Vietnam era . . . American government policy and popular culture worked at cross-purposes.”40 Although soft power can be employed purposefully as an instrument of national policy, such power is notably unpredictable in its potential influence, producing net benefit or harm. Bluntly stated, America is what it is, and there are many in the world who do not like what it is. The U.S. Government will have the ability to project American values in the hope, if not quite confident expectation, that “the American way” will be found attractive in alien parts of the world. Our hopes would seem to be achievement of the following: (1) love and respect of American ideals and artifacts (civilization); (2) love and respect of America; and (3) willingness to cooperate with American policy today and tomorrow. Admittedly, this agenda is reductionist, but the cause and desired effects are accurate enough. Culture is as culture does and speaks and produces. The soft power of values culturally expressed that others might find attractive is always at risk to negation by the evidence of national deeds that appear to contradict our cultural persona.

Turn – Science leadership is becoming globalized – that’s better so solve the world’s problemsJha, 11 – science correspondent for the Guardian [March 28, 2011, Alok Jha, “China poised to overhaul US as biggest publisher of scientific papers,” http://www.guardian.co.uk/science/2011/mar/28/china-us-publisher-scientific-papers]

China could overtake the United States as the world's dominant publisher of scientific research by 2013, according to an analysis of global trends in science by the Royal Society. The report highlighted the increasing challenge to the traditional superpowers of science from the world's emerging economies and also identified emerging talent in countries not traditionally associated with a strong science base, including Iran, Tunisia and Turkey. The Royal Society said that China was now second only to the US in terms of its share of the world's scientific research papers

written in English. The UK has been pushed into third place, with Germany, Japan, France and Canada following behind. "The scientific world is changing and new players are fast appearing. Beyond the emergence of China, we see the rise of South-East Asian, Middle Eastern, North African and other nations," said Chris Llewellyn Smith, director of energy research at Oxford University and chair of the Royal Society's study. "The increase in scientific research and collaboration, which can help us to find solutions to the global challenges we now face, is very welcome. However, no historically dominant nation can afford to rest on its laurels if it wants to retain the competitive economic advantage that being a scientific leader brings." In the report, published on Monday, the Royal Society said that science around the world was in good health, with increases in

funding and personnel in recent years. Between 2002 and 2007, global spending on R&D rose from $790bn to $1,145bn and the number of researchers increased from 5.7 million to 7.1 million. "Global spend has gone up just under 45%, more or less in line with GDP," said Llewllyn Smith. "In the developing world, it's

gone up over 100%." Over the same period, he added, the number of scientific publications went up by around 25%. To compare the output of different countries, the Royal Society's report collated information on research papers published in two time periods, 1993-2003 and 2004-2008. It counted research papers that had an abstract in English and where the work had been peer-reviewed. In both periods, the US dominated the world's science, but its share of publications dropped from 26% to 21%. China's share rose from 4.4% to 10.2%. The UK's share declined from 7.1% to 6.5% of the world's papers. Projecting beyond 2011, the Royal Society said that the landscape would change "dramatically". "China has already overtaken the UK as the second leading producer of research publications, but some time before 2020 it is expected to surpass the US." It said this could

happen as soon as 2013. China's rise is the most impressive, but Brazil, India and South Korea are following fast behind and are set to surpass the output of France and Japan by the start of the next decade. The quality of research is harder to measure, so the Royal Society used the number of times a research paper had been cited by

other scientists in the years after publication as a proxy. By this yardstick, the US again stayed in the lead between the two periods 1999-2003 and 2004-2008, with 36% and 30% of citations respectively. The UK stayed in second place with 9% and 8% in the same periods. China's citation count went from virtually nil to a 4% share. The overall spread of scientific subjects under investigation has remained the same. "We had expected to see a shift to bio from engineering and physics [but] overall, the balance has remained remarkably stable," said Llewellyn Smith. "In China, [the rise] seems to be in engineering subjects whereas, in Brazil, they're getting into bio and agriculture." As it grows its research base, Llewellyn

Smith said that China could end up leading the world in subjects such as nanotechnology . "The fact is they've poured money into nanotechnology and that's an area where they are recruiting people back from around the world with very attractive laboratories – that's my feeling." In addition,

there are new entrants to the scientific community. "Tunisia in 1999 had zero science budget – now it puts 0.7% of GDP into science," said Llewllyn Smith. "This isn't huge but it's symbolic of the fact that all countries are getting into science. Turkey is another example. Iran has the fastest-growing number of publications in the world, they're really serious about building up science."

Turkey's R&D spend increased almost six-fold between 1995 and 2007, said the Royal Society, and the number of scientists in the country has jumped by 43%. Four times as many papers with Turkish authors were published in 2008 as in 1996. In Iran, the number of research papers rose from 736 in 1996 to 13,238 in 2008. Its government is committed to increasing R&D to 4% of GDP by 2030. In 2006, the country spent just 0.59% of its GDP on science. Llewellyn Smith welcomed the internationalisation of science. "Global issues, such as climate change, potential pandemics, bio-diversity, and food, water and energy security, need global approaches. These challenges are interdependent and interrelated, with complicated dynamics that are often overlooked by policies and programmes put in place to address them," he said. "Science has a very important role in addressing global challenges and collaboration is necessary so that everybody can agree on global solutions. The more countries are involved in science, the more innovations we will have and the better off we will be."

EXT. SCIENCE LEADERSHIP NOT K/T SOFT POWERScience not enough – US image is tarnished too muchDickson,’09 [David, Direction Science & Development Network. June 1, 2009, “Science diplomacy: a timely idea or a fashionable myth?”, http://scidevnet.wordpress.com/2009/06/01/science-diplomacy-a-timely-idea-or-a-fashionable-myth/, SM] At the height of the Cold War, the scientific community became an important channel of communication between East and West on issues such as nuclear weapons control. The idea was simple. The internationalism — and apparent political neutrality — of science provided a useful cover for messages to be passed between leaders of both sides that would have been impossible to convey by more conventional means. Does science have a similar role in helping to meet the political challenges of today? The new US administration of President Barack Obama thinks it does. Enhanced scientific relations lie at the heart of its strategy of using “soft power” to rebuild political bridges with countries across the world, particularly in the Middle East. How far this commitment is shared by other countries will be debated over the next two days at a meeting in London jointly organised by the Royal Society and American Association for the Advancement of Science. Under the title “New Frontiers in Science Diplomacy“, the meeting is bringing together eminent speakers from across the developed and developing world to look in detail at the role of science in foreign policy. Of course, there is much more to the issue than merely repolishing a tarnished international image (understandably the top US priority, following two successful terms of an isolationist administration which seemed to care little about this image). Other countries care more, for example, about ways in which science can help build a global consensus about the need to tackle problems such as climate change. And lurking in the background is the fact even soft power is still power. If the key purpose of a country’s foreign policy is to extend its influence over the policy of others, there is certainly a debate to be had over the extent to which science should tie itself to this strategy (even accepting the clear economic self-interest in doing so). The issue is particular acute when it comes to offering science as a form of aid to the developing world. Countries in former European colonies in particular remain highly suspicious of political leverage arriving in their aid packages – even those designed to boost their scientific capacities.

Science doesn’t translate into diplomacy – empirically disprovenDickson, ’10 [David, Direction Science & Development Network , 6/28/10, “Science in diplomacy: ‘On tap but not on top’”, http://scidevnet.wordpress.com/2010/06/28/the-place-of-science-in-diplomacy-%E2%80%9Con-tap-but-not-on-top%E2%80%9D/, SM]

The broadest gaps in understanding the potential of scientific diplomacy lay in the third category, namely the use of science as a channel of international diplomacy, either as a way of helping to forge consensus on contentious issues, or as a catalyst for peace in situations of conflict. On the first of these, some pointed to recent climate change negotiations, and in particular the work of the Intergovernmental Panel on Climate Change, as a good example, of the way that the scientific community can provide a strong rationale for joint international action. But others referred to the failure of the Copenhagen climate summit last December to come up with a meaningful agreement on action as a demonstration of the limitations of this way of thinking. It was argued that this failure had been partly due to a misplaced belief that scientific consensus would be sufficient to generate a commitment to collective action, without taking into account the political impact that scientific ideas would have. Another example that received considerable attention was the current construction of a synchrotron facility SESAME in Jordan, a project that is already is bringing together researchers in a range of scientific disciplines from various countries in the Middle East (including Israel, Egypt and Palestine, as well as both Greece and Turkey). The promoters of SESAME hope that – as with the building of CERN 60 years ago, and its operation as a research centre involving, for example, physicists from both Russia and the United States – SESAME will become a symbol of what regional collaboration can achieve. In that sense, it would become what one participant described as a “beacon of hope” for the region. But others cautioned that, however successful SESAME may turn out to be in purely scientific terms, its potential impact on the Middle East peace process should not be exaggerated. Political conflicts have deep roots that cannot easily be papered over, however open-minded scientists may be to professional colleagues coming from other political contexts. Indeed, there was even a warning that in the developing world, high profile scientific projects, particular those with explicit political backing, could end up doing damage by inadvertently favouring one social group over another. Scientists should be wary of having their prestige used in this way; those who did so could come over as patronising, appearing unaware of political realities. Similarly, those who hold science in esteem as a practice committed to promoting the causes of

peace and development were reminded of the need to take into account how advances in science – whether nuclear physics or genetic technology – have also led to new types of weaponry. Nor did science automatically lead to the reduction of global inequalities. “Science for diplomacy” therefore ended up with a highly mixed review. The consensus seemed to be that science can prepare the ground for diplomatic initiatives – and benefit from diplomatic agreements – but cannot provide the solutions to either.

EXT. BUDGET CUTS A/CThey overwhelm the plan – our status of leadership in science is overMorello, ’11 [Lauren Morello, February 25, 2011, “House Budget Cuts Could End U.S. Science Leadership”, http://www.scientificamerican.com/article.cfm?id=house-budget-cuts-could-end-us-science-leadership, SM]

Spending cuts approved by the House would end America's reign as a scientific leader if they are enacted into law, a former Bush administration Energy Department official said yesterday. "Left intact, the massive cuts in research contained in the bill passed on 19 February would effectively end America's legendary status as the leader of the worldwide scientific community, " Raymond Orbach wrote in an editorial published online by the journal Science. The continuing budget legislation passed by the House last week would slash the budgets of federal environment, energy and science agencies compared to 2010 spending levels -- cutting $3 billion from U.S. EPA, more than $1 billion from the Energy Department, and roughly $450 million from the National Oceanic and Atmospheric Administration. Senate Democrats have said the House bill cuts too deep, raising fears of a federal shutdown if Congress and the White House don't agree on a spending fix before the current stopgap budget bill expires March 4. House Republican leaders have said they will not agree to a new, temporary funding bill unless it includes significant budget cuts in line with those included in the legislation the House approved last week. In his new editorial, Orbach -- who served as DOE's undersecretary for science under President George W. Bush -- called the House cuts "devastating." "I can personally attest that funding for scientific research is not a partisan issue -- or at least shouldn't be," wrote Orbach, now director of the Energy Institute at the University of Texas, Austin. "The cuts proposed in H.R. 1 would reverse a bipartisan commitment to double the science research budgets of the National Science Foundation, the DOE Office of Science, and the National Institute for Science and Technology over 10 years. These are national goals supported by both Presidents Bush and Obama, and they were affirmed as recently as last December in the America COMPETES Act," he said. Orbach called on the Senate to reverse the House cuts. "Failure to do so would relegate the United States to second-class status in the scientific community and threaten economic growth and prosperity for future generations of Americans," he wrote.

EXT. SOFT POWER FAILSThe government cannot control soft power—it is the perception of the entire society that matters. Gray 2011 – Professor of International Politics and Strategic Studies at the University of Reading, England. (Colin S., April, “HARD POWER AND SOFT POWER: THE UTILITY OF MILITARY FORCE AS AN INSTRUMENT OF POLICY IN THE 21ST CENTURY.” Published by Strategic Studies Institute)

Moreover, no contemporary U.S. government owns all of America’s soft power—a considerable understatement. Nor do contemporary Americans and their institutions own all of their country’s soft power. America today is the product of America’s many yesterdays, and the worldwide target audiences for American soft power respond to the whole of the America that they have perceived, including facts, legends, and myths.41 Obviously, what they understand about America may well be substantially untrue, certainly it will be incomplete. At a minimum, foreigners must react to an American soft power that is filtered by their local cultural interpretation. America is a futureoriented country, ever remaking itself and believing that, with the grace of God, history moves forward progressively toward an ever-better tomorrow. This optimistic American futurism both contrasts with foreigners’ cultural pessimism—their golden ages may lie in the past, not the future—which prevails in much of the world and is liable to mislead Americans as to the reception our soft power story will have.42 Many people indeed, probably most people, in the world beyond the United States have a fairly settled view of America, American purposes, and Americans. This locally held view derives from their whole experience of exposure to things American as well as from the features of their own “cultural thoughtways” and history that shape their interpretation of American-authored words and deeds, past and present.43

Hard power key to soft power – no impact to soft power alone Gray 2011 – Professor of International Politics and Strategic Studies at the University of Reading, England. (Colin S., April, “HARD POWER AND SOFT POWER: THE UTILITY OF MILITARY FORCE AS AN INSTRUMENT OF POLICY IN THE 21ST CENTURY.” Published by Strategic Studies Institute)

Full-blown, the argument holds, first, that America (for example) gains useful political clout if and when foreigners who matter highly to U.S. national security share important American understandings, values, and preferences. The thesis proceeds in its second step to package this thus far commonsense proposition under the banner of “soft power”; it is now dangerously objectified, as if giving something a name causes it to exist. Next, the third and most problematic step in the argument is the logical leap that holds that American soft power, as existing reality—what it is, and its effects— can be approached and treated usefully as an instrument of national policy. This is an attractive proposition: it is unfortunate that its promise is thoroughly unreliable. The problem lies in the extensive middle region that lies between a near harmony of values and perceived interests and, at the opposite end of the spectrum, a close to complete antagonism between those values and interests. Historical evidence as well as reason suggest that the effective domain of soft power is modest. The scope and opportunity for co-option by soft power are even less. People and polities have not usually been moved far by argument, enticement, and attractiveness. There will be some attraction to, and imitation of, a great power’s ideas and practical example, but this fact has little consequence for the utility of military force. Indeed, one suspects that on many occasions what might be claimed as a triumph for soft power is in reality no such thing. Societies and their political leaders may be genuinely attracted to some features of American ideology and practice, but the clinching reason for their agreement to sign on to an American position or initiative will be that the United States looks convincing as a guardian state and coalition leader.

Soft power cannot replace hard power Gray 2011 – Professor of International Politics and Strategic Studies at the University of Reading, England. (Colin S., April, “HARD POWER AND SOFT POWER: THE UTILITY OF MILITARY FORCE AS AN INSTRUMENT OF POLICY IN THE 21ST CENTURY.” Published by Strategic Studies Institute)

It is not difficult to identify reasons why military force seems to be less useful as a source of security than it once was. But it is less evident that soft power can fill the space thus vacated by the military and economic tools of grand strategy. Soft power should become more potent, courtesy of the electronic revolution that enables a networked global community. The ideological, political, and strategic consequences of such globalization, however, are not quite as benign as one might have predicted. It transpires that Francis Fukuyama was wrong; the age of ideologically fueled hostility has not passed after all.47 Also, it is not obvious that the future belongs to a distinctively Western civilization.48 It is well not to forget that the Internet is content-blind, and it advertises, promotes, and helps enable bloody antagonism in addition to the harmony of worldview that many optimists have anticipated. It does not follow from all this that the hard power of military force retains, let alone increases, its utility as an instrument of policy. But

assuredly it does follow that the historical motives behind defense preparation are not greatly diminished. Thus, there is some noteworthy disharmony between the need for hard power and its availability, beset as it increasingly is by liberal global attitudes that heavily favor restraint.

Soft power doesn’t solve wars – increases resentment for the “uncivilized” Gray 2011 – Professor of International Politics and Strategic Studies at the University of Reading, England. (Colin S., April, “HARD POWER AND SOFT POWER: THE UTILITY OF MILITARY FORCE AS AN INSTRUMENT OF POLICY IN THE 21ST CENTURY.” Published by Strategic Studies Institute)

An inherent and unavoidable problem with a country’s soft power is that it is near certain to be misassessed by the politicians who attempt to govern soft power’s societal owners and carriers. Few thoroughly encultured Americans are likely to undervalue “the American way” in many of its aspects as a potent source of friendly self-co-option abroad. Often, this self-flattering appreciation will be well justified in reality. But as an already existing instrument of American policy, the soft power of ideas and practical example is fraught with the perils of self-delusion. If one adheres to an ideology that is a heady mixture of Christian ethics (“one nation, under God . . .”), democratic principles, and free market orthodoxy, and if one is an American, which is to say if one is a citizen of a somewhat hegemonic world power that undeniably has enjoyed a notably successful historical passage to date, then it is natural to confuse the national ideology with a universal creed. Such confusion is only partial, but nonetheless it is sufficiently damaging as to be a danger to national strategy. Since it is fallacious to assume that American values truly are universal, the domain of high relevance and scope for American soft power to be influential is distinctly limited . If one places major policy weight on the putative value for policy of American soft power, one needs to be acutely alert to the dangers of an under-recognized ethnocentrism born of cultural ignorance. This ignorance breeds an arrogant disdain for evidence of foreigners’ lack of interest in being coopted to join American civilization. The result of such arrogance predictably is political and even military strategic counterreaction. It is a case of good intentions gone bad when they are pursued with indifference toward the local cultural context. Some people have difficulty grasping the unpalatable fact that much of the world is not receptive to any American soft power that attempts to woo it to the side of American interests. Not all rivalries are resolvable by ideas, formulas, or “deals” that seem fair and equitable to us. There are conflicts wherein the struggle is the message, to misquote Marshal MacLuhan, with value in the eyes of local belligerents. Not all local conflicts around the world are amenable to the calming effect of American soft power. True militarists of left and right, secular and religious, find intrinsic value in struggle and warfare, as A. J. Coates has explained all too clearly. The self-fulfilment and self-satisfaction that war generates derive in part from the religious or ideological significance attributed to it and from the resultant sense of participating in some grand design. It may be, however, that the experience of war comes to be prized for its own sake and not just for the great ends that it serves or promotes. For many, the excitement unique to war makes pacific pursuits seem insipid by comparison. This understanding and experience of moral, psychological, and emotional self-fulfillment increase our tolerance for war and threaten its moral regulation. It transforms war from an instrumental into an expressive activity.49 It is foolish to believe that every conflict contains the seeds of its own resolution, merely awaiting suitable watering through co-option by soft power. To be fair, similarly unreasonable faith in the disciplinary value of (American) military force is also to be deplored.

1NC INNOVATION FRONTLINE Their idea of economic competition is wrong – international innovation helps the US – we can integrate other ideas into our economy through mass consumptionBhide, ’08 [Amar Bhidé, a business professor at Columbia University, Nov 1, 2008 , “Is the U.S. Losing Its Economic Edge?”, http://www.inc.com/magazine/20081101/q-is-the-us-losing-its-economic-edge_pagen_2.html, SM]

Are you, then, less optimistic about America's ability to push ahead and create a vibrant, growing economy than you were when you sat down to write your book? No, not at all. We have one thing that works really well, and that's innovation. In the past, many technological developments have taken place during periods of severe economic stress. During the period of high inflation and doom-and-gloom recession of the early 1980s, for instance, people were buying and learning how to use PCs. That PC revolution set the stage for the huge productivity gains of the 1990s. Even in the Great Depression, the increases in productivity were enormous, based on the diffusion of a lot of technologies that had been developed in the 1920s. I'm not wishing for a depression or a replay of the 1980s. All I am saying is that we have a buffer against the financial meltdown, and that buffer is our ability to innovate, especially in the technology sector. You write that the dire predictions of so-called techno-nationalists are misplaced. Who are these techno-nationalists, and what are they missing? These are people who, in the context of trade and globalization, think that protectionism is bad, but that in order for us to survive the "onslaught of competition" from China and India, we have to crank up our technological investments so that we continuously stay ahead. These people say, let's invest more in R&D, let's invest more in basic research, let's train more engineers -- on the premise that the greater the technological lead that you have vis-à-vis other nations, the more prosperous you're going to be. And that's wrong? Absolutely. The U.S. isn't locked into a winner-take-all race for scientific and technological leadership with other nations. What's more, the growth of research capabilities in China and India, and thus their share of cutting-edge research, does not reduce U.S. prosperity. My analysis suggests exactly the opposite. Advances abroad will help improve living standards in the U.S. Moreover, the benefits I identify aren't the usual ones of how prosperity abroad increases opportunities for U.S. exporters. I show how cutting-edge research developed abroad benefits domestic production and consumption. That's counterintuitive for most people. It's helpful to think of a specific example. The World Wide Web was invented by a British scientist living in Switzerland. Think of how much this invention in Switzerland has revolutionized lives in the U.S. and has improved U.S. prosperity, probably to a greater degree than it has in Switzerland and certainly to a greater degree than it has in most other parts of the world. Why? Because the U.S. is really good at taking things like the Web and weaving them into our commercial fabric. Or, to give you another popular example: Many of the high-level technologies associated with the iPod were developed outside the U.S. Compression software comes from Germany; the design of the chip comes from the U.K. The whole idea of an MP3 player comes out of Singapore. But most of the value has been captured in the U.S., because the U.S. happens to represent the majority of the use of MP3 players in the world. So the point is that U.S. businesses are particularly adept at taking inventions and applying them to the marketplace? No, it's more than just applying them to the marketplace. It's also about our ability to consume these innovations. That's the really critical piece. At the Summer Olympics inBeijing, Coca-Cola (NASDAQ:COKE) had a pavilion set up where they were teaching the Chinese how to drink Coke, explaining that it should be drunk cold. That really caught my attention. Think about how much further the U.S. is on the consumption side. What does that really tell us? That we live in a more commercial culture than any in history. There is no other country where commerce and business have so completely pervaded everyday life. And so people are always looking for ways to serve consumers. Look at the historical differences between Europe and the U.S. In Europe, consumption started off for aristocrats. A classic example in Europe involves guns. When people first made sporting guns, they were primarily built for the aristocracy to hunt. But when people made guns in the U.S., they were used by farmers and ranchers. So these more standardized guns were made in the U.S. at a lower cost and for a more mass market. Even if Americans are the best consumers on the planet, why shouldn't we still be fearful of the rise of China and India and their incredibly fast-growing economies? Because economic systems don't compete with each other. Every gadget, car, or other product imported into the country brings in its wake what I call nontraded services. Consider a car. I bet there's three times as much value in all the nontraded activities that go along with the car as in the import value of the car itself: the employment at the dealer's showroom, the six-month servicing, the inspections, and so on. And every new physical gizmo, regardless of where it is manufactured, will end up generating many times the employment in the nontraded services sector it does in the traded sector.

This also means they can’t solve their Segal evidence – increasing US innovation fails because of globalization – only decreasing export barriers can maintain military access to technologyScowcraft, ’09 [Testimony of Brent Scowcroft President and Founder of the Scowcroft Group Before the House of Representatives Committee On Science and Technology, February 25, 2009, “Science and Technology Activities and Competitiveness”, SM]

These three conditions no longer obtain. First, today the United States has competition in most areas of advanced research and development, including military-related S&T. Advances in science and technology now occur throughout Europe, in Russia and Japan, and also in the developing economies of China, India and Brazil. Thus the number of access points to advanced science and technology have grown considerably and perhaps more to the point, outside the control of the United States. Second, most military production in the U.S. is now commercially based, thus “dual-use” goods and technologies . And third, the Western alliance has lost its Cold War consensus; NATO member countries and Japan no longer agree on what countries need to be controlled, what items should be controlled, and what kinds of controls are needed. Together, these changes make it much more difficult for the United States to successfully control the transfer of goods and technologies that have both commercial and military applications. As a result, many national security controls on science and technology no longer work in the ways they were intended. We endanger our own national security in thinking that unilaterally controlling dual-use items here prevents others from obtaining them elsewhere. Because science and technology research, development and production have become a global enterprise, the “Fortress America” approach of current controls cuts us off from information and technologies that we need for our national security. If we sustain these export control and visa barriers, we will increasingly lose touch with the cutting edge of science and technology, and we risk missing emerging national security threats. Following are just a few of the unintended consequences that the inappropriate application of export controls have on our national security and our economic competitiveness: At a time when battlefield interoperability is increasingly the norm, the licensing process can prevent repair at facilities closest to the theater of operation. Export controls constrain both U.S. commercial and military capabilities from expanding into new fields and from applying new scientific developments. The government’s rules are accelerating the development of technologies in capable research centers outside the United States. As foreign companies and governments fill these competitive gaps, valuable technical developments occur outside the U.S. to which the U.S. military and intelligence agencies then have no access. U.S. scientists are hobbled by rules that prevent them from working with world-class foreign scientists and laboratories located overseas, making it less likely that valuable discoveries and inventions will occur in the U.S. The government’s rules are driving jobs abroad—knowledge-intensive jobs that are critical to the U.S. economy.

EXT. NO ECONOMIC COMPETITIONNo such thing as economic competition – not zero-sumGalama AND Hosek, ’08 – Ph.D. and M.Sc. in physics AND .D. and M.A. in economics [Titus Galama – Ph.D. and M.Sc. in physics, University of Amsterdam; M.B.A. in business, INSEAD, Fontainebleau, France, James Hosek – Ph.D. and M.A. in economics, University of Chicago; B.A. in English, Cornell University, “U.S. Competitiveness in Science and Technology”, 2008, http://www.rand.org/pubs/monographs/2008/RAND_MG674.pdf, SM]

In work published over a decade ago, economist Paul Krugman questions whether the notion of competition in S&T is even relevant. He argues that the idea that nations “compete” is incorrect; countries are not like corporations and “are [not] to any important degree in economic competition with each other” (Krugman, 1994). Major industrial nations sell products that compete with each other, yet these nations are also each other’s main export markets and each other’s main suppliers of useful imports. More broadly, international trade is not a zero-sum game. For example, if the European economy does well, this helps the United States by providing it with larger markets and goods of superior quality at lower prices. Further, he argues that the growth rate of U.S. living standards essentially equals the growth rate of domestic productivity, not U.S. productivity relative to competitors; and enhancing domestic productivity is in the hands of Americans, not foreigners. Part of the reason for this, Krugman argues, is that the world is not as interdependent as one would think: 90 percent of the U.S. economy consists of goods and services produced for domestic use, i.e., produced by Americans, for Americans. But this is not to deny the importance of technological progress, and beneath it, science and technology, as a determinant of economic progress and improvement in the standard of living.

Downstream utilization of upstream technologies solves – we don’t need to be the first to invent technologies – US consumption guarantees US competitiveness Bhide, ’08 [Amar Bhidé, a business professor at Columbia University, Nov 1, 2008 , “Is the U.S. Losing Its Economic Edge?”, http://www.inc.com/magazine/20081101/q-is-the-us-losing-its-economic-edge_pagen_2.html, SM]

Still, wouldn't U.S. companies be in a stronger position if they invested more in research and development?No. Look at a company like IBM (NYSE:IBM). You might think the success of IBM was all about its technology. But IBM's great revolution was as much in sales and marketing as it was in the invention of the IBM 360. And the system that IBM developed for sales and marketing was not only of tremendous value to IBM in the sense that it allowed IBM to establish a dominant position in the computer industry. It was also of enormous value to the economy as a whole, because it was through the sales and marketing process that companies learned to use computers effectively. And that transformed work in America. It wasn't simply because somebody invented this box and called it a computer and plugged it in. There was a great deal of sales and marketing that was necessary, not just to sell computers but also to put them to use. Do you find that entrepreneurs behave differently than large corporations in the way they view markets? Most entrepreneurs, even those who build high-growth companies, tend to be more naturally focused on sales and marketing, because they cannot afford the luxury of R&D. And they tend to be skilled at taking inventions or innovations developed by someone else and putting them into use, because they can't waste money doing R&D. If you think of Microsoft (NASDAQ:MSFT) in its early years, they didn't do any formal R&D. They couldn't. There was no cash flow to do that. Now, they spend a few billion dollars on R&D. It's not clear to me that this is producing a huge return, either for the economy or for Microsoft. So entrepreneurs are good innovators? Entrepreneurs are great innovators because of the contributions they make on the ground level. It's critical to take a broad view of the innovation process. When people think of innovation, they often think of a technological breakthrough and scientists in white coats. They don't often consider all the effort and initiative of putting that innovation into use. But unless it is put into use, a technological breakthrough is useless. It requires marketing, sales, and organizational efforts. And the consumer is also a key part of the process. Someone who buys a spreadsheet program off the shelf and, with diligence and resourcefulness, creates a homemade CRM system is an important player in the innovation game -- a real innovator. You write a lot about the power of consumers to spur innovation. You call it venturesome consumption. What exactly do you mean by that? Venturesome consumption has three important dimensions. One involves being the leading-edge user, where you help whoever is producing the gizmo develop the gizmo and thereby become a partner in its development. The second level of venturesomeness is a willingness to take a chance on new goods and services where you haven't a clue whether they will give you good value for money. Any kind of consumption in particular? It could be flat-screen TVs, some new electronic gadget, or anything, really. This goes beyond early adopters. As a country and a culture, we just like to take chances when we consume. Let's just buy it, with no regrets. We equip our houses with all kinds of weird objects. And that brings me to the third level of venturesome consumption: the amount of time we are willing to invest in learning how to use

technology and make it work for us. An illiterate peasant can buy and start using a mobile phone. But most modern gizmos are not like that. They require a considerable amount of resourcefulness and problem solving. So it's not just the person who is creating the object who needs to figure out how to use it; it's also the person who is using the object. And then the willingness to stay with it until it works is unusual in the U.S., at every level -- at the consumer level and at the company level. In Germany, according to interviews I've done, people don't want the latest generation of what you have to sell; they want something they know will work. In the U.S., they say, Hey, give us your latest and greatest, and we'll figure out how to make it work.

1NC NASA LEADERSHIP FRONTLINENASA credibility resilient – they’ve recovered from worse Dinerman, ‘6-13 [Taylor Dinerman, 6/13/11, “The irreplaceable Space Shuttle”, http://www.thespacereview.com/article/1863/1, SM]

The shuttles are headed for museums and for the history books. The debates over what was, or should be, the right space policy will go on. Yet in all the noise and smoke of the arguments the accomplishments of the shuttle program are forgotten. In spite of inadequate budgets, political pressure, and unrealistic public expectations, NASA built an amazing vehicle. The men and women of America’s space agency and its contractors should be proud of what they managed to do. It’s hard to be optimistic about the future of NASA, but the agency has recovered from past disasters and neglect, and it may do so again.

Collapse of NASA inevitable – overwhelming political factors and decline of manned missionsGreen, 1AC Author ’10 [Joshua Green – a senior editor of The Atlantic and a weekly political columnist for the Boston Globe., July 7, 2010 “NASA’s Cloudy Future”, http://www.theatlantic.com/politics/archive/2010/07/nasas-cloudy-future/59331/#bio]

That day has been coming since 2004, when George W. Bush took the recommendation of the commission investigating the Columbia explosion and issued a directive to retire the space shuttle. The real shock came in January, when President Obama killed its successor, Project Constellation, which aimed to return Americans to the moon by 2020. Obama's FY2011 budget, while narrowly increasing NASA's $18.7 billion outlay, proposes to redirect that money toward research and development and stronger support for commercial space flight, which would bring NASA's illustrious 50-year history of manned missions to a close. This is economically and psychologically devastating to communities in Texas, Alabama, and Florida that depend on NASA programs. Here, the mood is defiant. Many of the parade floats bore signs that read "STOP OBAMA. SAVE NASA.'' The future of NASA is a charged issue that doesn't divide along partisan lines . The Obama administration's plan to privatize manned space flight has won plaudits from conservatives like Newt Gingrich, who called it "a brave reboot,'' while angering others, like former Republican Majority Leader Tom DeLay, a longtime NASA champion whose district included the Johnson Space Center. Democrats affected by the cuts have raised an outcry -- Florida Senator Bill Nelson called it "dead wrong'' -- while others have cheered the proposal to refocus the agency on climate change issues. Obama's new approach to space is being touted as a tough, forward-looking set of policies designed to serve the nation's long-term interests. The Constellation program, the administration points out, was over budget, behind schedule, and relied on existing technology; even the destination -- the moon -- was nothing new. Better to direct those funds to developing heavy-lift rocket systems and robotic exploration missions that might one day help people visit new worlds. The prohibitive cost of a manned mission to Mars or an asteroid, both touted by the president in an April speech, means that any such mission will be a cooperative effort with other nations. Critics reply that killing Constellation and reorienting NASA is foolish and costly. "The innovations that have come out of the space program are phenomenal,'' DeLay said. "With our failing manufacturing base, it is extremely important for our economy to maintain them.'' Private space flight has shown promise, but it will be years before a commercial company can safely launch astronauts into space. Lacking the capacity to send US astronauts to the International Space Station, we'll soon pay Russia to ferry them there, which won't be cheap. But the loudest complaint regards "American greatness'' -- the idea that the willing forfeiture of our leadership in space amounts to a kind of moral trespass that will cede to nations like China and India the next great strides in science and technology. Stopping Obama and saving NASA's manned missions is unlikely . History and politics have conspired against it. Without the Cold War imperative to beat the Soviet Union, the space program's profile has waned. NASA has depended for years -- sometimes against the wishes of the president -- on a succession of powerful congressional figures, most recently Tom DeLay, whose clout helped ensure that Constellation would succeed the shuttle program. But after introducing Constellation, Bush never mentioned it again. DeLay was forced to leave Congress soon afterward, and NASA has never found his equivalent champion. Congress still must pass a budget, but Obama's vision is likely to prevail. Where his critics have a point is in arguing that NASA lacks a clear mission. Without a directive and funding, talk of visiting Mars or an asteroid is grandiose but empty. Meanwhile, gauzy nostrums about inspiring children and international cooperation are creating political headaches. Last week, NASA administrator Charles Bolden touched off a storm when he told al Jazeera that the agency's new

mission was to "find a way to reach out to the Muslim world'' -- surely not what anybody had in mind.

EXT. NASA COLLAPSE INEV.Numerous alt causes – their authorFloridaToday.com, 1AC Author ‘11 [6/06/2011, “Our views: NASA's Webb debacle” http://www.floridatoday.com/apps/pbcs.dll/article?AID=2011110606013]

Those are the findings of a FLORIDA TODAY investigation that reviewed five years of project records, revealing an agency that continues to grossly mismanage major new programs. For instance, the shuttle fleet and International Space Station both came in far behind schedule, weighed down by 45 percent increases from their original price estimates, according to the Government Accountability Office, the financial watchdog arm of Congress. Ditto NASA’s Constellation moon program, which a presidential blue ribbon panel said was on a fiscally “unsustainable trajectory” that doomed it. And, just recently, NASA said the new heavy-lift rocket it wants to send astronauts on future deep-space missions from Kennedy Space Center would cost $9 billion and come in two years behind its mandated 2016 completion date.

1NC NSF CP The United States federal government should implement the recommendations of the National Science Board in its 2010 Globalization of Science and Engineering Research report.

OR

The National Science Foundation should ensure it supports "truly transformative" research that keeps U.S. researchers ahead of the world. The United States federal government should force federal agencies to benchmark its research against world leaders and ensure they fund "world-leading" efforts. The United States federal government should examine policies that protect the U.S. economy, intellectual property, technical leadership and pick "critical research areas for which the U.S. should be the global R&D leader.”

Counterplan secures US scientific leadership and competitiveness – resolves status quo gaps and spurs advanced investmentVergano, ’10 [Dan Vergano, Feb 25, 2010, “U.S. science and engineering leadership facing foreign foes”, http://content.usatoday.com/communities/sciencefair/post/2010/02/us-science-and-engineering-leadership-facing-foreign-foes/1, SM]

Growing international science and engineering expertise, "presents definite challenges to U.S. competitiveness in high technology areas, and to its position as a world leader," warns a blue-ribbon science panel.In the "Globalization of Science and Engineering Research" report released this week, the National Science Board calls for federal agencies and U.S. industries to benchmark their research against international competitors, to ensure domestic technical expertise stays ahead of other nations. Worldwide expenditures on research and development have doubled since 1996 to $1.1 trillion annually, the report notes, with much of the growth coming from China, India and other Asian nations. Such growth benefits U.S. competitiveness by increasing overall knowledge and opening more avenues for international collaboration, says the report. But it also gives firms opportunities to look elsewhere for such expertise. Last year, President Obama called for U.S. public and private spending on research and development to increase to 3% of the nation's $14.2 trillion GDP, up from about 2.8% ($398 billion) now. Industry provides about 2/3 of the current U.S. R&D funding. However, "U.S. firms in their majority-owned overseas affiliates consistently employed more foreign-resident R&D employees than U.S.-based affiliates of foreign firms employed U.S.-resident R&D employees," finds the report. While U.S. firms continue to lead in high tech sales, the goods are increasingly manufactured elsewhere. To combat the trend, the report panel recommends: The National Science Foundation should ensure it supports "truly transformative" research that keeps U.S. researchers ahead of the world. The Obama administration should force federal agencies to benchmark its research against world leaders and ensure they fund "world-leading" efforts. The administration should examine policies that protect the U.S. economy, intellectual property, technical leadership and pick "critical research areas for which the U.S. should be the global R&D leader." "Everyone benefits -- workers, companies, all of society -- from more competitive science and engineering expertise," says board member Louis Lanzerotti of the New Jersey Institute of Technology in Newark. "What are the critical research areas?" Lanzerotti adds. "The U.S. isn't asking that question right now. And we should."

2NC NSF SOLVENCYCP creates a sustainable and stimulating boost to US technology leadershipNSB, ’10 [National Science Board, FYI: The National Science Board is a qualified group that comes together biannually, oversees the collection of a very broad set of quantitative information about U.S. science, engineering and technology, and publishes the data and trends in our Science and Engineering Indicators (Indicators) report to congress, “Globalization of Science and Engineering Research”, http://www.nsf.gov/statistics/nsb1003/pdf/nsb1003.pdf, SM]

Investments in both public and private S&E research by nations are critical for their domestic economic growth and for overall social welfare. For the U.S., the industrial sector is the primary supporter of R&D, but relies on the Federal government to support the bulk of research on the basic end of the spectrum and in the academic sector. The complementary support by these two main players has sustained U.S. leadership across the fields of science and engineering research, and ensured the long-term growth of employment opportunities in U.S. research in academia, but even more so in the private sector. Research investments are increasing worldwide, and are growing more rapidly outside the centers that previously dominated the world R&D enterprise – North America, Europe, and Japan. Occurring in parallel with this development are increasing investments by U.S. private firms in R&D abroad, which are motivated by several dominant factors, including: proximity to customers, access to local expertise and educational institutions, ease of travel and relocation of people across borders, location of financial assets, and, often, lower cost structures for labor and facilities. This rapid evolution in worldwide R&D capabilities carries important policy issues for the Federal government as well for private firms in the U.S. The growth of global S&E research capacity raises several policy questions for the agencies in the Federal government that fund this research. These include: • What is the role of the NSF, the only non-mission research funding agency, in guiding Federal policy in this era of increasing globalization? • How can Federal funding agencies, individually and jointly, best respond to other countries’ explicit focus on building S&E capacity? • What are the potential lessons from other countries’ S&E strategies that could benefit the Federal R&D system and the existing excellence of U.S. S&E? The National Science Foundation, being the only non-mission-oriented agency that funds S&E research in the U.S. has often been favorably cited and even emulated by other nations. It is incumbent on the NSF to maintain its emphasis on the funding of basic, peer-reviewed research across the fields of science and engineering, with special attention to transformative S&E research in order to ensure that the U.S. remains a world research leader. Recommendation: The National Science Foundation should assess its two criteria for funding of S&E research to ensure that the criteria encourage the proposing and support of truly transformative research, and should modify the criteria and/ or merit review process if the assessment finds modifications necessary to accomplish this goal. The increasing globalization of S&E has caused many developed and developing nations to establish plans and goals for specific S&E areas in which to concentrate their public research investments. The expectation is that these public investments will stimulate economic development, and both public and private employment. Recommendation: The Office of Science and Technology Policy in the Executive Office of the President, through the National Science and Technology Council mechanism, should engage all Federal agencies involved with S&E research to: (a) develop means to assess or continue to assess the quality of their agency’s supported research against international activities, and (b) identify and as appropriate make adjustments necessary to ensure that their agency’s research is world-leading. The continued expansion abroad of R&D activities by U.S. private firms, driven by global competitive pressures and financial incentives, poses long-term challenges for U.S. continuing domestic economic strength and the domestic employment of highly-skilled and highly-educated technical personnel. This expansion raises several policy questions for U.S. private firms and for the Nation’s overall economic strength. These include: a. What does growth in U.S. privately funded R&D abroad imply for the viability and growth of domestically based private R&D activities? What is the role of conditions that host governments and home governments may impose on private industry for technology transfers and spillovers, and what is their net effect on the long-term competitiveness of the U.S.? b. How well do the legal systems of other countries protect intellectual property when U.S.-funded R&D activities are performed abroad, and if patents are filed, in which country are they filed? c. How does privately funded U.S. R&D performed abroad support innovation and the economy within the U.S.? d. Are there certain S&E research capabilities that are critical to be conducted within the Nation’s borders? If yes, what are they and what are the implications for licensing and global trade? Recommendation: The Office of Science and Technology Policy should call for a President’s Council on Innovation and Competitiveness as described in the COMPETES3 Act. Issues for discussion would include: (a) relationships between U.S. and foreign-supported R&D to ensure continued vitality and growth of U.S. technical strength, (b) safeguarding national interests in intellectual property, (c) ensuring that the U.S. economy benefits from R&D supported abroad, and (d) assessing critical research areas for which the U.S. should be the global R&D leader.

2NC NSF SOLVENCY – COMPARATIVECP is the only way to have significant science progress – transformative research measures are keyNSB, ’10 [National Science Board, FYI: The National Science Board is a qualified group that comes together biannually, oversees the collection of a very broad set of quantitative information about U.S. science, engineering and technology, and publishes the data and trends in our Science and Engineering Indicators (Indicators) report to congress, “Globalization of Science and Engineering Research”, http://www.nsf.gov/statistics/nsb1003/pdf/nsb1003.pdf, SM]

Increasing S&E capacity around the world challenges the U.S. to actively focus on maintaining its competitive strength in S&E research. As the Board has previously noted, science progresses in two fundamental and equally valuable ways: additive advancement in understanding, with new projects designed to build upon the results of previous studies or testing long-standing hypotheses and theories; and revolutionary advancements, through the application of radically different approaches or interpretations that result in the creation of new paradigms or new scientific fields. This second type of discovery, enabled by what the Board has defined as “transformative” research, is critical to maintaining a world-leading edge in S&E research . In response to global competitive pressures, U.S. research agencies must ensure that they provide adequate support for transformative research. Further, U.S. research agencies should constantly assess their programs and impacts in both types of research against international S&E research activities. International benchmarking efforts will require the development of robust assessment methods to gauge impacts of S&E research funding (a challenge for science of science policy experts), but will position U.S. research agencies to support world-leading activities

*** STEM ADVANTAGE

1NC STEM FRONTLINE Status Quo solves STEM – Obama’s is making moves towards public-private partnershipsGentile, ’10 [James M. Gentile, January 13th, 2010, “President Obama's Leadership in Improving STEM Education”, http://www.rescorp.org/news-and-comment/presidential-perspective/president-obamas-leadership-in-improving-stem-education/133, SM]President Barack Obama took another crucial step last week in galvanizing support - from government, business, higher education, and philanthropy - to dramatically improve the teaching of science, technology, engineering, and mathematics (STEM) in U.S. public schools. Numerous studies have shown that American students are not keeping up with competitors in these fields, which have been so crucial to U.S. economic leadership in the past century. As The Washington Post recently reported, "International math testing in 2007 found that U.S. fourth-graders trailed counterparts in some areas of Europe and Asia and that U.S. eighth-graders lagged behind those from a handful of Asian powers. Similar results were found in science." The President announced "five new public-private partnerships that will use proven models to prepare more than 10,000 new math and science teachers over the next five years and will support the professional development of more than 100,000 current teachers in STEM fields." The partnerships "represent a combined commitment of more than $250 million in financial and in-kind support, adding to the more than $260 million in support announced in November at the launch of the 'Educate to Innovate' campaign." The partnerships announced by The White House include the following: o Intel's Science and Math Teachers Initiative - a ten-year, $200 million cash and in-kind campaign to support teaching in math and science; o Expansion of the National Math and Science Initiative's UTeach Program - to prepare more than 4,500 undergraduates in STEM subjects to be new math and science teachers by 2015, and 7,000 by 2018; o A Commitment by Public University Presidents - in which the presidents of more than 75 major public universities committed to collectively prepare 10,000 science and math teachers annually by 2015; o The PBS Innovative Educators Challenge - through which PBS and its 356 partner stations, in collaboration with the National Science Teachers Association, will launch a multi-year STEM initiative; o The Woodrow Wilson Teaching Fellowships in Math and Science - through which the Woodrow Wilson National Fellowship Foundation will announce a major expansion of this program, which provides future math and science teachers with a Master's degree in education and places them in difficult-to-staff schools. President Obama should be applauded for making the sciences and science teaching a high priority in his Administration - through key Cabinet and senior staff appointments, funding from the Recovery Act, and the "Educate to Innovate" campaign, among other initiatives. He also deserves praise for challenging the business, higher education, and philanthropic communities to join him in this effort. All of us should re-evaluate what we can do in that regard.

Alt Cause – Automation – has a comparatively bigger impact than offshoringAttis, ’07 [David Attis, Jul 23, 2007, “Higher Education and the Future of U.S. Competitiveness”, http://www.educause.edu/thetowerandthecloud/PUB7202h, SM]

The same forces that demand a rethinking of science and engineering education are also reshaping the demand for skills from the broader population. The global proliferation of information technology has enabled a “trade in tasks” that opens more and more American workers to potential foreign competition. But those pundits who focus exclusively on offshoring often fail to recognize that its effects are often dwarfed by the impact of automation. The American call center worker is more likely to lose his or her job to IVR (interactive voice response) technology than to an offshore call center. Both offshoring and automation enable routine tasks to be performed at a lower cost, reducing the value of jobs structured around routine tasks but increasing the value of jobs that require more complex tasks that cannot easily be automated or offshored. The salient distinction is not necessarily between those with more or less education but between those whose work can be replaced by a computer or someone far away using a computer versus those whose productivity is enhanced by a computer.5

No STEM Crisis – their impact is empirically deniedGalama AND Hosek, ’08 – Ph.D. and M.Sc. in physics AND .D. and M.A. in economics [Titus Galama – Ph.D. and M.Sc. in physics, University of Amsterdam; M.B.A. in business, INSEAD, Fontainebleau, France, James Hosek – Ph.D. and M.A. in economics, University of Chicago; B.A. in English, Cornell University, “U.S. Competitiveness in Science and Technology”, 2008, http://www.rand.org/pubs/monographs/2008/RAND_MG674.pdf, SM]

Despite the rhetoric and the intensive action on the Hill, some voices called for restraint. The reports and testimony making a case for or arguing against an S&T crisis are part of an ongoing policy debate.One line of counterargument is that such warnings are far from unprecedented and have never resulted in the crisis anticipated. The author of a Washington Watch article noted that “similar fears of a STEM6 workforce crisis in the 1980s were ultimately unfounded” (Andres, 2006). Neal McCluskey, a policy analyst from the Cato Institute, noted that similar alarm bells were sounded decades earlier (and in his view, have had underlying political agendas): Using the threat of international economic competition to bolster federal control of education is nothing new. It happened in 1983, after the federally commissioned report A Nation at Risk admonished that ‘our once unchallenged preeminence in commerce, industry, science, and technological innovation is being overtaken by competitors throughout the world,’ as well as the early 1990s, when George Bush the elder called for national academic standards and tests in order to better compete with Japan. (McCluskey, 2006)Roger Pielke of the University of Colorado observed that such issues as poor student performance have an even longer history, with no negative outcomes. Arguments that “certain other countries produce a greater proportion of scientist and engineering students or that those students fare better on tests of achievement . . . have been made for almost 50 years,” he stated, “yet over that time frame the U.S. economy has done quite well” (Pielke, 2006).

EXT. A/CThey don’t even access a relevant internal link – Demographics means China will always produce a higher educated population, attempting to maintain innovation through students failsAttis, ’07 [David Attis, Jul 23, 2007, “Higher Education and the Future of U.S. Competitiveness”, http://www.educause.edu/thetowerandthecloud/PUB7202h, SM]

Yet the debate in the United States continues to focus on graduating ever greater numbers of scientists and engineers as the key to increasing U.S. competitiveness. While we must continue to improve standards and encourage more students to study science and engineering, we need to acknowledge that we will never win the race to produce the highest test scores or the most engineers. Simple demographics dictates that we will never outproduce China in engineers. But that does not mean that America’s innovation capacity is doomed. The best test-takers do not always make the best innovators, and a range of countries with high test scores—such as Japan, Singapore, Korea, and China—are increasingly worried that their educational systems stress conformity at the expense of creativity. The challenge is not to train the most scientists and engineers but to train the scientists and engineers (and artists and anthropologists and managers) who are best able to work within the global innovation system to create valuable new products and services.

The end of the spaceflight program will crush science educationPost-tribune, 11 [July 8, 2011, Colleen Sikorski, Chicago Sun Times, “With shuttles shelved, science education faces new challenge,” http://posttrib.suntimes.com/news/6407731-418/with-shuttles-shelved-science-education-faces-new-challenge.html]

After NASA’s space shuttle program ends, educators will have to work even harder to show students how molecules and equations in the classroom relate to space exploration and other real-world science fields. Purdue University Calumet and the Challenger Learning Center are both working on projects to help teachers

meet the challenge. “The space program has excited generations of children to pursue careers in science and engineering,” PUC physics and science professor Shawn Slavin said. “Without that there, it is really like getting rid of the role model or the celebrity that is the thing that drives people to that field. There isn’t much replacement for that.”

1NC ASTRONOMY WORKFORCE CPCP: The United States federal government should initiate partnerships with the American Astronomical Society and the American Physical Society, alongside United States astronomy and astrophysics departments should gather and disseminate demographic data on astronomers in the workforce in United States astronomy and astrophysics departments to inform students’ career decisions.

The counterplan spurs solves STEM faster than the aff – brings already qualified people into STEM jobs by applying Astronomy education to the workforce Blandford, et. al, 10 – a Pehong and Adele Chen Professor of Physics and director of the Kavli Institute for Astrophysics and Cosmology at Stanford University [Roger D. Blandford, Chair, Committee for a Decadal Survey of Astronomy and Astrophysics National Research Council, New Worlds, New Horizons in Astronomy and Astrophysics, 2010, ISBN: 9780309158008, pg. 83-4, SM]

The urgency for federal investment in science, technology, engineering, and mathematics (STEM) education and research was highlighted in the influential 2007 National Academies report Rising Above the Gathering Storm: Energizing and Employing America for a Brighter Economic Future. CONCLUSION: Astronomical research continues to offer significant benefits to the nation beyond astronomical discoveries. These benefits include its role in capturing the public’s attention and thereby promoting general science literacy and proficiency, its service as a gateway to science, technology, engineering, and mathematics careers, and a number of important and often unexpected technological spin-offs. The field of astronomy and astrophysics deserves inclusion in initiatives to enhance basic research, such as the America COMPETES Act. As further service to the nation, important roles in government can be played by suitably skilled scientists. Not only are they able to inform the decision-making process, but they also can develop a rare appreciation of the challenges of the political process, which they are well-placed to communicate to other scientists. RECOMMENDATION: The astronomical community should encourage and support astronomers’ commitment to serve in science service/policy positions, on a rotator, fellowship, or permanent basis, at the relevant funding agencies— NSF, NASA, DOE— in Congress, at the Office of Management and Budget, or at the Office of Science and Technology Policy. A consequence of the current excitement in the field of astronomy is that it attracts many highly capable students who contribute substantially to the research enterprise. Not all of these will take up long-term positions in astronomy, and so it is fortunate that training in astronomical research appears to be well matched in practice to much broader career opportunities. However, the situation also appears to be changing rapidly, and there is a need for students and postdoctoral scholars to be responsibly informed about their employment options on the basis of reliable and current information. There is a particular need to educate and expose young researchers to issues of public policy. RECOMMENDATION: The American Astronomical Society and the American Physical Society, alongside the nation’s astronomy and astrophysics departments, should make both undergraduate and graduate students aware of the wide variety of rewarding career opportunities enabled by their education , and be supportive of students’ career decisions that go beyond academia. These groups should work with the federal agencies to gather and disseminate demographic data on astronomers in the workforce to inform students’ career decisions.

2NC ASTRONOMY SOLVENCYCP brings the best and brightest into the STEM industries – Blandford, et. al, 10 – a Pehong and Adele Chen Professor of Physics and director of the Kavli Institute for Astrophysics and Cosmology at Stanford University [Roger D. Blandford, Chair, Committee for a Decadal Survey of Astronomy and Astrophysics National Research Council, New Worlds, New Horizons in Astronomy and Astrophysics, 2010, ISBN: 9780309158008, pg. 83-4, SM]

Astronomy is an incredibly exciting field that is attracting some of the best and brightest technically able young people. They are a precious resource for the nation, and it is important to optimize and broaden the benefits to the nation that their talents bring. Young people trained in astronomical research have a high degree of competence in disciplines with applicability beyond just astronomy and astrophysics. As a group, they are also energetic, hard-working, and highly motivated, and the fraction of their time that can be devoted to research is higher than at earlier and later career stages. Although training in astronomy for astronomers is valuable, in practice at least 20 percent of astronomers leave the profession for other careers following the Ph.D., the postdoctoral, and even the faculty/research position level. Careers outside astronomy and astrophysics are available that make use of the technical expertise gained through an astronomy education, and astronomers are demonstrably employable in a large variety of professions , such as computer science, data systems, image processing, detector technology, and medical technology, as well as other physical sciences. Training in astronomy research is good preparation for a wide range of careers. Experience in finding innovative solutions to new problems and familiarity with cutting-edge techniques and tools have very broad appeal to employers, and an astronomer’s education is rarely wasted. Nonetheless, the recent rapid growth in the postdoctoral pool of temporary positions suggests an increased need for advising and mentoring regarding broad career choices, not just in academia but also across the education and research enterprise, including careers beyond astronomy. Indeed, there is a strong and urgent need for career mentoring at all stages, from undergraduate to junior faculty member. In addition, it is important to introduce courses into astronomy curricula that can open doors to new careers. These courses could involve computer science, engineering, project management, public policy, or pedagogy, for example, possibly taken in other departments. Often, academic mentors emphasize academic careers for their students at the expense of discussing and supporting a broader range of career opportunities. The committee believes that doctoral training in astronomy prepares an individual for a variety of rewarding and important STEM careers and that the astronomy community needs to recognize alternate career paths more clearly. Professional training should accommodate the range of career paths taken by graduate and postdoctoral alumni, giving attention to (1) the full range of activities in academic faculty work, including teaching, advising, and performing institutional and national service; (2) the non-research skills needed by all researchers, including communicating to the non-specialist and the public at large, writing and administering grants, and project management; (3) necessary high-level training in communication and in the increasingly important areas of computation and instrumentation; and (4) career options both within and outside academia. Some of these goals could be achieved through professional master’s programs in astronomy with a particular focus. Partnership opportunities with government, industry, media resources, and museums could help broaden astronomy-related experiences through internships in areas such as public policy, computation and instrumentation, pedagogy, science outreach, and communications.

***SOLVENCY

1NC SOLVENCY FRONTLINEHot pixels kill solvencyNew Scientist, 11 [March 5, 2011, Rachel Courtland, New Scientist, “Hot pixel mystery plagues delayed space telescope,” Lexis]

What is degrading detectors on a NASA telescope already projected to launch late and run over-budget? Hot pixel mystery SOMETHING strange is killing off pixels in the detectors of a multi-billion-dollar space telescope. NASA's ultra-sensitive James Webb Space Telescope (JWST) is supposed to launch in 2014 and glimpse the universe's first stars and galaxies. But in December Marcia Rieke of the University of Arizona in Tucson and colleagues found that roughly 2 per cent of pixels in a detector destined for JWST's Near Infrared Camera were transmitting signals although no light was hitting them, four times as many "hot pixels" as expected. They later found that the problem affects four of the camera's five long-wavelength detector arrays. "We don't know what is happening, and we don't know if there's a way to reverse it or slow it down," Rieke says. NASA allows no more than 5 per cent of a detector's pixels to be hot by the end of the telescope's five-year space mission. At this rate, the detectors may exceed this limit before the telescope even leaves the ground, says Rieke. The pixel problem is just the latest setback for JWST. In November 2010, an independent review panel predicted that JWST is unlikely to launch before September 2015 and will cost $1.5 billion more than expected.

Telescope development fails because of management- Incompetence, not cash flow is the problem Postcrescent.com, 11 [June 5, 2011, Space telescope debacle devours NASA funds Once budgeted for $1.6B, cost now nears $7B http://www.postcrescent.com/article/20110606/APC0101/106060481/Space-telescope-debacle-devours-NASA-funds]

Considered by scientists the most important space mission of the decade, the James Webb Space Telescope project is being overhauled for the second time in five years because of skyrocketing costs and cascading schedule delays. Decision-makers initially were told the observatory would cost $1.6 billion and launch this year on a mission to look deeper into space and further back in time than the Hubble Space Telescope, in a quest for new clues about the formation of our universe and origins of life. NASA now says the telescope can't launch until at least 2018, though outside analysts suggest the flight could slip past 2020. The latest estimated price tag: up to $6.8 billion. NASA admits the launch delay will push the bill even higher. And, scientists are worried the cost growth and schedule delays are gobbling up more and more of the nation's astronomy budget and NASA's attention, threatening funding for other space science programs. Some fear the dilemma will get worse if the replanning work this summer forces NASA to shift billions more science dollars to Webb to get it back on track. So, what went wrong? A Florida Today review of five years' worth of budget records, status reports and independent audits show the Webb observatory is plagued by the same, oft-repeated problems that caused most major NASA projects to bust their budgets and schedules. In short, mistakes included: » NASA and its   contractors  underestimated the telescope's cost and failed to include enough reserve cash to handle the kinds of technical glitches that always crop up in development of a complex spacecraft, including many expensive risks managers knew about. » Leaders at agency headquarters in Washington and Goddard Space Flight Center in Baltimore, which led the project before the problems came to light, failed to act on repeated warnings that cash flow was too tight and technical glitches too many to meet the budget or schedule. "We were optimistic," said NASA headquarters program manager Keith Howard, who has worked on the telescope project since its inception in the 1990s. "If you want to say overly optimistic, that's one way to look at it. If we weren't optimistic, we would not be in this business." Even an infusion of federal stimulus money — meant to prop up the nation's ailing economy — was only enough to keep the telescope project afloat and workers on the job. The Independent Comprehensive Review Panel concluded NASA needed to make immediate management changes mostly because leaders had not questioned and verified enough of what they were being told. The review panel recommended NASA headquarters take control, starting with a new cost estimate and schedule more in line with reality.

EXT. HOT PIXELSHot pixels will compromise the JWSTSpacenews.com, 11 [March 11, 2011, “NASA Puts $30M Cost on JWST Hot Pixel Fix,” http://www.spacenews.com/civil/110311-nasa-cost-jwst-hot-pixel-fix.html]

During testing in December, NASA discovered the performance of multiple detector arrays had degraded since they were tested two years ago and put in storage. The arrays, built by Teledyne Imaging Sensors of

Camarillo, Calif., are planned to fly on JWST’s Near Infrared Camera, Near Infrared Spectrograph and Fine Guidance Sensor-Tunable Filter Imager. Teledyne spokeswoman Robyn McGowan declined to comment on the issue.

NASA in February established a Failure Review Board to investigate the problem and issue a report in April. If it is determined that a full set of new flight and backup detectors is needed, it would cost about $30 million and take about a year for delivery for integration with the three instruments, JWST Program Director Rick Howard told the NASA Advisory Council during a meeting here. “Right now we still don’t have a root cause,” Howard said. During testing of five detector arrays two years ago, about 0.5 percent of the pixels were found to

be out of specification. When one of these arrays was tested in December, the number — to the dismay of scientists — had grown to about 2 percent. Subsequent testing revealed three of the other four arrays had experienced a similar degree of unexpected pixel degradation. Astronomers fear the problem — if not corrected — could grow worse with time and compromise the quality of JWST’s imagery. NASA is in the process of establishing new cost and schedule estimates for the JWST program. The spacecraft most recently had a $5 billion price tag and was scheduled for launch in June 2014. An independent review last year estimated the program is 15 months behind schedule and would cost another $1.5 billion. Howard said NASA will submit a revised budget and schedule for JWST to the White House this fall as it prepares a 2013 spending proposal that will be sent to Congress in February.

Hot pixels could cripple the JWST before it leaves the groundUPI, 11 [March 2, 2011, “Pixel problems plague new telescope,” http://www.upi.com/Science_News/2011/03/02/Pixel-problems-plague-new-telescope/UPI-58861299117186/]

TUCSON, March 2 (UPI) -- Problems are cropping up in the detectors in a multibillion-dollar NASA space telescope already over budget and expected to launch late, officials said. The problem is just the latest in a string of setbacks for NASA's ultra-sensitive James Webb Space Telescope, NewScientist.com reported Wednesday. With a 21-foot mirror, nearly three times as wide as the one on the Hubble Space Telescope, the telescope could detect the universe's farthest and oldest stars and galaxies. But in December, researchers at the University of Arizona discovered roughly 2 percent of pixels in a detector destined for the Near Infrared Camera were transmitting signals although no light was hitting them. That's four times as many "hot pixels" as there were when the detector was analyzed in 2008, and NASA requirements call for no more than 5 percent of a detector's pixels to be hot by the end of the telescope's intended five-year space mission. At the present rate, the detectors may exceed this limit before the telescope even leaves the ground, UA researcher Marcia Rieke says. "We don't know what is happening, and we don't know if there's a way to reverse it or slow it down," says Rieke, principal investigator for the NIRCam. "Until we understand the root cause, I think we're all going to be quite nervous." NASA has set up a review board to

analyze the detector problem. "It's too early to speculate on what the root cause is or what we're going to do to fix it," JWST program director Rick Howard says.

1NC REFORM TURNWe should allow the James Webb to be cancelled – it won’t kill science leadership and will spur reform in our approach to science leadershipCampbell, 7/8/11 – creator of Science 2.0, the world’s largest online science community [July 8, 2011, Hank Campbell, “Webb Space Telescope - Why Congress May Be Right To Kill It,” http://www.science20.com/science_20/webb_space_telescope_why_congress_may_be_right_kill_it-80701]I've long said that what NASA needs is not a James Webb Space Telescope but an actual James Webb for the 21st century. Webb, if you are not familiar with NASA lore, was a bold leader rather than a bureaucrat tasked with perpetuating funding, and it was due to his leadership that NASA launched 75 missions into space, including putting a man on the Moon. The telescope named after him is instead very much a product of modern NASA - its benefit and time to completion were overestimated and its funding requirement underestimated. The belief in much of modern Big Science is once you get initial funding it becomes too expensive to not complete so issuing a reasonable number for appropriations comes before honesty. And then ethical researchers and engineers are stuck holding the bag. It isn't Republicans who launched the latest volley of concern about JWST but new Chairman Hal Rogers is a Republican, so the partisan shills in science writing will make it a "Republicans hate science" issue but the report was ordered by Senator Barbara Mikulski, an outraged Maryland Democrat, last year. People who circle the wagons around every bit of funding (see Shrimp On A Treadmill) will say you can't ever cut funding. They worry that if America cancels this successor to the Hubble we will lose 'leadership' in astronomy, the same way they claim we lost 'leadership' in physics by canceling the Superconducting SuperCollider, despite the fact that there was no indication it would even be completed today - or how it would have worked. It was a goal, not a specification for engineers. The Webb telescope has likewise been a black hole for funding. In James Webb Space Telescope delivers more bad news last year I noted that the budget was up to $6.5 billion and now an earliest completion date of 2015, though its original claim was it would be done by now. Budgets are finite. Everyone knows this except partisans in science. The $1.5 billion that JWST now claims it needs in order to not waste the billions already spent could fund 5,000 basic science research projects in space science (see

While Webb Bleeds, Space Science Hemorrhages) and $1.5 billion is just the latest cost overrun, not the total budget that may come up as more engineering concerns arise - so rather than circle the wagons around this project because it is science and people want to avoid a slippery slope, scientists can do a world of good holding each other accountable and making it less necessary for politicians to do so. The idea behind the Webb Telescope is a great one - continuing the work started by Hubble and Webb will be able to see light from about 250-400 million years after the Big Bang whereas the Hubble Space Telescope sees back to only 800 million years. It sounds esoteric to the public but there are fascinating things we can learn. However, science has to have a cost attached to a value, basic research or not. This is what killed the SSC. Those who compare the Webb Telescope to losing the SSC should take note - canceling the SSC made the much more reasonable, both in cost and engineering, Large Hadron Collider (LHC) a reality. Did it give Europe some ethereal, unquantifiable 'leadership' in physics? No, lots of projects are still done in the US and Japan but the task of finding the Higgs boson, which may not even exist, and its press has fallen to Europe. America still contributes and its knowledge will benefit all scientists, just like the Tevatron in the US has helped all scientists worldwide. It may be that canceling the JWST will be the wake-up call NASA has needed for a long time. The Obama administration already pulled the plug on the Constellation project and it may be time to do two things that are painful in the short term but essential for space science in the long term: First, fund smaller projects that don't have big engineering issues and are achievable. Second, make missions time-based, get back to 'acceptable risk' and allow NASA to shuck off the modern 'zero defects' mentality and the tentacles of bureaucracy and regulatory constraints that infect much of government-funded science. Creating bold missions where project managers use a 'joint confidence level' of 50% are not going to work in a time of budget

concerns. Let's hope the science community takes this warning shot as a chance to get fundamental reform in how science is done.

***GATES FOUNDATION

1NC SHELLText: The Bill and Melinda Gates Foundation should [insert the plan].

The Bill and Melinda Gates Foundation has billions and already funds multiple education initiativesCBSNews, 10 [October 4, 2010, “The Gates Foundation: Giving Away A Fortune,” http://www.cbsnews.com/stories/2010/09/30/60minutes/main6915431_page2.shtml?tag=contentMain;contentBody]

Some 7,000 miles away, back home in Seattle, the Bill and Melinda Gates Foundation is building its new headquarters. There are 850 employees figuring out which science or development projects are worthy. And listen to what they have spent already: $4.5 billion for vaccines; almost $2 billion for scholarships in America; and $1.5 billion to improve farming in Africa and Asia, just to name a few. The foundation's wealth ranks up there with America's biggest companies, just behind McDonalds and ahead of Boeing. "Boy, his and hers offices. I'm not sure a lot of marriages would survive this," Pelley said, touring the Gates' workspace. "Oh, it works out great," Bill Gates said. "Well, we actually like it a lot," Melinda Gates added. The Gates live in a secluded hi-tech mansion with three children. The kids are now ages 8, 11 and 14. Bill and Melinda met at a Microsoft meeting 23 years ago. "What did you think? I mean, it is not everyday a girl gets asked out by the richest man in the world?" Pelley asked. "Oh, no, It wasn't that, it was that I didn't think it was a very good idea to date the CEO of the company," she replied. It was back in 1993 on a vacation in Africa that they began to think about giving away their money. "Well, if you have money, what are you gonna do with it? You can spend it on yourself, you can have, you know, thousands of people holding fans and cooling you off. You can build pyramids and things. You know, I sometimes order two cheeseburgers instead of one. But we didn't have any consumption ideas. And if you don't think it's a favor to your kids to have them start with gigantic wealth, then you've gotta pick a cause," Bill Gates explained. "You don't consider it to be a favor to your kids?" Pelley asked. "To give them enormous wealth?" "No," Melinda Gates said. "They should go on to pursue whatever it is they want to do in life and not feel cheated by that by being given something, given a whole lot of wealth. They would never go out and figure out who they are and what their potential is." Melinda Gates told Pelley they told their kids that they are giving most of the money away and that their children are okay with it. "Yes, they reach different ages, they may ask us again, 'Tell me again, What? Why?'" Bill Gates added. The Gates' kids will still be massively wealthy. But their parents have already given roughly $30 billion to the foundation and they told us they'll give ninety percent of their money away. Add to that the contribution of the Gates' close friend, Warren Buffett, who has committed another $30 billion to the foundation. This past summer, the Gates and Buffett challenged billionaires to give half of their wealth to the charity of their choice. So far 40 have signed the pledge. "The foundation, you, have made certain choices about what you're going to fund. And some people might ask, 'Why not drop 30 billion dollars on a cure for cancer,' for example," Pelley remarked. "Well, there's a huge market for cancer drugs. And there's dozens of pharmaceutical companies that spend tens of billions on those drugs. In malaria, when we announced a grant for $50 million, we became the biggest private funders. And so, the fact that it kills over a million children a year and yet has almost no money given to it, you know, that struck us as very strange. But it became the thing we saw, 'Okay, this will be unique. We'll take the diseases of the poor, where there's no market and we'll get the best scientists working on those diseases,'" Bill Gates explained. "You're trying to find the places where the money will have the most leverage, how you can save the most lives for the dollar, so to speak," Pelley remarked. "Right. And transform the societies," Gates

replied. Another society they want to transform is America's, particularly through the schools. They've pledged nearly one quarter of all the foundation money to American students. And we followed Melinda Gates to the Friendship Collegiate Academy High School in Washington D.C. "I wonder what you think is the most alarming thing about American education?" Pelley asked. "I think it's most alarming that we're only preparing a third of the kids to go on to college. That's a frightening thing for our democracy to say a third of kids are prepared to go," she replied. If only a third of high school seniors are academically prepared to go to college, the Gates believe that a revolution in teaching can go a long way toward pushing that up to 80 percent. They're funding research to figure out what makes great teachers great. The foundation is at work in schools in nearly all 50 states. At Friendship Academy, they've given nearly a million dollars to the "Early College Program." Juniors take college courses. The money hires teachers, buys books and takes the students on college tours. Last year, 100 percent of Friendship Academy's seniors got into college. Sort of like "national parents," Bill and Melinda Gates have helped pay college tuition for 20,000 American kids. "The country is built on ingenuity. It's built on having lots of very well-educated people. And if you were from a poor family, how are you going to be break out of that? Well, education is the only way. Education is the thing that 20 years from now, will determine if this country is as strong and as just as it wants to be," Bill Gates explained. One of the boldest efforts of the foundation is unfolding in the slums that we visited in Delhi, an attempt to eradicate polio. No one in America has seen this since the 1960s. We found, in a Delhi hospital, a polio ward full of paralyzed children. "This young boy, Sahil. He is ten years old. Sahil has got paralysis of one side of his body. One leg. See what he's doing, he's trying his best, he's bringing his hand, but he cannot move his leg," a doctor explained. In a country where water often runs next to sewage, the virus, which is spread through human waste, finds new victims. Polio has been cornered to just four countries on Earth, so the Gates have teamed with Rotary International to bang on every door to find the last child who hasn't tasted the vaccine. "Do you believe you can do that, actually eradicate the virus from the face of the Earth?" Pelley asked. "It's been done with smallpox. And that's what gives us the hope and the belief," Melinda Gates said.

While in India, we were invited to a ceremony that every new mother prays for. Because so many newborns die, they're not given names right away. One family had waited a week to bring their daughter into the light and name her "Durga," which means "Invincible." It was during the ceremony that we saw what it is that has moved a no-nonsense executive to give away her fortune. Durga's first blessing was from the sun. Then she received a second, a future free of polio, in the form of a vaccination.

2NC SOLVENCYThe foundation supports education – the James Webb would be in line with thatBill Gates, 11 – CEO of the Bill and Melinda Gates Foundation [2011, “2011 Annual Letter from Bill Gates,” http://www.gatesfoundation.org/annual-letter/2011/Pages/excellence-in-teaching.aspx]

In the United States, the foundation’s biggest investments are in education. Only a third of students are graduating from high school prepared to succeed at college-level work, and even fewer are going on to get a degree that will help them compete for a good job. No one should feel comfortable with those results. Davis Guggenheim's amazing and popular movie Waiting for “Superman” made a powerful argument against the status quo. It showed a broad audience that schools with the right approach can succeed, even with inner city students that typical schools do not educate well. As more people understand the gap between what is possible and what is actually happening in most schools, I believe the momentum for reform will grow. Since 1980 U.S. government spending per K-12 student increased by 73 percent, which is 20 percent faster than the rest of the economy. Over that time our achievement levels were basically flat, while other countries caught up. A recent analysis by the Programme for International Student Assessment (PISA) showed the United States is about average (compared to 35 developed countries) in science and reading and below average in math. Many Americans have a hard time believing this data, since we are so used to being the global leader in educational achievement and since we spend a lot more money on education than many other countries. PISA measured educational achievement in the Shanghai area of China, and even allowing for the fact that Shanghai is one of the most advanced parts of China, the scores relative to the United States and other countries were quite stunning. China did better in math, science, and reading than any of the 65 countries it was compared to, and it achieved these results with an average class size of more than 35 students. One of the impressive things about the Chinese system is how teachers are measured according to their ability. There are four levels of proficiency in the Chinese system, and to move up a level, teachers have to demonstrate their excellence in front of a panel of reviewers. According to the PISA analysis (available at www.pisa.oecd.org), two key things differentiate the U.S. education system from most other countries’ systems. The first is that non-U.S. students are in school for more hours, and the second is that U.S. school systems do very little to measure, invest in, and reward teacher excellence. Most people who become teachers do so because they’re passionate about kids. It’s astonishing what great teachers can do for their students. But the remarkable thing about great teachers today is that in most cases nobody taught them how to be great. They figured it out on their own. That’s why our foundation is investing to help devise measurement and support systems to help good teachers become great teachers. Our project to learn what the best teachers do—and how to share this information with other teachers—is making significant progress. With the help of local union affiliates, we have learned a lot already. We’re learning that listening to students can be an important element in the feedback system. In classes where students agree that “Our class stays busy and doesn't waste time” or that “In this class, we learn a lot almost every day,” there tend to be bigger achievement gains. Another great tool is taking a video showing both the teacher and the students and asking evaluators to provide feedback. Melinda and I spent several days visiting schools in Tennessee this fall and sat with teachers who were watching videos of themselves teaching. We heard from a number of them how they had already improved by seeing when students were losing interest and analyzing the reasons.

CP solves the aff – Bill and Melinda Gates have an interest in funding innovation measures AP, ’10 [The Hindu, 1/25/10, “Innovation can leverage change, says Gates”, http://www.thehindu.com/sci-tech/science/article94654.ece, SM]

In his second annual letter, issued on Monday, Mr. Gates says investment in science and technology can leverage those dollars and make more of a difference than charity and government aid alone.In his 19-page letter, Mr. Gates says the foundation currently is backing 30 areas of innovation including online learning, teacher improvement, malaria vaccine development, HIV prevention, and genetically modified seeds. The Seattle-based foundation focuses most of its donations on global health, agriculture development and education. Since 1994, the foundation has committed to $21.3 billion in grants. As of Sept. 30, 2009, its endowment totalled $34.17 billion. Mr. Gates said his and his wife’s experience at Microsoft Corp. is not the only reason they are so taken with technology. “Melinda and I see our foundation’s key role as investing in innovations that would not otherwise be funded,” he wrote. “This draws not only on our backgrounds in technology but also on the foundation’s size and ability to take a long-term view and take large risks on new approaches”. Gates begins his letter by talking about how much fun he’s having at his new job: 2009 was the first year he worked full-time as co-chair of the foundation, after a decade of part-time work as he led Microsoft full-time. He talks about enjoyable visits around the world to talk to scientists, politicians, teachers, farmers and people doing the work of the foundation. “Seeing the work firsthand reminds me of how urgent the needs are as well as how challenging it is to get all the right pieces to come together,” Mr. Gates wrote. “I love my new job and feel lucky to get to focus my time on these problems.” He talked

about the way he and Melinda work as partners at the foundation, each focusing on problems that interest them and then sharing what they’ve learned and making decisions together on what the foundation should do.

The foundation is already funding measures to increase STEM in public policy – BHEF, ’09 [12/07/09, Business-Higher Education Forum Publication, “Business-Higher Education Forum to Expand Use of STEM Education Model”, http://www.bhef.com/news/newsreleases/Gates_Grant.asp, SM]The Business-Higher Education Forum (BHEF) has received a $417,517 grant from the Bill & Melinda Gates Foundation to help BHEF expand a unique simulation modeling tool for science, technology, engineering and math (STEM) education policy making, as well as for enhancing college access, readiness, and success efforts. BHEF is a national organization of Fortune 500 CEOs, prominent college and university presidents, and foundation leaders who work to advance innovative solutions to U.S. education challenges that affect competitiveness and the economy. One of the organization’s main goals is to double the number of STEM graduates by 2015. The grant will allow BHEF to adapt its U.S. STEM Education Model to examine STEM education policies at the state level and also to explore ways to expand the model’s use by federal and state education policymakers in STEM and other areas. The current model enables policy makers to simulate the effect of factors, such as changes in teacher quality and participation in undergraduate cohort programs, on increasing the number of STEM graduates. As such, it offers a powerful new tool for education policy makers that can help them understand the effect of various policies on the education system over time and simulate outcomes. The new grant will enable BHEF to explore the model’s applicability to other issues, as well, including the role of two-year colleges on college degree attainment. “This grant provides crucial support for increasing the model’s applicability and will allow BHEF to advance its use as a tangible policymaking tool,” says BHEF Chair David J. Skorton, president of Cornell University. “At the state level, we expect to test the adaptability of the model initially in Ohio and to use it to simulate the impact of various policies on key education issues of importance to that state. We hope that effort will provide a roadmap for policymakers in other states.” The model, developed by the Raytheon Company and donated to BHEF, is a key component of BHEF’s STEM Initiative, which was launched in June 2005 to help ensure that America remains a global leader in science, technology, engineering and mathematics education. BHEF’s work under this Initiative and under its initiative on college readiness, access, and success was supported by an initial $910,000 grant from the foundation in 2008. In addition to releasing the model into open source in July, BHEF—with partners Raytheon and The Ohio State University—has advanced the STEM Research and Modeling Network (SRMN) to support the modeling effort. The SRMN brings together researchers, policymakers, practitioners, corporations and funders, all of whom share the goal of using simulation modeling and similar tools to identify ways that student interest, participation and achievement in the STEM fields can be strengthened. Current forecasts of student degree attainment in the United States suggest that the U.S. will not produce enough STEM graduates at the two- or four-year college level to meet employer demand. The development of this model and the accompanying SRMN represent the mobilization of a community committed to aggressively addressing this challenge through innovative tools. “We are pleased with this generous support from the foundation, which will enable BHEF to advance the model and use it as a tool to address the nation’s challenges in STEM degree production,” says BHEF Vice Chair William H. Swanson, Raytheon Chairman and CEO. “It is crucial, if we are to remain competitive as a nation, that we have enough STEM graduates to fuel the workforce pipeline.”

The Foundation is funding innovation measures – there’s room for more grants nowLaster, ’10 [Jill Laster, April 22, 2010, “Bill Gates Says Open Courseware Is Good but Needs Improvement”, http://chronicle.com/blogs/wiredcampus/bill-gates-says-open-courseware-is-good-but-needs-improvement/23385, SM]

The Bill & Melinda Gates Foundation is looking at how to help support innovation in open courseware, he said. “What’s been done so far has had very modest funding. This is an area we need more resources, more bright minds, and certainly one that I want to see how the foundation could make a contribution to this.”The foundation   announced   $12.9-million in technology-related educational grants in December, including $2.5-million for Carnegie Mellon University’s Community College Open Learning Initiative. Marie Groark, a foundation spokeswoman, said that it is “scoping out” options for a second round of grants, but that “nothing’s been determined or set.”

***POLITICS LINKS

1NC POLITICS LINK [PLAN UNPOPULAR]The plan will spark backlash – seen as wasteful spendingMoran, 11 – PJM Chicago editor, Blog Editor at The American Thinker, and a frequent contributor to FrontPage.com [July 10, 2011, Rick Moran, “Webb telescope may be axed,” http://www.americanthinker.com/blog/2011/07/webb_telescope_may_be_axed.html]

It is one of the most sophisticated and technologically advanced machines ever conceived. The Next Generation Space Telescope, named for former Apollo-era NASA administrator James Webb, is actually designed to look all the way back to the beginning of time - 14 billion light years. It will also aid in detecting life on other planets, as well as catalogue rare

cosomlogical phenomena like gamma ray bursts and supernovas. But in an era of budget cutting, it is seen by many on the Hill as a luxury we can't afford: The House Appropriations Committee released its 2012 Commerce, Justice and Science funding bill today, ahead of a scheduled committee markup Thursday. The bill provides $50.2 billion overall for the nation's projects in those three areas, which is $7.4 billion less than President Obama's budget request. NASA's budget is slashed by $1.6 billion, which is $1.9 billion less than Obama wanted. About $1 billion of that comes from the end of the shuttle program, and NASA Science funding is cut by $431 million from last year. The bill also terminates funding for the James Webb Space Telescope, which is billions of dollars over budget and plagued by poor management," an Appropriations Committee press release says flatly. While management problems are a little more subjective, the telescope is indeed massively over budget, as we've told you before. In

November, a congressional panel described the telescope as "NASA's Hurricane Katrina," because of its destructive toll on other agency projects. That review found the telescope's price tag had mushroomed to $6.5 billion and that it would not be ready until at least 2015. Then, just last week, the watchdog site NASA Watch obtained a memo from Goddard Space Flight Center describing that it may not launch until after 2018 -- even that is "unfeasible," the report said.

2NC PLAN UNPOPULARThe James Webb is a partisan issueIBT, 11 [July 9, 2011, Gabriel Perna, International Business Times, “Republicans and Democrats Set To Battle Over The Webb Telescope,” http://www.ibtimes.com/articles/177090/20110709/nasa-webb-telescope-republicans-house-of-representatives-senate.htm]

The battle over the Hubble Telescope may very well be a political one. This week, the Republican favored House of Representatives proposed cutting the James Webb Space Telescope (JWST) as part of the 2012 NASA budget. The budget was determined by the House Appropriations commerce, justice, science subcommittee and led by Republican Representative Frank Wolf from Virginia. Wolf and other appropriators said the JWST program "is billions of dollars over budget and plagued by poor management." Not long after this budget cut was proposed, Wolf's opponent on the Senate subcommittee of the same

nature, Sen. Barbara Mikulski, a Democrat from Maryland, released a statement which said cutting the Webb telescope is a terrible idea. "Today the House Appropriations Subcommittee on Commerce, Justice, Science

and Related Agencies passed a bill that would terminate the James Webb Space Telescope, kill 2,000 jobs nationwide and stall scientific progress and discovery. It was a shortsighted and misguided move," Mikulski said. "The Webb Telescope will lead to the kind of innovation and discovery that have made America great. It will inspire

America's next generation of scientists and innovators that will have the new ideas that lead to the new jobs in our new economy. It seems as if the Senate (Democrats) and the House (Republicans) will spar over this issue and determine Webb's fate. The funding bill will be considered by the Full Appropriations Committee on July 13.

The plan would be contentiousMorring, 11 – staff writer [February 15, 2011, Frank Morring, Jr. Aerospace Daily & Defense Report, Funding and Policy; Pg. 1 Vol. 237 No. 30, “$18.7 Billion NASA Request Sets Up Capitol Hill Showdown,” Lexis]

Another likely bone of contention with Congress is the James Webb Space Telescope, which an outside panel has found faces a cost overrun of at least $1.5 billion. In the new request the Webb telescope would get only $375 million to continue fabrication and testing while NASA conducts its own calculations. The budget request carries no launch date for the telescope, and the agency says there will not be one until the fiscal 2013 request a year from now. The independent review ordered by Sen. Barbara Mikulski (D-Md.), who chairs the appropriations panel that funds NASA, estimated that launch of the deep-space infrared telescope will have to slip more than a year from its old September 2014 target.

Congress wants to cut funding now – the plan will be perceived as wasteClark, 11 – staff writer for Spaceflight Now [July 6, 2011, Stephen Clark, Spaceflight Now, “House panel proposes killing Hubble telescope successor,” http://www.spaceflightnow.com/news/n1107/06jwst/]

Legislators seeking to rein in government spending have put the troubled James Webb Space Telescope

up for cancellation, saying the successor to NASA's Hubble observatory is haunted by poor management and out-of-control costs. The next-generation space telescope is mired in a budgetary black hole. With an estimated cost of $6.5 billion and a cascade of delays, the flagship space mission could still be on the ground in 2018, NASA officials told Congress in April. Managers privately said launch of JWST could slip even later due to federal spending cutbacks. President Obama's 2012 budget proposal called for flat spending on JWST at $375 million annually over the next five years. Developed as the replacement for the Hubble Space Telescope, JWST is a joint project between NASA and the European Space Agency. With a 21.3-foot-diameter primary mirror, the telescope is designed to peer back in time almost to the Big Bang, giving astronomers a glimpse of infant galaxies as the universe cooled after its formation. The proposal to terminate JWST came from the House Appropriations Committee's panel overseeing NASA. The committee released their 2012 spending bill Wednesday, calling for more than $1.6 billion in cuts to NASA's budget from this year's levels. The Republican-led House subcommittee suggested a $16.8 billion NASA budget for fiscal year 2012, which begins in October. That's $1.9 billion less than the White House proposed in February. "The bill also terminates funding for the James Webb Space Telescope, which is billions of dollars over budget and plagued by poor management," lawmakers said in a press release. The Senate and the White House, which include JWST supporters, will weigh in on the federal budget before it becomes law. The budget must also pass

the full House of Representatives. NASA officials have repeatedly told Congress, researchers and journalists that JWST's exorbitant cost is prohibiting the agency from conducting other astrophysics missions. JWST's budget problems will likely keep NASA from launching a gravitational wave detector named LISA or the International X-ray Observatory until the 2020s.

There is no political will to fund the James WebbDiMascio, 7/8/11 – a writer who specializes in defense, covered Congress for Defense Daily and military policy and purchasing for Inside the Army. DiMascio has worked as a reporter for The Other Paper, a Columbus, Ohio, alternative newsweekly, and has written for The New York Times, The Village Voice and other publications. [July 8, 2011, Jen DiMascio, “Lawmakers Seek To Kill Webb Telescope,” http://www.aviationweek.com/aw/generic/story_generic.jsp?channel=aerospacedaily&id=news/asd/2011/07/07/02.xml&headline=Lawmakers%20Seek%20To%20Kill%20Webb%20Telescope]

A House panel recommends killing the Northrop Grumman-built James Webb Space Telescope, calling the Hubble successor “ billions of dollars over budget and plagued by poor management .” Overall, the House Appropriations Commerce, Justice, Science subcommittee backs funding NASA at $16.8 billion in fiscal 2012, a cut of $1.9 billion to President Barack Obama’s budget request, according to a committee statement. The subcommittee is scheduled to approve its draft of the spending bill that also covers the Commerce and Justice departments on July 7. The bill still must pass in the full House and be reconciled with a Senate version before becoming law. House Appropriations Committee Chairman Rep. Hal Rogers (R-Ky.) defends the committee’s decisions. “Given this time of fiscal crisis, it is also important that Congress make tough decisions to cut programs where necessary to give priority to programs with broad national reach that have the most benefit to the American people,” Rogers says. NASA’s future space telescope has run into its share of trouble, going $1.5 billion over budget and seeing its launch date slip at least three years.

Plan unpopular – costs mean politicians want to overwhelmingly end itMcKie, 7-9 [Robin McKie, Saturday 9 July 2011, “Nasa fights to save the James Webb space telescope from the axe”, http://www.guardian.co.uk/science/2011/jul/09/nasa-james-webb-space-telescope, SM]

Nasa is fighting to save its next-generation space observatory, the James Webb Space Telescope. Politicians want to end the project – one of the most complex ever conceived by space engineers – even though billions of dollars have already been spent on its construction. Scheduled for launch in 2016, the James Webb, intended to replace the ageing Hubble Space Telescope, would orbit in deep space, a million miles from Earth, and peer into the dawn of the universe. Its observations would answer major questions about the structure of the cosmos, say astronomers. The cost of the observatory has soared from an initial estimate of $1.6bn (£996m) to more than $6.5bn (£4bn). As a result, budgets for other astronomical research projects have been slashed, leading the journalNature to describe the James Webb as "the telescope that ate astronomy". Last week the US House of Representatives' appropriations committee on commerce, justice, and science decided that it had had enough of these escalating costs and moved to cancel the project by stripping $1.9bn from Nasa's budget for next year. A terse statement, released by the Republican-dominated committee, said that the project "is billions of dollars over budget and plagued by poor management". The decision still has to be approved by the full appropriations committee, the House and the Senate. Nevertheless , analysts say the telescope now faces a struggle to survive .

Congress looking to make budget cuts now – makes the plan unpopularHarrold, ‘7-7 [Max Harrold, 7/7/11, “Bad news for Canada: U.S. could scrap new space telescope”, http://www.montrealgazette.com/technology/space-shuttle/news+Canada+could+scrap+space+telescope/5067942/story.html#ixzz1Rj7eeTlF, SM]

One day before Atlantis was set to launch — marking the end of the storied space shuttle program — the Canadian government seemed caught off guard by a proposal in the U.S. Congress to kill a major new space telescope in which Canada is heavily invested. But a senior official here on Thursday underscored how important the James Webb Space Telescope is to Canadian ambitions in space. Canada is also spending $150 million in it. On Thursday, the U.S. House of Representatives' Appropriations Subcommittee on Commerce, Justice and Science approved a yearly budget for the National Aeronautics and Space Administration that does not include funding for the telescope, the successor to the Hubble Space Telescope, which has yielded a heap of amazing images and data. Federal Industry Minister Christian Paradis, in Florida with his 10-year-old son to watch the shuttle's historic launch, seemed to have had no warning of the decision. It comes at a time when U.S. politicians are debating deep government budget cuts in a range of areas. "I was just told about this," Paradis told reporters near the launch pad. "I will take the time to fully analyze what it is going on and check with the (Canadian Space Agency). Standing nearby, Steve MacLean, president of the CSA and an astronaut who went on two space shuttle missions, said the Webb telescope is absolutely worth trying to save. "I think the one thing that people in North

America will remember from the astronomy program in a hundred years is what the Hubble Telescope has done so far and what the James Webb Telescope could do ," Hopefully, the cash-strapped U.S. government will figure out a way to fly the Webb, he added. "It's that telescope and a few other things that we're doing on the ground that will teach us much about our future. Canada has a major role in this telescope. We have a technology that is at the heart of the telescope that's involved in precisely pointing it." The telescope is slated to launch in 2014 and use infrared technology to detect the first stars, quasars and supernovae of the early universe with unprecedented sensitivity. It is to be positioned 1.5 million kilometres from Earth. The Webb telescope project has been criticized for being poorly managed and for a budget that reportedly ballooned to $6.5 billion U.S. from $1.6 billion U.S..