1nc—data localization web view1nc—data localization. uniqueness. tech industries solve....

57
1NC—Data Localization

Upload: phamhuong

Post on 01-Feb-2018

222 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

1NC—Data Localization

Page 2: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Uniqueness

Page 3: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Tech Industries SolveCompanies resolve perception link in the status quoKendrick 15 (Katharine Kendrick is a policy associate for Internet communications technologies at the NYU Stern Center for Business and Human Rights., 2.19.15, “Risky Business: Data Localization” http://www.forbes.com/sites/realspin/2015/02/19/risky-business-data-localization/, ekr)

U.S. companies’ eagerness to please the EU affects their leverage in a place like Russia or China, and undermines their principled calls for a global Internet. Just as we’ve seen the emergence of company best practices to minimize how information is censored, we need best practices to minimize risks in where it is stored. Companies should take the following steps: Avoid localizing in a repressive country whenever possible. When Yahoo! entered Vietnam, to meet performance needs

without enabling the government’s Internet repression, it based its servers in Singapore. Explore global solutions. Companies like Apple and Google have started encrypting more data by default to minimize inappropriate access by any government. This doesn’t solve everything, but it’s a step forward for user privacy. Minimize exposure. If you must

have an in-country presence, take steps to minimize risk by being strategic in what staff and services you locate there. Embrace transparency. A growing number of companies have increased transparency by issuing reports on the number of government requests they receive. They should also publish legal requirements like localization, so that people understand the underlying risks

to their data. Work together. Companies should coordinate advocacy in difficult markets through organizations like the Global Network Initiative. Tech companies can take a proactive, collective approach, rather than responding reactively when their case hits the headlines. We can only expect localization demands to increase—and business pressures to pull in the opposite direction. While the political dynamics have shifted, companies should still have respect for human rights—and the strength of the global Internet—at the forefront of decisions over where to store their data.

Page 4: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Current Freedom Act SolvesStatus Quo Freedom Act sufficientCEA 15 (June 2, 2015, “Washington: CEA Praises Senate Passage of USA FREEDOM Act” http://www.ce.org/News/News-Releases/Press-Releases/2015-Press-Releases/CEA-Praises-Senate-Passage-of-USA-FREEDOM-Act.aspx, ekr)

The Consumer Electronics Association has issued the following news release: The following statement is attributed to Michael Petricone, senior vice president of government and regulatory affairs, Consumer Electronics Association (CEA)®, regarding the U.S. Senate’s passage of H.R.

2048, the Uniting and Strengthening America by Fulfilling Rights and Ensuring Effective Discipline Over Monitoring (USA FREEDOM) Act of 2015: “We welcome this important reform to U.S. intelligence gathering which takes critical steps to increase transparency and restore trust in American businesses , all while maintaining our commitment to preserving our national security. The bipartisan USA FREEDOM Act is common-sense reform to our nation’s intelligence gathering

programs, which will preserve American businesses’ competitiveness worldwide, while continuing to protect our national security. “Following the Senate passage, the legislation now heads to the White House, where we anticipate swift action by President Obama to sign this legislation into law.”

Page 5: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Internal Link

Page 6: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Big data badBig data is ineffective and dangerousHardy 13 - Quentin Hardy is a reporter, providing analysis on technology, economics and global business. Citing Kate Crawford, who is a Principal Researcher at Microsoft Research New York City, a Visiting Professor at MIT's Center for Civic Media, and a Senior Fellow at NYU's Information Law Institute. (“Why Big Data Is Not Truth,” http://bits.blogs.nytimes.com/2013/06/01/why-big-data-is-not-truth/ 6/1/2013) STRYKER

The word “data” connotes fixed numbers inside hard grids of information, and as a result, it is easily mistaken for fact . But including bad product introductions and wars, we have many examples of bad data causing big mistakes. Big Data raises bigger issues . The term suggests assembling many facts to create greater, previously unseen truths. It suggests the certainty of math. That promise of certainty has been a hallmark of the technology industry for decades. With Big Data , however, there are even more hazards, some human and some inherent in the technology. Kate Crawford, a researcher at Microsoft Research, calls the problem “Big Data fundamentalism — the idea with larger data sets, we get closer to objective truth .”

Speaking at a conference in Berkeley, Calif., on Thursday, she identified what she calls “six myths of Big Data.” Myth 1: Big Data is New In 1997, there was a paper that discussed the difficulty of visualizing Big Data , and in 1999, a paper that discussed the problems of gaining insight from the numbers in Big Data . That indicates that two prominent issues today in

Big Data, display and insight, had been around for awhile. “But now it’s reaching us in new ways,” because of the scale and prevalence of Big Data, Ms. Crawford said. That also means it is a widespread social phenomenon, like mobile phones were in the 1990s, that “generates a lot of comment, and then

disappears into the background, as something that’s just part of life.” Myth 2: Big Data Is Objective Over 20 million Twitter messages about Hurricane Sandy were posted last year. That may seem sufficient for a picture of whom the storm affected. However, the 16 percent of Americans on Twitter tend to be younger, more urban and more affluent than the norm. “Very few tweets came out of Breezy Point, or the Rockaways,” Ms. Crawford said. “These were very privileged urban stories.” And some people, privileged or otherwise, put information like their home addresses on Twitter in an effort to seek aid. That

sensitive information is still out there, even though the threat is gone. That means that most data sets , particularly where people are concerned,

need references to the context in which they were created. Myth 3: Big Data Doesn’t Discriminate “Big Data is neither color blind nor gender blind,” Ms. Crawford said. “We can see how it is used in marketing to segment people.” Facebook timelines, stripped of data like names, can still be used to determine a person’s ethnicity with 95 percent accuracy, she said. Information like sexual orientation among males is also relatively easy to identify. (Women are tougher to pinpoint.) That information can be used to determine what kind of advertisements, for example, that people receive. It’s important to remember that whenever people start creating data sets, these become

fallible human tools. “Data is something we create, but it’s also something we imagine,” Ms. Crawford said. Myth 4: Big Data Makes Cities Smart “It’s only as good as the people using it,” Ms. Crawford said. Many of the sensors that track people as they manage their urban lives come from high-end smartphones, or cars with the latest GPS systems .

“Devices are becoming the proxies for public needs,” she said, “but there won’t be a moment where everyone has access to the same technology.” In addition, moving cities toward digital initiatives like predictive policing, or creating systems where people are seen, whether they like it or not, can promote lots of tension between individuals and their governments. Sorry, IBM. Take that, Cisco. That goes for you, too, Microsoft, Ms. Crawford’s employer. All these big technology companies have Smart Cities initiatives. Myth 5: Big Data Is Anonymous A study published in Nature last March looked at 1.5 million phone records that had personally identifying information removed. It found that just four data points of when and where a call was made could identify 95 percent of individuals. “With just two, you can identify 50 percent of them,” Ms. Crawford said. “With a fingerprint, you need 12 data points to identify somebody.” Likewise, smart grids can spot when your friends come over. Search engine queries can yield health data that would be protected if it came up in a doctor’s office. Myth 6: You Can Opt Out Last December, Instagram, the photo-sharing site, changed its terms of service to allow it to share customer’s photos more broadly, even use images in ads. What it didn’t have was a paid option, in which a person could, for a fee, not be part of that. Even if that option existed, Ms. Crawford said, this would imply a two-tier system — people who could afford to control their data and those who could not. “Besides,” she said, given the ways that information can be obtained in these big systems, “what are the chances that your personal information will never be used?” Before Big Data disappears into the background as another fact of life, Ms. Crawford said, “We need to think about how we will navigate these systems. Not just individually, but as a society.”

Page 7: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

big data ineffectiveBig data doesn’t solve disease—predictions are too difficult and the bar for entry is too lowWhite 15 - Michael White is a systems biologist at the Department of Genetics and the Center for Genome Sciences and Systems Biology at the Washington University School of Medicine in St. Louis, where he studies how DNA encodes information for gene regulation. (“The Ethical Risks of Detecting Disease Outbreaks With Big Data,” http://www.psmag.com/health-and-behavior/ethical-risks-of-detecting-disease-outbreaks-with-big-data 2/24/2015) STRYKER

One of the most urgent ethical issues that the researchers identify lies at what they call "the nexus of ethics and methodology." The ethical issue can be reduced to one question: Do these methods actually work ? Ensuring that the methods work "is an ethical, not just a scientific, requirement," the researchers note. Unlike some other social media experiments, a flawed public health monitoring program can cause serious physical and economic harm to large numbers of people .

Digital disease detection programs are relatively easy to set up compared to traditional disease monitoring systems, which means there is a risk that the bar for entering this field might be dangerously low . An under-prediction of a disease outbreak can result in complacency and lack of preparedness by health officials or the public. An over-prediction could cause panic, misallocation of limited supplies of vaccines or medical resources, and, as some reactions to the recent Ebola outbreak demonstrated, damaging stigmatization of people or communities who don't pose a risk. As the physicist Niels Bohr once noted, prediction is hard —especially about the future. Big data programs and algorithms often perform well when they’re used to “predict” the existing data that was used to help build them, but then do poorly when confronted with new data. That's where digital disease detection tools that use social media data often run into trouble. Google Flu Trends looked impressive in its initial report in 2009, where it was used to retroactively predict flu activity of previous years. But it largely missed the two waves of H1N1 swine flu that hit later in 2009. As the Google Flu researchers wrote, "Internet search behavior changed during

pH1N1, particularly in the categories 'influenza complications' and 'term for influenza'"—two search terms that are particularly important in the algorithm. The program also over-predicted the severity of the 2011-12 flu season by 50 percent .

Big data doesn’t solve healthcare – multiple barriers and empirics prove**predictive analytics/comparative data are referring to the same idea of large databases with patient information i.e. big data in general

Crockett 14 (David, Ph.D. from University of Colorado in medicine, Senior Director of Research and Predictive Analytics at Health Catalyst, “3 Reasons Why Comparative Analytics, Predictive Analytics, and NLP Won’t Solve Healthcare’s Problems”, https://www.healthcatalyst.com/3-reasons-why-comparative-analytics-predictive-analytics-and-nlp-wont-solve-healthcares-problems/)

Comparative Data Doesn’t Drive Improvement We’ve had comparative data for years in the U.S. healthcare system and it hasn’t moved the needle towards better, at all. In fact, the latest OECD data ranks the U.S. even worse than we’ve ever been on healthcare quality and cost . Comparative data, like the OECD, is interesting and

certainly worth looking at, but it’s far from enough to drive improvements in an organization down to the individual patient. To drive that sort of change, you have to get your head and hands dirty in your own data ecosystem, not somebody

else’s that is at best a rough facsimile of your organization. There are too many variables and variations in healthcare delivery right now that add too much noise to the data to make comparative analytics as valuable as some pundits advocate. We don’t even have an industry standard and clinically precise definition of patients that should be included (and excluded from routine management) in a diabetes registry, much less the other 15 chronic

diseases and syndromes we should be managing. Predictive Analytics Fails to Include Outcomes We’ve also had predictive analytics supporting risk stratification for years in healthcare, particularly in case management, but without

Page 8: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

outcomes data, what are we left to predict? Readmissions. That’s a sad state of affairs. Before we start believing that predictive analytics is going to change the healthcare world, we need to understand how it works, technically and programmatically. Without

protocol and patient-specific outcomes data, predictive analytics is largely vendor smoke and mirrors in all but a very small number of use cases.

Page 9: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Impact

Page 10: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Disease Defense1. No zoonotic disease impact – multiple warrants

A. Empirics ProveRidley 12 (Matt Ridley, columnist for The Wall Street Journal and author of The Rational Optimist: How Prosperity Evolves, “Apocalypse Not: Here’s Why You Shouldn’t Worry About End Times,” http://www.wired.com/wiredscience/2012/08/ff_apocalypsenot/all/)

The emergence of AIDS led to a theory that other viruses would spring from tropical rain forests to wreak revenge on humankind for its ecological sins. That, at least, was the implication of Laurie Garrett’s 1994 book, The Coming Plague:

Newly Emerging Diseases in a World Out of Balance. The most prominent candidate was Ebola, the hemorrhagic fever that starred in Richard Preston’s The Hot Zone, published the same year. Writer Stephen King called the book “one of the most horrifying things I’ve

ever read.” Right on cue, Ebola appeared again in the Congo in 1995, but it soon disappeared. Far from being a harbinger, HIV was the only new tropical virus to go pandemic in 50 years . ¶ In the 1980s British cattle began dying from mad cow disease, caused by an infectious agent in feed that was derived from the remains of other cows. When people, too, began to catch this

disease, predictions of the scale of the epidemic quickly turned terrifying : Up to 136,000 would die , according to one study. A pathologist warned that the British “have to prepare for perhaps thousands, tens of thousands, hundreds of thousands, of cases

of vCJD [new variant Creutzfeldt-Jakob disease, the human manifestation of mad cow] coming down the line.” Yet the total number of deaths so far in the UK has been 176 , with just five occurring in 2011 and none so far in 2012.¶ In 2003 it was SARS, a virus from civet cats, that ineffectively but inconveniently led to quarantines in Beijing and Toronto amid predictions of global Armageddon. SARS subsided within a year, after killing just 774 people. In 2005 it was bird flu, described at the time by a United Nations official as being “like a combination of global warming and HIV/AIDS 10 times faster than it’s running at the moment.” The World Health Organization’s official forecast was 2 million

to 7.4 million dead. In fact, by late 2007, when the disease petered out, the death toll was roughly 200 . In 2009 it was Mexican swine flu. WHO director general Margaret Chan said: “It really is all of humanity that is under threat during a pandemic.” The outbreak proved

to be a normal flu episode.¶The truth is, a new global pandemic is growing less likely, not more . Mass migration to cities means the opportunity for viruses to jump from wildlife to the human species has not risen and has possibly even declined , despite media hype to the contrary. Water- and insect-borne infections —generally the most lethal— are declining as living standards slowly improve . It’s true that casual-contact infections such as colds are thriving —but only by being mild enough that their victims can soldier on with work and social

engagements, thereby allowing the virus to spread. Even if a lethal virus does go global, the ability of medical science to sequence its genome and devise a vaccine or cure is getting better all the time.

B. Burnout and variation checkYork 14 (Ian, head of the Influenza Molecular Virology and Vaccines team in the Immunology and Pathogenesis Branch of the Influenza Division at the CDC, PhD in Molecular Virology and Immunology from McMaster University, M.Sc. in Veterinary Microbiology and Immunology from the University of Guelph, former Assistant Prof of Microbiology & Molecular Genetics at Michigan State, “Why Don't Diseases Completely Wipe Out Species?” 6/4/2014, http://www.quora.com/Why-dont-diseases-completely-wipe-out-species)

But mostly diseases don't drive species extinct. There are several reasons for that. For one, the most dangerous diseases are those that spread from one individual to another. If the disease is highly lethal , then the population drops, and it becomes less likely that individuals will contact each other during the infectious phase.

Highly contagious diseases tend to burn themselves out that way.¶ Probably the main reason is variation . Within the host

and the pathogen population there will be a wide range of variants. Some hosts may be naturally resistant. Some pathogens will

Page 11: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

be less virulent . And either alone or in combination, you end up with infected individuals who survive.¶ We see this in HIV, for example. There is a small fraction of humans who are naturally resistant or altogether immune to HIV, either because of their CCR5 allele or their MHC Class I type. And there are a handful of people who were infected with defective versions of HIV that didn't progress to disease. ¶ We can see indications of this sort of thing

happening in the past, because our genomes contain many instances of pathogen resistance genes that have spread through the whole population . Those all started off as rare mutations that conferred a strong selection advantage to the carriers, meaning that the specific infectious diseases were serious threats to the species.

Page 12: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Indicts

Page 13: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Skepticism

You should be generally skeptical of their evidence – it overstates the value of big data – studies prove data is mostly irrelevant nowAslett 13 (Matt, research director for 451 research, formerly the Deputy Editor of monthly magazine Computer Business Review and ComputerWire's daily news service, “Big data reconsidered: it's the economics, stupid”, https://451research.com/report-short?entityId=79479&referrer=marketing)

For the past few years the data management industry has been in the grip of a fever related to 'big data' – a loosely defined term that has been used to describe analysis of large volumes of data, or analysis of unstructured data, or high-velocity data, or

social data, or predictive analytics, or exploratory analytics or all of the above – and more besides. The expectations for the potential of big data to revolutionize the data management and analytics industry are great and inflated, to the extent that it is easy to become disillusioned. A quick check of recent news headlines suggests that big data has the potential to solve world hunger, defeat terrorism, close the gender gap, bring about world peace, cure cancer and identify life on Mars. We don't doubt that data management and analytics will have a critical role to play in efforts related to all those issues, but there is clearly a gap between the potential of big data and the extent to which

related technologies have been adopted to date. For example, interviews from TheInfoPro , a service of 451 Research, with storage

professionals indicate that big data accounted for just 3% of the total data storage footprint in 2012 – and the exact same percentage in 2013. While we believe that the big data trend has the potential to revolutionize the IT industry by

enabling new business insight based on previously ignored and underutilized data, it is clear that big data is also massively over-hyped.

Page 14: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

2NC—Data Localization

Page 15: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Uniqueness

Page 16: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Ext—Current Freedom Act SolvesNew Freedom Act is sufficient to solve US’s global credibility gap. HRW ‘15

Human Rights Watch is an independent, international organization that works as part of a vibrant movement to uphold human dignity and advance the cause of human rights for all. “Strengthen the USA Freedom Act” - May 19, 2015 - http://www.hrw.org/news/2015/05/19/strengthen-usa-freedom-act

As the Senate considers the USA Freedom Act this week, policymakers should strengthen it by limiting large-scale collection of records and reinforcing transparency

and carrying court reforms further. The Senate should also take care not to weaken the bill, and should reject any amendments that would require companies to retain personal data for longer than is necessary for business purposes. It has been two years since the National Security Agency (NSA) whistleblower Edward Snowden unleashed a steady stream of documents that exposed the intention by the United States and the United Kingdom to “collect it all” in the digital age. These revelations demonstrate how unchecked surveillance can metastasize and undermine democratic

institutions if intelligence agencies are allowed to operate in the shadows, without robust legal limits and oversight. On May 13 , the US House of

Representatives approved the USA Freedom Act of 2015 by a substantial margin. The bill represents the latest attempt by Congress to rein in one of the surveillance programs Snowden disclosed—the NSA’s domestic bulk phone metadata collection under Section 215 of the USA Patriot Act. The House vote followed a major rebuke to the US government by the US Court of Appeals for the Second Circuit, which ruled on May 7 that the NSA’s potentially nationwide dragnet collection of phone records under Section 215 was unlawful. Section 215 is set to expire on June 1 unless Congress acts to extend it or to preserve specific powers authorized under the provision, which go beyond collection of phone records. Surveillance reforms are long overdue and can be accomplished while protecting US citizens from serious security threats. Congress and the Obama administration should end all mass surveillance programs, which unnecessarily and disproportionately intrude on the privacy of hundreds of millions of people who are not linked to wrongdoing. But reforming US laws and reversing an increasingly global tide of mass surveillance will not be easy. Many of the programs Snowden revealed are already deeply entrenched, with billions of dollars of infrastructure, contracts, and personnel invested. Technological capacity to vacuum up the world’s communications has outpaced existing legal frameworks meant to protect privacy. The Second Circuit opinion represents an improvement over current law because it establishes that domestic bulk collection of phone metadata under Section 215 of the Patriot Act cannot continue. Section 215 allows the government to collect business records, including phone records, that are “relevant” to an authorized investigation. The court ruled that the notion of “relevance” could not be stretched to allow intelligence agencies to gather all phone records in the US. However, the opinion could be overturned and two other appeals courts are also considering the legality of the NSA’s bulk phone records program. The opinion also does not address US surveillance of people not in the US. Nor does it question the underlying assumption that the US owes no privacy obligations to people outside its territory, which makes no sense in the digital age and is inconsistent with human rights law requirements. Even if the Second Circuit opinion remains good law, congressional action will be necessary to address surveillance programs other than Section 215—both domestic and those affecting people outside the US—and to create more robust institutional safeguards to prevent future abuses. The courts cannot bring about reforms to increase oversight and improve institutional

oversight on their own. Human Rights Watch has supported the USA Freedom Act because it is a modest, if incomplete, first step down the long road to reining in the NSA excesses. Beyond ending bulk records collection, the bill would begin to reform the secret Foreign Intelligence Surveillance Act (FISA) Court, which oversees NSA surveillance, and would introduce new transparency measures to improve oversight. In passing the bill, the House of Representatives also clarified that it intends the bill to be consistent with the Second Circuit’s ruling, so as to not weaken its findings.

The bill is no panacea and, as detailed below, would not ensure comprehensive reform. It still leaves open the possibility of large-scale data collection practices in the US under the Patriot Act. It does not constrain surveillance under Section 702 of the FISA Amendments Act nor Executive Order 12333, the primary legal authorities the government has used to justify mass surveillance of people outside US borders. And the bill does not address many modern surveillance capabilities, from mass cable tapping to use of malware, intercepting all mobile calls in a country, and compromising the security of mobile SIM

cards and other equipment and services. Nonetheless, passing a strong USA Freedom Act would be a long-overdue step in the right direction. It would show that Congress is willing and able to act to protect privacy and impose oversight over intelligence agencies in an age when technology makes ubiquitous surveillance possible. Passing this bill would also help shift the debate in the US and globally and would distance the U nited S tates from other countries that seek to make mass surveillance the norm. On a global level, other governments may already be emulating the NSA’s approach, fueling an environment of impunity for mass violations of privacy. In the last year, France, Turkey, Russia, and other countries have passed legislation to facilitate or expand large-scale surveillance. If the

USA Freedom Act passes, it would be the first time Congress has affirmatively restrained NSA activities since

the attacks of September 11. Key supporters of the bill have vowed to take up reforms to other laws next, including Section 702 of the FISA Amendments Act.

Page 17: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Internal Link

Page 18: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Ext—Data Doesn’t Solve HealthcareBig data doesn’t work for disease—only works as well a actual data collectionSwift 14 - Janet Swift is a spreadsheets and statistics specialist for I-Programmer. (“Google Flu Trends Adopts New Model,” http://www.i-programmer.info/news/197-data-mining/7939-google-flu-trends-new-model.html 11/3/2014) STRYKER

Google Flu Trends is launching a new model in the United States for the coming 2014/2015 flu season . The

important difference is that it is going to incorporate CDC flu data - which rather ruins its original idea. Google Flu Trends (GFT) was launched in

2008 to predict how many cases of flu are likely to occur based on "aggregate search data". The premise used by the model was that there is a correlation between the number of cases of flu and the number of searches on the topic of flu. So rather than collect data from doctors and hospitals about people showing symptoms you can instead look for

searches using terms associated with flu such as "cough" or "fever". Initially the model worked well. Not only did it provide accurate estimates of

the number of cases of flu, it did so ahead of those from the CDC (Centers for Disease Control and Prevention). But over time Google's model started to overpredict the incidence of flu, due to what could be interpreted as a positive feedback effect. Heightened media attention to flu when the incidence of flu rises leads to more people googling flu related terms. For the 2012/2013 flu season the GFT prediction exceeded the number of "real" flu cases by 95%. Responding to the research that revealed this anomaly Google adjusted the model for the 2013/2014 flu season (see the details in Google Updates Flu Model but it continued to overpredict . So a more drastic remedy was sought. According to Christian Stefansen, Senior Software Engineer, in a post on the Google Research blog announcing "brand new engine" for GFT, for the coming flu season in the US, Google is substituting a : "more robust model that learns continuously from official flu data". While this may well improve the model's accuracy, the fact that it uses actual data defeats the idea that flu could be predicted solely on the basis of Internet users search behavior. If the new model works well, it won't be nearly as interesting a finding as the success of the old model.

Big data analysis is ineffective—A. Selection biasHoffman and Podgurski 13 - Sharona Hoffman is a Edgar A. Hahn Professor of Law and Professor of Bioethics and Co-Director of Law-Medicine Center at Case Western Reserve University School of Law. Andy Podgurski is a Professor of Electrical Engineering and Computer Science at Case Western Reserve University. (“The Use and Misuse of Biomedical Data: Is Bigger Really Better?” American Journal of Law & Medicine, 39 Am. J. L. and Med. 497, 2013) STRYKER

If data subjects have the opportunity to opt out of inclusion in a database or if certain individuals' records are otherwise

excluded, a class of problems often called [*522] " selection bias " may arise . n227 Selection bias may occur when the subset of individuals studied is not representative of the patient population of interest . n228 This kind of selection bias could manifest, for example, if a disproportionate number of people of one ancestry or economic class opt out of participating in a database. n229 It can likewise exist if individuals with certain behavior traits that might be important in some studies--such as diet, exercise, smoking status, and alcohol or drug consumption--

choose not to participate or cannot access medical facilities in which studies take place . n230 Selection bias can distort assessments of measures such as disease prevalence or exposure risk because study estimates will differ systematically from the true values of these measures for the target population . n231

That is, the estimates will not be generalizable from the research subjects to the larger population about which analysts wish to draw

conclusions. n232 Another, more subtle kind of selection bias, which is also called "collider-stratification bias," n233 "collider-bias," n234 or "M-

bias," n235 is specific to causal-effect studies. n236 These studies typically seek to measure the average beneficial effect on patients of a particular treatment or the average harmful effect on individuals of a particular exposure. n237 Collider-stratification bias occurs in analyzing study data when the analysis is conditioned on (e.g., stratified by) one or more levels of a variable that is a common effect (a "collider") of both the

Page 19: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

treatment/exposure variable and the outcome variable or that is a common effect of a cause of the treatment/exposure and a cause of the outcome. n238 Consider the following classic example. Commonly, some patients are lost to follow-up , and thus outcome measurements that would be essential for research purposes are unavailable. The data from these patients cannot be included in studies. Both the treatment and outcome at issue may influence which patients stop seeking medical care. Patients may fail to return for follow-up both because the treatment is unpleasant (treatment factor) and because they actually feel better and don't see a need to

return to their doctors (an outcome factor). The loss of these study subjects can create a spurious statistical association between the treatment/exposure variable and the outcome variable that becomes mixed with and distorts the true causal effect of the former on the latter . n239 Because collider-stratification bias is associated with [*523] the exclusion of some patients from a study, it is categorized as a type of selection bias. n240

B. Confounding biasHoffman and Podgurski 13 - Sharona Hoffman is a Edgar A. Hahn Professor of Law and Professor of Bioethics and Co-Director of Law-Medicine Center at Case Western Reserve University School of Law. Andy Podgurski is a Professor of Electrical Engineering and Computer Science at Case Western Reserve University. (“The Use and Misuse of Biomedical Data: Is Bigger Really Better?” American Journal of Law & Medicine, 39 Am. J. L. and Med. 497, 2013) STRYKER

In observational causal-effect studies, confounding bias (confounding) may be an even greater concern than selection bias. n241 "Classical" confounding occurs because of the presence of a common cause of the treatment/exposure variable and the outcome variable. n242 Confounding is different from collider-stratification bias because it involves a common cause of the treatment/exposure and outcome variables rather than a common effect of the variables. n243 The following hypothetical illustrates classical confounding. Suppose a physician's treatment choices are influenced by the severity or duration of a patient's disease, which also influence the outcome of treatment. n244 Thus, patients at a later stage of a disease may receive one treatment (treatment A) and those who are at an earlier stage may receive a different therapy (treatment B). At the same time, sicker patients may have worse

treatment outcomes than healthier individuals. Unless such a common cause, which is called a "confounding variable" or "confounder," is adjusted for appropriately during statistical data analysis, it may induce a spurious association between the treatment variable and the outcome variable, which distorts estimation of the true causal effects of treatments. n245 In other words, researchers may reach incorrect conclusions regarding the efficacy of the two treatments because of the confounding variable: the degree of sickness suffered by patients receiving the different therapies. Treatment A may appear to be less effective than treatment B not because it is in fact an inferior therapy but because so many of the patients receiving treatment A are in a late stage of the disease and would not do well no matter what treatment they received. This particular form of confounding, called "confounding by indication," is especially challenging to adjust for, because it may involve multiple factors

that influence physicians' treatment decisions. n246 Socioeconomic factors and patient lifestyle choices may also be confounders. Those who lack financial resources or adequate health coverage may select less expensive treatments not because those are the best choices for them but because those are the only affordable options. n247 Low income may also separately lead to poor health for reasons such as poor nutrition or financial stress. In the case of preventive care, a

treatment's perceived benefits may be amplified because health-oriented individuals interested in the intervention also pursue exercise, low-fat diets, and other health-promoting behaviors. These patients' impressive outcomes thus would not be associated solely with the preventive measure . n248

C. Measurement biasHoffman and Podgurski 13 - Sharona Hoffman is a Edgar A. Hahn Professor of Law and Professor of Bioethics and Co-Director of Law-Medicine Center at Case Western Reserve University School of Law. Andy Podgurski is a Professor of Electrical Engineering and Computer Science at Case Western Reserve University. (“The Use and Misuse of Biomedical Data: Is Bigger Really Better?” American Journal of Law & Medicine, 39 Am. J. L. and Med. 497, 2013) STRYKER

Measurement biases arise from errors in measurement and data collection . n262 Observational study results may be compromised if the biomedical records that are analyzed contain such errors . Measurement

Page 20: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

errors occur for a variety of reasons. Measurement instruments might not be calibrated properly or might lack sufficient sensitivity to detect differences in relevant variables. n263 Storage time or conditions for biological samples might be different and might affect study results. n264 To the extent that researchers solicit and record patients' own accounts and

memories, the subjects' ability to recall details may be influenced by the questioner's competence, patience, and apparent sympathy

or by the degree to which the patient perceives the topic to be important and relevant to her life. n265 In addition, patients may have impaired memories or may lie in response to questions if they are embarrassed about the truth. n266 Accurate measurement may be further hindered by incomplete, erroneous, or miscoded EHR data that obfuscates true values . n267 In causal-effect studies, errors in measurement of the treatment/exposure and the outcome are most problematic when they are associated (dependent) and when they are differential, that is, when the treatment affects the measurement error for the outcome or the outcome affects the measurement error for the treatment. n268 For example, differential measurement error could occur in a study of the effect of treatment A on dementia, if the use of A was determined only by interviewing study participants, because dementia affects subjects' ability to recall whether and how they were treated. n269

Mismeasurement of confounding variables also impedes adjustments intended to eliminate confounding bias. n270

Healthcare prediction can’t be scaled up – no motivation and structural problemsCrockett 13 (David, Ph.D. from University of Colorado in medicine, Senior Director of Research and Predictive Analytics at Health Catalyst, “Using Predictive Analytics in Healthcare: Technology Hype vs. Reality”, https://www.healthcatalyst.com/predictive-analytics-healthcare-technology)

The buzzword fever around predictive analytics will likely continue to rise and fall. Unfortunately, lacking the proper infrastructure, staffing and resource to act when something is predicted with high certainty to happen, we fall short of the full potential of harnessing historic trends and patterns in patient data. In other words, without the willpower for clinical intervention , any predictor – no matter how good – is not fully utilized.

Page 21: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Ext—Big Data BadBig data is unnecessary—it’s not needed to important discoveries and is often a distractionArbesman 13 - Samuel Arbesman, an applied mathematician and network scientist, is a senior scholar at the Ewing Marion Kauffman Foundation and the author of “The Half-Life of Facts.” (“Five myths about big data,” http://www.washingtonpost.com/opinions/five-myths-about-big-data/2013/08/15/64a0dd0a-e044-11e2-963a-72d740e88c12_story.html 8/16/2013) STRYKER

Big data holds the promise of harnessing huge amounts of information to help us better understand the world. But when talking about big data, there’s a tendency to fall into hyperbole . It is what compels contrarians to write such tweets as “Big Data, n.: the belief that any sufficiently

large pile of s--- contains a pony.” Let’s deflate the hype. 1. “Big data” has a clear definition. The term “big data” has been in circulation since at least the 1990s, when it is believed to have originated in Silicon Valley. IBM offers a seemingly simple definition: Big data

is characterized by the four V’s of volume, variety, velocity and veracity. But the term is thrown around so often, in so many contexts — science, marketing, politics, sports — that its meaning has become vague and ambiguous. There’s general agreement that ranking every page on the Internet according to relevance and searching the phone records of every Verizon customer in the United States qualify as applications of big data. Beyond that, there’s much debate. Does big data need to involve more information than can be processed by a single home computer? If so, marketing analytics wouldn’t qualify, and neither would most of the work done by Facebook. Is it still big data if it doesn’t use certain tools from the fields of artificial intelligence and machine learning? Probably. Should narrowly focused industry efforts to glean consumer insight from large datasets be grouped under the same term used to describe the sophisticated and varied things scientists are trying to do? There’s a lot of confusion, and industry experts and scientists often end

up talking past one another. 2. Big data is new. By many accounts, big data exploded onto the scene quite recently . “If wonks were fashionistas, big data would be this season’s hot new color,” a Reuters report quipped last year. In a May 2011 report, the McKinsey Global Institute declared big data “the next frontier for innovation, competition, and productivity.” It’s true that today we can mine massive amounts of data — textual, social,

scientific and otherwise — using complex algorithms and computer power. But big data has been around for a long time. It’s just that exhaustive datasets were more exhausting to compile and study in the days when “computer” meant a person who performed calculations. Vast linguistic datasets, for example, go back nearly 800 years. Early biblical concordances — alphabetical indexes of words in the Bible, along with their context — allowed for some of the same types of analyses found in modern-day textual data-crunching. The sciences also have been using big data for some time. In the early 1600s, Johannes Kepler used Tycho Brahe’s detailed astronomical dataset to elucidate certain laws of planetary motion. Astronomy in the age of the Sloan Digital Sky Survey is certainly different and more awesome, but it’s still astronomy. Ask statisticians, and they will tell you that they have been analyzing big data — or “data,” as they less redundantly call it — for centuries. As they like to argue, big data isn’t much

more than a sexier version of statistics, with a few new tools that allow us to think more broadly about what data can be and how we generate it. 3. Big data is revolutionary. In their new book, “Big Data: A Revolution That Will Transform How We Live, Work, and Think,”Viktor Mayer-Schonberger and Kenneth Cukier compare “the current data deluge” to the transformation brought about by the Gutenberg printing press. If you want more precise advertising directed toward you, then yes, big data is revolutionary. Generally, though, it’s likely to have a modest and gradual impact on our lives . When a phenomenon or an effect is large, we usually don’t need huge amounts of data to recognize it (and science has traditionally focused on these large effects). As things become more subtle, bigger data helps. It can lead us to smaller pieces of knowledge: how to tailor a product or how to treat a disease a little bit better. If those bits can help lots of people, the

effect may be large. But revolutionary for an individual? Probably not. 4. Bigger data is better. In science, some admittedly mind-blowing big-data

analyses are being done. In business, companies are being told to “embrace big data before your competitors do.” But big data is not automatically better. Really big datasets can be a mess . Unless researchers and analysts can reduce the number of variables and make the data more manageable, they get quantity without a whole lot of quality . Give me

some quality medium data over bad big data any day. And let’s not forget about bias. There’s a common misconception that throwing more data

at a problem makes it easier to solve. But if there’s an inherent bias in how the data are collected or examined, a bigger dataset doesn’t help. For example, if you’re trying to understand how people interact based on mobile phone data, a year of data rather than a month’s

worth doesn’t address the limitation that certain populations don’t use mobile phones. Many interesting questions can be explored with little datasets. Big data has refined our idea of six degrees of separation: Facebook has shown that it’s actually closer to four degrees. But the first six-

degrees study was done by psychologist Stanley Milgram using a lot of cleverness and a small number of postcards. Furthermore, although it’s exciting to have massive datasets with incredible breadth, too often they lack much in the way of a temporal dimension. To really understand a phenomenon, such as a social one, we need datasets with large historical sweep. We need long

Page 22: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

data, not just big data. 5. Big data means the end of scientific theories. Chris Anderson argued in a 2008 Wired

essay that big data renders the scientific method obsolete: Throw enough data at an advanced machine-learning technique, and all the correlations and relationships will simply jump out . We’ll understand everything. But you can’t just go fishing for correlations and hope they will explain the world . If you’re not careful, you’ll end up with spurious correlations. Even more important, to contend with the “why” of things, we still need ideas, hypotheses and theories. If you don’t have good questions, your results can be silly and meaningless. Having more data won’t substitute for thinking hard, recognizing anomalies and exploring deep truths.

Page 23: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Impact

Page 24: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Ext—Disease DefenseEmpirics and isolated populations proveBeckstead 14 (Nick, Research Fellow at the Future of Humanity Institute, citing Peter Doherty, recipient of the 1996 Nobel Prize for Medicine, PhD in Immunology from the University of Edinburgh, Michael F. Tamer Chair of Biomedical Research at St. Jude Children’s Research Hospital, “How much could refuges help us recover from a global catastrophe?” in Futures, published online 18 Nov 2014, Science Direct)

That leaves pandemics and cobalt bombs, which will get a longer discussion. While there is little published work on human extinction risk from pandemics , it seems that it would be extremely challenging for any pandemic —whether natural or manmade—to leave the people in a specially constructed refuge as the sole survivors . In his introductory book on pandemics

(Doherty, 2013, p. 197) argues:¶ “No pandemic is likely to wipe out the human species. Even without the protection provided by modern science, we survived smallpox, TB, and the plagues of recorded history .

Way back when human numbers were very small, infections may have been responsible for some of the

genetic bottlenecks inferred from evolutionary analysis, but there is no formal proof of this.”¶ Though some authors have vividly described worst-case scenarios for engineered pandemics (e.g. Rees, 2003 and Posner, 2004; and Myhrvold, 2013), it would take a special effort to infect people in highly isolated locations , especially the 100+ “ largely uncontacted” peoples who prefer to be left alone. This is not to say it would be impossible. A madman intent on annihilating all human life could use cropduster-style delivery systems, flying over isolated peoples and infecting them. Or perhaps a pandemic could be engineered to be delivered through animal or environmental vectors that would reach all of these people.

Zoonotic diseases will not cause extinction – two reasons – missing a molecular signature and cross-reactive immunities checkPalese 9 (Peter Palese, chairman of the department of microbiology at the Mount Sinai School of Medicine in New York., “Why Swine Flu Isn’t So Scary”, The Wall Street Journal, http://online.wsj.com/article/SB124122223484879119.html)

Still, there is more evidence that a serious pandemic is not imminent. In 1976 there was an outbreak of an H1N1 swine virus in Fort Dix, N.J., which showed human-to-human transmission but did not go on to become a highly virulent strain. This virus was very similar to

regular swine influenza viruses and did not show a high affinity for the human host.Although the swine virus currently circulating in humans is different from the 1976 virus, it is most likely not more virulent than the other seasonal strains we have experienced over the last several years. It lacks an important molecular signature (the protein PB1-F2) which was

present in the 1918 virus and in the highly lethal H5N1 chicken viruses. If this virulence marker is necessary for an influenza virus to become highly pathogenic in humans or in chickens -- and some research suggests this is the case -- then the current swine virus, like the 1976 virus, doesn't have what it takes to become a major killer. Since people have been exposed to H1N1 viruses over many

decades, we likely have some cross-reactive immunity against the swine virus. While it may not be sufficient to

prevent illness, it may very well dampen the impact of the virus on mortality. I would postulate that by virtue of this "herd immunity," even a 1918-like H1N1 virus could never have the horrific effect it had in the past. The most likely outcome is that the current swine virus will become another (fourth) strain of regular seasonal influenza.

Page 25: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

1NC—Tech Industry

Page 26: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Uniqueness

Page 27: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Tech Industries SolveCompanies resolve perception link in the status quoKendrick 15 (Katharine Kendrick is a policy associate for Internet communications technologies at the NYU Stern Center for Business and Human Rights., 2.19.15, “Risky Business: Data Localization” http://www.forbes.com/sites/realspin/2015/02/19/risky-business-data-localization/, ekr)

U.S. companies’ eagerness to please the EU affects their leverage in a place like Russia or China, and undermines their principled calls for a global Internet. Just as we’ve seen the emergence of company best practices to minimize how information is censored, we need best practices to minimize risks in where it is stored. Companies should take the following steps: Avoid localizing in a repressive country whenever possible. When Yahoo! entered Vietnam, to meet performance needs

without enabling the government’s Internet repression, it based its servers in Singapore. Explore global solutions. Companies like Apple and Google have started encrypting more data by default to minimize inappropriate access by any government. This doesn’t solve everything, but it’s a step forward for user privacy. Minimize exposure. If you must

have an in-country presence, take steps to minimize risk by being strategic in what staff and services you locate there. Embrace transparency. A growing number of companies have increased transparency by issuing reports on the number of government requests they receive. They should also publish legal requirements like localization, so that people understand the underlying risks

to their data. Work together. Companies should coordinate advocacy in difficult markets through organizations like the Global Network Initiative. Tech companies can take a proactive, collective approach, rather than responding reactively when their case hits the headlines. We can only expect localization demands to increase—and business pressures to pull in the opposite direction. While the political dynamics have shifted, companies should still have respect for human rights—and the strength of the global Internet—at the forefront of decisions over where to store their data.

Page 28: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Current Freedom Act SolvesCurrent Freedom Act restores confidenceCEA 15 (June 2, 2015, “Washington: CEA Praises Senate Passage of USA FREEDOM Act” http://www.ce.org/News/News-Releases/Press-Releases/2015-Press-Releases/CEA-Praises-Senate-Passage-of-USA-FREEDOM-Act.aspx, ekr)

The Consumer Electronics Association has issued the following news release: The following statement is attributed to Michael Petricone, senior vice president of government and regulatory affairs, Consumer Electronics Association (CEA)®, regarding the U.S. Senate’s passage of H.R.

2048, the Uniting and Strengthening America by Fulfilling Rights and Ensuring Effective Discipline Over Monitoring (USA FREEDOM) Act of 2015: “We welcome this important reform to U.S. intelligence gathering which takes critical steps to increase transparency and restore trust in American businesses , all while maintaining our commitment to preserving our national security. The bipartisan USA FREEDOM Act is common-sense reform to our nation’s intelligence gathering

programs, which will preserve American businesses’ competitiveness worldwide, while continuing to protect our national security. “Following the Senate passage, the legislation now heads to the White House, where we anticipate swift action by President Obama to sign this legislation into law.”

Page 29: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Impact

Page 30: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Economy DefenseScholars have been consistently wrong about economic decline, sever economic shocks have no real impactDrezner 14 (Daniel W., professor of international politics at the Fletcher School of Law and Diplomacy at Tufts University. His latest book, The System Worked: How the World Stopped Another Great Depression, is just out from Oxford University Press; “The Uses of Being Wrong” http://www.lexisnexis.com.proxy2.cl.msu.edu/lnacui2api/results/docview/docview.do?docLinkInd=true&risb=21_T20276111299&format=GNBFI&sort=DATE,D,H&startDocNo=1&resultsUrlKey=29_T20276111283&cisb=22_T20276111282&treeMax=true&treeWidth=0&csi=171267&docNo=1)

My new book has an odd intellectual provenance-it starts with me being wrong. Back in the fall of 2008, I was convinced that the open global economic order, centered on the unfettered cross-border exchange of goods, services, and ideas, was about to collapse as quickly as Lehman Brothers. A half-decade later, the closer I looked at the performance of the system of global economic governance, the clearer it became that the meltdown I had expected had not come to pass . Though the advanced industrialized economies suffered prolonged economic slowdowns, at the global level there was no great surge in trade protectionism, no immediate clampdown on capital flows, and, most surprisingly, no real rejection of neoliberal economic principles . Given what has normally transpired after severe economic shocks, this outcome was damn near miraculous. Nevertheless, most observers have remained deeply pessimistic about the functioning of the global political economy. Indeed, scholarly books with titles like No One's World: The West, The Rising Rest, and the Coming Global Turn and The End of American World Order have come to a conclusion the opposite of mine. Now I'm trying to understand how I got the crisis so wrong back in 2008, and why so many scholars continue to be wrong now.

Global economic governance institutions guarantee resiliency Daniel W. Drezner 12, Professor, The Fletcher School of Law and Diplomacy, Tufts University, October 2012, “The Irony of Global Economic Governance: The System Worked,” http://www.globaleconomicgovernance.org/wp-content/uploads/IR-Colloquium-MT12-Week-5_The-Irony-of-Global-Economic-Governance.pdf

Prior to 2008, numerous foreign policy analysts had predicted a looming crisis in global economic governance. Analysts only reinforced this perception since the financial crisis, declaring that we live in a “G-Zero” world. This paper takes a closer look at

the global response to the financial crisis. It reveals a more optimistic picture. Despite initial shocks that

were actually more severe than the 1929 financial crisis, global economic governance structures responded quickly and robustly. Whether one measures results by economic outcomes, policy outputs, or institutional flexibility, g lobal e conomic g overnance has displayed surprising resiliency since 2008. Multilateral economic institutions performed well in crisis situations to reinforce open economic policies, especially in contrast to the 1930s. While there are areas where

governance has either faltered or failed, on the whole, the system has worked. Misperceptions about global economic governance persist because the Great Recession has disproportionately affected the core economies – and because the efficiency of past periods of global economic governance has been badly overestimated. Why the system has worked better than expected remains an open question. The rest of this paper explores the possible role that the distribution of power, the robustness of international regimes, and the resilience of economic ideas might have played.

Page 31: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Competitiveness DefenseEmerging economies make competitiveness decline inevitable and alt causes overwhelm their internal link CNN ‘10[http://finance.fortune.cnn.com/2010/10/01/david-rubenstein-u-s-is-losing-its-competitive-edge/?section=magazines_fortune]

Ever since China's economy surpassed Japan's this past summer, speculation has escalated over when the country might take over the United States as the world's largest. The estimate has ranged from 2030 to 2035, the latter date being the one Carlyle Group co-founder David Rubenstein highlighted at a forum Wednesday in Washington DC of some of the

day's biggest newsmakers.¶ Rubenstein says the U.S. faces the harsh possibility of losing some of its competitive edge amid the rapid rise of emerging economics – in particular, China . The U.S. overwhelmingly dominates the private equity and

venture capital industries worldwide, the prominent investor notes. China and other emerging economies have become eager players and companies such as private equity firm Carlyle have increasingly been spending more time in these regions. To date,

Carlyle has invested $3 billion in China, he says.¶ But several pressing factors are threatening America's competitive edge . Rubenstein lists huge deficits and government debt, high unemployment, and widening income disparities. ¶ His remarks echo what other business executives have said recently. In a report released by the World Economic Forum in

August, the U.S. slipped a notch down the ranks of competitive economies – falling behind Sweden and Singapore, which rose to the No. 1 and No. 2 spots, respectively.¶ The report, which combines economic data and a survey of more than 13,500 business

executives, commended the U.S. for its innovation, excellent universities and flexible labor market. But what has hurt America's competitiveness, in particular, is the country's huge deficits and rising government debt. While China ranked far below the U.S. at No. 27, the Asian powerhouse outperformed all major developing economies.¶ "We have to recognize as Americans that we're not going to be as dominant a force in the global economy as we have been," Rubenstein says, adding that unless the U.S. lowers its debts and deficits, improves joblessness and narrow widening income gaps, future generations will have a lower quality and less affluent lifestyle.¶ Facing the inevitable decline¶ Rubenstein couldn't have been more straight-to-the point about the depths of America's economic turmoil. But perhaps more important, he

points out that it's virtually inevitable that China and even India might eventually surpass the U.S. economy – simply because they're just bigger , not necessarily richer.

No risk of economy or heg decline - competitiveness theory is falseFallows ‘10[James, correspondent for The Atlantic Monthly, studied economics at Oxford University as a Rhodes Scholar. He has been an editor of The Washington Monthly and of Texas Monthly, and from 1977 to 1979 he served as President Jimmy Carter's chief speechwriter. His first book, National Defense, won the American Book Award in 1981; he has written seven others. “How America Can Rise Again”, Jan/Feb edition, http://www.theatlantic.com/doc/201001/american-decline]

This is new. Only with America’s emergence as a global power after World War II did the idea of American “decline” routinely involve falling behind someone else. Before that, it meant falling short of expectations—God’s, the Founders’, posterity’s—or of the previous virtues of America in its lost, great days. “The new element in the ’50s was the constant

comparison with the Soviets,” Michael Kazin told me. Since then, external falling-behind comparisons have become not just a

staple of American self-assessment but often a crutch. If we are concerned about our schools, it is because children are learning more in Singapore or India; about the development of clean-tech jobs, because it’s happening faster in China. ¶ Having often lived outside the United States since the 1970s, I have offered my share of falling-behind analyses, including a book-length comparison of Japanese and American strengths (More Like Us) 20 years ago. But at this point in America’s national life cycle, I think the exercise is largely a distraction, and that

Americans should concentrate on what are, finally, our own internal issues to resolve or ignore. ¶ Naturally there are lessons to draw from other countries’ practices and innovations; the more we know about the outside world the better,

Page 32: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

as long as we’re collecting information calmly rather than glancing nervously at our reflected foreign image. For instance, if you have spent any time in places where tipping is frowned on or rare, like Japan or Australia, you view the American model of day-long small bribes, rather than one built-in full price, as something similar to baksheesh, undignified for all concerned. ¶ Naturally, too, it’s easier to draw attention to a domestic problem and build support for a solution if you cast the issue in us-versus-them terms, as a response to an outside threat. In If We Can Put a Man on the Moon …, their new book about making government programs more effective, William Eggers and John O’Leary emphasize the military and Cold War imperatives behind America’s space program. “The race to the moon was a contest between two systems of government,” they wrote, “and the question would be settled not by debate but by who could best execute

on this endeavor.” Falling-behind arguments have proved convenient and powerful in other countries, too. ¶ But whatever their popularity or utility in other places at other times, falling-behind concerns seem too common in America

now. As I have thought about why overreliance on this device increasingly bothers me, I have realized that it’s because my latest stretch out of the country has left me less and less interested in whether China or some other country is “overtaking” America. The question that matters is not whether America is “falling behind” but instead

something like John Winthrop’s original question of whether it is falling short—or even falling apart. This is not the mainstream

American position now, so let me explain. ¶ First is the simple reality that one kind of “decline” is inevitable and therefore not worth worrying about. China has about four times as many people as America does. Someday its economy will be larger than ours. Fine! A generation ago, its people produced, on average, about one-sixteenth as much as Americans did; now they produce about one-sixth. That change is a huge achievement for China—and a plus rather than a minus for everyone else, because a business-minded China is more benign than a miserable or rebellious one. When the Chinese produce one-quarter as much as

Americans per capita, as will happen barring catastrophe, their economy will become the world’s largest. This will be good for them but will not mean “falling behind” for us. We know that for more than a century, the consciousness of decline has been a blight on British politics, though it has inspired some memorable, melancholy literature. There is no reason for America to feel depressed about the natural emergence of China, India, and others as world powers. But second, and more important, America may have reasons to feel actively optimistic about its prospects in purely relative terms. ¶ The Crucial American Advantage ¶ Let’s start with the more modest claim, that China has ample reason to worry about its own future. Will the long-dreaded day of reckoning for Chinese development finally arrive because of environmental disaster? Or via the demographic legacy of the one-child policy, which will leave so many parents and grandparents dependent on so relatively few young workers? Minxin Pei, who grew up in Shanghai and now works at Claremont McKenna College, in California, has predicted in China’s Trapped Transition that within the next few years, tension between an open economy and a closed political system will become unendurable, and an unreformed Communist bureaucracy will finally drag down economic performance. ¶

America will be better off if China does well than if it flounders. A prospering China will mean a bigger world economy with more opportunities and probably less turmoil — and a China likely to be more cooperative on

environmental matters. But whatever happens to China, prospects could soon brighten for America. The American culture’s particular strengths could conceivably be about to assume new importance and give our economy new pep. International networks will matter more with each passing year. As the one truly universal nation, the United States continually refreshes its connections with the rest of the world—through languages, family, education, business—in a way no other nation does , or will. The countries that are comparably open—Canada, Australia—aren’t nearly as large; those whose economies are comparably large—Japan, unified Europe, eventually China or India—aren’t nearly as open. The simplest measure of whether a culture is dominant is whether outsiders want to be part of it. At the height of the British Empire, colonial subjects from the Raj to Malaya to the Caribbean modeled themselves in part on Englishmen: Nehru and Lee Kuan Yew went to Cambridge,

Gandhi, to University College, London. Ho Chi Minh wrote in French for magazines in Paris. These days the world is full of businesspeople, bureaucrats, and scientists who have trained in the United States. ¶ Today’s China attracts outsiders too, but in a particular way. Many go for business opportunities; or because of cultural fascination; or, as my wife and I did, to be on the scene where something truly exciting was under way. The Haidian area of Beijing, seat of

its universities, is dotted with the faces of foreigners who have come to master the language and learn the system. But true immigrants? People who want their children and grandchildren to grow up within this system? Although I met many foreigners who hope to stay in China indefinitely, in three years I encountered only two people who aspired to citizenship in the People’s Republic. From the physical rigors of a badly polluted and still-developing country, to the constraints on free expression and dissent, to the likely ongoing mediocrity of a university system that emphasizes volume of output over independence or excellence of research, the realities of China heavily limit the appeal of becoming Chinese. Because of its scale and internal diversity, China (like India) is a more racially open society

Page 33: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

than, say, Japan or Korea. But China has come nowhere near the feats of absorption and opportunity that make up much of America’s story, and it is very difficult to imagine that it could do so—well, ever. ¶ Everything we know about future industries and technologies suggests that they will offer ever-greater rewards to flexibility, openness, reinvention, “crowdsourcing,” and all other manifestations of individuals and groups keenly attuned to their surroundings. Everything about American society should be hospitable toward those traits—and should foster them better and more richly than other societies can. The American advantage here is broad and atmospheric, but it also depends on two specific policies that, in my view, are the absolute pillars of American strength: continued openness to immigration, and a continued concentration of

universities that people around the world want to attend. ¶ Maybe I was biased in how I listened, but in my interviews, I thought I could tell which Americans had spent significant time outside the country or working on international “competitiveness” issues. If they had, they predictably emphasized those same two elements of long-term American advantage. “My favorite statistic is that one-quarter of the members of the National Academy of Sciences were born abroad,” I was told by Harold Varmus, the president of the Memorial Sloan-Kettering Cancer Center and himself

an academy member (and Nobel Prize winner). “We may not be so good on the pipeline of producing new scientists, but the country is still a very effective magnet.” ¶ “We scream about our problems, but as long as we have the immigrants, and the universities, we’ll be fine,” James McGregor, an American businessman and author who has lived in China for years, told me. “I just wish we could put LoJacks on the foreign students to be sure they stay.” While, indeed, the United States benefits most when the best foreign students pursue their careers here, we come out ahead even if they depart, since they take American contacts and styles of thought with them. Shirley Tilghman, a research biologist who is now the president of Princeton, made a similar point more circumspectly. “U.S. higher education

has essentially been our innovation engine,” she told me. “I still do not see the overall model for higher education anywhere else that is better than the model we have in the United States, even with all its challenges at

the moment.” Laura Tyson, an economist who has been dean of the business schools at UC Berkeley and the University of London, said, “It can’t be a coincidence that so many innovative companies are located where they are”—in California, Boston, and other university centers. “There is not another country’s system that does as well—although

others are trying aggressively to catch up.” ¶ Americans often fret about the troops of engineers and computer scientists marching out of Chinese universities. They should calm down. Each fall, Shanghai’s Jiao Tong University produces a ranking of the world’s universities based mainly on scientific-research papers. All such rankings are

imprecise, but the pattern is clear. Of the top 20 on the latest list, 17 are American, the exceptions being Cambridge (No. 4), Oxford (No. 10), and the University of Tokyo (No. 20). Of the top 100 in the world, zero are Chinese. ¶ “On paper, China has the world’s largest higher education system, with a total enrollment of 20 million full-time tertiary students,” Peter Yuan Cai, of the Australian National University in

Canberra, wrote last fall. “Yet China still lags behind the West in scientific discovery and technological innovation.” The obstacles for Chinese scholars and universities range from grand national strategy—open economy, closed

political and media environment—to the operational traditions of Chinese academia. Students spend years cramming details for memorized tests; the ones who succeed then spend years in thrall to entrenched professors. Shirley Tilghman said the modern American model of advanced research still shows the influence of Vannevar Bush, who directed governmental science projects during and after World War II. “It was his very conscious decision to get money into young scientists’ hands as quickly as possible,” she said. This was in contrast to the European “Herr Professor” model, also prevalent in Asia, in which, she said, for young scientists, the “main opportunity for promotion was waiting for their mentor to die.” Young Chinese, Indians, Brazilians, Dutch know they will have opportunities in American labs and start-ups they could not have at home. This will remain America’s advantage, unless we throw it away.

Page 34: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Internal Link

Page 35: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Alt causesAlt cause to tech innovation declineJayakumar 14 (Amrita Kayakumar, staff writer, quoting Frank Kendall, the Pentagon’s top official, on military innovation, “Kendall, Hagel stress innovation to maintain military superiority” , http://www.washingtonpost.com/business/capitalbusiness/kendall-hagel-stress-innovation-to-maintain-military-superiority/2014/09/04/8a10e984-3464-11e4-a723-fa3895a25d02_story.html ekr)

Kendall offered few details about the program, which he plans to elaborate on soon. In his speech, Kendall said the acquisition process, which has been blamed for slowing down the pace of government programs, was not as big a concern as investment in new technology, especially in light of foreign competition. Russia and China are “building things that are designed to be effective against the power projection capabilities of

the United States and of our allies,” he said. “And they’re doing a reasonably good job of it, particularly China.” The shrinking defense budget and cuts to research and development in particular are a source of deep concern to him , Kendall

said. Such cuts were tantamount to “delaying modernization,” he said. “As we delay modernization, we basically lose the

time that it takes us to get things into the force,” he said. Kendall also added that the Pentagon’s budget would try and invest more in technology that moves capabilities forward. His remarks echoed those of Defense Secretary Chuck Hagel, who

spoke on the same topic at a conference in Rhode Island Wednesday. Hagel also stressed the need for American companies to innovate in order to keep pace with the rest of the world. “We cannot assume, as we did in the 1950s and ’70s, that the Department of Defense will be the sole source of key breakthrough technologies,” he said.

Page 36: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

reverse framingOverall economic stagnation frames tech competitiveness not the other way aroundPorter and Rivkin 12 (Michael Porter is an economist, researcher, author, advisor, speaker and teacher. Throughout his career at Harvard Business School, he has brought economic theory and strategy concepts to bear on many of the most challenging problems facing corporations, economies and societies, including market competition and company strategy, economic development, the environment, and health care. His extensive research is widely recognized in governments, corporations, NGOs, and academic circles around the globe. His research has received numerous awards, and he is the most cited scholar today in economics and business. Jan W. Rivkin is the Senior Associate Dean for Research and a Professor in the Strategy Unit at Harvard Business School. His research, course development, and teaching efforts examine the interactions across functional and product boundaries within a firm – that is, the connections that link marketing, production, logistics, finance, human resource management, and other parts of a firm. //From the March 2012 edition of the Harvard Business Review, “The Looming Challenge to U.S. Competitiveness” https://hbr.org/2012/03/the-looming-challenge-to-us-competitiveness)

This erosion reflects troubling trends in many of the factors that underpin U.S. competitiveness. This set of factors, as identified in the work of Michael Porter,

Mercedes Delgado, Christian Ketels, and Scott Stern, includes macro and micro components. From a macro perspective, a competitive nation requires sound monetary and fiscal policies (such as manageable government debt levels), strong human development (good health care and K–12 education systems), and effective political institutions (rule of law and effective law-making bodies). Macro foundations create the potential for long-term productivity, but actual productivity depends on the microeconomic conditions that affect business itself. A competitive nation exhibits a

sound business environment (including modern transport and communications infrastructure, high-quality research institutions, streamlined regulation, sophisticated local consumers, and effective capital markets ) as well as strong clusters of firms and supporting institutions in particular fields, such as information technology in Silicon Valley and energy in Houston. Competitive nations

develop companies that adopt advanced operating and management practices. In a large country like the U.S., many of the most important drivers of competitiveness rest at the regional and local levels, not the national level. Though federal policies surely matter, microeconomic drivers tied to regions—such as roads, universities, pools of talent, and cluster specialization—are crucial. Assessing

the U.S. through this lens, we see significant cracks in its economic foundations, with particularly troubling deterioration in macro

competitiveness. Problems include levels of government debt not seen since World War II; health care and primary education systems whose results are neither world-class nor reflective of the large sums spent on them; and a polarized and often paralyzed political system (especially at the federal level) that makes decisions only when facing a crisis. In micro competitiveness, eroding skills in the workplace, inadequate physical infrastructure, and rising regulatory complexity increasingly offset traditional strengths such as innovation and entrepreneurship. Our HBS alumni survey provided an original and timely assessment of overall competitiveness and the strengths and weaknesses of the U.S. The findings were

sobering. (See the chart “Evaluating the U.S. Business Environment,” in the article “Choosing the United States,” HBR March 2012.) Respondents perceived the United States as already weak and in decline with respect to a range of important factors: the complexity of the national tax code, the effectiveness of its political system, basic education, macroeconomic policies, and regulation. Some current American strengths, such as logistics and communications infrastructure and workforce skill levels, were seen as declining. America’s unique strengths in entrepreneurship, higher education, and management quality were intact, but these strengths must overcome growing weaknesses in many other areas.

Page 37: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

tech not key to competitivenessTechnological innovation isn’t key to competition—the Halo effect is falseEasterly 9 - William Easterly is Professor of Economics at New York University and Co-director of the NYU Development Research Institute, which won the 2009 BBVA Frontiers of Knowledge in Development Cooperation Award. (“Tiger Woods thoughtfully explodes “Halo Effect” myth in development,” http://aidwatchers.com/2009/12/tiger-woods-thoughtfully-offers-to-explode-%E2%80%9Chalo-effect%E2%80%9D-myth-in-development/ 12/14/2009) STRYKER

Our expectation that celebrities will be model citizens, contrary to vast evidence, is based on the Halo Effect. The Halo Effect is the idea that someone that is really, really good at one thing will also be really good at other things. We thought because Tiger was so good at being a golfer, he also must be very good at to have and to hold, forsaking all others, keeping thee only unto her as long as you both shall live… What Tiger considerately did for our education was to show how

the Halo Effect is a myth. This blog has a undying affection for those psychological foibles that cause us to strongly believe in mythical things, and the Halo Effect is a prime example (and the subject of a whole book on its destructive effects in business.) Why would marital fidelity and skillful putting have any correlation? OK fine and good, but many of you are asking: What the Vegas Cocktail Waitress does this have to do with development? The Halo Effect was discussed in a previous blog, but when assaulting psychological biases, you can never repeat the attack enough. Not to mention that we all remember the psychology

literature more easily when illustrated by a guy with 10 mistresses. So if we observe a country is good at say, technological innovation, we assume that this country is also good at other good things like, say, visionary leadership , freedom from corruption, and a culture of trust. Since the latter three are imprecise to measure (and the measures themselves may be contaminated by the Halo Effect), we lazily assume they are all good. But actually, there are plenty of examples of successful innovators with mediocre leaders, corruption, and distrustful populations. The US assumed world technological leadership in the late 19th century with presidents

named Chester Arthur and Rutherford B. Hayes, amidst legendary post-Civil War graft. Innovators include both trusting Danes and suspicious Frenchmen. The false Halo Effect makes us think we understand development more than we really do, when we think all good things go

together in the “good” outcomes. The “Halo Effect” puts heavy weight on some explanations like “visionary leadership” that may be spurious . More subtly, it leaves out the more complicated cases of UNEVEN determinants of success: why is New York City the world’s premier city, when we can’t even manage decent airports (with 3 separate failed tries)? The idea that EVERYTHING is a necessary condition for development is too facile . The principles of specialization and comparative advantage suggest you DON”T have to be good at everything all the time.

Page 38: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

2NC—Tech Industry

Page 39: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Uniqueness

Page 40: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Ext—Current Freedom Act SolvesNew Freedom Act is sufficient to solve US’s global credibility gap. HRW ‘15

Human Rights Watch is an independent, international organization that works as part of a vibrant movement to uphold human dignity and advance the cause of human rights for all. “Strengthen the USA Freedom Act” - May 19, 2015 - http://www.hrw.org/news/2015/05/19/strengthen-usa-freedom-act

As the Senate considers the USA Freedom Act this week, policymakers should strengthen it by limiting large-scale collection of records and reinforcing transparency

and carrying court reforms further. The Senate should also take care not to weaken the bill, and should reject any amendments that would require companies to retain personal data for longer than is necessary for business purposes. It has been two years since the National Security Agency (NSA) whistleblower Edward Snowden unleashed a steady stream of documents that exposed the intention by the United States and the United Kingdom to “collect it all” in the digital age. These revelations demonstrate how unchecked surveillance can metastasize and undermine democratic

institutions if intelligence agencies are allowed to operate in the shadows, without robust legal limits and oversight. On May 13 , the US House of

Representatives approved the USA Freedom Act of 2015 by a substantial margin. The bill represents the latest attempt by Congress to rein in one of the surveillance programs Snowden disclosed—the NSA’s domestic bulk phone metadata collection under Section 215 of the USA Patriot Act. The House vote followed a major rebuke to the US government by the US Court of Appeals for the Second Circuit, which ruled on May 7 that the NSA’s potentially nationwide dragnet collection of phone records under Section 215 was unlawful. Section 215 is set to expire on June 1 unless Congress acts to extend it or to preserve specific powers authorized under the provision, which go beyond collection of phone records. Surveillance reforms are long overdue and can be accomplished while protecting US citizens from serious security threats. Congress and the Obama administration should end all mass surveillance programs, which unnecessarily and disproportionately intrude on the privacy of hundreds of millions of people who are not linked to wrongdoing. But reforming US laws and reversing an increasingly global tide of mass surveillance will not be easy. Many of the programs Snowden revealed are already deeply entrenched, with billions of dollars of infrastructure, contracts, and personnel invested. Technological capacity to vacuum up the world’s communications has outpaced existing legal frameworks meant to protect privacy. The Second Circuit opinion represents an improvement over current law because it establishes that domestic bulk collection of phone metadata under Section 215 of the Patriot Act cannot continue. Section 215 allows the government to collect business records, including phone records, that are “relevant” to an authorized investigation. The court ruled that the notion of “relevance” could not be stretched to allow intelligence agencies to gather all phone records in the US. However, the opinion could be overturned and two other appeals courts are also considering the legality of the NSA’s bulk phone records program. The opinion also does not address US surveillance of people not in the US. Nor does it question the underlying assumption that the US owes no privacy obligations to people outside its territory, which makes no sense in the digital age and is inconsistent with human rights law requirements. Even if the Second Circuit opinion remains good law, congressional action will be necessary to address surveillance programs other than Section 215—both domestic and those affecting people outside the US—and to create more robust institutional safeguards to prevent future abuses. The courts cannot bring about reforms to increase oversight and improve institutional

oversight on their own. Human Rights Watch has supported the USA Freedom Act because it is a modest, if incomplete, first step down the long road to reining in the NSA excesses. Beyond ending bulk records collection, the bill would begin to reform the secret Foreign Intelligence Surveillance Act (FISA) Court, which oversees NSA surveillance, and would introduce new transparency measures to improve oversight. In passing the bill, the House of Representatives also clarified that it intends the bill to be consistent with the Second Circuit’s ruling, so as to not weaken its findings.

The bill is no panacea and, as detailed below, would not ensure comprehensive reform. It still leaves open the possibility of large-scale data collection practices in the US under the Patriot Act. It does not constrain surveillance under Section 702 of the FISA Amendments Act nor Executive Order 12333, the primary legal authorities the government has used to justify mass surveillance of people outside US borders. And the bill does not address many modern surveillance capabilities, from mass cable tapping to use of malware, intercepting all mobile calls in a country, and compromising the security of mobile SIM

cards and other equipment and services. Nonetheless, passing a strong USA Freedom Act would be a long-overdue step in the right direction. It would show that Congress is willing and able to act to protect privacy and impose oversight over intelligence agencies in an age when technology makes ubiquitous surveillance possible. Passing this bill would also help shift the debate in the US and globally and would distance the U nited S tates from other countries that seek to make mass surveillance the norm. On a global level, other governments may already be emulating the NSA’s approach, fueling an environment of impunity for mass violations of privacy. In the last year, France, Turkey, Russia, and other countries have passed legislation to facilitate or expand large-scale surveillance. If the

USA Freedom Act passes, it would be the first time Congress has affirmatively restrained NSA activities since

the attacks of September 11. Key supporters of the bill have vowed to take up reforms to other laws next, including Section 702 of the FISA Amendments Act.

Page 41: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Internal Link

Page 42: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Ext—Alt CauseTons of insurmountable alt causes to competitivenessPorter and Rivkin 12 (Michael Porter is an economist, researcher, author, advisor, speaker and teacher. Throughout his career at Harvard Business School, he has brought economic theory and strategy concepts to bear on many of the most challenging problems facing corporations, economies and societies, including market competition and company strategy, economic development, the environment, and health care. His extensive research is widely recognized in governments, corporations, NGOs, and academic circles around the globe. His research has received numerous awards, and he is the most cited scholar today in economics and business. Jan W. Rivkin is the Senior Associate Dean for Research and a Professor in the Strategy Unit at Harvard Business School. His research, course development, and teaching efforts examine the interactions across functional and product boundaries within a firm – that is, the connections that link marketing, production, logistics, finance, human resource management, and other parts of a firm. //From the March 2012 edition of the Harvard Business Review, “The Looming Challenge to U.S. Competitiveness” https://hbr.org/2012/03/the-looming-challenge-to-us-competitiveness ekr)

To support the interpretation that America’s problems are cyclical, not structural, one could point to the facts that labor productivity has held up in America and corporate profits hit record highs in 2010. Unfortunately, that snapshot masks deeper signs of an incipient competitiveness problem—one that began before the

Great Recession and in some ways contributed to it. The problem shows up in a range of economic performance measures as well as in the trajectories of the underlying factors that drive competitiveness. Productivity. America’s long-

run rate of growth in labor productivity was strong relative to that of other advanced economies in the late 1990s and early 2000s, but it began to trail off before the financial crisis. Productivity has been sustained since the crisis largely by rising unemployment and falling workforce participation, ominous signs for U.S. competitiveness . Job creation. Even more unsettling is the country’s job-creation picture. Long-term growth in private-sector employment has dipped to historically low levels, a trend that started well before the Great Recession. (See the exhibit “Disappearing Job Growth.”) In industries

exposed to international competition, job growth has virtually stopped. Wages. American wages have been under pressure for more than a decade. In 2007, before the downturn, U.S. median household income stood below 1999 levels in real terms—and has fallen even more since. In the two decades prior to 2007, median income grew, but at an anemic annual rate of just 0.5%. Most affected have been middle- and lower-income workers, many of whom are more exposed to international competition today than ever before.

Page 43: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Impact

Page 44: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Ext—Economy DefenseEcon is resilientOliver ‘9Business columnist for the Star, a Canadian newspaper, “David Olive: Will the economy get worse?,” http://www.thestar.com/printArticle/598050, AM

Should we brace for another Great Depression ? No . The notion is ludicrous. Conditions will forever be such that the economic disaster that helped define the previous century will never happen again . So

why raise the question? Because it has suited the purposes of prominent folks to raise the spectre of a second Great Depression. Stephen Harper has speculated it could happen. Barack Obama resorted to apocalyptic talk in selling his economic stimulus package to the U.S. Congress. And British author Niall Ferguson, promoting his book on the history of money, asserts "there will be blood in the streets" from the ravages dealt by this downturn. Cue the famished masses' assault on a latter-day

Bastille or Winter Palace. As it happens, the current economic emergency Obama has described as having no equal since the

Great Depression has not yet reached the severity of the recession of 1980-82, when U.S. unemployment

reached 11 per cent. The negativism has become so thick that Robert Shiller was prompted to warn against it in a recent New York Times essay. Shiller, recall, is the Yale economist and author of Irrational Exuberance who predicted both the dot-com collapse of the late 1990s

and the likely grim outcome of a collapse in the U.S. housing bubble. Shiller worries that the Dirty Thirties spectre "is a cause of the current situation – because the Great Depression serves as a model for our expectations, damping

what John Maynard Keynes called our `animal spirits,' reducing consumers' willingness to spend and businesses' willingness to hire and expand. The Depression narrative could easily end up as a self-fulfilling prophecy."

Some relevant points, I think: LOOK AT STOCKS Even the prospects of a small -d depression – defined by most economists

as a 10 per drop in GDP for several years – are slim. In a recent Wall Street Journal essay, Robert J. Barro, a Harvard economist,

described his study of 251 stock-market crashes and 97 depressions in 34 nations dating back to the mid-19th century. He notes that only mild recessions followed the U.S. stock-market collapses of 2000-02 (a

42 per cent plunge) and 1973-74 (49 per cent). The current market's peak-to-trough collapse has been 51 per cent. Barro concludes the probability today of a minor depression is just 20 per cent, and of a major depression , only 2 per cent . LOOK AT JOBS NUMBERS In the Great Depression, GDP collapsed by 33 per cent, the jobless rate was

25 per cent, 8,000 U.S. banks failed, and today's elaborate social safety net of state welfare provisions did not exist. In the current downturn, GDP in Canada shrank by 3.4 per cent in the last quarter of 2008, and in the U.S. by 6.2 per cent. A terrible

performance, to be sure. But it would take another 10 consecutive quarters of that rate of decline to lose even the 10 per cent of GDP that qualifies for a small-d depression. Allowing that 1,000 economists laid end to end still wouldn't reach a conclusion, their consensus view is economic recovery will kick in next year, if not the second half of this

year. The jobless rate in Canada and the U.S. is 7.2 per cent and 8.1 per cent , respectively. Again , the consensus among experts is that a worst-case scenario for U.S. joblessness is a peak of 11 per cent . There have been no bank failures in Canada. To the contrary, the stability of Canadian banks has lately been acclaimed worldwide. Two of America's

largest banks, Citigroup Inc. and Bank of America Corp., are on government life support. But otherwise the rate of collapse of U.S. lenders outside of the big "money centre " banks at the heart of the housing-related financial crisis has been only modestly higher than is usual in recessionary times. LOOK AT INTERVENTIONS In the Great Depression, Herbert Hoover and R.B. Bennett, just prior to the appearance of the Keynesian pump-priming theories that would soon dominate modern economic management, obsessed with balanced budgets, seizing upon precisely the wrong cure. They also moved very slowly to confront a crisis with no precedent. (So did Japan's economic

administrators during its so-called "lost decade" of the 1990s.) Most earlier U.S. "panics" were directly tied to abrupt collapses in stock or commodity values not accompanied by the consumer-spending excesses of the Roaring Twenties and greatly

Page 45: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

exacerbated by a 1930s global trade war. Today, only right-wing dead-enders advance balanced budgets as a balm in this hour of economic emergency. In this downturn, governments from Washington to Ottawa to Beijing have been swift in crafting Keynesian stimulus packages. Given their recent legislative passage – indeed, Harper's

stimulus package awaits passage – the beneficial impact of these significant jolts is only beginning to be felt. And,

if one believes, as I long have, that this is a financial crisis – the withholding of life-sustaining credit from the economy by a crippled

global banking system – and not a crisis with origins on Main Street, then the resolution to that banking failure may trigger a much faster and stronger economic recovery than anyone now imagines. tune out the static It's instructive that there was much talk of another Great Depression during the most painful recession since World War II, that of 1980-82. Indeed, alarmist talk about global systemic collapses has accompanied just about every abrupt unpleasantness , including the Latin American debt crisis of the 1980s, the Mexican default in 1995, the Asian currency crisis of the late 1990s , financial havoc in Argentina early this decade,

and even the failure of U.S. hedge fund Long-Term Capital Management in the late 1990s. Modern economic

recoveries tend to be swift and unexpected . The nadir of the 1980-82 downturn, in August 1982, kicked off the greatest stock-market and economic boom in history. And no sooner had the dot-com and telecom wreckage been cleared away, with the Dow Jones Industrial Average bottoming out at 7,286 in October 2002, than the

next stock boom was in high gear. It reached its peak of 14,164 – 2,442 points higher than the previous high, it's worth noting – just five years later. look at the big picture Finally, the case for a sustained economic miasma is difficult to make. You'd have to believe that the emerging economic superpowers of China and India will remain for years in the doldrums to which they've recently succumbed ; that oil, steel, nickel, wheat and other commodities that only last year skyrocketed in price will similarly fail to recover, despite continued global population growth, including developing world economies seeking to emulate the Industrial Revolutions in China and South Asia.

US not key to global economy – decoupling is for realWassener 9Wassener, MSC in IR, 9—London School of Economics and Political Science, MSc , International Relations, Politics (Bettina, In Asia, a Derided Theory Returns, 1 July 2009, http://query.nytimes.com/gst/fullpage.html?res=9C0CEFDE163EF932A35754C0A96F9C8B63)

HONG KONG -- For a while, when the global economic crisis was at its worst, it was a dirty word that only the most provocative of analysts dared to use. Now, the D-word -- decoupling -- is making a comeback, and nowhere more so than in Asia. Put simply, the term refers to the theory that emerging countries -- whether China or Chile -- will become more independent of the ups and downs in the United States as their

economies become stronger and more sophisticated. For much of last year, the theory held up. Many emerging economies had

steered clear of investments that dragged down a string of banking behemoths in the West , and saw nothing like the turmoil that began to engulf the United States and Europe in 2007. But then, last autumn, when the collapse of Lehman Brothers caused the global financial system to convulse and consumer demand to shrivel, emerging economies around the world got caught in the downdraft, and the D-word became mud. Now, the tables are turning again, especially in Asia, where many emerging economies are showing signs of a stronger recovery than in the West. And economists here have begun to use the D-word in public once again.

''Decoupling is happening for real ,'' the chief Asia-Pacific economist at Goldman Sachs in Hong Kong , Michael Buchanan, said in a recent interview. Or as the senior Asia economist at HSBC, Frederic Neumann, said, ''Decoupling is not a dirty word.'' To be sure, the once sizzling pace of Asian economic growth has slowed sharply as exports to and investments from outside the region slumped. Across Asia, millions of people have lost their jobs as business drops off and companies cut costs and output. Asia is heavily dependent upon selling its products to consumers in the United States and Europe, and many executives still say a strong U.S. economy is a

prerequisite for a return to the boom of years past. Nevertheless, the theory of decoupling is back on the table. For the past couple of months, data from around the world have revealed a growing divergence between Western economies and those in much of Asia , notably China and India. The World Bank last week forecast that the economies of the euro zone and the United States would contract 4.5 percent and 3 percent, respectively, this year -- in sharp contrast to the 7.2 percent and 5.1

percent economic growth it forecasts for China and India. Forecasts from the Organization for Economic Cooperation and Development that were also published last week backed up this general trend. Major statistics for June, due

Page 46: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Wednesday, are expected to show manufacturing activity in China and India are on the mend. By contrast, purchasing managers' indexes for

Europe and the United States are forecast to be merely less grim than before but still show contractions. Why this diverging picture? The crisis hit Asia much later. While the U.S. economy began languishing in 2007, Asian economies were still doing well right up until the collapse of Lehman Brothers last September. What followed was a rush of stimulus measures -- rate cuts and government spending programs. In Asia's case, these came soon after things soured for the region; in the United

States, they came much later in the country's crisis. Moreover, developing Asian economies were in pretty good shape when the crisis struck. The last major crisis to hit the region -- the financial turmoil of 1997-98 -- forced governments in Asia to introduce overhauls that ultimately left them with lower debt levels, more resilient banking and regulatory systems and often large foreign

exchange reserves. Another crucial difference is that Asia, unlike the United States and Europe, has not had a banking crisis. Bank

profits in Asia have plunged and some have had to raise extra capital but there have been no major collapses and no bailouts. ''The single most important thing to have happened in Asia is that there has not been a banking crisis,'' said Andrew Freris, a regional strategist at BNP Paribas in Hong Kong. ''Asia is coming though this crisis with its banking system intact. Yes, some banks may not be

making profits -- but it is cyclical and not systemic.'' The lack of banking disasters also has meant that, unlike in Europe and the

United States, Asian governments have not had to spend cash to clean the balance sheets of faltering banks. In other words, all of the stimulus spending is going into the economy. The effect is greater and more immediate. Add to that the fact that companies and households in Asia are typically not burdened with the kind of debt that is forcing Americans and Europeans to cut back consumption and investment plans. Asians are generally big savers; those in developing nations have limited health care and pension systems to fall back on. So they put aside cash for retirement, sickness and their children's education, rather than maxing out multiple credit cards. Paul Schulte of Nomura said this difference was leading to a long-term shift.

Page 47: 1NC—Data Localization  Web view1NC—Data Localization. Uniqueness. Tech Industries Solve. Companies resolve perception link in the status quo. Kendrick 15 (Katharine Kendrick

Ext—Competitiveness Defense