se 3330 homework 2: se code of ethics case study [15 points] requirement: group...

7
SE 3330 Homework 2: SE Code of Ethics Case Study [15 points] Here are two case studies from Googles ethical dilemmas in Europe and Facebooks most recent Cambridge Analystica case. Please see attached documents. Requirement: This is a group assignment. Choose the Google case if your team number is odd and the Facebook case if your team number is even. Please review lecture note 13 and the ACM Software Engineering Code of Ethics. Create a PowerPoint presentation with the following contents: 1. A brief overview of the case. 2. Related SE Code of Ethics. 3. How to apply the code to analyze the case. 4. Conclusions. During the class, instead of going over the code one more time in class, we will have a discussion session. I will randomly choose a team to present each case. The assignment will be graded based on both content and design. Make your presentation neat and concise. If you need additional explanations, put them as notes. Submit your presentation to K:\Courses\CSSE\shiy\se333\1Dropbox with the name GroupX_hw2.pptx. List all the team members involved on the title page of your presentation.

Upload: others

Post on 10-Aug-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: SE 3330 Homework 2: SE Code of Ethics Case Study [15 points] Requirement: group assignmentpeople.uwplatt.edu/~shiy/courses/se333/as2.pdf · 2018-04-23 · The assignment will be graded

SE 3330 Homework 2: SE Code of Ethics Case Study

[15 points]

Here are two case studies from Google’s ethical dilemmas in Europe and Facebook’s most recent

Cambridge Analystica case. Please see attached documents.

Requirement:

This is a group assignment. Choose the Google case if your team number is odd and the Facebook case

if your team number is even. Please review lecture note 13 and the ACM Software Engineering Code of

Ethics. Create a PowerPoint presentation with the following contents:

1. A brief overview of the case.

2. Related SE Code of Ethics.

3. How to apply the code to analyze the case.

4. Conclusions.

During the class, instead of going over the code one more time in class, we will have a discussion session.

I will randomly choose a team to present each case.

The assignment will be graded based on both content and design. Make your presentation neat and concise.

If you need additional explanations, put them as notes. Submit your presentation to

K:\Courses\CSSE\shiy\se333\1Dropbox

with the name GroupX_hw2.pptx. List all the team members involved on the title page of your presentation.

Page 2: SE 3330 Homework 2: SE Code of Ethics Case Study [15 points] Requirement: group assignmentpeople.uwplatt.edu/~shiy/courses/se333/as2.pdf · 2018-04-23 · The assignment will be graded

1

The Google in Europe Case

Copyright 2010 by

Robert N. Barger, Ph.D.

University of Notre Dame, Retired

One of the problems of being a multinational corporation with the world’s largest search engine is the danger of running afoul of privacy laws which are handled in different ways by different European nations. An example of this occurred when officials from 30 European nations adopted procedures in May of 2010 which could force Google Europe to modify its “Street View” software (software which presents 360 degree street level maps). The officials were worried that private images such as photos of individuals and vehicle license plates could be exposed to the public without their owners’ permission.1 In this example, and the two examples that follow, it should be remembered that law and ethics are not the same thing. Something can be legal but not ethical or, by the same token, it can be ethical but not legal. Confusion about assuming that law and ethics are identical sometimes occurs because a society usually tends to make laws which express the ethical ideals of the majority of its members. Another example involving Google in Europe occurred when three Google executives in Italy were sentenced in February of 2010 to six-month suspended sentences by an Italian judge for violation of Italian privacy laws. The judge stated in his opinion that the Internet was not an “unlimited prairie where everything is permitted and nothing can be prohibited.” He indicated that the Google executives made it possible for Google to profit from a video of an autistic boy being bullied by classmates. Not surprisingly, Google did not agree with the judge’s reasoning. It issued a statement saying: “This conviction attacks the very principles on which the Internet is built. If these principles are swept aside then the Web as we know it will cease to exist, and many of the economic, social, political and technological benefits it brings could disappear.” The video in question was viewed over 5,500 times in a two-month period and made it to the

Page 3: SE 3330 Homework 2: SE Code of Ethics Case Study [15 points] Requirement: group assignmentpeople.uwplatt.edu/~shiy/courses/se333/as2.pdf · 2018-04-23 · The assignment will be graded

2

top of Google Italy’s “most entertaining” video list. It was removed only after an Italian association representing people with Down syndrome - whose organizational name was mentioned in the video - complained to the police.2 A third example, this time involving monopoly charges against Google France, ended with the French Competition Authority officially declaring that Google was a monopoly acting in restraint of trade. It stated: “Google holds a dominant position on the advertising market related to online searches. Its search engine enjoys a wide popularity and currently totals around 90 percent of the Web searches made in France. Moreover, there are strong barriers to entry for this activity.”3 Google had done business with a French company named Navx, which sells a database which can be used to inform drivers of where the French police are likely to have radar traps in operation. Because Google’s business philosophy states, “You can make money without doing evil,”4 Google stopped doing business with Navx, apparently feeling that Navx was offering a product that indirectly supported law-breaking. As a result, people entering search terms like “radar trap” in French could no longer find Navx’s product through a Google search. The French Competition Authority ordered Google to resume offering its services to Navx and to formulate clear rules which could be understood in advance and were fair to all. It cited discrimination by Google in not taking ads from Navx while accepting ads from makers of global positioning devices who’s Web sites promote similar products. This paper has presented several situations in which Google Europe faces ethical dilemmas. Answer the following questions, explaining how you would resolve these dilemmas. Questions: 1. How would you resolve the conflict between having identifiable photos of individuals and license plates available on the Internet (in the context of “Street View” images) versus not having Street View projections available on the Internet?

Page 4: SE 3330 Homework 2: SE Code of Ethics Case Study [15 points] Requirement: group assignmentpeople.uwplatt.edu/~shiy/courses/se333/as2.pdf · 2018-04-23 · The assignment will be graded

3

2. How would you resolve the conflict between showing an autistic boy being bullied by his classmates on the Internet versus completely prohibiting the showing of such images? 3. How would you resolve the conflict between refusing to accept an ad on an Internet search engine that you believed supported law-breaking versus denying equal access to a company that wanted to place such an ad on the search engine? 1 http://www.bloomberg.com/news/2010-05-19/google-street-view-privacy-breach-probe-is-joined-by-spain-italy-france.html 2 http://www.nytimes.com/2010/04/13/business/global/13google.html?_r=1&ref=world 3 http://www.nytimes.com/2010/07/02/business/02norris.html 4http://www.google.com/corporate/tenthings.html

Page 5: SE 3330 Homework 2: SE Code of Ethics Case Study [15 points] Requirement: group assignmentpeople.uwplatt.edu/~shiy/courses/se333/as2.pdf · 2018-04-23 · The assignment will be graded

4/23/2018 Cambridge Analytica’s use of Facebook data was a ‘grossly unethical experiment’ - The Verge

https://www.theverge.com/2018/3/18/17134270/cambridge-analyticas-facebook-data-underscores-critical-flaw-american-electorate 1/3

The personal information of more than 50 million users was misusedBy Andrew Liptak @AndrewLiptak Mar 18, 2018, 8:44am EDT

US & WORLD TECH POLIT ICS

Cambridge Analytica’s use of Facebook data was a‘grossly unethical experiment’

45

Photo by Amelia Holowaty Krales / The Verge

On Friday, Facebook announced that it had suspended Strategic CommunicationLaboratories (SCL) and its political data analytics company, Cambridge Analytica, forviolating its Terms of Service, by collecting and sharing the personal information of up to50 million users without their consent. The incident is demonstrative of ways thatFacebook’s core business model — delivering individualized ads to users — can beexploited, while raising uncomfortable questions about how such data might have beenused to influence the 2016 presidential campaign.

Cambridge Analytica is owned in part by hedge fund billionaire Robert Mercer, and firstaided Senator Ted Cruz’s presidential campaign in 2015, before helping the Trumpcampaign in 2016. It promised to target voters’ “unconscious psychological biases,” byusing massive amounts of data develop personality profiles, which could then be used to

Page 6: SE 3330 Homework 2: SE Code of Ethics Case Study [15 points] Requirement: group assignmentpeople.uwplatt.edu/~shiy/courses/se333/as2.pdf · 2018-04-23 · The assignment will be graded

4/23/2018 Cambridge Analytica’s use of Facebook data was a ‘grossly unethical experiment’ - The Verge

https://www.theverge.com/2018/3/18/17134270/cambridge-analyticas-facebook-data-underscores-critical-flaw-american-electorate 2/3

create extremely specific ads. According to Vox, the Trump campaign broughtCambridge Analytica on in June 2016 to help with its digital operations, headed up byDonald Trump’s son-in-law, Jared Kushner. The campaign later named Steve Bannon —a former vice president of Cambridge Analytica — as campaign manager. CambridgeAnalytica employee Christopher Wylie recently passed information to The Observer anddescribed the company’s work as a “grossly unethical experiment,” and said that theyexploited Facebook to harvest the personal information of millions of people and “builtmodels to exploit what we knew about them and target their inner demons.”

The suspension of the two companies came a day before a pair of reports in the TheNew York Times and The Observer about how Cambridge Analytica obtained and usedthe personal information of 50 million users to design voter profiles to target politicaladvertising during the 2016 election. Facebook confirmed that the data came fromUniversity of Cambridge psychology professor Dr. Aleksandr Kogan, who created an appcalled “thisisyourdigitallife,” in 2015, which 270,000 people downloaded. The app gaveKogan permission to access information from the users’ accounts, as well as informationabout their friends.

Facebook says that “Kogan gained access to this information in a legitimate way andthrough the proper channels that governed all developers on Facebook at that time,” butthen passed the information to SCL / Cambridge Analytica. Former Cambridgeemployees revealed to the Times that the firms collected information on more than 50million users without their consent, which the company used as the foundation of “itswork on President Trump’s campaign in 2016.”

Upon discovering that the information was being misused, Facebook removed the appand asked for the information to be destroyed in 2016 when it discovered that Kogan hadhanded the information over to SCL / Cambridge Analytica, orders that both companiessay that they complied with. But The Observer says that “Facebook did not pursue aresponse when the letter initially went unanswered for weeks because Wylie wastraveling, nor did it follow up with forensic checks on his computers or storage.”Facebook also did not notify users whose data was used by the company, and disputedthe description of the incident as a “breach.”

A CAMBRIDGE ANALYTICA EMPLOYEE DESCRIBED THE COMPANY’S WORK AS A “GROSSLY UNETHICALEXPERIMENT”

Page 7: SE 3330 Homework 2: SE Code of Ethics Case Study [15 points] Requirement: group assignmentpeople.uwplatt.edu/~shiy/courses/se333/as2.pdf · 2018-04-23 · The assignment will be graded

4/23/2018 Cambridge Analytica’s use of Facebook data was a ‘grossly unethical experiment’ - The Verge

https://www.theverge.com/2018/3/18/17134270/cambridge-analyticas-facebook-data-underscores-critical-flaw-american-electorate 3/3

Furthermore, while Cambridge Analytica says that it destroyed the information inquestion, The New York Times reports that it “still possesses most of all of the trove.”Cambridge Analytica released a statement yesterday saying that it had deleted all of thedata, and that it’s working with Facebook to resolve the issue.

The company has been under scrutiny from government officials and regulators over itsrole in the US presidential election, as well as the UK’s Brexit campaign to leave theEuropean Union in 2015. In December 2017, The Wall Street Journal reported thatspecial counsel Robert Mueller asked that the company to turn over documentspertaining to the Trump campaign as part of his investigation into the role that Russiaplayed during the 2016 presidential election, while the House Intelligence Committeeinterviewed Cambridge Analytica Chief Executive Alexander Nix. Following Facebook’ssuspension of the two companies, Massachusetts attorney general Maura Healeyannounced on Twitter (via The Hill) that her office was launching an investigation, whileother lawmakers said that they would like to see Facebook CEO Mark Zuckerberg testifybefore congressional committees.

The revelations come after a rocky year for the social media company, which confirmedthat political ads from companies backed by the Russian government ran on the site,which were seen by more than 10 million people. Facebook says that the informationobtained by Kogan was accessed “in a legitimate way and through the proper channelsthat governed all developers on Facebook at that time,” and that it has “made significantimprovements in our ability to detect and prevent violations by app developers,” in thelast five years, requiring developers to justify the use of the data that they collect. But thisincident highlights a key feature of Facebook, to utilize personal information to deliverspecific advertising to individuals, and only goes to underscore a critical weakness in theAmerican electorate: that this information can not only be used to manipulate an election,but it can be obtained relatively easily, with few checks.

FACEBOOK REPORTEDLY DID NOT FOLLOW UP TO CONFIRM THAT THE DATA HAD BEEN DELETED