danish agency for digitisation 020513

Post on 26-Jan-2015

115 Views

Category:

Technology

6 Downloads

Preview:

Click to see full reader

DESCRIPTION

 

TRANSCRIPT

New digital welfare (2011-2015)

Welfare technology is vital to future welfare services. In recent years, the public sector has invested in welfare technology and is now in a strong position to exploit IT and new technology more intensively to modernize and optimize public services such as schools, the health service and eldercare. Providing good service does not necessarily require a face-to-face meeting and, in many cases, digital solutions can provide citizens with a more modern and effective service.

Source: Digital Velfærd, Digitaliseringsstyrelsen

Privacy & Digital Identity

Anja Bechmann, Ass. Professor & Head of Digital Footprints Research Group, AU

Venue: Invited speaker for internal workshop at Danish Agency for Digitisation, Copenhagen, May 2, 2013

• 1. who am i, how do we investigate digital footprints?

• 2. Lessons learned: what is personal and what should be protected? How?

• 3. Danish Agency for Digitisation projects - discussion

who am I, what do we do?

• Digital Footprints is a research group (at Centre for Internet Research, Aarhus University, Denmark) interested in the data that users share, expose or trade when communicating through the internet. The research group is dedicated to collect, analyze and understand digital footprints, the character of the footprints, the context(s) they form and in which they are given, and the purpose of the individual/group for sharing, exposing and trading data.

• The Social Library (Danish Agency for Culture in collaboration with REDIA)

• Trust and Privacy on Facebook in DK & IT (in collaboration with Data-Mining group in Italy led by Matteo Magnani)

• Personal data sharing on Facebook among high school/college students (18-20) in DK & US (DigiHumLab Denmark)

• In preparation (un-financed): Optimizing digital identity online

My API Research Projects

ethno-mining is unique in its integration of ethnographic and data mining techniques.

This integration is carried out in iterative loops between the formation of interpretations of the data and the development of

processes for validating those interpretations. (…) here are two key characteristics of the iterative loops in ethno-mining.

First, they can be separated into three categories based on the amount of a priori knowledge used to find and validate

interpretations of the data. Second, the results of the iterative loops are frequently, although not exclusively, represented

in visualizations. Visualizations have two basic affordances: they can represent both quantitative and qualitative analyses,

and they exploit the visual system to support more comprehensive data analysis, particularly pattern finding and outlier

detection.

(…)

our method seeks to expose and explicitly address the selection biases in both qualitative and quantitative research methods

by checking them against one another. Ethno-mining extends its scrutiny of these biases beyond simply comparing the biases

embedded in standard qualitative and quantitative techniques. It does so by tightly integrating the techniques in loops,

generating mutually informed analysis techniques with complimentary sets of biases.

“Ethno-Mining: Integrating Numbers and Words from the Ground Up by R. Aipperspach, T. Rattenbury, A. Woodruff, K. Anderson, J. Canny, P. Aoki.

Source: Bechmann, A. Internet Profiling: The economy of data intraoperability on Facebook and Google, Mediekultur (submitted).

Lessons learned: what is personal and what should be protected? How?

Source: Digital Velfærd, Digitaliseringsstyrelsen

yes...

Source: Gerlitz & Helmond: The Like Economy: Social Buttons and the data-intensive web, New Media & Society, Online first, Feb 4, 2013.

Securing privacy on Facebook

Privacy filter

•the students studied use groups as primary privacy filter

Source: Bechmann, A. 2013. Managing the interoperable self, Nordmedia2013, Oslo August 7-11 (accepted).

I hate ... X

Shh....it’s a secret

“it’s because it’s our private space [danish: forum]. We cannot see each other every day so

this is a way of keeping up to date with

each other...with things not everyone should know...if you

have met a guy or if we are going to meet”

• Not personal data (on timeline): “I only post things that I am not ashamed of [fit self-image, ed.]and then I don’t care who sees it”

• Personal data: “sad things” (death) and “things that you do not want to be confronted with”, private address, face (pictures more personal than information) and account info (in US also drinking and religion)and not comment on historical data so that they resurface

Control and secure the digital identity

The students were very interested in seeing their own Facebook data as shown to third parties. They are annoyed by the lack of transparency in data flows to third-party apps that make them unable to manage their identity properly.

How does that correspond with legislative definition?

'personal data 'shall mean any information relating to an identified or identifiable natural person ('data subject'); an identifiable person is one who can be identified, directly or indirectly, in particular by reference to an identification number or to one or more factors specific to his physical, physiological, mental, economic, cultural or social identity;

Informed consent: 'the data subject's consent' shall mean any freely given specific and informed indication of his wishes by which the data subject signifies his agreement to personal data relating to him being processed.

Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data

• in particular want to protect:

• identifiable pictures

• respect of ‘do not profile/confront me with this info-tags’

• private address

The use of data

• what do they think of companies accessing data through apps (average 60 apps in sample)?

• (not oriented towards apps as companies - more as services or employers trying to profile them - not concerned)

• showing them what we draw-> “it is taken out of context...it is a little bit silly now [facerape prank, ed.]”

• things that they think are private should not be made public

• Understanding (transparency) leads to accept according to the users and when in doubt of what app does they avoid it (unless referred to a lot of times through friends) (according to themselves)

•Children of the data economy

http://youtu.be/N5WurXNec7E

Danish Agency for Digitisation projects - discussion

Case 1: Assignment of kindergarten

• Home address

• Work address x 2

• Workplace x 2

• Job description

• working hours x 2

• Education, interests, preferences

• ->more qualified offer for parents

• ->maybe prevent bad offers for more parents by correlating data

Case 2: NemID

• accessibility - security problems

• but what about:

• user micro-managing of the information connected to the authentication travelling with the user (NEMID)

• co-existing authentication systems or integration with private companies

• How should digital identity solution incorporate digital life in general?

Q’s for you• should digital identity solution (eg. NEMID)

incorporate digital life in general?

• what kind of demands for privacy do you consider in your solutions and strategies?

• how do you deal with accessibility? (platform, server, device, solution)

• how do you/we secure that consent is in fact informed when we know from statistics that people do not read the text/permissions? (privacy paradox)

• how do you make solutions sensitive to the needs and everyday life of the user?

THANK YOU.Anja Bechmann, Ass. Professor

Head of Digital Footprints Research Group, AU+45 5133 5138

anjabechmann@gmail.com // @anjabechmannDigitalfootprints.dk // @digifeet

(Visiting researcher from 1.8.12-1.8.13, DIKU Copenhagen)

top related