jonathan cave, university of warwick (plenary): agreeing to disagree about privacy: markets as...
DESCRIPTION
Network of Excellence Internet Science Summer School. The theme of the summer school is "Internet Privacy and Identity, Trust and Reputation Mechanisms". More information: http://www.internet-science.eu/TRANSCRIPT
Agreeing to Disagree About Privacy: Markets as Privacy, Identity and Trust Mechanisms
Jonathan Cave 13 August 2012
EINS – Internet Science Network of Excellence Summer School
Short version • Privacy is a relatively recent invention • New technologies challenge the underlying
assumptions – What is private? – Are data and identities the same thing? – How does privacy relate to risk and (a sense of) security?
• Easy answers are best avoided: – More privacy, trust, security are better – Privacy is dead – Identities should be strong and unique – Collect all the data you can, then work out how to protect
or re-use them
Internet Privacy and Identity, Trust and Reputation Mechanisms
2
Markets play a crucial role • Markets can protect or erode privacy – or change
the way it operates. This can lead to markets in privacy (protection or invasion) itself.
• Private data are increasingly valuable – this can lead to markets in personal data and information – Some of this value is created by use of PII, and should be
shared – Some is merely captured by technology or given away by
inattention – My data may say something about me, people like me or
you – Not everything of value needs to be protected by property
rights • Privacy of action is also important and may need
protection from behavioural markets Internet Privacy and Identity, Trust
and Reputation Mechanisms 3
Outline • Working definitions • A networked (abstract) view of rights regimes • Privacy as a human and/or economic right • Social mechanisms - rights in market settings • Privacy and markets • Topics for discussion
Internet Privacy and Identity, Trust and Reputation Mechanisms
4
WORKING DEFINITIONS OF PRIVACY
Internet Privacy and Identity, Trust and Reputation Mechanisms
5
Some essential building blocks • Privacy • Security • Identity • Trust • Technical tools: games, mechanism design,
networks, lattices, partitions
Internet Privacy and Identity, Trust and Reputation Mechanisms
6
Privacy of what?
Internet Privacy and Identity, Trust and Reputation Mechanisms
7
informational privacy
the right to be let alone
corporeal privacy spatial privacy
relational privacy
Privacy of action
Privacy…
• (inter) subjectivity – – my view, your view, others’
views – A regress: I think that you
think that… this is private
• Hidden in plain sight – private, invisible or beneath notice?
• Functions of privacy Internet Privacy and Identity, Trust
and Reputation Mechanisms 8 16/08/20
12
Privacy in the Information* Society * or “Knowledge” or “Belief”
Internet Privacy and Identity, Trust and Reputation Mechanisms
9
• ‘Protected space’ has evolved to include bodies, actions, history and judgements
• Privacy as a right or an interest – Privacy interests can be traded-off, sold or given away – Privacy rights are
• deeper; linked to self-control, -respect and –responsibility • limited for children, criminals, public figures • economic (FIPP) or ‘human’ (OECD)
• Privacy is also subjective – What infringes my privacy may be of no consequence to you – Actions relating to privacy may trigger conflicts or open dialogue
• Either view is contingent or uncertain. Things change, but – It is hard to claw back information – It may be equally hard to reveal it at a later date – Private information may involve opinion as well as fact
Privacy and publicity • We are all more or less public figures
– We cannot control what is known about ourselves – We do not carefully choose what to reveal – The collective judgement may be a stampede
• This may be self-fulfilling – ‘Give a dog a bad name…’ – Particularly true where collective judgement brings us into or out of
the public eye • Privacy may be protected by limiting access or flooding
observers • Privacy is perhaps most important as a societal mechanism
to – Let us act for ourselves – Provide respite and recovery – Provide a currency goodwill or trust – Give us a reason to be trustworthy
Internet Privacy and Identity, Trust and Reputation Mechanisms
10
Identity • Used and abused in ever more profound and ever more trivial
ways • Multiplies
– by design or otherwise – for good (compartmentalisation) or ill (accountability)
• Converges and coalesces through data-mining, persistence, sharing
• How many should we have; what pulls them together or apart?
• More identity is not always better: – Anonymous (cash) transactions are cheap – ID costs may deter good trades – Privacy and anonymity interests may limit ID – Reliance on (technical) ID may crowd out finer (character) judgement – Powerful ID is attractive and potentially corrupting – Opting out may become widespread – or impossible
• growing tensions between (relatively) unique physical identity and increasingly fragmented useful or effective legal and virtual identities
Internet Privacy and Identity, Trust and Reputation Mechanisms
11
Trust • If technologies and ‘new market’ institutions provide the warp of
the social fabric, trust provides the weft • Trust means different things to people, systems and
organisations • Trust is central to the relation of privacy and security:
– Customers must trust business security arrangements to safeguard their privacy
– Personal privacy and system security form coffer dams against attack
• Trust always involves an ‘incomplete contract’ – – Monitoring dissipates the savings of trust – Assurance (penalties) ≠ insurance (indemnities) – Reputation and identity are informal versions
• Trust and trustworthiness need to be appropriately matched Trustworthy Untrustworthy
Trusting Appropriate delegation, specialisation
Enforcement costs, costs of adverse incidents
Untrusting Excess contracting, monitoring costs; race-to-the-bottom.
Lost gains from trade, inappropriate risk allocation
Internet Privacy and Identity, Trust
and Reputation Mechanisms 12
Technical elements I • Games:
– Players, strategies, information, preferences, solution concepts – Non-cooperative, bargaining, cooperative
• Mechanism design: – solution concepts help us characterise outcomes of strategic
situations – mechanism design lets us design rules to favour desirable outcomes
• Networks – Often binary graphs (nodes connected by links, subsets of N2) – may
be necessary to consider n-ary networks (subsets of 2N); – Links have strength, direction, duration, state dependence, salience,
subjectivity – Links and nodes are dual – A topology (notion of closeness) with parameters (path length,
clustering, etc.) – Much network theory comes from electronics –emphasis on ‘shortest
paths’ and ‘nearest neighbours’ – clearly needs relaxation for privacy considerations
Internet Privacy and Identity, Trust and Reputation Mechanisms
13
Technical elements II • More networks
– Networks are layered (people, data, ideas, things,…) – Self-organised networks – Epistemic networks: ‘knows’ as links
• Lattices: – Partially-ordered sets – complete if GLB and LUB of any two elements
are in set – Tarski theorem: isotone functions on complete lattices have fixed
points
• Partitions: – Dividing a set into an exhaustive collection of disjoint subsets – Used to describe information (subsets are ‘events’), rights (below) – Partitions make a lattice – agreeing to disagree as an example of
Tarski
• Models of communication, association, behaviour and the propagation of risk Internet Privacy and Identity, Trust
and Reputation Mechanisms 14
PRIVACY AS A HUMAN AND/OR ECONOMIC RIGHT
Internet Privacy and Identity, Trust and Reputation Mechanisms
15
Transatlantic and intergenerational tussle • EU version – privacy as fundamental human right
– Primarily data protection – Inalienable, with emphasis on consent (‘cookie law’) – Right to be forgotten
• US version – privacy as economic right – Opting in/out – Personalised or class profiling – Three-party involvement
• Tussle – mines in the “Safe Harbo(u)r” • Consequence – neither human right nor economic value are
protected • Other issues
– Government involvement – Impact of national security, crime prevention, anti-terrorism
• ACTA and DPI as a special case Internet Privacy and Identity, Trust
and Reputation Mechanisms 16
A NETWORKED (ABSTRACT) VIEW OF RIGHTS REGIMES
Internet Privacy and Identity, Trust and Reputation Mechanisms
17
A suggested framework for rights regimes • Rights may be seen as a lattice
– Based on a partition into ‘equivalent’ situations or outcomes – Partially ordered by inclusion (finer distinctions)
• This creates a mechanism for communication and negotiation – A language L to map a (set of) situations E into public action or
utterance L(E) – First round – all parties form their judgement and ‘do their thing’ – Second round – each party refines his judgement based on what
others have done, leading to a (finer) posterior – Process converges by Tarski to a common knowledge consensus
– Union-consistency: !∩ !↑′ =∅ %&' ((!)=((!↑′ ) +ℎ-& ((!)=((!∪ !↑′ )
– If the language is union consistent, agreeing to disagree is impossible
• Can be applied to options and outcomes • Public language – right to act • Further partial order: preference over actions. Internet Privacy and Identity, Trust
and Reputation Mechanisms 18
SOCIAL MECHANISMS - RIGHTS IN MARKET SETTINGS
Internet Privacy and Identity, Trust and Reputation Mechanisms
19
Rights as property rights • Personal data and actions have externalities – this
can lead to market failures • Individual property rights can
– Prevent encroachment (even if non-transferable) – Facilitate trades and bargains – Encourage optimal use of information – Produce strategic manipulation and distortion (‘acting out’)
• Collective property rights may be needed – Informational commons – Jointly private data (esp. of transactions) – Conventions and rules
• Bundling and unbundling are vital Internet Privacy and Identity, Trust
and Reputation Mechanisms 20
Privacy preferences and markets • Use of personal profiling for targeted third-party
monetisation • What information is ‘sensitive’? • Is data privacy linked to PII? • Business models
– Harvest and resell behavioural data that reveals preferences – Mine and recombine stored profile information – ‘Nudge’ users into preferred actions – Sell ID theft and other forms of privacy protection – Privacy intermediaries – Privacy as a ‘local’ public good or a social construct – Privacy as an asset (with derivatives) – PITs and PETs Internet Privacy and Identity, Trust
and Reputation Mechanisms 21
Efficiency • Selection vs. incentives – who should bear the risks
and costs of privacy protection? – Balance, power to act, preferences, risk aversion,
resilience – Repudiation and re-issue – Protecting people from themselves
• Unintended (behavioural) consequences – Cynicism, paranoia, opportunism leading to poor data,
absent data, crime – Privacy as a social construct – Crowding in and crowding out – Changes to accountability, responsibility and transparency
Internet Privacy and Identity, Trust and Reputation Mechanisms
22
PRIVACY AND MARKETS
Internet Privacy and Identity, Trust and Reputation Mechanisms
23
Privacy affects the functioning of many markets • Example – open, closed and discretionary order
books in financial markets (how to interpret trade data)
• Trust in automated transactions systems • Exploiting asymmetries of trust – revelation of
private data as a trust-enhancing mechanism • Strong differences in national and cultural attitudes • Mutually assured identity
Internet Privacy and Identity, Trust and Reputation Mechanisms
24
Some peculiarities of the market environment • Network effects and interoperability:
– tipping equilibrium (“Winner takes all”) – Excess volatility or excess inertia – Norms and conventions (cohesion vs. contagion)
• Security economics (hard shells and soft centres) • Reluctance to exchange good and bad information • IPR and standards • Legal liability and public confidence • The importance of the public sector
– Large-scale public procurement and launching customers – Support for innovation and standardisation – Direct, self- and co-regulation
• ‘Splitting’ between the Wild West and the Walled Garden • Two bad puns:
– Trust and anti-trust – Security and securities
Internet Privacy and Identity, Trust and Reputation Mechanisms
25
SOME EXAMPLES
Internet Privacy and Identity, Trust and Reputation Mechanisms
26
An example: biometrics and privacy • Strength of biometrics can threaten privacy
– Unauthorised or invasive use of data, indirect abuse of personal data – Inappropriate or non-consensual identification, denial of identity services – Even if all the data are accurate, portions may give misleading or invasive
impressions – May give away too much – People may not be careful enough – they certainly don’t seem to be.
• Biometrics can also enhance privacy – Mutual identification to limit inappropriate access – May remove need for more invasive data gathering – Protection through weakness
• Limited scalability • Lack of interoperability standards • Proprietary interest in collected data • Need for cooperation and consent
– Commercial interest in offering security of data and identity-based decisions – Technical tricks: cancellation, liveness tests, degradation – Strongly anonymised records
Internet Privacy and Identity, Trust and Reputation Mechanisms
27
Some examples
• CCTV cameras in public and shared private spaces – London has more than the whole of the US – Every aspect of life is watched by someone – Direct and measurable impacts – some perverse – Backed by technology (ANPRS, face recognition, voice analysis) – Linked to direct intervention – Blurred public-private boundaries
• Hoodies and Niqābs
• Biometrics
• DNA
• Data mashing and other ‘recombinant’ data uses
• Loyalty cards and commercial profiling
• Virtual worlds
Internet Privacy and Identity, Trust and Reputation Mechanisms
28
Some examples • CCTV cameras in public and shared private spaces
• Hoodies and Niqābs – Religious and ‘tribal’ group identities, or personal freedom? – Do we trust those who withhold their identities?
• In commercial spaces • In employment
– To what extent are they chosen? – To what extent does our reaction force their choice?
• Biometrics • DNA
• Data mashing and other ‘recombinant’ data uses
• Loyalty cards and commercial profiling
• Virtual worlds
Internet Privacy and Identity, Trust and Reputation Mechanisms
29
Some examples • CCTV cameras in public and shared private spaces
• Hoodies and Niqābs
• Biometrics – A pervasive ‘strong’ form of identity – perhaps too strong? – Merely physical identity – Can be used for identification, authentication and indexing – Confusion about technology and human factors
• Real vs. behavioural impacts • Type I, II and III errors
– Where is the private sector? – Privacy and utility currently protected by weaknesses in technology
• DNA • Data mashing and other ‘recombinant’ data uses
• Loyalty cards and commercial profiling • Virtual worlds
Internet Privacy and Identity, Trust and Reputation Mechanisms
30
Some examples • CCTV cameras in public and shared private spaces
• Hoodies and Niqābs • Biometrics
• DNA – Like biometrics, a link to the physical body – Unlike biometrics; persistent traces and durable ‘template’ – “Forensic” use (for legal and commercial decisions) – May indicate more than identity (health, capabilities, kinship) – Silent as to time, intent
• Data mashing and other ‘recombinant’ data uses • Loyalty cards and commercial profiling
• Virtual worlds
Internet Privacy and Identity, Trust and Reputation Mechanisms
31
Some examples • CCTV cameras in public and shared private spaces
• Hoodies and Niqābs • Biometrics
• DNA
• Data mashing and other ‘recombinant’ data uses – Refers to combination of data from different sources – Hard to reconcile with existing privacy protections - informed consent
in a networked world – Identification not necessary for privacy infringement – The liability and intellectual property issues are profound and
unsolved – Meanwhile, commercial and civil society development is racing ahead.
• Loyalty cards and commercial profiling
• Virtual worlds
Internet Privacy and Identity, Trust and Reputation Mechanisms
32
Some examples • CCTV cameras in public and shared private spaces
• Hoodies and Niqābs • Biometrics
• DNA
• Data mashing and other ‘recombinant’ data uses
• Loyalty cards and commercial profiling – Who owns personal data (recent security breaches)? – The practice is old; the power and scope are new – A change in the implied client-customer relation – Obstacle to search and competition or gateway to mass
personalisation?
• Virtual worlds
Internet Privacy and Identity, Trust and Reputation Mechanisms
33
Some examples • CCTV cameras in public and shared private spaces
• Hoodies and Niqābs • Biometrics
• DNA
• Data mashing and other ‘recombinant’ data uses
• Loyalty cards and commercial profiling
• Virtual worlds – From transaction spaces to social networks, Second Life and
MMORPGs – Mutual and proportionate identification – Who is the relevant person? – Delegated identity for transactional avatars – The closeness of chosen identities
Internet Privacy and Identity, Trust and Reputation Mechanisms
34
Governance of privacy and identity
BusinessCitizens
(consumers,communities,civil society)
Administrations
Technology
Internet Privacy and Identity, Trust and Reputation Mechanisms
35
TOPICS FOR DISCUSSION
Internet Privacy and Identity, Trust and Reputation Mechanisms
36
A warning from history • Business, government and civil society all have strong stakes, but start
from different places • Isolated events exert disproportionate influence • Different agendas involved in privacy and security discussions are not
necessarily consistent • Challenge to business is to embrace these issues and take joint
ownership • Various scenarios are within our reach
• Failure through success – big data analytics • Success through failure – learning to be careful and policy
improvement
o The surveillance society (in practice) o The surveillance society (in theory) Low privacy
o Peer-to-peer o Virtual agora or closed community High Privacy Low security High security
16/08/2012
ADDITIONAL EXAMPLES OF TENSION BETWEEN INTERNET INNOVATIONS AND PRIVACY
Internet Privacy and Identity, Trust and Reputation Mechanisms
38
Market failures System failures
Positive externalities (spillovers) Failures in infrastructural provision and investment
Public goods and appropriability Lock-in / path dependency failures
Imperfect and asymmetric information
Institutional failures
Market dominance Interaction failures
Capabilities failures
Sources: Smith (1999), Martin and Scott (2000) and EC (2006)
Privacy and innovation – ST systems model
Internet Privacy and Identity, Trust and Reputation Mechanisms
39
Privacy and innovation – ST systems model
Market failures System failures
Function creep Failures in infrastructural provision and investment
‘Tragedy of the data commons’ Lock-in / path dependency failures (‘opt-in’/’opt-out’)
Transparency of data subjects versus opacity of systems
Mismatch regulatory practices and data practices
No incentives for newcomers with privacy as USP
Interaction failures
Privacy authorities and law enforcement practices
Internet Privacy and Identity, Trust and Reputation Mechanisms
40
Case-study examples – Cloud Computing
Growing EU market (68B (2011)->150B (2015))
Motives for adoption: Cost reduction; cost accounting; Time to market; greening of ICT
Software as a Service Platform as a Service
Infrastructure as a Service
A model for enabling convenient, on-demand network access to a shared pool of configurable computing resources
US NIST
Internet Privacy and Identity, Trust and Reputation Mechanisms
41
Cloud computing - 2 • Tension 1 – Data controller-data processor (Art 29
WP; Art 17 95/46: security measures) • Tension 2 – Informed choice and consent (auditing
SOC2/SOC3 including privacy and security)
• Tension 3 – Ownership, confidentiality and law
enforcement • Tension 4 – Appropriate SLAs (data integrity, data
disclosure, data preservation, data location/transfer, rights over services/content, property rights and duties (Queen Mary’s univ study)
• Tension 5 – User expectations vs privacy – WTP vs WTA, right of ownership (data
portability), right to access (Facebook, GMail)
" Approach/solution 1 " Technology (encryption):
SIENA (from IAAS to PAAS and SAAS)
" Approach/solution 2 " Security as a Service " Forrester: $1.5B market
(2015) " BUT " Apple’s SDK for iOS only
moderate attention for ‘concern for user’s data’ (only in closed Walled Garden of the Apps Store)
" Approach/solution 3 " Cloud neutral approach?
Internet Privacy and Identity, Trust and Reputation Mechanisms
42
Case-study examples – Behavioural targeting
Market forecast: $4.4 b in 2012
Policy pressure: implementation of more stricter ePrivacy article related to
use of cookies, June 2011
Behavioural targeting - 2 • Tension 1
– Explicit and informed prior consent; (Art 29 WP; ePrivacy directive art. 5(3)); leads to unwanted effects (pop-ups)
• Tension 2 – New intrusive technologies and tools
• Respawning (‘evercookie’) • HTML5 – persistent cookies • Device fingerprinting (unique consumer
identification) • Tension 3
– Trust and confidence • Consumers show reluctance when confronted
with BT practices • Cookie practices hard to understand and to act
upon (Flash cookies) • Generic privacy policies not informative and too
long • Tension 4
– Regulation is perceived to distort business practices
" Approach/solution 1 " Policy approach on informed
consent (‘browser settings are sufficient’)
" Approach/solution 2 " Control instruments to users " ‘Track me not’ browser button " ‘Advertising Icon Option’ " Transparant privacy policies
" Approach/solution 3 " Different approaches " Just in Time contextual
advertising " Consumers show reluctance
when confronted with BT practices
Internet Privacy and Identity, Trust and Reputation Mechanisms
44
Case-study examples – Location based services
Google Street View
Location tracking
GPS
Friend finder
Mash up
All data in device All data in network
Linkage between media
Internet Privacy and Identity, Trust and Reputation Mechanisms
45
Location based services - 2 • Tension 1
– Regulatory practices • 2002/58/EC Harmonisation of opt-in
consent and withdrawal of consent • Data Retention Directive 2006/24/EC;
financial burden on telco’s and ISPs • Conflicting regulatory frameworks
– Definition of personal data, traffic data, location data
• Tension 2 – Strict regulatory practices
• Switzerland: blur all faces, all number plates, sensitive facilities, clothing
• Art 29 WP: storage of photo’s: from 12 to 6 months
• Tension 3 – Lack of user control (location tracking)
• Gathering of profiles (Malte Spitz) • Selling of aggregate data (TomTom)
• Tension 4 – Collection of sensitive data (faces, number
plates, clothing, buildings)
" Approach/solution 1 " Soft regulatory practices: " offering opt-out Germany:
244.000 citizens
" Approach/solution 2 " Control instruments to users " Switching off GPS button
" Approach/solution 3 " New technologies " Automatic face blurring
technologies " Number plate blurring
Internet Privacy and Identity, Trust and Reputation Mechanisms
46
Case-study examples – RFID Ø Building block for Internet of
things Ø Growth $5.03B (2009) - $5.63B
(2010) Ø Use in multitude of domains ü health care; cattle; pets;
logistics; .. Ø Unique ID
Ø Limited control and choice/ consent for user
Internet Privacy and Identity, Trust and Reputation Mechanisms
47
RFID - 2 • Tension 1
– Awareness raising by EDPS, Art 29 WP, consumer groups
• Limited awareness by consumers • Undesired disclosure of data • EC Recommendation 12 May 2009 • Industrial PIA RFID
• Tension 2 – Critical approach from industry
• RFID singled out as privacy intrusive technology
• Privacy problems are in back end • Data encryption in chip is costly • Disabling of RFID means limited access to after
sales services • Tension 3
– Convergence of technologies with privacy implications
• Biometrics (fingerprint recognition • Corporeal intrusion (swarm technology)
• Approach/solution 1 • Regulatory practices • Privacy impact assessment
RFID
• Approach/solution 2 • Control instruments to users
• ‘Killer application’ • Deep sleep mode • Transparency tools (?)
Internet Privacy and Identity, Trust and Reputation Mechanisms
48
Case-study Biometrics Growing market: $4.2B (2010) -
$11.2B (2015) Largest share: finger printing
Facial, iris, voice show higher CAGR
Market driver: homeland security
Decentralised systems (authentication)
Centralised systems (fraud detection, illegal and
criminal activities)
Internet Privacy and Identity, Trust and Reputation Mechanisms
49
Biometrics - 2 • Tension 1
– Storage of sensitive data • False positives • Third party use • Consent and choice
• Tension 2 – Limited accuracy
• Enrolment/identification (UK, NL) • Tension 3
– Back-firing of public failures on private business
• Distrust of decentralised biometric systems
– Single sign on – Access systems
• Tension 4 – Public distrust of premium services
• Advantageous for specific groups
– Air transport – Banking
" Approach/solution 1 " Regulatory practices
" Globalisation of regulation (US-VISIT/SIS/VIS …)
" ICAO standardisation " Fine-tuning EU regulations (<12
age)
" Approach/solution 2 " Control instruments to users
" Transparency tools (?)
" Approach/solution 3 " Offering surplus value
" User friendliness " Speed (Single sign on)
" Trust (One time passwords) " Added value services (premium)
" Approach 4 " Demonstrating public value " Identity fraud (double dippers)
" Crowd control Internet Privacy and Identity, Trust
and Reputation Mechanisms 50
Concluding the cases • Legal/regulatory issues abound
– Stronger enforcement; stricter adherence to privacy constraints; homeland security
– Harmonisation fails; different approaches in different countries – Influx of different regimens (sector specific regulations, police and
judicial coordination) • Privacy intrusions are part of public services as well! Specific uses and
misuses are hard to differentiate for the public at large (fraud detection vs commercial use of data)
• Privacy innovation is restricted (face blurring technology, decentralised biometrics)
• Usually soft approach: awareness raising, opt-in/opt-out offers; transparency measures
• Business practices mostly oriented towards data collection and use; privacy is only secondary to business models
• Companies with inherent privacy friendly approaches have modest market shares and are not very visible to the public at large
Internet Privacy and Identity, Trust and Reputation Mechanisms
51