dasts16 a koene_un_bias

20
User agency on social networks that are mediated by algorithms Ansgar Koene HORIZON Digital Economy Research, University of Nottingham

Upload: ansgar-koene

Post on 12-Apr-2017

222 views

Category:

Science


1 download

TRANSCRIPT

Page 1: Dasts16 a koene_un_bias

User agency on social networks that are mediated by algorithms

Ansgar Koene HORIZON Digital Economy Research,

University of Nottingham

Page 2: Dasts16 a koene_un_bias

User experience satisfaction on social network sites

Page 3: Dasts16 a koene_un_bias

Human attenetion is a limited resource

Filter

Page 4: Dasts16 a koene_un_bias

Information services, e.g. internet search, news feeds etc.

• free-to-use => no competition on price• lots of results => no competition on quantity

• Competition on quality of service• Quality = relevance = appropriate filtering

Good information service = good filtering

Page 5: Dasts16 a koene_un_bias

Sacrificing control for Convenience

Page 6: Dasts16 a koene_un_bias

Sacrificing control for Convenience

Page 7: Dasts16 a koene_un_bias

Personalized recommendations

• Content based – similarity to past results the user liked

• Collaborative – results that similar users liked (people with statistically similar tastes/interests)

• Community based – results that people in the same social network liked(people who are linked on a social network e.g. ‘friends’)

Page 8: Dasts16 a koene_un_bias

How do the algorithms work?

Page 9: Dasts16 a koene_un_bias
Page 10: Dasts16 a koene_un_bias

User understanding of social media algorithms

More than 60% of Facebook users are entirely unaware of any algorithmic curation on Facebook at all: “They believed every single story from their friends and followed pages appeared in their news feed”.

Published at: CHI 2015

Page 11: Dasts16 a koene_un_bias

Revealing News Feed behaviour

Page 12: Dasts16 a koene_un_bias

Participants indicate desired changes

Page 13: Dasts16 a koene_un_bias

Information filtering, or ranking, implicitly manipulates choice behaviour.

Many online information services are ‘free-to-use’, theservice is paid for by adverting revenue, not users directlyÞ Potential conflict of interest:

promote advertisement vs. match user interests

Advertising inherently tries to manipulate consumer behaviourPersonalized filtering can also be use for political spin /

propaganda etc.

Manipulation: conflict of interest

Page 14: Dasts16 a koene_un_bias

Trending Topics controversy

Page 15: Dasts16 a koene_un_bias

Q&A with N. Lundblad (Google)Nicklas Lundblad, Head of EMEA Public Policy and Government Relations at Google

Human attention is the limited resource that services need to compete for.

As long as there exist competing platforms, loss of agency due to algorithms deciding what to show to users is not an issue. Users can switch to other platform.

Page 16: Dasts16 a koene_un_bias

UnBias: Emancipating Users Against Algorithmic Biases for a Trusted Digital Economy

WP1: ‘Youth Juries’ workshops with 13-17 year olds to co-produce citizen education materials on properties of information filtering/recommendation algorithms;

WP2: co-design workshops, hackathons and double-blind testing to produce user-friendly open source tools for benchmarking and visualizing biases in the algorithms;

WP3: design requirements for algorithms that satisfy subjective criteria of bias avoidance based on interviews and observation of users’ sense-making behaviour

WP4: policy briefs for an information and education governance framework for social media usage. Developed through broad stakeholder focus groups with representatives of government, industry, third-sector organizations, educators, lay-people and young people (a.k.a. “digital natives”).

Page 17: Dasts16 a koene_un_bias

Thank you for your attention

[email protected]

Click to add texthttp://casma.wp.horizon.ac.uk/

Page 18: Dasts16 a koene_un_bias

It’s based on data so it must be true

“More data, not better models”

Belief that ‘law of large number’ means Big Data methods do not need to worry about model quality or sampling bias as long as enough data is used.

“More Data” is the key to Deep-learning success compared to previous AI

Page 19: Dasts16 a koene_un_bias

Garbage in -> garbage outperpetuating the status-quo

ProPublica “Machine Bias”

Page 20: Dasts16 a koene_un_bias

‘equal opportunity by design’“Big Data: A Report on Algorithmic Systems,

Opportunities, and Civil Rights“, White House report focused on the problem of avoiding discriminatory outcomes

“To avoid exacerbating biases by encoding them into technological systems a principle of ‘equal opportunity by design’—designing data systems that promote fairness and safeguard against discrimination from the first step of the engineering process and continuing throughout their lifespan.”