student vulnerability, agency and learning analytics: an exploration
TRANSCRIPT
Student vulnerability, agency and learning analytics: an exploration
By Paul Prinsloo (University of South Africa) & Sharon Slade (Open University, UK)
ACKNOWLEDGEMENTS
The presenters do not own the copyright of any of the images in this presentation. We hereby acknowledge the original copyright and licensing regime of every image and reference used. All the images used in this presentation have been sourced from Google labeled for non-commercial reuse.
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License
Image credits: http://commons.wikimedia.org/wiki/File:DARPA_Big_Data.jpg
“Privacy and big data are simply incompatible and the time has come to reconfigure choices that we made decades ago to enforce constraints”
(Lane, Stidden, Bender & Nissenbaum, 2015, p. xii)
A number of concerns
• Raw data is an oxymoron – data is not neutral
• The negative impacts of discrimination based on non-neutral data
• Increased sharing of data and information by individuals
• The quantified self movement - I am more than my data (or not?)
• Governance of data amidst increased sharing of data between different stakeholders irrespective of original context/purpose
• Re-identification of de-identified data
• Options for and practices of disclosure
Privacy as lens is not (anymore…) providing enough scope for engagement with the issues…
• Post-Snowden – despite a decrease in public trust, no real change in the sharing of personal data and information
• Changing definitions and social/cultural mores
• Regulatory frameworks can’t keep up and become outdated
• Differences between geopolitical and institutional contexts
• We need a “palette of ‘privacy solutions’” (Gurses, 2015)
Vulnerability as interpretive lens
• Applicable to both higher education institutions and individual students/groups of students
• Due to asymmetrical power relationship, students are, per se, more vulnerable
• Higher educations do not really have a choice – they cannot NOT use data – the fiduciary duty of higher education
The rationale for student vulnerability as lens• Very little published research on student perceptions of the use
of their data by higher education institutions (Slade & Prinsloo, 2014)
• The asymmetrical power relationship between students and institutions – students are more vulnerable with less access to resources to contest or refuse
• The fiduciary duty of higher education – the social contract
• The purpose of learning analytics is to support and increase the effectiveness of learning
In the light of this context:
Is providing students with an option to opt-
in or out (really) an option?
Image credit: http://www.mailbow.net/eng/blog/opt-in-and-op-out/
A framework for mapping the collection, use and sharing of personal user information
(Miyazaki & Fernandez, 2000)
Nevercollect or identity users
Users explicitly opting in to have data collected, used and shared
Users explicitly opting out
The constant collection, analysis and sharing of user data with users’ knowledge
The constant collection, analysis and sharing of user data withoutusers’ knowledge
Learning analytics and student vulnerability: between the devil and the deep blue sea?
Students (some more vulnerable
than others)
Generation, harvesting and analysis of data
Our assumptions, selection of data and
algorithms may be ill-defined
Turning ‘pathogenic’ – “a response intended to
ameliorate vulnerability has the paradoxical effect of
exacerbating existing vulnerabilities or generating
new ones” (Mackenzie et al, 2014, p. 9)
Problematising opting in/out (1)• Prior affirmative consent in all cases = impractical (OECD, 2012; in Brian,
2015)
• Assumption that users agree when they use an online service
• Once data have been legitimately acquired, current legal frameworks do not dictate or constraint the scope of the use and re-use of such data (Ohm, 2015)
• Consent is under-theorized (Prinsloo & Slade, 2015)
• Context and users’ understanding of context determine scope and detail of the sharing of information
• Tensions between a paternalistic/rights-based approach to the collection, analysis and use of data and a discursive-disclosive approach
Problematising opting in/out (2)• Individuals often don’t read TOCs – lengthy, technical/legal language
• Individuals decide on a case-to-case basis
• When default option is to opt-in, individuals are more careful (Bellman, Johnson & Lohse, 2001)
• Font size & layout impact on individuals opting in or out
• Organizations find ways to generate high opt-ins (the exchange value of information/tracking)
• “… there are simply too many entities that collect, use, and disclose people’s data for the rational person to handle” (Prinsloo & Slade, 2015)
Image credit: http://www.mailbow.net/eng/blog/opt-in-and-op-out/
How do we address student vulnerability?
How much agency should/can they have?
A framework for learner agency (1)
1. The duty of reciprocal care
1. The contextual integrity of privacy and data – data is not neutral and we must take cognisance of the dangers in context-collapse –especially with historical and aggregated data
1. Student agency and privacy self-management – How do we think “critically about the range of student control over what data will be analysed, for what purposes, and how students will have access to verify, correct or supply additional information” (Prinsloo & Slade, 2015)
A framework for learner agency (2)4. Rethinking consent and employing nudges – the value of transparent
information exchange & student-centered learning analytics – students not as data but as collaborators
4. Developing partial privacy self-management – differentiated, context and purpose-appropriate opportunities to opt in/out
4. Adjusting privacy’s timing and focus – downstream use/forever/ for what purposes/revoking consent…
4. Moving toward substance over neutrality – hard rules for troublesome practices, soft rules to create spaces for context/choices
4. Moving from quantified selves to qualified selves – we are more than our data and our data does not provide complete pictures of us
(In)conclusions
“Providing people with notice, access, and the ability to control their data is key to facilitating some autonomy in a world where decisions are increasingly made about them with the use of personal data, automated processes, and clandestine rationales, and where people have minimal abilities to do anything about such decisions” (Solove, 2013, p. 1899)
THANK YOUPaul Prinsloo (Prof)Research Professor in Open Distance Learning (ODL)College of Economic and Management Sciences, Office number 3-15, Club 1, Hazelwood, P O Box 392Unisa, 0003, Republic of South Africa
T: +27 (0) 12 433 4719 (office)T: +27 (0) 82 3954 113 (mobile)
[email protected]: paul.prinsloo59
Personal blog:http://opendistanceteachingandlearning.wordpress.com
Twitter profile: @14prinsp
Sharon Slade (Dr) Senior Lecturer and Regional Manager, Faculty of Business and LawThe Open University, Walton Hall, Milton Keynes, MK7 6AA, United Kingdom
T: 01865 486250
Personal blog:http://odlsharonslade.wordpress.com/
Twitter profile: @SharonSlade
ReferencesBellman, S., Johnson, E.J. and Lohse, G.L. 2001. On site: to opt-in or opt-out?: it depends
on the question. Communications of the ACM, 44, 2, (2001), 25-27. Retrieved from http://dl.acm.org/citation.cfm?id=359241
Boam, E., & Webb, J. 2014, May 2. The qualified self: going beyond quantification. [Web log post]. Retrieved from http://designmind.frogdesign.com/2014/05/qualified-self-going-beyond-quantification/
Brian, S. 2015. The unexamined life in the era of big data: toward a UDAAP for data. Retrieved from http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2533068
Eynon, R. 2013. The rise of Big Data: what does it mean for education, technology, and media research? Learning, Media and Technology, DOI: 10.1080/17439884.2013.771783
References (cont)
Gurses, S. 2015. Privacy and security. Can you engineer privacy? Viewpoints. Retrieved from http://cacm.acm.org/magazines/2014/8/177015-can-youengineer-privacy/fulltext
Lane, J., Stodden, V., Bender, S., & Nissenbaum, H. (Eds). 2015. Privacy, big data, and the public good. New York, NY: Cambridge University Press.
Miyazaki, D., & Ferenandez, A. 2000. Internet privacy and security: an examination of online retailer disclosures. Journal of Public Policy & Marketing, 19(1): 54-61.
Mackenzie, C., Rogers, W., & Dodds, S. (eds.). 2014. Vulnerability. New essays in ethics and feminist philosophy. Oxford University Press: Oxford.
References (cont)
Prinsloo, P., & Slade, S. 2015. Student privacy self-management: implications for learning analytics. LAK15
Slade, S., & Prinsloo, P. (2014).Student perspectives on the use of their data: between intrusion, surveillance and care. EDENRW, Oxford, UK, 27-28 October. Retrieved from http://oro.open.ac.uk/41229/
Solove, D.J. 2013. Introduction: Privacy self-management and the consent dilemma. Harvard Law Review 1880 (2013); GWU Legal Studies Research Paper No. 2012-141; GWU Law School Public Law Research Paper No. 2012-141. Available at SSRN: http://ssrn.com/abstract=2171018