3. Learning Analytics
HE Managers see:
• Promise vs. Concerns
• Potential vs. Risks
• Benefits vs. Cost
• Purpose vs. Competitive Pressures
• Intentions vs. Hesitations
• Leading to Confusion
HEIs: How to implement LA?
4. Institutionalising LA
Recent set-backs in education:
• (1) inBloom
• (2) Snappet
Acceptance factors:
Data subjects being sufficiently aware of the
consequences of using the system, the validity and
relevance of the results obtained, and the level of
transparency of the data model.
5. 6
• $100 million investment
• Aim: Personalized learning in public schools, through data & technology
standards
• 9 US states participated, in 2013 data about millions of children
have been stored
Privacy as Show Stopper for LA
6. 7
Privacy as Show Stopper for LA
Ignoring the fears and public
perception of the application of
analytics can lead to a lack of
acceptance, protests, and even failure
of entire LA implementations.
7. 8
Related Research Work
Prinsloo & Slade (2013)
Slade & Prinsloo (2013)
Pardo & Siemens (2014)
Prinsloo & Slade (2015)
Hoel & Chen (2015)
Sclater, Bailey (2015)
Steiner, Kickmeier-Rust, Albert (2015)
9. 10
Related Policies
• HEI law on data collection in NL like in all
EU countries since Nuremberg trials (data
collection allowed to improve education,
clear purpose, consent, limited access)
• Engelfriet, A., Jeunink, E., Manderveld, J.
(2015). Learning analytics onder de Wet
bescherming persoonsgegevens.
https://www.surf.nl/kennis-en-
innovatie/kennisbank/2015/learning-
analytics-onder-de-wet-bescherming-
persoonsgegevens.html
• SURF follow-up report with use cases in
preparation
10. Research Ethics
Origins lie in post-WWII. – Milestones:
• Nuremberg Code (1949)
• Helsinki Declaration (1964)
• Belmont Report (1978)
• 2000s: Biomedical Science
• RRI (Responsible Research and Innovation)
12. Privacy
• The right to be left alone (Westin 1968)
• Informational self-determination (Flaherty
1989)
• Informational, decisional, local privacy
(Roessler 2005)
• Privacy is not anonymity or data security!
14. Who are the Bad Guys?
ME
&
MY DATA
Government? Commerce?
Education? Hackers & Bad Guys?
15. Legal Frameworks
• EU Data Protection Directive 95/46/EC
(automated processing of ‘personal data’)
2016: General Data Protection Regulation
(GDPR)
• Restricting the (re-)use of data
vs. and contradicting
• Big Data business models
• European Data Retention Directive 2006/24/EC
(data storage for security purposes)
18. Fears
• Power-relationship, user exploitation
• Data ownership
• Anonymity and data security
• Privacy and data identity
• Transparency and trust
19. Power-relationship
• Tracking and id-ing users or citizens by state or
corporations (e.g. insurance companies, banks,
car manufacturers, etc.) = benefit not to the user!
• Power-relationship is asymmetrical
20. Exploitation
• Free labour as business model of for-profit
companies
• Crowd sourcing outside the „commons“
24. Anonymity and Data Security
• No absolute anonymity or de-identification
• Integration of multiple data sources increase
compromised personal identity
• Data stores are not 100% secure
25. Privacy and Data Identity
• System identity vs. Social identity
• People approximated onto data models by
probability
• Power: data subjects have no say in the design of
data model
Arora, P. (2016). Bottom of the data pyramid: Big data and the global South, International Journal of Communication
26. Transparency and Trust
• Assumption: more transparency = more trust
• But: relationship is mostly asymmetrical
(individual vs. big corporation)
• Transparency as instrument of control
31. Call for Papers
Special Issue: Journal of HE
Development (ZfHE)
Learning Analytics:
Implications for Higher
Education
http://bit.ly/1qXTaNz
Deadline: 10 June 2016
Publication date: Spring
2017
Notes de l'éditeur
DELICATE is not done by rule of thumb
To put LA into institutional practice, HE managers require guidance.
Institutionalisation of Learning Analytics, especially in smaller size institutions has slowed down and in some places even been reversed following negative publicity from some high level cases, combined with the general scare about data abuse on the Internet.
This shows that ignoring the fears and public perception of the application of analytics, no matter how benevolent the intent, can lead to a lack of acceptance, protests, and even failure of entire Learning Analytics implementations.
The rise of Learning Analytics has recently lead to an academic discourse around the topic of ethics and privacy.
This has already lead to the development of some important Learning Analytics policies by some organisations.
Research ethics originated in the post-WWII era, especially in the field of human research and experimentation on and with humans. More recently this has been extended to include RRI and Big Data processing of human data subjects.
Not all research has been compliant with the established ethical consensus. Facebook conducted a research experiment that damaged the social networks relationship with users.
Privacy is a psychological state that is in constant negotiation with the (social) environment.
Privacy has to be evaluated in the respective environment.
The picture in the slide could show a very indecent act of interaction if taken out of its context: It’s a historic picture of a male gynaecologists respecting a patient’s privacy.
However, this contextual integrity and the protection of context bound information stands against the repurposing of data and big data research.
There is substantial confusion about security on the internet mixing up some key players.
We think education is different!
GDPR: directly implemented in nation states, no ratification needed. Is a step to a unified approach. Comes into force by 2018.
GDPR means in practice that a company outside the EU which is targeting consumers in the EU will be subject to the GDPR. This is not the case currently.
The new Regulation also gives users the „right to be forgotten“ to prevent the permanent stigmatisation from early childhood on. This allows people to better change and control their destiny without being haunted by past events and behaviours.
We identified five most common fears in connection with Big Data in education. We’ll elaborate them to some extent in the next couple of slides.
Power & exploitation
Data ownership
Anonymity & data security
Privacy & data identity
Transparency & trust
Power: More often than not there is an asymmetrical power relationship between data controller and data subject.
Also unease about data identities that are designed by “others” and where users get put into or approximated - especially in context of developing World
Feeling exploited as in facebook where contributions by users is seen as free labour for the World’s largest advertising company.
HOWEVER: Education has a benevolent mission and has been based largely on a trusted relationship that it works to the students’ benefit. In DELICATE: we aim at helping providers to develop into “Trusted Knowledge Organisations
Data Ownership: currently no clear regulation who owns which data.
BUT: The new EU GDPR will also e.g. enforce the “right to be forgotten”.
Offers to hand over data are often not useful to the users – how many people can do something with them?
There are moves to enable data subjects to carry and curate their own data in so-called “personal data stores” (e.g. MIT PDS). Not sure whether and for how long this might work.
It’s agreed that there is no absolute anonymity. In our DELICATE aid for implementers we’ve also given room to the reflections on physical data security and of sensitive personal data to help managing the risk.
Privacy & data identity: Arora (2016) distinguishes between “system identity” (the cumulative sum of the data traces we leave behind mapped onto a data model) and “social identity” (how we are perceived by our social environment). Biometrical information makes this even more sensitive. BUT: In education we assume a “learning contract” with the institution or provider. For the duration of this relationship, we should assume “trust” as the default value. Personalised learning is an indication for moves into individual treatment as opposed to stereotypical “boxing”. In DELICATE the necessary transparency what happens to student/teacher data is therefore included.
Transparency & trust: There is the assumption that more transparency leads to more user empowerment and trust. However, the relationship is mostly asymmetrical as the transparency of an individual person isn’t the same as the transparency of a complex system like a multi-national corporation. Even with all the information available complex organisations or universities aren’t easily understood. Transparency is also an instrument of control therefore leading to more scepticism rather than trust. In DELICATE we postulate that being open about the intentions and open to feedback by stakeholders does help ease the problem. Playing results back to the users themselves may involve them in the process and lead to them taking more responsibility of their own learning by reflecting on the data.
DELICATE checklist is intended to be an aid not a solution to the mentioned issues. Education institutions should be confident in being trusted organisations that are open to scrutiny from a wide variety of stakeholders and others (parents, inspectors, authorities, peers, auditors, funding councils, etc.) – thus, the levels of dangers to privacy and re-purposing of data should be lower than elsewhere.