[Extended] Bottom-up growth of learning analytics at two Australian universities: Empowering staff with actionable intelligence to improve student outcomes
Presented at the University of New South Wales Learning Analytics and Educational Data Science research group meeting, April 2016.
This presentation will outline two approaches to learning analytics at the University of Sydney and Macquarie University, where staff are closely involved in the coevolution and development of two bespoke learning analytics tools to personalise student-staff interactions at scale. The University of Sydney system, called the Student Relationship Engagement System (SRES), is a highly-customisable web-based tool that supports the efficient capture and collation of student datasets. A companion mobile app helps staff quickly collect and access student data. Through an embedded messaging system, teaching staff can set up fully customisable rules to contact students via personalised emails and text messages. A nascent feature allows staff to leverage machine learning to uncover hidden patterns and relationships within and between datasets. The Macquarie University system is an enhancement of an existing Moodle plug-in, the Moodle Engagement Analytics Plugin (MEAP). MEAP can readily access data on student assessments, completions, login activity, forum activity, and the gradebook, amongst others, which are customisably represented as ‘risk indicators’. MEAP allows flexible and customisable interrogation of these data, and provides staff the ability to send personalised emails to students based on these risk indicators. At both institutions, these learning analytics approaches have grown from the grassroots to address pressing staff needs, highlighting the importance of this bespoke coevolution process of design, development, and implementation. The systems have enjoyed substantial organic adoption and are associated with positive student outcomes. As open source developments, we are very interested in working together to open up accessible learning analytics to teachers and students.
Similaire à [Extended] Bottom-up growth of learning analytics at two Australian universities: Empowering staff with actionable intelligence to improve student outcomes
Learning analytics - what can we achieve together.pptxRebecca Ferguson
Similaire à [Extended] Bottom-up growth of learning analytics at two Australian universities: Empowering staff with actionable intelligence to improve student outcomes (20)
Kantar AI Summit- Under Embargo till Wednesday, 24th April 2024, 4 PM, IST.pdf
[Extended] Bottom-up growth of learning analytics at two Australian universities: Empowering staff with actionable intelligence to improve student outcomes
1. The University of Sydney Page 1
Bottom-up growth of learning analytics at two
Australian universities:
Empowering staff with actionable intelligence to improve student outcomes
@dannydotliu
danny.liu@mq.edu.au
danny.liu@sydney.edu.au
2. The University of Sydney Page 2
Learning analytics is the measurement, collection, analysis
and reporting of data about learners and their contexts, for
purposes of understanding and optimising learning and the
environments in which it occurs.
1st International Conference on Learning Analytics and Knowledge 2011
4. The University of Sydney Page 4
1: Macro and micro
– Data perspective: aggregate1 vs contextualised2
[1] Jayaprakash, S. M., Moody, E. W., Lauría, E. J., Regan, J. R. and Baron, J. D. (2014) Early alert of academically at-risk students: An open source analytics initiative. Journal of Learning Analytics, 1(1), 6-47.
[2] Gašević, D., Dawson, S., Rogers, T. and Gasevic, D. (2016) Learning analytics should not promote one size fits all: The effects of instructional conditions in predicting academic success. The Internet and Higher Education, 28, 68-84.
Predictive
algorithm
Meaning? Meaning Meaning Meaning
5. The University of Sydney Page 5
1: Macro and micro
– Scale: large vs small cohorts
– Difficulties and benefits of learning and teaching at scale
• More students
• More data
• Less time
• Less personal
– What is the purpose of learning analytics?
6. The University of Sydney Page 6
2: Human and computer
– The fields of learning analytics and
educational data mining1
– Prevailing wind: collect lots of data and
throw algorithms at it
– The allure of predictive algorithms
[1] Siemens, G. and Baker, R. S. (2012) Learning analytics and educational data mining: towards communication and collaboration. In Proceedings LAK12, Vancouver.
[2] http://www.jonbecker.net/ranty-blog-post-about-big-data-learning-analytics-higher-ed/
“People seem to genuinely believe that if [we] just
have enough data and squint REALLY hard at those
data, we can begin to see the solutions to our student
success problem.” 2
7. The University of Sydney Page 7
2: Human and computer
– Who really is the master?
– Data and algorithms?
– Students and teachers?
– “Intelligence amplification”1
– Levels of abstraction
– Meaningful information and
representations2
[1] Baker, R. S. (2016) Stupid Tutoring Systems, Intelligent Humans. International Journal of Artificial Intelligence in Education, 1-15.
[2] Jones, D., Beer, C. and Clark, D. (2013) The IRAC framework: Locating the performance zone for learning analytics. In Proceedings ascilite, Sydney.
“The learning system is not itself intelligent;
the human intelligence that surrounds the
system is supported and leveraged.
Designers are informed to support redesign
and enhancement of a learning system;
instructors are informed so that they can
support the student right away.” 1
8. The University of Sydney Page 8
3: Research and teaching
– Institutional culture
– Ethics
– Learning analytics research
– Scalability and flexibility
– Frameworks, models, theorisation, discussion, conjecture
• Where are the practical applications and impact?
[1] Macfadyen, L. P. and Dawson, S. (2012) Numbers are not enough. Why e-learning analytics failed to inform an institutional strategic plan. Journal of Educational Technology & Society, 15(3), 149-163.
“… no vision or plan will emerge or be embraced without the support of faculty…
[who] may view the introduction of technologies into teaching as a time-consuming
imposition, as something that diverts them from current research and teaching
activities…” 1
9. The University of Sydney Page 9
3: Research and teaching
[1] Clow, D. (2012) The learning analytics cycle: closing the loop effectively. In Proceedings LAK12, Vancouver.
[2] Jones, D., Beer, C. and Clark, D. (2013) The IRAC framework: Locating the performance zone for learning analytics. In Proceedings ascilite, Sydney.
Learning analytics cycle1 IRAC framework2
Learners
Data
Metrics &
analytics
Intervention
& effect
Information
• Relevance, sufficiency, processing
Representation
• Understandable, useful
Affordances for action
• Engaging users, appropriate actions
Change
• Continuous development, adaptation
10. The University of Sydney Page 10
4: Top down and bottom up
“Faced with dashboards that promise the
moon, but that are meaningless in light of
the concrete questions, it is unsurprising
when one hears administrators, faculty,
and students describe learning analytics
as creepy and useless.” 1
[1] http://timothyharfield.com/blog/2014/07/23/using-learning-analytics-to-promote-success-among-the-successful/
[2] http://chronicle.com/article/What-s-Really-to-Blame-for/235620/
Vendor institutional packages
Bespoke agile development
“Higher education needs to get better at
academic needs assessment… It requires
an in-depth exploration of how teaching
and learning happens in various corners
of the campus community and what
capabilities would be most helpful to
support those efforts.” 2
11. The University of Sydney Page 11
4: Top down and bottom up
[1] Colvin, C., Rogers, T., Wade, A., Dawson, S., Gašević, D., Buckingham Shum, S., Nelson, K. J., Alexander, S., Lockyer, L., Kennedy, G., Corrin, L. and Fisher, J. (2016) Student retention and learning analytics: a snapshot of Australian practices and
a framework for advancement. Australian Government Office for Learning and Teaching, Canberra, ACT.
[2] Lonn, S., Aguilar, S. and Teasley, S. D. (2013) Issues, challenges, and lessons learned when scaling up a learning analytics intervention. In Proceedings LAK13, Leuven.
“We found that mature foundations for LA implementations were identified in institutions
that adopted a rapid innovation cycle whereby small scale projects are initiated and
outcomes quickly assessed within short time frames. The successful projects in this cycle are
then further promoted for scale-ability and mainstream adoption. In the context of LA, this
small-scale seeded approach appeared more effective in terms of organisational
acceptance and adoption than a whole of institution model attempting to roll out a single
encompassing program.” 1
“… while [institutional IT-supported tool] is a reasonable cost-savings solution, it may not
be nimble enough… This may end up being an irreconcilable challenge to the ability of the
system to scale…” 2
13. The University of Sydney Page 13
1. Macquarie: Empowering staff with actionable LMS data
2. Sydney: Learning analytics by stealth
3. Macquarie: Building institutional readiness through openness
14. Empowering staff with actionable LMS data
THE MOODLE ENGAGEMENT ANALYTICS PLUGIN (MEAP)
April 2016
15. Learning analytics in Moodle
MOTIVATIONS AND BENEFITS
• Large(ish) classes, failure and retention issues
• Staff familiarity with Moodle, single point of access
• Learning experience data already centralised
16. Learning analytics in Moodle
MUCH PROMISE BUT LITTLE DELIVERY?
https://moodle.org/plugins/block_gismo
Mazza et al. (2012)
https://moodle.org/plugins/browse.php?list=set&id=20
Log viewer
Statistics report
GISMO
MOCLog
Engagement Analytics
17. Enhancing an existing plugin
17
MOODLE ENGAGEMENT ANALYTICS PLUGIN (MEAP)
• Originally developed by Phillip Dawson, Adam Olley, and Ashley Holman
Moodle
Report
Logins Forums Assessments
Parameters
Traffic
lights
BYO
Moodle Engagement
Analytics Plugin
Indicators
Action
18. Involving staff
18
ACADEMICS AND STUDENT SUPPORT
Staff expectations around an
early alert system
Prototyping and development
User testing
Piloting
Feedback and further development
21. Stakeholder outcomes
21
PERSONALISED, DATA-DRIVEN INTERVENTIONS
• Who: unit
convenors and
student support
staff
• What: census,
updates, reminders
• Why: predominantly
for at-risk students
• How: logins,
assessment
submissions, grades,
attendance
I was surprised someone cared/was actually monitoring, kind
of a weird, I don't know totalitarian/'people are watching you'
feeling? But in this situation I was happy.
He gave me specific advice and encouraged me and it made
me feel much better.
The email basically kicked me into gear and I completed all my
assessments post-email to a high level.
Very useful. I wouldn't have been able to do such a large scale
analysis and identify so many students without MEAP. I
wouldn't have been able to send them such tailored, structured
and consistent messages.
22. Stakeholder outcomes
22
THE DARK SIDE
• Issues around
• Message
composition
• Suggestions
• Student contexts
• Being labelled
I was contacted but in a manner that suggest I should drop out of
the course and not waste the convenor's time. She didn't ask
whether I was experiencing any issues outside of university,
but simply that I should transfer course if I can't handle the
workload.
Being labelled as lazy when you're doing your best and
don't have any other choice is quite sad.
Being aware and then being told of my own inadequacies is
confronting and, yes, does make me feel worse about life in
general. It's something I need to be told and is still that extra bit of
motivation.
The email was worded in a way that it the unit [convenor] was
trying to tell me I was doing horrible and should drop out and
didn't refer any help.
23. Stakeholder outcomes
23
PERSONALISED, DATA-DRIVEN INTERVENTIONS
• Was there an effect?
Online activity Online activityOnline activityOnline activity Online activity
Risk rating Risk rating
Change in risk rating
No email sent
Emailed but not opened
Opened email
24. Stakeholder outcomes
24
PERSONALISED, DATA-DRIVEN INTERVENTIONS
-0.1
-0.05
0
0.05
0.1
0.15
0.2
0.25
Forum Login
Changeinriskrating
*
Missed online quizzes & tutorial submissions
No email sent Emailed but not opened Opened email
25. Back to the users
25
FURTHER ITERATIVE DEVELOPMENT
27. Lessons learned and next steps
27
LOOKING BACK AND LOOKING FORWARD
• Looking back
• Talk to the users
• Find champions
• Staff have varying levels of error tolerance
• LA is a political football
• Looking forward
• Further evaluating impact
• Production and wider trial
• Community collaboration
• New Moodle LA spec
https://docs.moodle.org/dev/Learning_Analytics_Specification
28. The University of Sydney Page 28
1. Macquarie: Empowering staff with actionable LMS data
2. Sydney: Learning analytics by stealth
3. Macquarie: Building institutional readiness through openness
29. The University of Sydney Page 29
Learning analytics by
stealth
The Student Relationship Engagement System
(SRES)
Standing out from a Crowd
SumAll CC BY-NC-ND 2.0
https://flic.kr/p/kYbv4C
30. The University of Sydney Page 30
The contexts of learning analytics
Common barriers to adoption
– Policy and ethical
challenges
– Culture of resistance to
change
– Vendor solutions
– Data accuracy
– One-size-fits-all
Pressing institutional needs
– ~$7 million/year lost to
attrition
– Larger class sizes
– More disconnected students
– Feedback very generalised
31. The University of Sydney Page 31
Attendance
Interim
grades
LMS
metrics
Third party
tools & other
data
The Student Relationship Engagement System
32. The University of Sydney Page 32
Personalising connections with students
– Empowering staff
– Flexible & intuitive
– Targeted and personalised
– Multi-channel
– Benefits
– Highly customisable
– Efficient – key data in one
place, operating at scale
– Connect staff and all
students (not just at-risk)
33. The University of Sydney Page 33
Stakeholder outcomes
Discontinued
Failed
Passed
F
P
C
D
HD
1st year arts unit ~500 enrolment 1st year science unit ~1000 enrolment
“Many thanks Adam. Yes, things are going much better this semester. I really appreciate
how you keep in contact and keep an eye on us. It's such a big class, I don't know how you
do it.”
“Just to let you know that your emails really helped me survive last semester. I never
realised how big a change it would be from school.”
34. The University of Sydney Page 34
Co-evolution of the SRES
– Organic adoption by academics
– Co-evolution of capabilities
0
5000
10000
15000
20000
25000
0
10
20
30
40
50
60
70
2012 2013 2014 2015
Numberofstudents
Numberofunitsorschools
Number of units Number of schools Number of students
Pilot
EWS
introduced
EWS
integrated
New
analyses
More data
types
Data
import
35. The University of Sydney Page 35
Learning analytics by stealth?
– Co-evolving capabilities and competencies around data-driven
pedagogy and curriculum
36. The University of Sydney Page 36
Learning analytics by stealth?
– Co-evolving capabilities and competencies around data-driven
pedagogy and curriculum
37. The University of Sydney Page 37
The contexts of learning analytics
Common barriers to adoption
– Policy and ethical
challenges
– Culture of resistance to
change
– Vendor solutions
– Data accuracy
– One-size-fits-all
Pressing institutional needs
– ~$7 million/year lost to
attrition
– Larger class sizes
– More disconnected students
– Feedback very generalised
38. The University of Sydney Page 38
Lessons learned and next steps
– Looking back
– It was ugly but it worked
– Ease of use is important – does it save time?
– Attendance was a (unsurprisingly?) popular metric
– Everyone uses a customisable system differently
– Personalisation at scale
– Looking forward
– Trans-Tasman redevelopment effort
– Facilitate wider roll-out
– Research & evaluation of impact on students and staff
39. The University of Sydney Page 39
1. Macquarie: Empowering staff with actionable LMS data
2. Sydney: Learning analytics by stealth
3. Macquarie: Building institutional readiness through openness
41. Our approach
41
CONNECTING USERS WITH DATA THROUGH ANALYTICS
Macquarie
Open
Analytics
Toolkit
Data
Users
Analytics
LMS
Video
Classrooms
Mobile
Business
systems
External
courses
Understand students
Identify and predict
Code of practice
“LAMP Lighters”
42. Bringing data together
42
NUANCES OF LEARNING DATA
LMS
Video
Classrooms
Mobile
Business
systems
External
courses
Learning
Record
Store
(LRS)
Custom analytics engine
45. Lessons learned and next steps
45
LOOKING BACK AND LOOKING FORWARD
• Looking back
• Multi-disciplinary, multi-level teams
• System architecture choice is important
• Students are very open about data (unless it’s identifiable)
• Staff and students have a (surprisingly) good idea of what they want
• Looking forward
• LRS to production
• Piloting ‘dashboards’* with staff and students
• Working with xAPI community
• Open sourcing analytics tools
47. The University of Sydney Page 47
Lessons learned and issues raised
– Give them what they want vs. build it and they will come
– Customisability is key
– Utility (eventually) trumps aesthetics (to an extent)
– But people still like shiny things
– Data are not enough – connect with pedagogical, pastoral
– Surprisingly little kickback about privacy & ethics
– It’s tricky to measure impact
– Iterate – capabilities, implementation
– Focus on the human
48. The University of Sydney Page 48
Four tensions
Research Teaching
Top-down Bottom-up
Human Computer
Macro Micro
49. The University of Sydney Page 49
Learning analytics is not an elixir for ineffective teaching,
nor does it reveal an ideal pedagogy; instead, it provides
data-driven tools or suggestions to help instructors make
changes that can be measured in terms of student outcomes.
Pistilli, M. D., Willis III, J. E., & Campbell, J. P. (2014). Analytics Through an Institutional Lens:
Definition, Theory, Design, and Impact. In Learning Analytics (pp. 79-102). Springer New York.
50. The University of Sydney Page 50
1. MEAP Empowering staff with actionable LMS data
Chris Froissard, Deborah Richards, Amara Atif
2. SRES Learning analytics by stealth
Charlotte Taylor, Adam Bridgeman, Abelardo Pardo, Kathryn Bartimote-
Aufflick et al.
3. MOAT Building institutional readiness through openness
James Hamilton, Ed Moore, Yvonne Breyer, Yvonne Nemes et al.
@dannydotliu
danny.liu@mq.edu.au
danny.liu@sydney.edu.au