SlideShare a Scribd company logo
1 of 94
The power of learning analytics for UCL:
lessons learned from the Open University
UK
Arena Exchange Seminar, UCL, London
22 July 2016@DrBartRienties
Reader in Learning Analytics
What is learning analytics?
http://bcomposes.wordpress.com/
(Social) Learning Analytics
“LA is the measurement, collection, analysis and reporting of data about learners
and their contexts, for purposes of understanding and optimising learning and the
environments in which it occurs” (LAK 2011)
Social LA “focuses on how learners build knowledge together in their cultural
and social settings” (Ferguson & Buckingham Shum, 2012)
Agenda? You choose 
1. The power of 151 Learning Designs on 113K+ students at the
OU?
2. Analytics4Action: evidence-based interventions?
3. OU Analyse: predictive analytics with automated student
recommender?
4. Key drivers for 100K+ student satisfaction?
5. Opportunities of learning analytics/elearning for teaching
practice, grant acquisition, commercialisation, and wider policy
implications.
Assimilative Finding and
handling
information
Communicati
on
Productive Experiential Interactive/
Adaptive
Assessment
Type of
activity
Attending to
information
Searching for
and
processing
information
Discussing
module related
content with at
least one other
person (student
or tutor)
Actively
constructing an
artefact
Applying
learning in a
real-world
setting
Applying
learning in a
simulated
setting
All forms of
assessment,
whether
continuous,
end of
module, or
formative
(assessment
for learning)
Examples of
activity
Read, Watch,
Listen, Think
about,
Access,
Observe,
Review, Study
List, Analyse,
Collate, Plot,
Find,
Discover,
Access, Use,
Gather, Order,
Classify,
Select,
Assess,
Manipulate
Communicate,
Debate,
Discuss, Argue,
Share, Report,
Collaborate,
Present,
Describe,
Question
Create, Build,
Make, Design,
Construct,
Contribute,
Complete,
Produce, Write,
Draw, Refine,
Compose,
Synthesise,
Remix
Practice,
Apply, Mimic,
Experience,
Explore,
Investigate,
Perform,
Engage
Explore,
Experiment,
Trial, Improve,
Model,
Simulate
Write,
Present,
Report,
Demonstrate,
Critique
Method – data sets
• Combination of four different data sets:
• learning design data (189 modules mapped,
276 module implementations included)
• student feedback data (140)
• VLE data (141 modules)
• Academic Performance (151)
• Data sets merged and cleaned
• 111,256 students undertook these modules
Toetenel, L. & Rienties, B. (2016). Analysing 157 Learning Designs using Learning Analytic approaches as a means to evaluate the impact of pedagogical
decision-making. British Journal of Educational Technology.
Constructivist
Learning Design
Assessment
Learning Design
Productive
Learning Design
Socio-construct.
Learning Design
VLE Engagement
Student
Satisfaction
Student
retention
Learning Design
151 modules
Week 1 Week 2 Week30
+
Rienties, B., Toetenel, L., Bryan, A. (2015). “Scaling up” learning design: impact of learning design activities on LMS behavior and performance. Learning
Analytics Knowledge conference.
Disciplines Levels
Size module
Average time spent per week in VLE
Cluster 1 Constructive (n=73)
Cluster 2 Assessment (n=10)
Cluster 3 Productive (n=38)
Cluster 4 Social Constructivist (n=20)
Week Assim Find Com. Prod Exp Inter Asses Total
-2 -.03 .02 -.02 -.09 .20* -.03 .01 .35**
-1 -.17* .14 .14 -.01 .30** -.02 -.05 .38**
0 -.21* .14 .37** -.07 .13 .08 .02 .48**
1 -.26** .25** .47** -.02 .28** .01 -.1 .48**
2 -.33** .41** .59** -.02 .25** .05 -.13 .42**
3 -.30** .33** .53** -.02 .34** .02 -.15 .51**
4 -.27** .41** .49** -.01 .23** -.02 -.15 .35**
5 -.31** .46** .52** .05 .16 -.05 -.13 .28**
6 -.27** .44** .47** -.04 .18* -.09 -.08 .28**
7 -.30** .41** .49** -.02 .22** -.05 -.08 .33**
8 -.25** .33** .42** -.06 .29** -.02 -.1 .32**
9 -.28** .34** .44** -.01 .32** .01 -.14 .36**
10 -.34** .35** .53** .06 .27** .00 -.13 .35**
Model 1 Model 2 Model 3
Level0 -.279** -.291** -.116
Level1 -.341* -.352* -.067
Level2 .221* .229* .275**
Level3 .128 .130 .139
Year of implementation .048 .049 .090
Faculty 1 -.205* -.211* -.196*
Faculty 2 -.022 -.020 -.228**
Faculty 3 -.206* -.210* -.308**
Faculty other .216 .214 .024
Size of module .210* .209* .242**
Learner satisfaction (SEAM) -.040 .103
Finding information .147
Communication .393**
Productive .135
Experiential .353**
Interactive -.081
Assessment .076
R-sq adj 18% 18% 40%
n = 140, * p < .05, ** p < .01
Table 3 Regression model of LMS engagement predicted by institutional, satisfaction and learning design analytics
• Level of study predict VLE
engagement
• Faculties have different VLE
engagement
• Learning design
(communication & experiential)
predict VLE engagement (with
22% unique variance
explained)
Model 1 Model 2 Model 3
Level0 .284** .304** .351**
Level1 .259 .243 .265
Level2 -.211 -.197 -.212
Level3 -.035 -.029 -.018
Year of
implementation .028 -.071 -.059
Faculty 1 .149 .188 .213*
Faculty 2 -.039 .029 .045
Faculty 3 .090 .188 .236*
Faculty other .046 .077 .051
Size of module .016 -.049 -.071
Finding information -.270** -.294**
Communication .005 .050
Productive -.243** -.274**
Experiential -.111 -.105
Interactive .173* .221*
Assessment -.208* -.221*
LMS engagement .117
R-sq adj 20% 30% 31%
n = 150 (Model 1-2), 140 (Model 3), * p < .05, ** p < .01
Table 4 Regression model of learner satisfaction predicted by institutional and learning design analytics
• Level of study predict
satisfaction
• Learning design (finding info,
productive, assessment)
negatively predict satisfaction
• Interactive learning design
positively predicts satisfaction
• VLE engagement and
satisfaction unrelated
Model 1 Model 2 Model 3
Level0 -.142 -.147 .005
Level1 -.227 -.236 .017
Level2 -.134 -.170 -.004
Level3 .059 -.059 .215
Year of implementation -.191** -.152* -.151*
Faculty 1 .355** .374** .360**
Faculty 2 -.033 -.032 -.189*
Faculty 3 .095 .113 .069
Faculty other .129 .156 .034
Size of module -.298** -.285** -.239**
Learner satisfaction (SEAM) -.082 -.058
LMS Engagement -.070 -.190*
Finding information -.154
Communication .500**
Productive .133
Experiential .008
Interactive -.049
Assessment .063
R-sq adj 30% 30% 36%
n = 150 (Model 1-2), 140 (Model 3), * p < .05, ** p < .01
Table 5 Regression model of learning performance predicted by institutional, satisfaction and learning design analytics
• Size of module and discipline
predict completion
• Satisfaction unrelated to
completion
• Learning design
(communication) predicts
completion
Constructivist
Learning Design
Assessment
Learning Design
Productive
Learning Design
Socio-construct.
Learning Design
VLE Engagement
Student
Satisfaction
Student
retention
150+ modules
Week 1 Week 2 Week30
+
Rienties, B., Toetenel, L., (2016). The impact of learning design on student behaviour, satisfaction and performance: a cross-institutional comparison across
151 modules. Computers in Human Behavior, 60 (2016), 333-341
Communication
Toetenel, L., Rienties, B. (2016) Learning Design – creative design to visualise learning activities. Open Learning.
So what does the OU do in terms of
interventions on learning analytics?
The OU is developing its capabilities in 10 key areas that
build the underpinning strengths required for the effective
deployment of analytics
Strategic approach
40
42
Analytics4Action framework
Implementation/testing methodologies
• Randomised control trials
• A/B testing
• Quasi-experimental
• Apply to all
Community
of inquiry
framework:
underpinning
typology
Menu of response
actions
Methods of
gathering data
Evaluation Plans
Evidence hub
Key metrics and
drill downs
Deep dive
analysis and
strategic insight
45
Menu of actions
Learning design (before start) In-action interventions (during module)
Cognitive Presence  Redesign learning materials
 Redesign assignments
 Audio feedback on assignments
 Bootcamp before exam
Social Presence  Introduce graded discussion forum activities
 Group-based wiki assignment
 Assign groups based upon learning analytics
metrics
 Emotional questionnaire to gauge students’
emotions
 Introduce buddy system
 Organise additional videoconference sessions
 One-to-one conversations
 Cafe forum contributions
 Support emails when making progress
Teaching Presence  Introduce bi-weekly online videoconference
sessions
 Podcasts of key learning elements in the module
 Screencasts of “how to survive the first two weeks”
 Organise additional videoconference sessions
 Call/text/skype student-at-risk
 Organise catch-up sessions on specific topics that
students struggle with
Toetenel, L., Rienties, B. (2016) Learning Design – creative design to visualise learning activities. Open Learning.
Problem specification – the OU model
• Given:
– Demographic data at the Start (may include information about
student’s previous modules studied at the OU and his/her objectives)
– Assessments (TMAs) as they are available during the module
– VLE activities between TMAs
– Conditions student must satisfy to pass the module
• Goal:
– Identify students at risk of failing the module as early as possible so
that OU intervention is efficient and meaningful.
Available data
• Demographic data: age, gender, new/cont. student, education,
IMD, post code, occup. category, motivation, …
• Presentation-related (fluid) data: VLE logs, TMA (score,
submission date), CMA, payment dates, TMA/CMA weights,
End of module assessment.
• Aggregated VLE data available daily.
Naïve Bayes network
Gender
Education
N/C
VLE
TMA1
• Education:
– No formal qualif.
– Lower than A level
– A level
– HE qualif.
– Postgraduate qualif.
• VLE:
– No engagement
– 1-20 clicks
– 21-100 clicks
– 101 – 800 clicks
• N/C:
– New student
– Continuing student
• Gender:
– Female
– Male
Goal:
Calculate probability of failing at TMA1
• either by not submitting TMA1,
• or by submitting with score < 40.
Bayes network: example
• Demographic data
– New student
– Male
– No formal qualification
Gender
Educatio
n
N/C
TMA1
Without VLE:
Probability of failing at TMA1 = 18.5%
Gender
Educatio
n
N/C
VLE
TMA1
With VLE:
Clicks Probability
0 64%
1-20 44%
21-100 26%
101-800 6.30%
Why TMA1?
• Two reasons:
– TMA1 is a good predictor of success or failure
– It is enough time to intervene … is it true?
We are here
History we know Future we can affect
Predicting final result from TMA1
Gender
Educatio
n
N/C
VLE
TMA1 Final resultTMA6TMA2
Pass/Distinction
Fail
TMA1 >=40
TMA1 <40
Bayes minimum error classifier
If student fails in TMA1, he/she is likely to fail the whole module
Module XXX1 2013B
• Total number of students: 2966
• Pass or distinction: 1644
• Fail or withdrawn: 1322
• Prior probabilities:
– P(fail) = 0.45, P(success) = 0.55
• After TMA1 (Bayes rule):
– P(fail|TMA1fail) = 0.96, P(success|TMA1fail) = 0.04
– P(fail|TMA1pass) = 0.29, P(success|TMA1pass) = 0.71
• 96% students who fail at TMA1 fail the module
Module XXX2 2013B
• Total number of students: 1555
• Pass or distinction: 609
• Fail or withdrawn: 946
• Prior probabilities:
– P(fail) = 0.61, P(success) = 0.39
• After TMA1 (Bayes rule):
– P(fail|TMA1fail) = 0.986, P(success|TMA1fail) = 0.014
– P(fail|TMA1pass) = 0.46, P(success|TMA1pass) = 0.54
• 98.6% students who fail at TMA1 fail the module
Selected important VLE activities
• Forum (F), Subpage (S), Resource (R), OU_content (O), No
activity (N)
• Possible activity combinations in a week: F, FS, N, O, OF, OFS,
OR, ORF, ORFS, ORS, OS, R, RF, RFS, RS, S
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
Start
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
Pass Fail No submit TMA-1time
VLE opens
Start
Activity space
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
Start
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
Pass Fail No submit TMA-1time
VLE opens
Start
VLE trail: successful
student
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
Start
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
Pass Fail No submit TMA-1time
VLE opens
Start
VLE trail: student who did
not submit
Probabilistic model: all students
time
TMA1
VLE
start
Module VLE Fingerprint
Four predictive models built from
legacy data by Machine Learning
Prediction sheet: example
Dashboard: Module view
Time Machine and VLE overview with notifications
Prediction table
Dashboard: Module view
Filter
Dashboard: Module view
Dashboard: Student view
VLE activities, TMA results, time machine
Nearest neighbours, Predictions with real scores,
Personalised recommender
Dashboard: Student view
Feedback from tutors
Dashboard: Tutor view
Background of QAA Study
• HE increasingly competitive market: student satisfaction has become an important
component of Quality Assurance (QA) and Quality Enhancement (QE, Kember & Ginns, 2012;
Rienties, 2014).
• Measurement of student satisfaction is important to pinpoint strengths and identify areas for
improvement (Coffey & Gibbs, 2001; Zerihun, Beishuizen, & Os, 2012).
• Potential benefits and drawbacks of student evaluations have been well-documented in the
literature (see for example Bennett & De Bellis, 2010; Crews & Curtis, 2011),
o Recent research continues to suggest strong resistance amongst academic staff (Crews &
Curtis, 2011; Rienties, 2014).
o Most student survey instruments lack of focus on key elements of rich learning, such as
interaction, assessment and feedback.
• With the increased importance of NSS and institutional surveys on academic and educational
practice, there is a need for a critical review of how these data are used for QA and QE.
Key Questions of the Project
1. To what extent are institutions using insights from NSS and institutional surveys to
transform their students’ experience?
2. What are the key enablers and barriers for integrating student satisfaction data with QA
and QE
3. How are student experiences influencing quality enhancements
a) What influences students’ perceptions of overall satisfaction the most? Are student
characteristics or module/presentation related factors more predictive than
satisfaction with other aspects of their learning experience?
b) Is the student cohort homogenous when considering satisfaction key drivers? For
example are there systematic differences depending on the level or programme of
study?
Methodology (Logistic Regression)
& Validation
Step 1: A descriptive analysis was conducted to
discount variables that were unsuitable for
satisfaction modelling.
Step 1 also identified highly
correlated predictors and
methodically selected the most
appropriate.
Module
Presentation
Student
Concurrency
Study history
Overall
Satisfaction
SEaM
UG new, UG continuing, PG new
and PG continuing students were
modelled separately at Step 2.
Step 2: Each subset of variables
was modelled in groups. The
variables that were statistically
significant from each subset
were then combined and
modelled to identify the final list
of key drivers
We found at Step 3 that the combined
scale provided the simplest and most
interpretable solution for PG students
and the whole scale for UG students.
The solution without the KPI’s included
was much easier to use in terms of
identifying clear priorities for action.
Step 3 Validation: all models have
been verified by using subsets of the
whole data to ensure the solutions
are robust. A variety of model fit
statistics have also been used to
identify the optimum solutions.
Satisfaction Modelling:
Undergraduate Continuing Students
% planned
life cycle
15
Module:
Examinable
Component
14
Module:
Level of
study
13
Module:
Credits
12
Q6
Method
of
delivery
11
Q11
Assignmen
t
completio
n
09
Q23 Tutor
knowledg
e
07
Q3 Advice
&
guidance
05
Q13
Qualificati
on aim
03
KPI-05
Teaching
materials
01
KPI-06
Workload
10
Q9
Assignmen
t
instruction
s
08
Q14
Career
relevance
06
Q5
Integratio
n of
materials
04
Q36
Assessment
02
Importance to Overall Satisfaction
Li, N., Marsh, V., & Rienties, B. (2016). Modeling and managing learner satisfaction: use of learner feedback to enhance blended and online learning experience. Decision
Sciences Journal of Innovative Education, 14 (2), 216-242.
Satisfaction Modelling:
Undergraduate New Students
Age
07
Q14
Career
relevance
05
Q3 Advice
&
guidance
03
KPI-05
Teaching
materials
01
Q13
Qualificatio
n aim
06
Q5
Integratio
n of
materials
04
Q36
Assessment
02
Importance to Overall Satisfaction
Li, N., Marsh, V., Rienties, B., Whitelock, D. (2016). Online learning experiences of new versus continuing learners: a large scale replication study. Assessment &
Evaluation in Higher Education. DOI: 10.1080/02602938.2016.1176989.
So what does the OU do in terms of
interventions on learning analytics?
The OU is developing its capabilities in 10 key areas that
build the underpinning strengths required for the effective
deployment of analytics
Strategic approach
86
88
Analytics4Action framework
Implementation/testing methodologies
• Randomised control trials
• A/B testing
• Quasi-experimental
• Apply to all
Community
of inquiry
framework:
underpinning
typology
Menu of response
actions
Methods of
gathering data
Evaluation Plans
Evidence hub
Key metrics and
drill downs
Deep dive
analysis and
strategic insight
91
Menu of actions
Learning design (before start) In-action interventions (during module)
Cognitive Presence  Redesign learning materials
 Redesign assignments
 Audio feedback on assignments
 Bootcamp before exam
Social Presence  Introduce graded discussion forum activities
 Group-based wiki assignment
 Assign groups based upon learning analytics
metrics
 Emotional questionnaire to gauge students’
emotions
 Introduce buddy system
 Organise additional videoconference sessions
 One-to-one conversations
 Cafe forum contributions
 Support emails when making progress
Teaching Presence  Introduce bi-weekly online videoconference
sessions
 Podcasts of key learning elements in the module
 Screencasts of “how to survive the first two weeks”
 Organise additional videoconference sessions
 Call/text/skype student-at-risk
 Organise catch-up sessions on specific topics that
students struggle with
Conclusions (Part I)
1. Learning design strongly influences
student engagement, satisfaction and
performance
2. Visualising learning design decisions
by teachers lead to more
interactive/communicative designs
Conclusions (Part II)
1. 10 out of 11 modules improved
retention
2. Visualising learning analytics data can
encourage teachers to intervene in-
presentation and redesign afterwards

More Related Content

What's hot

Suh an explanatory study of high school teachers’ integration of mobile learning
Suh an explanatory study of high school teachers’ integration of mobile learningSuh an explanatory study of high school teachers’ integration of mobile learning
Suh an explanatory study of high school teachers’ integration of mobile learning
Sylvia Suh
 
The Power of Learning Analytics: Is There Still a Need for Educational Research?
The Power of Learning Analytics: Is There Still a Need for Educational Research?The Power of Learning Analytics: Is There Still a Need for Educational Research?
The Power of Learning Analytics: Is There Still a Need for Educational Research?
Bart Rienties
 
Enhancing (in)formal learning ties in interdisciplinary management courses: a...
Enhancing (in)formal learning ties in interdisciplinary management courses: a...Enhancing (in)formal learning ties in interdisciplinary management courses: a...
Enhancing (in)formal learning ties in interdisciplinary management courses: a...
Bart Rienties
 
A Development of Students’ Worksheet Based on Contextual Teaching and Learning
A Development of Students’ Worksheet Based on Contextual Teaching and LearningA Development of Students’ Worksheet Based on Contextual Teaching and Learning
A Development of Students’ Worksheet Based on Contextual Teaching and Learning
IOSRJM
 
The power of learning analytics to unpack learning and teaching: a critical p...
The power of learning analytics to unpack learning and teaching: a critical p...The power of learning analytics to unpack learning and teaching: a critical p...
The power of learning analytics to unpack learning and teaching: a critical p...
Bart Rienties
 
22 January 2018 HEFCE open event “Using data to increase learning gains and t...
22 January 2018 HEFCE open event “Using data to increase learning gains and t...22 January 2018 HEFCE open event “Using data to increase learning gains and t...
22 January 2018 HEFCE open event “Using data to increase learning gains and t...
Bart Rienties
 

What's hot (20)

ESRC International Distance Education and African Students Advisory Panel Mee...
ESRC International Distance Education and African Students Advisory Panel Mee...ESRC International Distance Education and African Students Advisory Panel Mee...
ESRC International Distance Education and African Students Advisory Panel Mee...
 
Learning Design and Analytics in Learning and Teaching: The Role of Big Data
Learning Design and Analytics in Learning and Teaching: The Role of Big DataLearning Design and Analytics in Learning and Teaching: The Role of Big Data
Learning Design and Analytics in Learning and Teaching: The Role of Big Data
 
ASCILITE Webinar: A review of five years of implementation and research in al...
ASCILITE Webinar: A review of five years of implementation and research in al...ASCILITE Webinar: A review of five years of implementation and research in al...
ASCILITE Webinar: A review of five years of implementation and research in al...
 
Suh an explanatory study of high school teachers’ integration of mobile learning
Suh an explanatory study of high school teachers’ integration of mobile learningSuh an explanatory study of high school teachers’ integration of mobile learning
Suh an explanatory study of high school teachers’ integration of mobile learning
 
Affective behaviour cognition learning gains project presentation
Affective behaviour cognition learning gains project presentationAffective behaviour cognition learning gains project presentation
Affective behaviour cognition learning gains project presentation
 
Pedagogical Practices and Technology Integration Thesis Defense March 11, 2015
Pedagogical Practices and Technology Integration Thesis Defense March 11, 2015Pedagogical Practices and Technology Integration Thesis Defense March 11, 2015
Pedagogical Practices and Technology Integration Thesis Defense March 11, 2015
 
The Power of Learning Analytics: Is There Still a Need for Educational Research?
The Power of Learning Analytics: Is There Still a Need for Educational Research?The Power of Learning Analytics: Is There Still a Need for Educational Research?
The Power of Learning Analytics: Is There Still a Need for Educational Research?
 
Enhancing (in)formal learning ties in interdisciplinary management courses: a...
Enhancing (in)formal learning ties in interdisciplinary management courses: a...Enhancing (in)formal learning ties in interdisciplinary management courses: a...
Enhancing (in)formal learning ties in interdisciplinary management courses: a...
 
Impactof Mlti April2007
Impactof Mlti April2007Impactof Mlti April2007
Impactof Mlti April2007
 
A Development of Students’ Worksheet Based on Contextual Teaching and Learning
A Development of Students’ Worksheet Based on Contextual Teaching and LearningA Development of Students’ Worksheet Based on Contextual Teaching and Learning
A Development of Students’ Worksheet Based on Contextual Teaching and Learning
 
Blended e-Learning Activities for the Information and Innovation Management C...
Blended e-Learning Activities for the Information and Innovation Management C...Blended e-Learning Activities for the Information and Innovation Management C...
Blended e-Learning Activities for the Information and Innovation Management C...
 
The power of learning analytics to unpack learning and teaching: a critical p...
The power of learning analytics to unpack learning and teaching: a critical p...The power of learning analytics to unpack learning and teaching: a critical p...
The power of learning analytics to unpack learning and teaching: a critical p...
 
22 January 2018 HEFCE open event “Using data to increase learning gains and t...
22 January 2018 HEFCE open event “Using data to increase learning gains and t...22 January 2018 HEFCE open event “Using data to increase learning gains and t...
22 January 2018 HEFCE open event “Using data to increase learning gains and t...
 
How to build a better education review
How to build a better education reviewHow to build a better education review
How to build a better education review
 
Rdp ppt
Rdp pptRdp ppt
Rdp ppt
 
Action Research Proposal Presentation - DRAFT
Action Research Proposal Presentation - DRAFTAction Research Proposal Presentation - DRAFT
Action Research Proposal Presentation - DRAFT
 
Utilization of Digital Camera Simulation Media
Utilization of Digital Camera Simulation MediaUtilization of Digital Camera Simulation Media
Utilization of Digital Camera Simulation Media
 
E0333025028
E0333025028E0333025028
E0333025028
 
System analysis and training psychology
System analysis and training psychologySystem analysis and training psychology
System analysis and training psychology
 
Keynote: 7th eSTEeM Annual Conference Critical discussion of Student Evaluati...
Keynote: 7th eSTEeM Annual Conference Critical discussion of Student Evaluati...Keynote: 7th eSTEeM Annual Conference Critical discussion of Student Evaluati...
Keynote: 7th eSTEeM Annual Conference Critical discussion of Student Evaluati...
 

Similar to The power of learning analytics for UCL: lessons learned from the Open University UK

Final project program evaluation module 8
Final project program evaluation module 8Final project program evaluation module 8
Final project program evaluation module 8
Daniel Downs
 
Conducting Research on Blended and Online Education, Workshop
Conducting Research on Blended and Online Education, WorkshopConducting Research on Blended and Online Education, Workshop
Conducting Research on Blended and Online Education, Workshop
Tanya Joosten
 

Similar to The power of learning analytics for UCL: lessons learned from the Open University UK (20)

Learner engagement
Learner engagementLearner engagement
Learner engagement
 
Investigating learning strategies in a dispositional learning analytics conte...
Investigating learning strategies in a dispositional learning analytics conte...Investigating learning strategies in a dispositional learning analytics conte...
Investigating learning strategies in a dispositional learning analytics conte...
 
Jisc learning analytics service updates
Jisc learning analytics service updatesJisc learning analytics service updates
Jisc learning analytics service updates
 
The Multiple Learning Experiences (M-LEx™) Model – A Holistic Approach to Edu...
The Multiple Learning Experiences (M-LEx™) Model – A Holistic Approach to Edu...The Multiple Learning Experiences (M-LEx™) Model – A Holistic Approach to Edu...
The Multiple Learning Experiences (M-LEx™) Model – A Holistic Approach to Edu...
 
How educators value data analytics about their moocs (1)
How educators value data analytics about their moocs (1)How educators value data analytics about their moocs (1)
How educators value data analytics about their moocs (1)
 
ICT Integration across the Curricula
ICT Integration across the CurriculaICT Integration across the Curricula
ICT Integration across the Curricula
 
Pivoting to remote learning and online instruction
Pivoting to remote learning and online instructionPivoting to remote learning and online instruction
Pivoting to remote learning and online instruction
 
Learning design twofold strategies for teacher-led inquiry and student active...
Learning design twofold strategies for teacher-led inquiry and student active...Learning design twofold strategies for teacher-led inquiry and student active...
Learning design twofold strategies for teacher-led inquiry and student active...
 
Designing Learning Analytics for Humans with Humans
Designing Learning Analytics for Humans with HumansDesigning Learning Analytics for Humans with Humans
Designing Learning Analytics for Humans with Humans
 
Critical thinking and technology - an EDEN NAP Webinar
Critical thinking and technology - an EDEN NAP WebinarCritical thinking and technology - an EDEN NAP Webinar
Critical thinking and technology - an EDEN NAP Webinar
 
Final project program evaluation module 8
Final project program evaluation module 8Final project program evaluation module 8
Final project program evaluation module 8
 
2021_01_15 «Applying Learning Analytics in Living Labs for Educational Innova...
2021_01_15 «Applying Learning Analytics in Living Labs for Educational Innova...2021_01_15 «Applying Learning Analytics in Living Labs for Educational Innova...
2021_01_15 «Applying Learning Analytics in Living Labs for Educational Innova...
 
Enhancing students' learning through blended learning for engineering mathema...
Enhancing students' learning through blended learning for engineering mathema...Enhancing students' learning through blended learning for engineering mathema...
Enhancing students' learning through blended learning for engineering mathema...
 
Conducting Research on Blended and Online Education, Workshop
Conducting Research on Blended and Online Education, WorkshopConducting Research on Blended and Online Education, Workshop
Conducting Research on Blended and Online Education, Workshop
 
Investigating the effectiveness of an ecological approach to learning design ...
Investigating the effectiveness of an ecological approach to learning design ...Investigating the effectiveness of an ecological approach to learning design ...
Investigating the effectiveness of an ecological approach to learning design ...
 
WhitePaper_M-LEx
WhitePaper_M-LExWhitePaper_M-LEx
WhitePaper_M-LEx
 
Learning Analytics and the Scholarship of Teaching and Learning - an obvious ...
Learning Analytics and the Scholarship of Teaching and Learning - an obvious ...Learning Analytics and the Scholarship of Teaching and Learning - an obvious ...
Learning Analytics and the Scholarship of Teaching and Learning - an obvious ...
 
Highlights From Future of Education - mSchool + DreamBox Learning
Highlights From Future of Education - mSchool + DreamBox LearningHighlights From Future of Education - mSchool + DreamBox Learning
Highlights From Future of Education - mSchool + DreamBox Learning
 
Rae t4 d-knowledge-economy-sa-urs-dec2017
Rae t4 d-knowledge-economy-sa-urs-dec2017Rae t4 d-knowledge-economy-sa-urs-dec2017
Rae t4 d-knowledge-economy-sa-urs-dec2017
 
Learning Analytics for MOOCs: EMMA case
Learning Analytics for MOOCs: EMMA caseLearning Analytics for MOOCs: EMMA case
Learning Analytics for MOOCs: EMMA case
 

More from Bart Rienties

Applying and translating learning design and analytics approaches in your ins...
Applying and translating learning design and analytics approaches in your ins...Applying and translating learning design and analytics approaches in your ins...
Applying and translating learning design and analytics approaches in your ins...
Bart Rienties
 
How can you use learning analytics in your own research and practice: an intr...
How can you use learning analytics in your own research and practice: an intr...How can you use learning analytics in your own research and practice: an intr...
How can you use learning analytics in your own research and practice: an intr...
Bart Rienties
 
SAAIR: Implementing learning analytics at scale in an online world: lessons l...
SAAIR: Implementing learning analytics at scale in an online world: lessons l...SAAIR: Implementing learning analytics at scale in an online world: lessons l...
SAAIR: Implementing learning analytics at scale in an online world: lessons l...
Bart Rienties
 
OU/Leverhulme Open World Learning: Knowledge Exchange and Book Launch Event p...
OU/Leverhulme Open World Learning: Knowledge Exchange and Book Launch Event p...OU/Leverhulme Open World Learning: Knowledge Exchange and Book Launch Event p...
OU/Leverhulme Open World Learning: Knowledge Exchange and Book Launch Event p...
Bart Rienties
 
What have we learned from 6 years of implementing learning analytics amongst ...
What have we learned from 6 years of implementing learning analytics amongst ...What have we learned from 6 years of implementing learning analytics amongst ...
What have we learned from 6 years of implementing learning analytics amongst ...
Bart Rienties
 

More from Bart Rienties (20)

Applying and translating learning design and analytics approaches in your ins...
Applying and translating learning design and analytics approaches in your ins...Applying and translating learning design and analytics approaches in your ins...
Applying and translating learning design and analytics approaches in your ins...
 
Keynote Presentation: Implementing learning analytics and learning design at ...
Keynote Presentation: Implementing learning analytics and learning design at ...Keynote Presentation: Implementing learning analytics and learning design at ...
Keynote Presentation: Implementing learning analytics and learning design at ...
 
Edutech_Europe Keynote Presentation: Implementing learning analytics and lear...
Edutech_Europe Keynote Presentation: Implementing learning analytics and lear...Edutech_Europe Keynote Presentation: Implementing learning analytics and lear...
Edutech_Europe Keynote Presentation: Implementing learning analytics and lear...
 
How can you use learning analytics in your own research and practice: an intr...
How can you use learning analytics in your own research and practice: an intr...How can you use learning analytics in your own research and practice: an intr...
How can you use learning analytics in your own research and practice: an intr...
 
SAAIR: Implementing learning analytics at scale in an online world: lessons l...
SAAIR: Implementing learning analytics at scale in an online world: lessons l...SAAIR: Implementing learning analytics at scale in an online world: lessons l...
SAAIR: Implementing learning analytics at scale in an online world: lessons l...
 
OU/Leverhulme Open World Learning: Knowledge Exchange and Book Launch Event p...
OU/Leverhulme Open World Learning: Knowledge Exchange and Book Launch Event p...OU/Leverhulme Open World Learning: Knowledge Exchange and Book Launch Event p...
OU/Leverhulme Open World Learning: Knowledge Exchange and Book Launch Event p...
 
Keynote Learning & Innovation Maastricht University
Keynote Learning & Innovation Maastricht UniversityKeynote Learning & Innovation Maastricht University
Keynote Learning & Innovation Maastricht University
 
Education 4.0 and Computer Science: A European perspective
Education 4.0 and Computer Science: A European perspectiveEducation 4.0 and Computer Science: A European perspective
Education 4.0 and Computer Science: A European perspective
 
AI in Education Amsterdam Data Science (ADS) What have we learned after a dec...
AI in Education Amsterdam Data Science (ADS) What have we learned after a dec...AI in Education Amsterdam Data Science (ADS) What have we learned after a dec...
AI in Education Amsterdam Data Science (ADS) What have we learned after a dec...
 
Keynote Data Matters JISC What is the impact? Six years of learning analytics...
Keynote Data Matters JISC What is the impact? Six years of learning analytics...Keynote Data Matters JISC What is the impact? Six years of learning analytics...
Keynote Data Matters JISC What is the impact? Six years of learning analytics...
 
What have we learned from 6 years of implementing learning analytics amongst ...
What have we learned from 6 years of implementing learning analytics amongst ...What have we learned from 6 years of implementing learning analytics amongst ...
What have we learned from 6 years of implementing learning analytics amongst ...
 
Using Learning analytics to support learners and teachers at the Open University
Using Learning analytics to support learners and teachers at the Open UniversityUsing Learning analytics to support learners and teachers at the Open University
Using Learning analytics to support learners and teachers at the Open University
 
Using student data to transform teaching and learning
Using student data to transform teaching and learningUsing student data to transform teaching and learning
Using student data to transform teaching and learning
 
How learning gains and Quality Assurance are (mis)Aligned: An Interactive Wor...
How learning gains and Quality Assurance are (mis)Aligned: An Interactive Wor...How learning gains and Quality Assurance are (mis)Aligned: An Interactive Wor...
How learning gains and Quality Assurance are (mis)Aligned: An Interactive Wor...
 
Lecture series: Using trace data or subjective data, that is the question dur...
Lecture series: Using trace data or subjective data, that is the question dur...Lecture series: Using trace data or subjective data, that is the question dur...
Lecture series: Using trace data or subjective data, that is the question dur...
 
How to analyse questionnaire data: an advanced session
How to analyse questionnaire data: an advanced sessionHow to analyse questionnaire data: an advanced session
How to analyse questionnaire data: an advanced session
 
Questionnaire design for beginners (Bart Rienties)
Questionnaire design for beginners (Bart Rienties)Questionnaire design for beginners (Bart Rienties)
Questionnaire design for beginners (Bart Rienties)
 
Learning analytics adoption in Higher Education: Reviewing six years of exper...
Learning analytics adoption in Higher Education: Reviewing six years of exper...Learning analytics adoption in Higher Education: Reviewing six years of exper...
Learning analytics adoption in Higher Education: Reviewing six years of exper...
 
«Learning Analytics at the Open University and the UK»
 «Learning Analytics at the Open University and the UK» «Learning Analytics at the Open University and the UK»
«Learning Analytics at the Open University and the UK»
 
Presentation LMU Munich: The power of learning analytics to unpack learning a...
Presentation LMU Munich: The power of learning analytics to unpack learning a...Presentation LMU Munich: The power of learning analytics to unpack learning a...
Presentation LMU Munich: The power of learning analytics to unpack learning a...
 

Recently uploaded

Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
PECB
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
QucHHunhnh
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
Chris Hunter
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
negromaestrong
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
heathfieldcps1
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
kauryashika82
 

Recently uploaded (20)

Unit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxUnit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptx
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.ppt
 
Asian American Pacific Islander Month DDSD 2024.pptx
Asian American Pacific Islander Month DDSD 2024.pptxAsian American Pacific Islander Month DDSD 2024.pptx
Asian American Pacific Islander Month DDSD 2024.pptx
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SD
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdf
 
ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 
ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701
 
Micro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfMicro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdf
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
 
Role Of Transgenic Animal In Target Validation-1.pptx
Role Of Transgenic Animal In Target Validation-1.pptxRole Of Transgenic Animal In Target Validation-1.pptx
Role Of Transgenic Animal In Target Validation-1.pptx
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
 

The power of learning analytics for UCL: lessons learned from the Open University UK

  • 1. The power of learning analytics for UCL: lessons learned from the Open University UK Arena Exchange Seminar, UCL, London 22 July 2016@DrBartRienties Reader in Learning Analytics
  • 2. What is learning analytics? http://bcomposes.wordpress.com/
  • 3. (Social) Learning Analytics “LA is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs” (LAK 2011) Social LA “focuses on how learners build knowledge together in their cultural and social settings” (Ferguson & Buckingham Shum, 2012)
  • 4.
  • 5. Agenda? You choose  1. The power of 151 Learning Designs on 113K+ students at the OU? 2. Analytics4Action: evidence-based interventions? 3. OU Analyse: predictive analytics with automated student recommender? 4. Key drivers for 100K+ student satisfaction? 5. Opportunities of learning analytics/elearning for teaching practice, grant acquisition, commercialisation, and wider policy implications.
  • 6.
  • 7.
  • 8.
  • 9.
  • 10.
  • 11.
  • 12.
  • 13.
  • 14.
  • 15.
  • 16.
  • 17.
  • 18.
  • 19. Assimilative Finding and handling information Communicati on Productive Experiential Interactive/ Adaptive Assessment Type of activity Attending to information Searching for and processing information Discussing module related content with at least one other person (student or tutor) Actively constructing an artefact Applying learning in a real-world setting Applying learning in a simulated setting All forms of assessment, whether continuous, end of module, or formative (assessment for learning) Examples of activity Read, Watch, Listen, Think about, Access, Observe, Review, Study List, Analyse, Collate, Plot, Find, Discover, Access, Use, Gather, Order, Classify, Select, Assess, Manipulate Communicate, Debate, Discuss, Argue, Share, Report, Collaborate, Present, Describe, Question Create, Build, Make, Design, Construct, Contribute, Complete, Produce, Write, Draw, Refine, Compose, Synthesise, Remix Practice, Apply, Mimic, Experience, Explore, Investigate, Perform, Engage Explore, Experiment, Trial, Improve, Model, Simulate Write, Present, Report, Demonstrate, Critique
  • 20.
  • 21. Method – data sets • Combination of four different data sets: • learning design data (189 modules mapped, 276 module implementations included) • student feedback data (140) • VLE data (141 modules) • Academic Performance (151) • Data sets merged and cleaned • 111,256 students undertook these modules
  • 22. Toetenel, L. & Rienties, B. (2016). Analysing 157 Learning Designs using Learning Analytic approaches as a means to evaluate the impact of pedagogical decision-making. British Journal of Educational Technology.
  • 23. Constructivist Learning Design Assessment Learning Design Productive Learning Design Socio-construct. Learning Design VLE Engagement Student Satisfaction Student retention Learning Design 151 modules Week 1 Week 2 Week30 + Rienties, B., Toetenel, L., Bryan, A. (2015). “Scaling up” learning design: impact of learning design activities on LMS behavior and performance. Learning Analytics Knowledge conference. Disciplines Levels Size module
  • 24. Average time spent per week in VLE
  • 28. Cluster 4 Social Constructivist (n=20)
  • 29. Week Assim Find Com. Prod Exp Inter Asses Total -2 -.03 .02 -.02 -.09 .20* -.03 .01 .35** -1 -.17* .14 .14 -.01 .30** -.02 -.05 .38** 0 -.21* .14 .37** -.07 .13 .08 .02 .48** 1 -.26** .25** .47** -.02 .28** .01 -.1 .48** 2 -.33** .41** .59** -.02 .25** .05 -.13 .42** 3 -.30** .33** .53** -.02 .34** .02 -.15 .51** 4 -.27** .41** .49** -.01 .23** -.02 -.15 .35** 5 -.31** .46** .52** .05 .16 -.05 -.13 .28** 6 -.27** .44** .47** -.04 .18* -.09 -.08 .28** 7 -.30** .41** .49** -.02 .22** -.05 -.08 .33** 8 -.25** .33** .42** -.06 .29** -.02 -.1 .32** 9 -.28** .34** .44** -.01 .32** .01 -.14 .36** 10 -.34** .35** .53** .06 .27** .00 -.13 .35**
  • 30. Model 1 Model 2 Model 3 Level0 -.279** -.291** -.116 Level1 -.341* -.352* -.067 Level2 .221* .229* .275** Level3 .128 .130 .139 Year of implementation .048 .049 .090 Faculty 1 -.205* -.211* -.196* Faculty 2 -.022 -.020 -.228** Faculty 3 -.206* -.210* -.308** Faculty other .216 .214 .024 Size of module .210* .209* .242** Learner satisfaction (SEAM) -.040 .103 Finding information .147 Communication .393** Productive .135 Experiential .353** Interactive -.081 Assessment .076 R-sq adj 18% 18% 40% n = 140, * p < .05, ** p < .01 Table 3 Regression model of LMS engagement predicted by institutional, satisfaction and learning design analytics • Level of study predict VLE engagement • Faculties have different VLE engagement • Learning design (communication & experiential) predict VLE engagement (with 22% unique variance explained)
  • 31. Model 1 Model 2 Model 3 Level0 .284** .304** .351** Level1 .259 .243 .265 Level2 -.211 -.197 -.212 Level3 -.035 -.029 -.018 Year of implementation .028 -.071 -.059 Faculty 1 .149 .188 .213* Faculty 2 -.039 .029 .045 Faculty 3 .090 .188 .236* Faculty other .046 .077 .051 Size of module .016 -.049 -.071 Finding information -.270** -.294** Communication .005 .050 Productive -.243** -.274** Experiential -.111 -.105 Interactive .173* .221* Assessment -.208* -.221* LMS engagement .117 R-sq adj 20% 30% 31% n = 150 (Model 1-2), 140 (Model 3), * p < .05, ** p < .01 Table 4 Regression model of learner satisfaction predicted by institutional and learning design analytics • Level of study predict satisfaction • Learning design (finding info, productive, assessment) negatively predict satisfaction • Interactive learning design positively predicts satisfaction • VLE engagement and satisfaction unrelated
  • 32. Model 1 Model 2 Model 3 Level0 -.142 -.147 .005 Level1 -.227 -.236 .017 Level2 -.134 -.170 -.004 Level3 .059 -.059 .215 Year of implementation -.191** -.152* -.151* Faculty 1 .355** .374** .360** Faculty 2 -.033 -.032 -.189* Faculty 3 .095 .113 .069 Faculty other .129 .156 .034 Size of module -.298** -.285** -.239** Learner satisfaction (SEAM) -.082 -.058 LMS Engagement -.070 -.190* Finding information -.154 Communication .500** Productive .133 Experiential .008 Interactive -.049 Assessment .063 R-sq adj 30% 30% 36% n = 150 (Model 1-2), 140 (Model 3), * p < .05, ** p < .01 Table 5 Regression model of learning performance predicted by institutional, satisfaction and learning design analytics • Size of module and discipline predict completion • Satisfaction unrelated to completion • Learning design (communication) predicts completion
  • 33. Constructivist Learning Design Assessment Learning Design Productive Learning Design Socio-construct. Learning Design VLE Engagement Student Satisfaction Student retention 150+ modules Week 1 Week 2 Week30 + Rienties, B., Toetenel, L., (2016). The impact of learning design on student behaviour, satisfaction and performance: a cross-institutional comparison across 151 modules. Computers in Human Behavior, 60 (2016), 333-341 Communication
  • 34.
  • 35.
  • 36. Toetenel, L., Rienties, B. (2016) Learning Design – creative design to visualise learning activities. Open Learning.
  • 37.
  • 38. So what does the OU do in terms of interventions on learning analytics?
  • 39. The OU is developing its capabilities in 10 key areas that build the underpinning strengths required for the effective deployment of analytics Strategic approach
  • 40. 40
  • 41.
  • 42. 42 Analytics4Action framework Implementation/testing methodologies • Randomised control trials • A/B testing • Quasi-experimental • Apply to all Community of inquiry framework: underpinning typology Menu of response actions Methods of gathering data Evaluation Plans Evidence hub Key metrics and drill downs Deep dive analysis and strategic insight
  • 43.
  • 44.
  • 45. 45 Menu of actions Learning design (before start) In-action interventions (during module) Cognitive Presence  Redesign learning materials  Redesign assignments  Audio feedback on assignments  Bootcamp before exam Social Presence  Introduce graded discussion forum activities  Group-based wiki assignment  Assign groups based upon learning analytics metrics  Emotional questionnaire to gauge students’ emotions  Introduce buddy system  Organise additional videoconference sessions  One-to-one conversations  Cafe forum contributions  Support emails when making progress Teaching Presence  Introduce bi-weekly online videoconference sessions  Podcasts of key learning elements in the module  Screencasts of “how to survive the first two weeks”  Organise additional videoconference sessions  Call/text/skype student-at-risk  Organise catch-up sessions on specific topics that students struggle with
  • 46.
  • 47.
  • 48.
  • 49. Toetenel, L., Rienties, B. (2016) Learning Design – creative design to visualise learning activities. Open Learning.
  • 50.
  • 51. Problem specification – the OU model • Given: – Demographic data at the Start (may include information about student’s previous modules studied at the OU and his/her objectives) – Assessments (TMAs) as they are available during the module – VLE activities between TMAs – Conditions student must satisfy to pass the module • Goal: – Identify students at risk of failing the module as early as possible so that OU intervention is efficient and meaningful.
  • 52. Available data • Demographic data: age, gender, new/cont. student, education, IMD, post code, occup. category, motivation, … • Presentation-related (fluid) data: VLE logs, TMA (score, submission date), CMA, payment dates, TMA/CMA weights, End of module assessment. • Aggregated VLE data available daily.
  • 53. Naïve Bayes network Gender Education N/C VLE TMA1 • Education: – No formal qualif. – Lower than A level – A level – HE qualif. – Postgraduate qualif. • VLE: – No engagement – 1-20 clicks – 21-100 clicks – 101 – 800 clicks • N/C: – New student – Continuing student • Gender: – Female – Male Goal: Calculate probability of failing at TMA1 • either by not submitting TMA1, • or by submitting with score < 40.
  • 54. Bayes network: example • Demographic data – New student – Male – No formal qualification Gender Educatio n N/C TMA1 Without VLE: Probability of failing at TMA1 = 18.5% Gender Educatio n N/C VLE TMA1 With VLE: Clicks Probability 0 64% 1-20 44% 21-100 26% 101-800 6.30%
  • 55. Why TMA1? • Two reasons: – TMA1 is a good predictor of success or failure – It is enough time to intervene … is it true? We are here History we know Future we can affect
  • 56. Predicting final result from TMA1 Gender Educatio n N/C VLE TMA1 Final resultTMA6TMA2 Pass/Distinction Fail TMA1 >=40 TMA1 <40 Bayes minimum error classifier If student fails in TMA1, he/she is likely to fail the whole module
  • 57. Module XXX1 2013B • Total number of students: 2966 • Pass or distinction: 1644 • Fail or withdrawn: 1322 • Prior probabilities: – P(fail) = 0.45, P(success) = 0.55 • After TMA1 (Bayes rule): – P(fail|TMA1fail) = 0.96, P(success|TMA1fail) = 0.04 – P(fail|TMA1pass) = 0.29, P(success|TMA1pass) = 0.71 • 96% students who fail at TMA1 fail the module
  • 58. Module XXX2 2013B • Total number of students: 1555 • Pass or distinction: 609 • Fail or withdrawn: 946 • Prior probabilities: – P(fail) = 0.61, P(success) = 0.39 • After TMA1 (Bayes rule): – P(fail|TMA1fail) = 0.986, P(success|TMA1fail) = 0.014 – P(fail|TMA1pass) = 0.46, P(success|TMA1pass) = 0.54 • 98.6% students who fail at TMA1 fail the module
  • 59. Selected important VLE activities • Forum (F), Subpage (S), Resource (R), OU_content (O), No activity (N) • Possible activity combinations in a week: F, FS, N, O, OF, OFS, OR, ORF, ORFS, ORS, OS, R, RF, RFS, RS, S FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
  • 60. Start FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS Pass Fail No submit TMA-1time VLE opens Start Activity space FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
  • 61. FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS Start FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS Pass Fail No submit TMA-1time VLE opens Start VLE trail: successful student FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
  • 62. FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS Start FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS Pass Fail No submit TMA-1time VLE opens Start VLE trail: student who did not submit
  • 63. Probabilistic model: all students time TMA1 VLE start
  • 65. Four predictive models built from legacy data by Machine Learning
  • 67. Dashboard: Module view Time Machine and VLE overview with notifications
  • 70. Dashboard: Student view VLE activities, TMA results, time machine
  • 71. Nearest neighbours, Predictions with real scores, Personalised recommender Dashboard: Student view
  • 73.
  • 74.
  • 75. Background of QAA Study • HE increasingly competitive market: student satisfaction has become an important component of Quality Assurance (QA) and Quality Enhancement (QE, Kember & Ginns, 2012; Rienties, 2014). • Measurement of student satisfaction is important to pinpoint strengths and identify areas for improvement (Coffey & Gibbs, 2001; Zerihun, Beishuizen, & Os, 2012). • Potential benefits and drawbacks of student evaluations have been well-documented in the literature (see for example Bennett & De Bellis, 2010; Crews & Curtis, 2011), o Recent research continues to suggest strong resistance amongst academic staff (Crews & Curtis, 2011; Rienties, 2014). o Most student survey instruments lack of focus on key elements of rich learning, such as interaction, assessment and feedback. • With the increased importance of NSS and institutional surveys on academic and educational practice, there is a need for a critical review of how these data are used for QA and QE.
  • 76. Key Questions of the Project 1. To what extent are institutions using insights from NSS and institutional surveys to transform their students’ experience? 2. What are the key enablers and barriers for integrating student satisfaction data with QA and QE 3. How are student experiences influencing quality enhancements a) What influences students’ perceptions of overall satisfaction the most? Are student characteristics or module/presentation related factors more predictive than satisfaction with other aspects of their learning experience? b) Is the student cohort homogenous when considering satisfaction key drivers? For example are there systematic differences depending on the level or programme of study?
  • 77.
  • 78. Methodology (Logistic Regression) & Validation Step 1: A descriptive analysis was conducted to discount variables that were unsuitable for satisfaction modelling. Step 1 also identified highly correlated predictors and methodically selected the most appropriate. Module Presentation Student Concurrency Study history Overall Satisfaction SEaM UG new, UG continuing, PG new and PG continuing students were modelled separately at Step 2. Step 2: Each subset of variables was modelled in groups. The variables that were statistically significant from each subset were then combined and modelled to identify the final list of key drivers We found at Step 3 that the combined scale provided the simplest and most interpretable solution for PG students and the whole scale for UG students. The solution without the KPI’s included was much easier to use in terms of identifying clear priorities for action. Step 3 Validation: all models have been verified by using subsets of the whole data to ensure the solutions are robust. A variety of model fit statistics have also been used to identify the optimum solutions.
  • 79.
  • 80. Satisfaction Modelling: Undergraduate Continuing Students % planned life cycle 15 Module: Examinable Component 14 Module: Level of study 13 Module: Credits 12 Q6 Method of delivery 11 Q11 Assignmen t completio n 09 Q23 Tutor knowledg e 07 Q3 Advice & guidance 05 Q13 Qualificati on aim 03 KPI-05 Teaching materials 01 KPI-06 Workload 10 Q9 Assignmen t instruction s 08 Q14 Career relevance 06 Q5 Integratio n of materials 04 Q36 Assessment 02 Importance to Overall Satisfaction Li, N., Marsh, V., & Rienties, B. (2016). Modeling and managing learner satisfaction: use of learner feedback to enhance blended and online learning experience. Decision Sciences Journal of Innovative Education, 14 (2), 216-242.
  • 81. Satisfaction Modelling: Undergraduate New Students Age 07 Q14 Career relevance 05 Q3 Advice & guidance 03 KPI-05 Teaching materials 01 Q13 Qualificatio n aim 06 Q5 Integratio n of materials 04 Q36 Assessment 02 Importance to Overall Satisfaction
  • 82. Li, N., Marsh, V., Rienties, B., Whitelock, D. (2016). Online learning experiences of new versus continuing learners: a large scale replication study. Assessment & Evaluation in Higher Education. DOI: 10.1080/02602938.2016.1176989.
  • 83.
  • 84. So what does the OU do in terms of interventions on learning analytics?
  • 85. The OU is developing its capabilities in 10 key areas that build the underpinning strengths required for the effective deployment of analytics Strategic approach
  • 86. 86
  • 87.
  • 88. 88 Analytics4Action framework Implementation/testing methodologies • Randomised control trials • A/B testing • Quasi-experimental • Apply to all Community of inquiry framework: underpinning typology Menu of response actions Methods of gathering data Evaluation Plans Evidence hub Key metrics and drill downs Deep dive analysis and strategic insight
  • 89.
  • 90.
  • 91. 91 Menu of actions Learning design (before start) In-action interventions (during module) Cognitive Presence  Redesign learning materials  Redesign assignments  Audio feedback on assignments  Bootcamp before exam Social Presence  Introduce graded discussion forum activities  Group-based wiki assignment  Assign groups based upon learning analytics metrics  Emotional questionnaire to gauge students’ emotions  Introduce buddy system  Organise additional videoconference sessions  One-to-one conversations  Cafe forum contributions  Support emails when making progress Teaching Presence  Introduce bi-weekly online videoconference sessions  Podcasts of key learning elements in the module  Screencasts of “how to survive the first two weeks”  Organise additional videoconference sessions  Call/text/skype student-at-risk  Organise catch-up sessions on specific topics that students struggle with
  • 92.
  • 93. Conclusions (Part I) 1. Learning design strongly influences student engagement, satisfaction and performance 2. Visualising learning design decisions by teachers lead to more interactive/communicative designs
  • 94. Conclusions (Part II) 1. 10 out of 11 modules improved retention 2. Visualising learning analytics data can encourage teachers to intervene in- presentation and redesign afterwards

Editor's Notes

  1. Learning Design Team has mapped 100+ modules
  2. For each module, the learning design team together with module chairs create activity charts of what kind of activities students are expected to do in a week.
  3. For each module, detailed information is available about the design philosophy, support materials, etc.
  4. Explain seven categories
  5. 5131 students responded – 28%, between 18-76%
  6. Cluster analysis of 40 modules (>19k students) indicate that module teams design four different types of modules: constructivist, assessment driven, balanced, or socio-constructivist. The LAK paper by Rienties and colleagues indicates that VLE engagement is higher in modules with socio-constructivist or balanced variety learning designs, and lower for constructivist designs. In terms of learning outcomes, students rate constructivist modules higher, and socio-constructivist modules lower. However, in terms of student retention (% of students passed) constructivist modules have lower retention, while socio-constructivist have higher. Thus, learning design strongly influences behaviour, experience and performance. (and we believe we are the first to have mapped this with such a large cohort).
  7. Cluster analysis of 40 modules (>19k students) indicate that module teams design four different types of modules: constructivist, assessment driven, balanced, or socio-constructivist. The LAK paper by Rienties and colleagues indicates that VLE engagement is higher in modules with socio-constructivist or balanced variety learning designs, and lower for constructivist designs. In terms of learning outcomes, students rate constructivist modules higher, and socio-constructivist modules lower. However, in terms of student retention (% of students passed) constructivist modules have lower retention, while socio-constructivist have higher. Thus, learning design strongly influences behaviour, experience and performance. (and we believe we are the first to have mapped this with such a large cohort).
  8. Belinda This encapsulates our strategy which is moving forward on all fronts. Kevin will now demonstrate an operation tool available at scale and one of our latest experimental prototypes.
  9. Model that will be rolled out across the University
  10. Belinda This encapsulates our strategy which is moving forward on all fronts. Kevin will now demonstrate an operation tool available at scale and one of our latest experimental prototypes.
  11. Model that will be rolled out across the University