3. Session outline
» Overview of the Learning Analytics Service
» Overview of the Business Intelligence and Analytics Labs project
» Presentation 1: Moriamo Aduyemi, head of corporate information
systems,Abertay University
» Presentation 2: Niall Sclater, learning analytics consultant, Jisc
4. Effective learning analytics challenge
Rationale
» Universities and colleges wanted help to get started and have access to a
standard set of tools and technologies to monitor and intervene
Priorities identified
» Code of Practice on legal and ethical issues
» Develop a basic learning analytics service including an app for students
» Provide a network to share knowledge and experience
Timescale
» 2015-2016 -Test and develop the tools and metrics
» 2016-2017 -Transition to service (Freemium)
» Sept 2017 - Launch. Measure impact on retention and achievement
5. What do we mean by Learning Analytics?
» The application of big data techniques such as machine based learning
and data mining to help learners and institutions meet their goals:
» For our project:
› Improve retention (current project)
› Improve attainment (current project)
› Improve employability (future project)
› Personalised learning (future project)
6. Toolkit and community
» Blog: http://analytics.jiscinvolve.org
» Reports
› Code of practice for learning analytics
› The current state of play in UK higher
and further education
› Learning analytics in Higher Education: a
review of UK and international practice
» Mailing: analytics@jiscmail.ac.uk
» Network meetings
8. Current engagement
» Expressions of interest: 85
» Engaged in activity: 35
» Discovery to Sept 16: agreed (28), completed (18), reported (17)
» Learning analytics pre-implementation: (12)
» Learning analytics implementation: (7)
9. Current engagement
» From Sept 2016
» “ReadinessToolkit” with a diagnostic set of questions and support
materials leading to implementation
» Start-up guidelines to get ready for learning implementation
Further details will be announced via analytics@jiscmail.ac.uk
13. Heidi Plus
The new business intelligence service for UK Higher Education
Replaces Heidi (which will be decommissioned in November 2016)
Launched in November 2015 offering:
Improved data content and functionality
Delivery of data sets through commercial data explorer tool
New visualisations and dashboards
New training programme and support materials
Available to HE institutions with a full HESA subscription
Over 80% of current Heidi subscribers have started the Heidi Plus
application process (40% completed)
15. Take a thin vertical slice
Don’t try to waterfall each step
Want quick release of product and steer
Dev Team1 Dev Team 2
Dev Teams
Dev Teams
Dev Teams
Dev Teams
Analysis
Data orders
User stories
Dashboards
Process overview
17. Library Analytics labs
» Teams working on Library BI Stories at 0.2 FTE, total estimated effort
15 days from July - Oct 2016
» Both product owners and sector data eperts invited:
› Product owner from the sector to steer which stories are of interest
› Sector experts to understand what data sources are available and what
is in the data
› Jisc Contracted data transformation specialist (CETIS)
› JiscAgile scrum master andTableau user
» Teams receive experience and guidance of Agile working
» Option forTableau desktop training to help with creating visualisations
» Apply at http://bit.ly/jisc_library_data_labs_applications
» Queries to siobhan.burke@jisc.ac.uk or myles.danson@jisc.ac.uk
19. Retention
» 178,100 students aged 16-18 failed to finish post-secondary school
qualifications they started in the 2012/13 academic year
› costing £814 million a year - 12 per cent of all government spending on
post-16 education and skills (Centre for Economic and Social Inclusion)
» 8% of undergraduates drop out in their first year of study
› This costs universities around £33,000 per student
» Students with 340 UCAS points or above were considerably less likely
(4%) than those with fewer UCAS points (9%) to leave their courses
without their award
20. Attainment
» 70% of students reporting a parent with HE qualifications achieved an
upper degree, as against 64% of students reporting no parent with HE
qualifications
» In all disciplines except Computer Science, Medicine and Dentistry, and
Physical Science, students with a parent with an HE qualification were
more likely to have achieved an upper degree
» Overall, 70% of White students and 52% of BME students achieved an
upper degree
21. Marist college – Academic early alert system
Approach
» Supported by Bill and Melinda Gates Foundation. Investigated how
use of Academic EarlyAlert systems impact on final course grades and
content mastery
Outcome
» Analysis showed a statistically significant positive impact on final
course grades
» The most important predictor of future academic success was found
to be partial contributions to the final grade
» The predictive models developed at one institution can be
transferred to very different institutions while retaining most of
their predictive abilities
» Simply making them aware that they are at risk may suffice
22. Signals at Purdue University
Approach
» Signals’ predictive algorithm is based on performance, effort, prior
academic history and student characteristics
Outcome
» Problems are identified as early as the second week in the semester
» Students are given feedback through traffic lights – and from
messages tailored by their instructors
» Students using Signals seek help earlier and more frequently
» One study showed 10% more As and Bs were awarded for courses
using Signals than for previous courses which did not use Signals
23. UK Open University – developing an analytics mind-set
Approach
» Investing in strategic learning analytics programme to enhance
success of over 200,000 students
Outcome
» Availability of data – macro-level aggregation of all data relating to
student learning and experience
» Analysis of that data – multiple dashboards and tools under
development, including student views
» Interventions and processes that enhance success – including input
into learning and assessment design at module level
24. Predictive analytics at Nottingham Trent University
Approach
» Most prominent UK: university-wide dashboard for 30,000 students,
enhancing retention, community belonging, and attainment
Outcome
» ‘Doubters’ less likely to complete targeted via tutor and peer support
» 27% of students changed learning behaviour because of
analytics feedback
» Engagement in studies shown better predictor of final grade then
entry qualifications or demographics
25. Jisc’s effective learning analytics programme
25
ECAR Analytics Maturity Index for Higher Education
UK Learning Analytics Network
analytics@jiscmail.ac.uk
27. Group Name Question Main type Importance Responsibility
2 Consent Adverse impact of opting
out on individual
If a student is allowed to opt out of data collection and
analysis could this have a negative impact on their
academic progress?
Ethical 1 Analytics Committee
7 Action Conflict with study goals What should a student do if the suggestions are in conflict
with their study goals?
Ethical 3 Student
8 Adverse impact Oversimplification How can institutions avoid overly simplistic metrics and
decision making which ignore personal circumstances?
Ethical 1 Educational researcher
86 issues in 9 groups
Available from Effective learning analytics blog: analytics.jiscinvolve.org
28. Group Name Question Main type Importance Responsibility
2 Consent Adverse impact of opting
out on individual
If a student is allowed to opt out of data collection and analysis
could this have a negative impact on their academic progress?
Ethical 1 Analytics Committee
7 Action Conflict with study goals What should a student do if the suggestions are in conflict with
their study goals?
Ethical 3 Student
8 Adverse impact Oversimplification How can institutions avoid overly simplistic metrics and decision
making which ignore personal circumstances?
Ethical 1 Educational researcher
86 issues
jisc.ac.uk/guides/code-of-practice-for-learning-analytics
38. Service: dashboards
Visual tools to allow lecturers, module
leaders, senior staff and support staff
to view:
» Student engagement
» Cohort comparisons
» etc…
Based on either commercial tools
from Tribal (Student Insight) or open
source toolsfrom Unicon/Marist
(OpenDashBoard)
2/03/2016 The case for Learning Analytics
39. Service: Alert and intervention system
Tools to allow management of
interactions with students once risk
has been identified:
» Case management
» Intervention management
» Data fed back into model
» etc…
Based on open source tools from
Unicon/Marist (Student Success Plan)
2/03/2016 The case for Learning Analytics
47. Profile Aim Activity Data sources
Russell Group Retention of widening participation +
support for students to achieve 2.1
or better
Discovery +
Tribal Insight +
Learning Locker
Moodle +
Student Records
Research led
University
Retention, improve teaching,
empowering students
Discovery +
OpenSource Suite +
StudentApp
Moodle +
Attendance+
Student Records
Teaching led
University with
WP mission
Retention - requirement to make
identifying students more efficient so
they can focus on interventions
Tribal Insight +
Learning Locker
Blackboard +
Attendance +
Student Records
Research led
University
Student engagement Discovery +
Student app +
Learning Locker
Moodle +
Student Records
Teaching lead Understanding of how Learning
Analytics can be used
Discovery +
Technical Integration
Moodle
Please edit this slide to represent the session you are running at ConnectMore
The effective learning analytics challenge was initiated from consultation with stakeholders, senior manager and practitioners who felt the sector need support to get up to speed with learning analytics. They prioritised three main areas, a Code of Practice to address legal and ethical issues of using learning analytics; a set of basic learning analytics tools to allow institutions to get started and make informed decisions; and a network to allow institutions to share practice and learn from each other.
The current project has procured suppliers to provide a learning analytics service which are currently being tested by several institutions. This will be developed into a full service next year and provided as a new Jisc service from Sept 2017.
What do we mean by learning analytics. The service we are developing will collect data and undertake statistical analysis of historical and current data derived from the learning process to create models that allow for predictions that can be used to improve learning outcomes.
Models are developed by “mining” large amounts of data to find hidden patterns that correlate to specific outcomes
E.g. Mine VLE event data to find usage patterns that correlate to course grades
The service will provide predictive models initially for retention (identify students at risk of failing) and attainment (identifying students at risk of not achieving a specified level of attainment).
In the future we will look to offer predictive models to support employability and personal/adaptive learning.
The project consists of the learning analytics architecture (next slide), a toolkit and community.
These consist of a blog with reports and information to assist institutions with readiness to implement learning analytics and technical implementation of the Jisc service.
There are three reports all linked from the blog a Code of Practice for Learning Analytics, A report from 18 months ago that reviewed current state of learning analytics in the UK and a more recent report on the evidence base for the effectiveness of learning analytics with 12 international case studies.
If you want to be involved and keep informed about the development of the service then join the analytics jiscmail list
We also hold quarterly network meetings which are promoted via the blog and jiscmail list
Overview of learning analytics architecture.
Red items are components that will include the tools in the project (Tribal student insight, Unicon/Apereo LAP and Student Success Plan, Student App) but also alternative third party or institutional tools.
We have ~400 people on the Jiscmail list and a pipeline of interested institution's (50+ HE, 20+FE). We are actively engaging with 35 institutions, 28 in discovery institutional readiness and 12 in beta implementations.
From Sept 16 we’ll be introducing a new institutional readiness process to help institutions get ready for implementing learning analytics. This will consist of an overview workshop to introduce the service and an diagnostic assessment tool, institutions will complete the assessment tool and then undertake appropriate actions to address recommendations.
For institutions who are ready to start implementation there will be set of guidelines to get set-up with data collection and visualisations, ready to implement a predictive analytics solution and the student app.
Details will be announced via the jiscmail list – so join it to participate.
Myles 5 mins
James?
The flagship data delivery service is known as HEIDI (Higher Education Information Database for Institutions) developed in house in 2007. Jisc and hESA collabirated to replace this with a more up todate service. We procured Tableau and now offer Heidi Plus The new and improved…
Feedback has been very good across the sector.
Heidi Lab as a Jisc Alpha project (proof of concept) engaged with 390 individuals from 136 universities to develop a successful model of agile analysis. 60 analysts (planners and directors of planning) from 44 universities volunteered to join cross institutional agile analysis teams for two Heidi Lab cycles of 3 months each at just 0.2 FTE. Teams were supported as they identified and refined widely felt problem areas (see example on the slide – covered student, staff, research, estates etc) linked to national policy. They explored the data landscape for supportive insights, recording the issues encountered in our data catalogue. Finally they produced interactive dashboards using Jisc sourced leading edge software as proofs of concept for services to customers.
Teams are encouraged to take a thin vertical slice through the process to rapidly create proof of concept dashboards. They don't want too spend too much time on each step so in 4 weeks produce ‘minimum viable product’. That way they know whether something is viable or not and can adapt in light of experience. The process produces a range of dashboards ready for polishing to service production standard in just 3 months