This presentation was given at the 7th Learning Analytics and Knowledge conference (2017) in Vancouver, BC. It presents the trends and issues in student-facing learning analytics reporting research as identified by a literature review including over 90 articles.
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
LAK '17 Trends and issues in student-facing learning analytics reporting systems research
1. Trends and Issues in Student-Facing
Learning Analytics Reporting Systems
Research
Robert Bodily - Brigham Young University, USA
Katrien Verbert - University of Leuven, Belgium
2. RESEARCHQUESTIONS
1. What types of features do student-facing learning analytics
reporting systems have?
2. What are the different kinds of data collected in these
systems?
3. How are the system designs analyzed and reported on?
4. What are the perceptions of students about these systems?
5. What is the actual effect of these systems on student
behavior, student skills, and student achievement?
Research Questions
3. Inclusion Criteria
1. Track learning analytics data
(e.g. time spent or resource use)
beyond assessment data.
2. Report the learning analytics
data directly to students.
3. List of articles can be found at
http://bobbodily.com/article_list
INCLUSIONCRITERIA
4. Methodology
Education
Education & Computer
Science
Computer
Science
Conference
Proceedings
LAK and EDM proceedings IEEE Xplore
Peer Reviewed
Journal Articles
ERIC Google Scholar
ACM Database,
Computers and
Applied Sciences
Literature
Reviews
Verbert et al. 2013, Verbert
et al. 2014, Schwendimann
et al. 2016
Drachsler et al.
2015, Romero
and Ventura 2010
945 articles retrieved from the initial search
94 that fit the inclusion criteria
METHODOLOGY
5. Coding Categories
Functionality of the system
Data sources tracked and reported
Design analyses conducted
Student perceptions
Actual measured effects
Student use
CODINGCATEGORIES
6. Functionality
Purpose of the system
Data mining
Visualizations
Class comparison
Recommendation
Feedback
Interactivity
FUNCTIONALITY
7. Purpose of the System
FUNCTIONALITY
Category Name # of articles % of articles
Awareness or reflection 35 37
Recommend resources 27 29
Improve retention or engagement 18 19
Increase online social behavior 7 7
Recommend courses 3 3
Other 4 4
8. Data Mining
My article definition: any type of statistical analysis
beyond descriptive statistics
• 46 (49%) included a data mining component
• More common in recommender and data mining
systems, less common in dashboards
• Only 16 (17%) included a visualization component and
a recommendations component
FUNCTIONALITY
9. Visualizations
FUNCTIONALITY Visualization Type # of Articles
Bar chart 25
Line chart 19
Table 15
Network graph 10
Scatterplot 10
Donut graph 5
Radar chart 4
Pie chart 3
Timeline 3
Word cloud 3
Other 23
Visualizations in the Other category:
• Learning path visualization
• Box and whisker plot
• Tree map
• Explanatory decision tree
• Parallel coordinates graph
• Planning and reflection tool
• Plant metaphor visual
• Tree metaphor
10. Class Comparison
My article definition: the system had to allow students
to see other students data in comparison with their
own
• 35 (37%) of articles included a class comparison
feature
• Which students are motivated by comparison?
• Which students are unmotivated by comparison?
• What effect does personalizing reporting system
features have on student motivation?
FUNCTIONALITY
11. Recommendations
My article definition: Recommending or suggesting to
the student what to do.
• 43 articles (46%) included recommendations
• 78% of data mining articles provided
recommendations
• Future research should examine differences
between transparent recommendations and
traditional black-box recommendations
FUNCTIONALITY
12. Feedback
My article definition: Telling the user what has
happened using text.
• 17 systems (18%) provided text feedback
• Used frequently for just-in-time feedback and rarely
for unit-level or concept-level data reports
FUNCTIONALITY
13. Interactivity
My article definition: Allowing the user to interact with
the reporting system in some way
• 29 systems (31%) included an interactivity component
• Includes linking to content, filtering data results, or
providing simple/advanced views
• Future research should examine how students are
using these interactive features
FUNCTIONALITY
14. Data Sources
DATASOURCES
Subcategory Name # of Articles % of Articles
Resource use 71 76%
Assessment data 34 36%
Social interaction 33 35%
Time spent 29 31%
Other sensor data 7 7%
Manually reported data 5 5%
16. Needs Assessment
• 6 articles (6%) included a needs assessment
• Santos, Verbert, Govaerts, & Duval (2013)
• Surveyed students to identify needs
• Had students rank needs on importance
• Targeted the most important student issues
• Future research should include explicitly discussed
needs assessments.
DESIGNANALYSIS
17. Information Selection
• 14 articles (15%) included information selection
justification
• Ott, Robins, Haden, & Shephard (2015)
• Examined the literature
• Feild (2015)
• Exploratory data analysis
• Iandoli, Quinto, De Liddo, and Buckingham Shum
(2014)
• Used a theoretical framework
DESIGNANALYSIS
18. Visual Design
• 12 articles (13%) discussed the visual design or
recommendation design process
• Olmos & Corrin (2012)
• Iterative visual design process
• 85% of articles only presented the final visualization or
recommendation
• Future research should report on the visual design
process used
DESIGNANALYSIS
19. Usability Testing
• 10 articles (11%) reported a usability test
• Santos, Verbert, & Duval (2012) and Santos,
Govaerts, Verbert, & Duval (2012)
• System Usability Scale (SUS)
• Santos, Boticario, and Perez-Marin
• Usability and accessibility expert
• Future research should use a system usability scale
(SUS), evaluation expert, or other appropriate
methods to assess usability
DESIGNANALYSIS
22. Student Behavior Changes
1. 21% of students accepted the system recommendation to view additional content (Hsu, 2008)
2. Students participating in courses using the system were more likely to continue taking classes than those who
did not enroll in these courses (Arnold, Hall, Street, Lafayette, & Pistilli, 2012)
3. Students who enabled notifications (on 2 out of 3 systems) showed increased contributions in the social
network space (Xu & Makos, 2015)
4. Students visited the discussion space more frequently but did not post more frequently (Nakashara, Yaegashi,
Hisamatsu, & Yamauchi, 2005)
5. The percentage of posts viewed increased for all students, but there were few sustained changes (Wise, Zhao,
& Hausknecht, 2014)
6. The number of students completing assignments increased and LMS use increased (Chen, Chang, & Wang,
2008)
7. About 50% of students accepted recommendations from the system (Huang, Huang, Wang, & Hwang, 2009)
8. There was an 83.3% student interaction increase after recommendations were given (Holanda et al., 2012)
9. Students completed assignments more quickly and were able to complete the entire course more quickly
(Vesin, Klasnja-Milicevic, Ivanovic, & Budimac, 2013)
10. *For two of the three visualizations, students post quantity increased; for one of three, student post
quantity decreased (Beheshitha, Hatala, Gašević, & Joksimović, 2016)
11. *Students logged in more frequently, completed their coursework more quickly, completed more
questions, and answered more questions correctly on assignments (Santos et al., 2014)
12. *There were no significant differences between the treatment and control groups in terms of learning
efficiency (Janssen et al., 2007)
ACTUALMEASUREDEFFECTS
*Sample size greater than 150 and conducted an actual experiment (randomized control trial or other equivalent
group mean difference testing method)
23. Student Behavior Changes (summary)
1. N > 150 and used a randomized control trial or other equivalent
group mean difference testing method
2. For two of the three visualizations, students post quantity
increased; for one of three, student post quantity decreased
(Beheshitha, Hatala, Gašević, & Joksimović, 2016)
3. Students logged in more frequently, completed their coursework
more quickly, completed more questions, and answered more
questions correctly on assignments (Santos et al., 2014)
4. There were no significant differences between the treatment and
control groups in terms of learning efficiency (Janssen et al., 2007)
ACTUALMEASUREDEFFECTS
24. Student Achievement Changes
1. No significant achievement differences (Grann & Bushway, 2014)
2. More A’s and B’s and fewer C’s and D’s (Arnold et al., 2012)
3. No significant achievement differences (Park & Jo, 2015)
4. Students received more passing grades (Denley, 2014)
5. Frequency and quality of posts was affected positively and negatively (Beheshitha, Hatala, Gašević, &
Joksimović, 2016)
6. Students performed significantly better on the evaluation task (Huang, Huang, Wang, & Hwang, 2009)
7. Treatment group performed significantly better on final exam (Wang, 2008)
8. *No significant differences between treatment and control (Santos, Boticario, & Perez-Marin, 2014)
9. *No significant achievement differences (Ott, Robins, Haden, & Shephard, 2015)
10. *No significant achievement differences, but one course had an effect with Pell eligible students (Dodge,
Whitmer, & Frazee, 2015)
11. *Treatment group performed significantly better on final exam (Kim, Jo, & Park, 2015)
*Sample size greater than 150 and conducted an actual experiment (randomized control trial or other
equivalent group mean difference testing method)
ACTUALMEASUREDEFFECTS
25. Student Achievement Changes (summary)
1. N > 150 and used a randomized control trial or other equivalent group
mean difference testing method
2.No significant differences between treatment and control (Santos,
Boticario, & Perez-Marin, 2014)
3.No significant achievement differences (Ott, Robins, Haden, &
Shephard, 2015)
4.No significant achievement differences, but one course had an
effect with Pell eligible students (Dodge, Whitmer, & Frazee, 2015)
5. Treatment group performed significantly better on final exam
(Kim, Jo, & Park, 2015)
ACTUALMEASUREDEFFECTS
26. Student Skills Changes
1. Significant increase in student self-awareness accuracy (Kerly, Ellis, &
Bull, 2008)
2. Female students had increased interest when they had a choice to
use the system; male students reported higher interest with
mandatory notifications (Muldner, Wixon, Rai, Burleson, Woolf, &
Arroyo, 2015)
ACTUALMEASUREDEFFECTS
27. Student Use
• 12 articles (13%) tracked some form of student use
• Most articles reported on aggregate class level statistics
• Percent of class that accessed the system
• Number of interactions over the course of the semester
• Future research should investigate how students are using
visualization or recommendation reports
• Student use data could help us understand why or how a system is
helping students
STUDENTUSE
28. Question Category %
What is the intended goal of the system? Intended Goal 100
What visual techniques will best represent your data? Visualizations 13
What types of data support your goal? Information Selection 15
What do students need? Does it align with your goal? Needs Assessment 6
Is the system easy and intuitive to use? Usability Test 11
Why use the visual techniques you have chosen? Visual Design 13
How do students perceive the reporting system? Student Perceptions 17
What is the effect on student behavior/achievement? Actual Effects 18
How are students using the system? How often? Why? Student Use 13
Practitioner Recommendations
PRACTITIONERRECOMMENDATIONS
29. Future Research
1. Student use: How are students using reporting systems? Are students even
using them?
2. Design process: Are some data types and visualization types better than others,
and in what contexts?
3. Design process: Only a few authors reported on conducting needs assessments
and usability tests. What effect do these methods have on experimental rigor and
accuracy of findings?
4. Experimental research: Quasi-experimental methods, such as propensity score
matching, have yet to be used in a student-facing reporting tool context
5. Experimental research: Current research shows mixed results regarding the
efficacy of these systems. More experimental research is needed on the effects of
these systems on student behavior, achievement, and skills.
FUTURERESEARCH