IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...
Research and Deployment of Analytics in Learning Settings
1. Research and Deployment of Analytics in
Learning Settings
PAWS Meeting 9 April 2012
School of Information Sciences, University of Pittsburgh
Katrien Verbert
2. Human-Computer Interaction
Awareness Sense-making
prof. Erik Duval
Computer Graphics
prof. Phil Dutré
Language Intelligence
Information Retrieval
prof. Sien Moens
Flexible Interaction between people and information
http://hci.cs.kuleuven.be/
8. Duval, Erik. Attention please! Learning analytics for visualization and recommendation, Proceedings of LAK11: 1st
International Conference on Learning Analytics and Knowledge, pages 9-17, ACM (2011)
9. objectives
• self-monitoring for learners
• awareness for teachers
• learning resource use and recommendations
• part of Learning Analytics research [ACM LAK conf., Siemens 2011,
Duval 2011]
10. overview
• Student Activity Meter
• Step Up!
• Recommender systems for learning
• Future research plans
11.
12. Student activity meter (SAM):
demo.
http://ariadne.cs.kuleuven.be/monitorwidget-rwtheval/ or
http://bit.ly/I8AYV1
13. Design Based Research
Methodology
• Rapid prototyping
• Evaluate
Ideas in short iteration cycles of Design, Implementation
Evaluation
• Focus on Usefulness Usability
• Think-aloud evaluations, SUS (System Usability Scale) surveys,
usability lab, ...
14. Iteration one
• usability and user satisfaction evaluation
• 12CS students, using a -based
time tracker
• 2 evaluation sessions:
• task based interview with think aloud (after
1 week of tracking)
• user satisfaction (SUS MSDT) (after 1
month)
16. iteration two
• 20
persons: 3 CGIAR, 2 Law, 8 CS teachers 7
CS TA s.
• An online survey about usefulness, teacher
issues and how the tool can resolve these.
• on average: 40 mins are spent using SAM.
17. CGIAR CASE STUDY
issue for
teacher
addressed
Provide feedback to the students
✔ ?!
Being aware of what students are doing
✔ ✔
Knowing about collaboration and communication
✔ ✗
Knowing which documents are used and how much
✔ ✔
✔
Knowing how and when online tools have been used
?!
Finding the students who are not doing well
✔ ?!
Finding the best students
?! ?!
Knowing how much time students spent
?! ✔
Knowing if external learning resources are used
✔ ?!
18. demogra evaluation design
negative
positive
phics
goal
changes
• ↑learnability
usability,
• ↓errors
12 CS satisfaction, small usability
I.
1st iteration
• good satisfaction
students
preliminary issues
• usefulness
usefulness
positive
• provides
assessing
19 resource awareness
teacher needs,
II.
teachers help function
recomm. not • all vis. useful
use
TA s
useful
• many uses
usefulness
• 90% wants it
19. iteration three
• open course on learning and knowledge
analytics, http://bit.ly/dWYVbX
• 12 visual analytics enthousiasts + experts (who
also teach)
• almost identical survey to CGIAR case.
20. LAK CASE STUDY
issue for
teacher
addressed
Provide feedback to the students
✔ ✔
Being aware of what students are doing
✔ ?!
Knowing about collaboration and communication
✔ ✗
Knowing which documents are used and how much
✔ ?!
Knowing how and when online tools have been used
✗ ?!
Finding the students who are not doing well
✔ ?!
Finding the best students
?! ✗
Knowing how much time students spent
?! ✔
Knowing if external learning resources are used
?! ?!
21. ideas from experts
2
the used resource types
5
detailed information per student
4
detailed information of 2 students
3
detailed usage stats of resources
1
stats or vis. on content creation
22. demogra evaluation design
negative
positive
phics
goal
changes
usability, • ↑learnability
12 CS satisfaction, small usability • ↓errors
I.
1st iteration
students
preliminary issues
• good satisfaction
usefulness
• usefulness positive
assessing • provides awareness
19 resource
teacher • all vis. useful
II.
teachers help function
recomm. not
needs, use • many uses
TA s
useful
usefulness
• 90% want it
• provides
awareness
assessing re-orderable
most and feedback
12 teacher parallel
addressed • many uses
III.
participan needs, expert coordinates
needs are • 66% want it
ts
feedback, use with
indecisive
• recomm. can be
usefulness
histograms
useful
23. Iteration four
• a CS course on C++ programming
• 11people: 7 teachers, 2 TA s 1 course
planner
• richerdata set: tracking from programming
environment
• qualitative study using a structured face-2-face
interview
25. demo- evaluation design
negative
positive
graphics
goal
changes
usability, • ↑learnability
12 CS satisfaction, small usability • ↓errors
I.
1st iteration
students
preliminary issues
• good satisfaction
usefulness
• usefulness positive
• provides awareness
19 assessing teacher resource
• all vis. useful
II.
teachers needs, use help function
recomm. not
• many uses
TA s
usefulness
useful
• 90% want it
• provides awareness and
assessing teacher most
12 re-orderable feedback
needs, expert addressed
III.
participant PC with • many uses
feedback, use needs are
s
histograms
• 66% want it
usefulness
indecisive
• recomm. can be useful
• provides time overview
filter search, conflicting • provides course overview
11
use, usefulness icons, zooming visions of • PC assist with detecting
IV.
teachers
satisfaction
in line chart, students doing problems
TA s
editing PC axes
well or at risk
• many uses insights
• 100% want it
26. conclusion
• SAMenables to find a wide variety of
new insights
• a better course overview
• understanding student time spending
• almostall participants want to continue
using SAM
26
27. Santos Odriozola, Jose Luis; Govaerts, Sten; Verbert, Katrien; Duval, Erik
Goal-oriented visualizations of activity tracking: a case study with engineering students, Proceedings of LAK12: 2nd
International Conference on Learning Analytics and Knowledge, pages 10, ACM (to appear)
33. Nikos Manouselis, Hendrik Drachsler, Katrien Verbert and Erik Duval. Recommender Systems for Learning.
SpringerBriefs in Computer Science, 90 pages, Springer US (to appear).
41. how to achieve objectives
• Setting up a website / maintain TELeurope group community
• Setup a open data repository for sharing educational datasets and
related researches outcomes
• Organizing annual workshop and SI
• Organizing a data competition like in TREC
42.
43. dataTEL challenge dataTEL cafe
event
• a call for TEL datasets
• eight data sets submitted
http://bit.ly/ieqmWW
62. References
• Duval, E. (2011). Attention please!: learning analytics for visualization and recommendation. In Proceedings of the 1st
International Conference on Learning Analytics and Knowledge, (pp. 9-17), ACM.
• D. Keim, G. Andrienko, J.-D. Fekete, C. Go ̈rg, J. Kohlhammer, and G. Melanc ̧on. Visual Analytics: Definition, Process,
and Challenges. In A. Kerren, J. Stasko, J.-D. Fekete, and C. North, editors, Information Visualization, volume 4950 of
Lecture Notes in Computer Science, pages 154–175. Springer Berlin / Heidelberg, 2008
• J. Klerkx and E. Duval. Visualising social bookmarks. Journal of Digital Information, 10(2):1–40, 2009
• J. O'Donovan, B. Gretarsson, S.Bostandjiev, C. Hall, and T. Hollerer. SmallWorlds: Visualizing Social Recommendations.
In G. Melançon, T. Munzner, and D. Weiskopf (eds) Eurographics/ IEEE-VGTC Symposium on Visualization 2010,
Volume 29 (2010), Number 3, 10 pages
• Siemens, G. Gasevic, D. (eds) (2011). Proceedings of the 1st conference on Learning Analytics and Knowledge 2011.
ACM.
Notes de l'éditeur
Microsoft Desirability Toolkit
‘Knowing about collaboration andcommunication’ (the 3rd row with *) is not addressed by SAM, but is added to check a possible bias. The highest rated was ‘knowing how much time students spent’ and ‘Awareness of what students are doing’ Finding students in trouble and the best students was also rated rather low. Awareness of resource use has been mostly met, but can be improved by differentiating external resources (the external resource use issue is indecisive).
Actual use was high
For this evaluation we wanted to get expert feedback and see how SAM would operate in a large course. SAM was deployed in an open onlinecourse on Learning and Knowledge Analytics (LAK)5 – an emerging research domain in TEL that focuses on better measurement, analysis, visualization and reporting of data about learners [2]. More details on iteration 2 and 3 are available in [10]. allow re-ordering of the axes through drag-and-drop for better metrics comparison. To cope with the line density better, configurable histograms (12) are added to the axes.270 participants
Providing feedback most importantBoth LAK and CGIAR teachers want to understand the document use. The main differences between LAK and CGIAR teachers are: LAK rates finding students at risk higher and finding good students lower, online tool use is not so interesting for LAK teachers and collaboration is more important. Awareness is also rated high. Comparing with the objectives, awareness and resource use is again the most important.
How can data sets be shared according to privacy and legal protection rights? How to develop a respective policy to use and share data sets? How to pre-process data sets to make them suitable for other researchers? How to define common evaluation criteria for TEL recommender systems? How to develop overview methods to monitor the performance of TEL recommender systems on data sets?