eMadrid Gaming4Coding - Possibilities of game learning analytics for coding learning games.pptx
1. Possibilities of game learning
analytics in Coding4Learning
games
Baltasar Fernandez-Manjon
balta@fdi.ucm.es @baltafm
2. Is our serious game effective?
Is the game contributing to change the students
perception of programming? -> Girls?
Are the students learning computational concepts by
playing the game?
3.
4. Most common methodology are pre-post questionaries in experiments:
Formal validation of serious games
Is there a significant
difference between
pre-questionary and
post-questionary
results?
Pre and post questionaries should
have been previously validated
5. Game Learning Analytics
breaking the game black box
model to obtain information
while students play.
Manuel Freire, Ángel Serrano-Laguna, Borja Manero, Iván Martínez-Ortiz, Pablo Moreno-Ger, Baltasar Fernández-Manjón (2016): Game
Learning Analytics: Learning Analytics for Serious Games. In Learning, Design, and Technology (pp. 1–29). Cham: Springer International
Publishing. http://doi.org/10.1007/978-3-319-17727-4_21-1.
●GLA is learning analytics applied to serious games
○collect, analyze and visualize data from learners’
interactions with SGs
Game Learning Analytics (GLA)
6. Using game learning analytics to improve SG lifecycle
Game learning analytics: Learning Analytics + Game Analytics
● LA improve the learning process
● GA improve the game lifecycle
○ Long time used in the game industry
● GLA improves the serious game lifecycle
○ Focuses on the learning process
○ Measures the impact of the game
○ Provides feedback for the learning process
○ Detects game issues (e.g. students getting stuck)
.
● GLA is costly and complex to implement
Analysis & Visualizations
7. Game Learning Analytics process
When?
● Real-time
● Off-line
Stakeholders
● Developers - Designers
● Researchers
● Teachers
● Students
play
collection & analysis
reports
information
& feedback
8. Game Learning Analytics is Not Informagic!!
● Educational design should guide both game design
and analytics design → Game Learning Analytics Model
○ the interaction data to be collected (the more the merrier)
○ the analyses to perform on the interaction data
■ from click-stream to educational insight
○ visualizations/reports/dashboards
No meaningful information can be extracted without defining
the required data early on the design process!
Game Learning Analytics Model
Ivan Perez-Colado, Cristina Alonso-Fernández, Manuel Freire, Iván Martínez-Ortiz, Baltasar Fernández-Manjón (2018): Game Learning Analytics is
not informagic!. IEEE Global Engineering Education Conference (EDUCON), April 18-20, 2018, Santa Cruz de Tenerife, Canary Islands, Spain.
9. Learning Analytics Model (LAM)
1. Design-implementation-data cycle: Allows linking specific educational
objectives to game design with game interaction data.
2. Relates through data that the educational objectives correspond to
the objectives in the game.
3. Specifies the traces to be sent, necessary information to be recorded
and sent by the game so that it can be identified that the game
objectives are met
4. The analysis model, how the traces should be analysed and
interpreted.
5. May include specific analysis and visualisations for the different
stakeholders involved
9
10. Game Learning Analytics Model
What data is to be
collected from the game
and how it relates to
learning goals
Which specific statements (e.g.
in xAPI format) are to be
tracked from the game
containing that information
How the statements collected
are to be analyzed and what
information is to be reported
and/or visualized
11. Minimun Game Requirements for GLA
●Most of games are black boxes.
○No access to what is going on during game play
●We need access to game “guts”
○User interactions
○Changes of the game state or game variables
●Or the game must communicate with the outside world
○Using some logging framework
●What is the meaning of that data?
●Ethics: adequate experimental design and setting
○Are users informed?
○Anonymization of data could be required
○Fair data exploitation for all stakeholders?
12. Experience API for Serious Games: xAPI-SG Profile
Experience API (xAPI) is a new IEEE standard standard that
enables the capture of data about human performance and its
context.
https://adlnet.gov/news/2021/10/20/IEEE-to-Standardize-xAPI-v2.0-as-an-International-Standard/
The e-UCM Research Group in collaboration with ADL created
the Experience API for Serious Games Profile (xAPI-SG), a
xAPI profile for the specific domain of Serious Games.
The xAPI-SG Profile defines a set of verbs, activity types and
extensions, that allows tracking of all in-game interactions as
xAPI traces (e.g. level started or completed)
https://xapi.e-ucm.es/vocab/seriousgames
Ángel Serrano-Laguna, Iván Martínez-Ortiz, Jason Haag, Damon Regan, Andy Johnson, Baltasar Fernández-Manjón (2017):
Applying standards to systematize learning analytics in serious games. Computer Standards & Interfaces 50 (2017) 116–
13. xAPI-SG Profile
The xAPI-SG Profile is the result of the implementation of an interactions
model for Serious Games in xAPI.
The types of interactions that can be performed in a Serious Game, and
are included in the profile, can be grouped based on the type of
interactions and game objects that the interaction is performed over.
These common interactions and game objects related with them, are
classified in general categories of xAPI-SG statements.
● completables
● accessibles
● alternatives
● GameObjects
15. Analysis and visualizations with
xAPI-SG
The generic analyses for games and their
visualisation are simplified
A: degree of a participant's success in
applying knowledge
G: shows the scores achieved in each task
H: average times per task
D, E, and I: identify the degree of progress
15
Cristina Alonso-Fernández, Antonio Calvo-Morata, Manuel Freire, Iván Martínez-Ortiz, Baltasar Fernández-Manjón (2022): Game
Learning Analytics: Blending visual and data mining techniques to improve serious games and to better understand players
learning. Journal of Learning Analytics. Volume 9(3), 32–49. https://doi.org/10.18608/jla.2022.7633, https://learning-
analytics.info/index.php/JLA/article/view/7633/7669.
16. Articoding
●Educational game to promote and
improve programming learning in
schools
●Block programming similar to
Scratch
●The player has to pass levels by
solving problems posed in a board-
like scenario.
●The game is aimed at students
between 12 and 16 years old.
●Experimental design support system
of evaluation
○ Quizzes + analytics
17.
18.
19.
20.
21.
22. Articoding Game validation: Escolapias Puerta de Hierro
School
Formative evaluation
Participants– junio 2021
●134 students
●100 min game (2 class slots)
Questionaries
●Pre y post
●Videogame experience and use
●Previous programming knowledge
●Students opinion
23. Data interaction collected in xAPI
Interactions:
●Level in/out
●Level completed
●Stars obtained
●Completed intent
●Failed intent
●add/remove instruction
●…
24. Analytics in Articoding validation and evaluation
Main goals:
1. to verify that players learn and find out any difficulties that they may
have
2. to validate that the game design is adequate to meet its educational
objectives.
Research questions:
● RQ1. How many levels can players complete in two 50-minutes sessions?
● RQ2. Do players learn and apply the expected programming concepts?
● RQ3. Do players have problems completing game levels? Are the levels well designed?
● RQ4. Can we measure engagement using the collected interaction data?
● RQ5. Do players like the game?
25. Articoding: gender differences
Different number of interactions
●boys interact more
●girls think more their interactions
No significant difference in the number of
levels completed
26. Articoding levels: well designed? progression?
Use of interaction data to study the appropriateness of levels and
progression in game difficulty
27. Is the game working?
● Positive results
● 65.8% of students would continue to play.
● Boys seem to have liked it more 81.36% vs. 46.16%.
● Greater indifference among girls 5.08% vs. 32.69%.
Still an open question to
evaluate the learning with the
game compared with an
standardized method
28. Gaming4Coding: GLA challenges and open issues
● Mobile game
○ Android + Apple IOS
○ Apps stores
○ pre-post questionnaires with more limitations (limited screen and writing)
● BYOD - game will be played on the students mobile and/or school
tablets
● how to convey to the user the quality of their solution
○ programming allows multiple possible solutions
● Multiplayer game - maintaining the security
○ competition with other users to promote user interest
○ analytics are more complex
● Evaluation of the actual programming learning?
29. Prepare for the unexpected in GLA experiments
● not so hi-tech environment (e.g. schools with poor or unreliable wifi)
● dual tracker (local storage of GLA data)
● BYOD - always complex to deploy in students devices
Teachers should be provided with some extra support
● guide about how to use the game in schools - educational scenarios and uses
Why deploy your GLA when there are commercial services (some “free”)?
● Data ownership?
● Your GLA can be implemented with some standar modules (e.g. LRS)
Ethics should drive all the GLA process
● Informing the user about the data collection
● All participant should benefit from the GLA
Other aspects to consider
30. ● Game Learning Analytics has a great potential for improving SGs
○ Evidence based serious games
○ Games as assessments (better “Stealth” student evaluation)
○ Games as powerful research environments
● GLA should be systematized by a Learning analytics model that relates
educational design with game implementation and analytics (data
collected)
○ Analytics should be applied to the whole game lifecycle
● All the Gaming4Coding project results will be open code
● Experimental design: Ethics should drive the GLA process and comply
with legal requirements (e.g. EU GDPR)
Conclusions