"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
2021_06_30 «Collaborative Design with Classroom Teachers for Impactful Game-Based Learning Analytics and Dashboards»
1. Collaborative Design with
Teachers for Impactful Game-Based
Learning Analytics and Dashboards
YJ Kim
Assistant Professor
Design, Informal, and Creative Education
University of Wisconsin-Madison
Email: yj.kim@wisc.edu
Twitter: @yjkimchee
2. Games are good for you!
★ Learning (Clark et al. 2016)
★ Prosocial behaviors (Greitemeyer & Osswald, 2010)
★ Alternative assessment (Shute & Ke, 2012)
3. Game-based assessment
★ Using game as a context, data generated from gameplay, to measure and support
students’ learning (Kim, 2018)
★ Should meet psychometric qualities such as validity, reliability, and fairness
(Dicerbo, Shute, & Kim, 2017)
★ Can measure beyond knowledge: engagement, persistence, affects, strategies
★ Interdisciplinary field that combines learning sciences, game design,
psychometrics, and learning analytics/data science
Most importantly, assessment needs to be playful!
4. Playful (Ludic) Assessment
(Kim & Miklasz, 2021)
● Freedom to Experiment
○ Can you decide how you will be assessed?
○ Is there more than one way to answer the assessment correctly?
○ Is there more than one correct answer to the assessment?
○ Can you easily go down worthless or suboptimal answer paths?
● Freedom to Fail
○ Most basically, can you fail?
○ Is it psychologically safe to fail?
○ Can you set the difficulty of your assessment?
● Freedom to Try on Identities
○ Can you take the assessment from different perspectives?
○ Can you decide what you are assessed on, and therefore what you can be declared competent at?
○ Can you choose which assessment results do and do not get publicly acknowledged and shared?
● Freedom of Effort
● Can you decide not to be assessed?
● Can you decide when to be assessed?
● Can you decide how much you will engage or hold back from the assessment?
5. Obstacles
(Kim & Ifenthaler, 2019)
★ Past 10 years, the field did show how games can be
used as assessment.
★ Yet only limited impact in classrooms (at least in US).
★ To make use of data for different types of GBA:
○ Large amount of data → too complex to
comprehend
○ It requires teachers to develop a new
assessment literacy
○ Teachers need hands-on tools and platforms to
be able to use this type of assessment data
7. Educational game that was
developed with the goal of
assessing: common core geometry
standards of geometry, spatial
reasoning, and persistence
https://shadowspect.org/
8. Research Questions
RQ1: What new assessment literacies do teachers
need around learning analytics in educational
technology?
RQ2: How can interactive teacher dashboards for
(digital games) be designed to increase teachers’
understanding of student learning and connect those
insights to their personal knowledge of their students?
9. Method
- Design-based research
- A year-long process of 10 co-design workshops (All
sessions were virtual) with 8 math teachers
- Each session lasted approximately 2 hours
- Sources of data: Videos: sessions, breakout rooms,
think-alouds, Automated Transcripts or Captions,
Session Reflections from Researchers, Artifacts
generated from teachers (e.g. Padlets, Wakelet),
Surveys
12. Why Persistence?
(Kim, Lin, & Ruipérez-Valiente, in press)
★ It is an “invisible” (Chris) skill that students can use “throughout their life” (Bonnie) and
“beyond the math classroom” (Stacy).
★ As one teacher puts it, “persistence in the face of challenge is what leads us to success”
(Clarissa).
★ The co-design teachers believe all educators must foster persistence in all students because
it is an important lifelong skill transferable beyond the game.
★ At the same time, Some teachers are apprehensive about how they can help students with
“low persistence” as they acknowledge that there could be other life circumstances that
prevent the students from playing the game consistently.
13. @playfulMIT
Example #1
Flavors of Persistence
● Productive: methodical (use certain strategies) vs. not methodical. Is
randomly trying is unproductive?
● Perhaps have to see along with playstyle to see about if it’s
productive or not
● Frustrated vs. excited/satisfied persistence.
○ Excited: stick with it longer
○ Can also think of intrinsically motivated or if it’s because it’s
an assignment (told they have to persist) or expectation
○ Possibly tell by length of persistence without success
● Remain engaged and stay engaged over time even if they’re not
successful (might be difficult to distinguish for the sit and think type)
● Time persistence vs. Click persistence
How will they be presented on dashboard?
(e.g., score? Multiple Scores? Percentage?
No scores?)
● Separate scores is good
● Like to have option: a composite and the ability to
break them down to see the specific.
● Would want students to also be able to see info on
dashboard
● For example: if they’re not clicking stuff, you’ll
notice they’re not using the appropriate tools, so
can talk to them about.
● Discussion: what’s more important, overall puzzle,
or individual?
○ Both?
○ Caring more toward cumulative behavior
across all puzzles
○ If can look at single puzzle across students,
can see if any puzzle can draw out
persistence specifically
● Global breaking out into local ones
● “Hey you gain 2 persistence points!” when they
solved a puzzle where they struggled more.
● Adjustable algorithm: teachers get to select criteria
depending on their own goals or SLOs for the day
Jose’s Q about student faster/slower on average
● Based on students’ prior (comparing student 1 to
student 1; individualized)
● Adjustable metric
How will you act on the information?
● Example: if not clicking on stuff - can direct students to the tools
● Identifying these (like giving up too soon) can help figure out the hints to
give students
● Can encourage students to try more difficult puzzles if they don’t have much
“persistence points” b/c it shows they’re only completing the easier ones
○ But should also keep it so students don’t get discouraged from the
successful ones
15. Pool of every single
correct attempt on
this puzzle
Pool of every single
correct attempt on this
puzzle
Pool of every
single correct
attempt on this
puzzle
Number of
Submission
Number of
Events
Active Time
(in seconds)
%Tile # of
Submission
%Tile # of
Events %Tile
Active Time
Percentile
Composite Score
Persistence Composite Score
16. Persistence Composite Score
We categorize the activity of a student
with the following persistence labels
- NON_PERSISTANT: IF perc_composite < 5
AND completed == 0
- RAPID_SOLVER: IF perc_composite < 25
AND completed == 1
- PRODUCTIVE_PERSISTANCE: IF
perc_composite > 75 AND completed == 1
- UNPRODUCTIVE_PERSISTANCE: IF
perc_composite > 90 AND completed == 0
- NO_BEHAVIOR: Rest of cases
20. Design Principles
1. The visualization should be easy to navigate and inviting for teachers to dig deeper and
play with.
2. The visualization should foster teacher curiosity to explore the data.
3. The data that teachers see on visualizations should match with their desires and
intentions for using games in classrooms.
4. The visualization should allow the teachers to see multiple aspects of a learner, some
of which might be surprising and unexpected.
5. The visualization should allow the teachers to see learners’ growth over time.
6. The visualization should allow the teachers to identify and celebrate “productive
struggle.”
7. The data visualization should allow the teacher to question how the model was
created.
21. What makes co-design successful?
★ Connect with co-designers: trusting relationships
★ Feeling respected, heard, accountable matters
★ Tight co-design-development loop can keep the co-designers motivated
& engaged
★ Activities should help them to develop a professional learning
community
22. Future Work
★ Further disentangle our framework
○ For example, persistence vs. math score
★ Imagined use vs. real implementations
★ Developmental trajectories of teacher
assessment literacy & how practices change
over time
24. References
Kim, Y. J., & Miklasz, K. (2021). From Games to Make Assessment Playful. Teaching in the Game-Based Classroom: Practical Strategies for Grades
6-12.
Kim, Y. J., & Ifenthaler, D. (2019). Game-based assessment: The past ten years and moving forward. In Game-Based Assessment Revisited (pp. 3-
11). Springer, Cham.
Clark, D. B., Tanner-Smith, E. E., & Killingsworth, S. S. (2016). Digital games, design, and learning: A systematic review and meta-analysis. Review of
educational research, 86(1), 79-122.
Shute, V. J. (2011). Stealth assessment in computer-based games to support learning. Computer games and instruction, 55(2), 503-524.
Greitemeyer, T., & Osswald, S. (2010). Effects of prosocial video games on prosocial behavior. Journal of personality and social psychology, 98(2),
211.
Sanders, E. B. N., & Stappers, P. J. (2014). Probes, toolkits and prototypes: three approaches to making in codesigning. CoDesign, 10(1), 5-14
Shute, V. J., & Ke, F. (2012). Games, learning, and assessment. In Assessment in game-based learning (pp. 43-58). Springer, New York, NY.
DiCerbo, K., Shute, V., & Kim, Y. J. (2017). The future of assessment in technology rich environments: Psychometric considerations. Learning, design,
and technology: An international compendium of theory, research, practice, and policy, 1-21.