SlideShare utilise les cookies pour améliorer les fonctionnalités et les performances, et également pour vous montrer des publicités pertinentes. Si vous continuez à naviguer sur ce site, vous acceptez l’utilisation de cookies. Consultez nos Conditions d’utilisation et notre Politique de confidentialité.
SlideShare utilise les cookies pour améliorer les fonctionnalités et les performances, et également pour vous montrer des publicités pertinentes. Si vous continuez à naviguer sur ce site, vous acceptez l’utilisation de cookies. Consultez notre Politique de confidentialité et nos Conditions d’utilisation pour en savoir plus.
Hello everyone! In this talk I will present the joint work with our collaborators at Aalto University in Finland.
To learn to program, we first need to understand how program works. Perhaps, this comic is a good illustration for showing importance of understanding program dynamics.
Understanding program dynamics, i.e. how program execution is carried out in computer memory, has been one of the central challenges of learning programming.
It’s important to make execution process visible for students, especially novices who are new programming to help them easily see and follow important steps , and avoid programming misconceptions that are caused by the code “not behaving” as expected.
So, many methods and tools were developed to support students in understanding program dynamics. One advanced method is “program visualization” that offer animated examples to promote deeper understanding of code semantics
One well-known tool for this is Jeliot 3. As you can see from this example, it provides “visual representation” of the program’s execution.
Another well-known tool is Python Tutor. As you can see from this example, Using this tool, teachers and students can write Python programs in the web browser, step forwards and backwards through execution to view the “run-time state of data structures”.
So as we saw, these are amazing tools! But the question is were they successful in practice?
Yes, they “Worked excellent in labs”
and no because they did not work in classrooms , why? stress the problem. According to SIGCSE working group report [Naps et al. 2003], they are usually underused in classrooms by teachers and students
So, people have been looking for ways to solve this low usage problem. One solution that they suggest is packaging interactive learning content into online practice systems that students can access for their own benefit, as they receive no no additional credit for that.
A good example of these practice systems is CodingBat.com that is an online site of interactive “coding problems”. Students can write the code for each problem and receive immediate feedback.
This is really good, But this does not solve the problem completely! it engages teachers, they need no longer to master these tools to integrate the interactive contents into their teaching. Instead, they just provide students with a link to the practice system.
But it still has the problem of student not using the systems, And can we simply achieve it by turning these examples from “practice” to mandatory content? We apparently no! because it can lead to mindless clicking through the content to gain points without any understanding.
So, our goal was to solve this problem by “motivating” students to use these interactive examples and then examining value of them on students learning
We introduced a novel context where interactive learning resources such as animated examples can be used. This novel context is a special “practice system” that benefits from three main features to engage students: 1) having all content in one place, organized by course topics 2) “open student modeling that let student monitor their own progress and 3) “social comparison“ that let students compare their progress with the progress of other students in class. We have evidence from past work that by these features in the practice system, we can increase students engagement which can in turn positively affect their learning.
The practice system that we used in our study is called Mastery Grids that was developed in our group. This screenshot shows the interface of MG.
Mastery Grids organizes the content in topics that are shown as the columns of the gird. The color of a cell in a topic shows progress of the student in that topic and it gets darker as the student completes more contents within that topic.
The first row labeled by “Me”, shows the progress of student in each topic and the other two rows provides social comparison visualizations. The last row labeled by “Group” let the student see the progress of the class in each of the topics, and the middle row labeled by “Me vs Group” let the student compare his/her progress with the group.
If the student clicks on a cell in the grid, she/he can see resources related to that topic. And a click on a resource cell will show that resource in a new window that overlays Mastery Grids
Mastery Grids integrates three kinds of content: Problems, examples, and animated examples all hosted by external content servers.
Problems are parameterized code evaluation exercises, which the subject could repeat several times with different parameters.
Problems include automatic evaluation and feedback for student, saying
whether the response was correct or wrong.
Annotated examples uses a simpler technology to provide interactive exploration of programming examples. They turn an example code into a a worked-out example by
Adding explanation to example lines to show how specific goal can be achieved.
Students worked with examples such as which lines were clicked or how much time they spent on each line, etc.
To see if animated examples brings more advantage than examples with simpler technology, we compared them with annotated examples.
In this paper, we examine the value of animated program examples as practice content for both educational impact and prospects for engagement. To examine the added value of animated examples, we compare them to annotated examples, which are a more traditional way to support students in learning to understand programs.
So we wanted to examine the value of animated examples in Mastery Grids. We analyzed collected data from student use of the system and looked at four measures: ….
Animated examples were more engaging! To measure the engagement, we looked at the amount of work with practice content done by the students. We calculated avg. completion that by dividing the average clicks actually made by the total number of clicks needed to view the whole example. Only started examples were taken into account. For annotated examples, a click means an action to view an explanation, and for animated examples, a click means a step to move forward.
As you can see from this chart, percentage of completion was significantly higher in animated ex. This indicates that animated ex. motivated students to follow more lines, 23.6 % more than annotated ex. (the difference was sig. using non-param test)
Student not only completed more animation, but they also stayed longer in each animated ex. As you can see from this chart, the time spent on each ex. Is about twice more than the time spent in annotated ex. This difference was also significant (using non-parametric test)
One thing that I want to mention is the difference btw nature of animated examples and annotated examples. In anim. they need to clicks on every lines to get to a line of their interest whereas in anno. they can freely select a line they want. The interesting observation was that despite the nature of anim. they were still willing to see all lines and completed on average 95% of the ex. they started.
we ran step-wise regression to see impact of examples on grade, performance, and on lgain, our factors were attempts, mg-actions, pretest, gender, etc.
One thing I want to mention here is that there is a 2-way correlation between work with example and knowledge level. One is that examples do help students to increase knowledge (positive connection between student work with examples and performance), but the other way is negative, due to free content choice, lower knowledge and failures can led to the increase use of examples (negative connection between examples and performance) Despite this two way effect, regression can show which of these two ways has a more dominant effect.
This table shows the reliable predictors (sig. 0.05) related to work on examples in best model found by step-wise regression. We found that attempting an animated ex. is positively affecting number of correct problem attempts. Students had likely learned more from animated examples as they answered more questions correctly. So, positive impact of animated ex. overcame the process associating examples with poor knowledge. In contrast, annotated ex. appeared to be less useful for learning allowing the reverse process to overcome the positive impact on knowledge.
Switch from annotated to animated ex. turned the connection between amount of work and post-test score from negative to positive Regression analysis showed that a view of an annotated ex. decreases the posttest while each explored animated ex. increases the posttest. This shows that Animated examples had positive effect on students’ learning, increasing their posttest score.
Another case where switch to animated ex. turns the balance positive number of views on annotated ex. had negative effect on course grade while animated ex. views positively influenced the course grade Regression analysis showed that a view of an annotated ex. decreases the posttest while each explored animated ex. increases the posttest. This shows that Animated examples had positive effect on students’ course grade
Compared to traditional annotated examples, animated examples:
better engaged students increasing their interest in completing examples provided better impact on several performance measures such as problem solving success, post-test scores, and course grade turned the relationship between the amount of work with examples and performance from negative to positive
letting users skip parts of animation that they already know recommending lines of example that help user with what they do not know
Animated Examples as Practice Content
in a Java Programming Course
Roya Hosseini, Teemu Sirkiä, Julio Guerra
Peter Brusilovsky, Lauri Malmi
Technology Pays Back!
✓More work higher performance
Guiding student within animation
• saves time
• assures viewing must-seen parts
Loboda et al. (2014). Mastery grids: An open source social educational progress
visualization. In Open Learning and Teaching in Educational Communities (pp. 235-
Koli Calling International Conf. on Computing Education Research (pp. 189-190)
Brusilovsky et al. (2009). Problem solving examples as first class objects in
educational digital libraries: Three obstacles to overcome. Journal of Educational
Multimedia and Hypermedia, 18(3) (pp. 267–288)
Moreno et al. (2004). Visualizing programs with Jeliot 3. Proc. of the working
conference on Advanced visual interfaces (pp. 373-376)
Guo, P. J. (2013). Online python tutor: embeddable web-based program visualization
for cs education. Proc. of the 44th ACM technical symposium on Computer science
education (pp. 579-584)
Hsiao et al. (2008). Web-based parameterized questions for object-oriented
programming. Proc. of the World Conference on E-Learning (pp. 17-21)