A talk about two things in tandem: good practices for using the Moodle quiz; and how the quiz is used in reality at the Open University. Hopefully those two things have some things in common.
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
2017 UK/IE MoodleMoot: What makes a good moodle quiz? Lessons from the Open University
1. What makes a good Moodle quiz?
Lessons from the Open University
Tim Hunt & Chris Nelson
2. Overview
● Who we are
● The Open University
● What goes in a quiz?
● How quizzes go in module
● Producing and running quizzes
2
3. Who we are
Danger: highly oversimplified module production,
maintenance and support diagram!
3
OU faculties
Access
FASS
FBL
IET
WELS
STEM
OU
production,
support and
admin
Academic
Services
IT
Library
LTI
Tim Hunt,
Learning
Systems, IT
Chris Nelson,
Learning
Systems, LTI
5. The Open University
5
Main campus
(including the
Moodle servers)
in Milton Keynes
Students & tutors
all around the UK
and beyond
131 degrees
built from
450 modules
+ many other
qualifications
OpenLearn
6. A typical module
● Runs for 6 or 9 months
● 300 or 600 hours of study (so 8 or 16 hours per week)
● Students study from home (or work, bus, train, submarine, prison…)
● Students in groups of ~16 with an assigned Associate lecturer (tutor)
● Teaching material structured around the module website, using a weekly study
planner, with face-to-face or online tutorials
● Modules are produced, then presented with minor amends for 5–15 years
● Produced by academics in the faculty, working with learning media design support staff in LTI
What Moodle would call a course
6
7. Moodle at the OU
● Used Moodle since 2006 – version 1.6
● Lots of contributions to Moodle core
● Lots of plugins shared with the community
● Including question types
● ~50 000 logins to our main site on a busy day
● ~450 module websites per year
● ~700 000 quiz attempts per year
● ~6 million questions attempted per year
7
8. Quiz attempts per year
Online tests: summative and formative iCMAs (assessed),
and diagnostic and practice quizzes (unassessed)
8
0
100,000
200,000
300,000
400,000
500,000
600,000
700,000
800,000
06/07 07/08 08/09 09/10 10/11 11/12 12/13 13/14 14/15 15/16
Onlinetestsserved
Academic year (September-August)
Online tests served by the OU across all platforms by academic year
9. Student opinion of quizzes
Statement Definitely + mostly agree
Answering iCMA questions helps me to learn 129 (87%)
If I get the answer to an iCMA question wrong, the
computer-generated feedback is helpful
128 (85%)
The mark I am awarded for each iCMA helps me to
gauge how well I am doing
107 (72%)
Answering iCMA questions is fun 95 (64%)
I think the computer sometimes marks my iCMA
answers wrong
34 (22%)
82 (55%) mostly + definitely disagreed
9
Questionnaire results from ~150 first- and second-presentation students
of S104 Exploring Science. (Based on Jordan, 2011, p.154)
11. ● Online tests: assessed iCMAs versus unassessed quizzes
● How the online tests fits into the assessment strategy
● What is the test trying to achieve?
● Question, and question type
Key considerations
11
Monitoring
(formative)
Evaluation
(summative)
Pre-assessment
(diagnostic & practice)
If compulsory:
• Assessed deferred with
results on submission
• Assessed interactive
repeatable
• Assessed deferred
• Assessed interactive
12. What question types are used?
12
ShortAnswer
3.4% Essay
0.3%
STACK
14.9%
Numerical
10.2%
VarNumericSet
0.7%
VarNumeric
0.2%PMatch
0.2%
VarNumUnit
0.0%
PMatchJME
0.0%
MultiChoice
38.3%
OUMultiresponse
8.6%
DDWtoS
7.0%
Matching
5.9%
TrueFalse
2.8%
GapSelect
2.8%
DDImageOrText
0.6%
DDMarker
0.1%
Combined
3.3%
Opaque
0.8%
Answered question types on modules for the 2016 presentation year
('Description' type removed)
• How will students
respond?
Constructed
versus selective
• Blue = constructed
response types
• Green = selected
response types
13. Key considerations
Question bank, variants or ‘set’ questions?
13
● Some quizzes
randomly draw from
a large pool of
questions to present
different questions to
different students at
different times
● Some use random
variants of (roughly)
the same question
● Others simply set the
questions
16. Best practice
● Clearly communicate to students in advance:
● Availability (open/close times and dates)
● What the online test is testing (it’s Learning Objective) as well as what area it
might cover (the first two topics vs the entire module)
● How the online test will behave (e.g. everyone gets the same question vs.
different variants possible; instant or deferred responses)
● What will happen after completion?
● What to do if there’s a problem (e.g. additional requirements)
● Use, review, improve for next time! (Especially if assessed)
16
18. How many quizzes in a module?
Number of quizzes Number of modules
1 31
2 20
3 16
4 13
5 8
6 12
7 5
8 11
9 8
10 7
11 5
12 2
13 3
14 2
15 3
17 5
18 1
19 1
20 4
more 15
Total 172
All modules presented in 2016
18
Let’s look at why module teams have used
different numbers of quizzes.
For you to guess: How far does ‘more’ go?
How many quizzes in the OU module with the most?
0
5
10
15
20
25
30
35
0 5 10 15 20
Numberofmodules
Number of quizzes (cut off at 20)
19. Why have 1 quiz?
● A particular learning activity
● Revise a prerequisite skill
● Practice a newly acquired skill
● Trigger reflection
● Assess one specific learning outcome
● e.g. interpreting tables and graphs in a social
science module
19
20. Why have 17 quizzes?
● Module has 12 study units – one practice quiz each
● 4 Assessed quizzes
● Unit 1
● Units 2–4
● Units 5–8
● Units 9–12
● 1 Revision quiz
MST124 Essential mathematics 1
20
22. When do students do the quizzes?
Data from MST124 2014J (Lowe, 2015)
22
Note: no option
to submit late
23. Diagnostic quizzes
● The Open University’s is “open to people, places, methods and ideas”
● Difficult for modules like MST124 Essential Mathematics 1
● Some students bite off more than they can chew
● Particularly diverse student population: module is used in 40 different degrees
● Calvert et al. (2016) found:
“an early ‘Are you ready for MST124’ diagnostic quiz is key to ensuring students
are prepared for the module. Based on results, guidance officers may advise
students to start with MU123 Discovering Mathematics instead, or the data used
to tailor tutorials.”
● Participation in the diagnostic indicates a level of pre-knowledge, confidence, and
willingness to engage, therefore increasing the likelihood of successfully passing
the module
● Pass rates: Students who did the quiz: 60% Students who didn’t: 40%
Using informal diagnostic quizzes to support students
23
24. Why have 114 quizzes?
● 1 quiz the first week – Listening practice
● 3 quizzes for the remaining 35 weeks
● Listening
● Language structure
● Reading practice
● 4 extra practice quizzes
● 4 assessed quizzes
L197 第一步 Beginners’ Chinese
24
25. Are deadlines a good idea?
From SM358 The quantum world
25
2010 advisory deadline
26. Are deadlines a good idea?
From SM358 The quantum world
26
2010 advisory deadline
2012 hard deadline
27. Which subjects use… quizzes?
Warning: total number different for different subjects and years
27
12
11
5
1
2
13
11 11
37
10
1
2
7
11 11
4
1
2
13
14 14
40
11
1
2
9
11
10
4
2 2
15
14
17
39
11
1 1
12
10
15
9
3
2
13 13
20
37
15
1
8
16
22
12
5
3
16
15
19
43
21
1
6
14
21
16
8
5
17
12
21
26
24
1
6
0
10
20
30
40
50
A D B W E K L M S T U H Y
FASS FBL WELS STEM IET Access
Numberofmodules
Subject and faculty
Number of modules using online tests by subject
2011 2012 2013 2014 2015 2016
30. Creation of an iCMA
● Workflow system.
● Academic module author(s)
devise and create the questions
for each online test
● Other academic team members
test the questions
● Editors may be requested to
proofread questions and
responses
● Module authors approve
questions (or sets) for use
30
31. Administration of an iCMA
● Three days prior to cut-off date, the system automatically sends a score approval
e-mail reminder to the Curriculum Manager (who manages the module in the
school)
● When >100 submissions made, the author or Curriculum Manager approves the
iCMA scoring through the Results Statistics page. Adjustments can be made if
required (e.g. a badly-worded question can be ignored and scores recalculated)
-----------------------------------------------
¦ iCMA closes ¦
-----------------------------------------------
● A week after the iCMA closes, scores are transferred to student records, and scores
and feedback are made available to students
● Continuous improvement of question, hints, and feedback!
Module is live and iCMA is open
31
33. References
Butcher, P., Sangwin, C. and Hunt, T. (2013). “Embedding and enhancing
eAssessment in the leading open source VLE”. In: Proceedings of the HEA STEM
Conference, Higher Education Academy, 2013.
Calvert, C., Hilliam, R., and Coleman, J. (2016). “Improving retention for all students,
studying mathematics as part of their chosen qualification, by using a voluntary
diagnostic quiz”, MSOR Connections. Vol.43, No.3, pp.32–38.
Jordan, S. (2011). “Using interactive computer‐based assessment to support
beginning distance learners of science”, Open Learning: The Journal of Open,
Distance and E-Learning. Vol.26, pp.147–164.
Lowe, T. W. (2015). “Online Quizzes for Distance Learning of Mathematics”, Teaching
Mathematics and Its Applications: An International Journal of the IMA. Vol.34,
pp.138–148.
… and other works by these authors
33
35. Summary: effective quiz use
Depends what you want to achieve educationally
When appropriate, probably helps students
so don’t be afraid to just try it
Then evaluate & adjust
Little & often seems like good advice
The learning is more important than the marks
35
Tim Hunt
t.j.hunt@open.ac.uk
Senior Developer, Learning Systems,
Information Technology
Chris Nelson
chris.nelson@open.ac.uk
eAssessment Product Development Manager,
Learning and Teaching Innovation