Katarina Thomson and Karl Molden - Turning Course Evaluation into Information
1. Turning Course
Evaluation Into Information
Karl Molden, Senior Planning Analyst
Katarina Thomson, Senior Planning Analyst
2. | University of Greenwich Context
37,499 Students in 2014/15 of which:
• 19,671 university based
• 1,603 elsewhere in UK
• 16,225 wholly overseas
Of UK students
• 72% Full time
• 76% Undergraduate
• 13% Overseas plus 7% EU students
• 50% white students
• Split across 3 campuses (plus partner colleges)
• Main subject areas: business and administrative studies, subjects allied to
medicine (nursing, paramedics etc), education (teacher training etc), computer
science & maths, biological sciences & pharmacy, engineering, architecture and
building, social studies, etc.
3. | University of Greenwich Surveys
Numerous student surveys:
• National Student Survey (NSS)
• Postgraduate Taught Experience Survey (PTES) / Postgraduate research
Experience Survey (PRES
• International Student Barometer
• University Student Survey (USS) – including UK Engagement Survey (UKES)
• In 2013 and 2014: in-house engagement survey based on National Survey
of Student Engagement (NSSE) (with licence from University of Indiana)
• Course evaluations
• Destinations of Leavers from Higher Education (DLHE)
• etc
• Issues of survey fatigue -> to maintain the commitment of students to respond
to surveys, there is a need to make sure that survey results are fully utilised
• The university is developing its systems and policy with regards to analytics and
their use
4. | University Student Survey (USS)
• In 2014 and 2015 the University Student Survey was issued to:
• On-campus, first year undergraduates
• Non final-year undergraduates UK partner colleges
• Wholly overseas based students
• Includes:
• Key NSS questions
• HEA’s UK Engagement Survey (UKES) (on campus only)
• Univ of Greenwich specific questions
5. | Module evaluations
• From 2014/15: coordinated system of module evaluations
• Every student on every module is asked to answer a standard set of questions
• Uses Evasys software
• Transparency: module-level reports are sent automatically on completion of
survey to:
• course leaders – who are required to comment on Moodle
• students who responded to the survey
• Results are also available for central analysis
• Generates a vast amount of data
7. | Motivation
• Surveys are used across the sector and typically well understood by
staff and students – potential to develop something clear and
understandable for both groups which could be adapted easily by
other institutions
• We know that other work carried out has found a small but significant
relationship between measures of engagement and academic
achievement* - this project would build on that to create “real-time”
analytics from module level surveys
• Systems for collection of data already exist and are well supported
• We are "drowning in a sea of data“ – why add more if we can leverage
what’s already there?
* Sheffield Hallam University, Using UKES results and institutional award marks to explore the relationship
between student engagement and academic achievement
9. | Results
• Sheffield Hallam identified 6 UKES questions which had a statistically
significant correlation with academic outcomes.
• Our analysis has identified 4 of the same 6 questions as having a
similar correlation.
• One further USS question also found to have statistically
significant correlation.
• Three of these five questions are thematically linked – around
working with/explaining material to other students.
10. | Results
1 2 4 5
020406080
Answers for Q5a based on GPA
Q5a
GPA
1 2 4 5
020406080
Answers for Q6b based on GPA
Q6b
GPA
12. | Results
• Further to the results with the UKES questions, two questions from
the course evaluation surveys were found to have a correlation with
academic outcome.
• As these surveys are conducted every term this could be useful in
developing “real-time” interventions.
• Further work is needed to look at the potential of adding further data
to the mix to develop more sophisticated statistical models.
13. | Conclusion
The evidence seems to support a small, positive correlation between
certain survey question responses and academic outcomes.
By identifying activities which may lead to improved responses to
certain questions it may therefore be possible to improve these
outcomes.
Similar relationships in “in-session” course level surveys have been
identified which mean we may be able to develop other processes to
improve academic achievement of students.
| Questions?