This document discusses university league tables and how student outcomes factor into a university's reputation and rankings. It notes several problems with league tables, such as time lags in data and inconsistencies in methodology. The document then examines how the university currently performs in various league tables and the factors that rankings are based on. It proposes actions the university can take to improve its performance and position over time, such as increasing the number of good degrees awarded, improving student satisfaction scores, and encouraging more students to complete post-graduation outcome surveys. The goal is to genuinely enhance student outcomes rather than just manipulate league table results.
2. Contents
• What are league tables for?
• Student input to league
tables.
• How league tables are
constructed.
• What do they say about us?
• How should we respond?
3. Where do you think we should be in a university league table?
4. Why are they important?
• Used in recruitment decisions – maybe.
• Reputation – definitely
• How our students see us
• Our own self-worth
5. What are the problems with
League tables?
• League tables use the most recent data sets available
at the time of publication.
• There is a time-lag when using publishing data sets and
time series data.
• Different statistical analysis, methodology and
weightings are used. In the different tables
• There can be changes to criteria and methodology
• Data sets can change
• Difficult to make comparisons between subjects and
between competitors at this lower level.
6. More problems....
• Categorisation of courses and subjects are different.
And only concentrate on FT UG provision – this
University is about so much more than that
• The editorial that appears alongside each University
entry is not always available to the University to update
or influence prior to publication.
• Information is often not the same as we might use
internally
• Newspapers love to trumpet obvious, but wildly
misleading headlines
7. However – we are stuck
with them…..
....so, let’s make sure we understand
what they say about us, and try to
make them work for us
8. Over-riding importance –
Improving Student Outcomes
• We need to allow students to reach their full
potential
– Celebrating individual success
– Maximising individual rewards
– Maximising contribution to society
– How do we do this in areas of low aspiration?
• Students are increasingly consumers
– Access price comparisons – is there price
sensitivity?
– Access to performance comparisons
9. Impact of student outcomes on the
University
• KIS
• Unistats
• National Student Survey
• League Tables
• Which? University Guide
• College and University websites
• The Complete University Guide
• ……etc etc
10. Student Outcomes in
Public Information
• Inputs
– Spend per student
– Staff student ratios
– Entry standards
– Research ratings
– Cost of living
– Spend on services
– Faculty spend
• Outputs
– Number of “good”
degrees
– National Student
Survey results
– Retention rates
– Employability
11. Student Input to Public Information
• National Student
Survey
– Measure of final year
students
– Questions on
satisfaction with
course, feedback,
teaching
• Used in:
– League tables/KIS
(external)
– Portfolio performance
review (internal)
– Award annual
monitoring (internal)
12. Ones to watch
• Complete University Guide (May)
• Guardian University Guide (June) *
• Times/Sunday Times Good University Guide
(August)
• * - in University strategic plan…..aim to be in top 50
13. What do they say about us
now?
YEAR THE
COMPLETE
UNIVERSITY
GUIDE
THE
GUARDIAN
THE TIMES THE SUNDAY
TIMES
2011 80 69 77 96
2012 99 77 89 105
2013 108 96 100 107
2014 113 92 ??? ???
14. SU positions in Complete University Guide for each factor
Entry Standards 111/124
Student Satisfaction 54/124 <- good!!
Research Assessment 111/124
Graduate Prospects 116/124
Staff student ratio 87/124
Academic Services Spend 94/124
Facilities Spend 62/124
Good Honours 103/124
Degree Completion 108/124
Overall 113th
15.
16.
17.
18. To be successful in the CUG league table :
• increase the number of good degrees awarded –
how?
• recruit better qualified students who are more
likely to get good degrees – WP? Value added?
• with better degrees, more graduates will get
graduate entry jobs
• increase research assessment scores – limit
number submitted to REF
19. Factor Source Weighting
% Satisfied with Teaching NSS 2012 10%
% Satisfied overall with
course
NSS 2012 5%
Expenditure per student
(FTE)
HESA data for 2010–11, and
2011–12
15%
Student:staff ratio HESA data for 2010–11, and
2011–12
15%
Career prospects 2010-11 HESA/DLHE data 15%
Value added score/10 HESA data (ie the cohort who
graduated in 2012
15%
Average Entry Tariff typical UCAS scores of
students aged 20 or under on
entry (HESA)
15%
% Satisfied with Assessment NSS 2012 10%
Guardian league table factors
28. Are we trying to improving
position or performance?
• Clearly, we can try to play the game of moving
our league table position
• What we really want to do is improve our
performance in each of the key areas to make
sure there is a sustainable and genuine change
29. What steps can we take?
Guardian criteria Suggested Action
Entry standards Review all current standard offers, are we
pitching ourselves properly against
competitors? Average A level scores have gone
up, have our offers?
Student/staff ratio Reviewing more thoroughly the data we submit
to HESA
Developing better models of SSR to identify
where investment is most needed
Spend per student Reviewing more thoroughly the data we submit
to HESA
Identifying capital spend needed
Increased recent spend on libraries and IT will
have an impact
30. Guardian criteria Action
Value added Increase number of “good” degrees awarded.
Reviewing all level modules with low pass rates
and average marks.
Reviewing degree classification rules as part of
change to % calculation
Identifying through portfolio review awards with
consistently poor progression and attainment
NSS teaching, assessment
and feedback and overall
satisfaction
Faculty action plans, and award level plans
Increased student engagement with survey
Seven principles of feedback
Online assessment and feedback project
employment Encouraging more students to complete DLHE
Staffordshire Graduate – improving our students’
chances of success
31. Keeping track of how we
are doing
• Portfolio Performance Measurement
– Provides an internal review mechanisms for award
performance
– Records market attractiveness
– Retention, progression and “good” degrees
– National Student Survey, DLHE
• Future performance measures
– Value added (difficult to get the raw data)
– SSR at School and subject level
– REF
32. Conclusions
• League Tables are a necessary evil, and a part of
the HE landscape
• One of our KPIs is to ‘to be amongst the top 50
institutions in The Guardian league table’
• We all have a part to play, in explaining to
students and parents what they really measure,
• Central work on-going on data returns and
strategy
• Schools working in partnership with our students
on supporting their experience and attainment
Definitely used in recruitment of overseas students, and overseas partners and potential partners have an interest in how we perform or are viewed
Final point – subject level tables are not consistent, as the level of granularity of data used is not reflective of what actually happens
Retention is not a measure in The Guardian League table. Retention measures using a HESA PI, which does not directly correlate with our own internal progression data are used in other league tablesUnder future performance measures you could perhaps mention the REF. The way we played it last time has had a negative impact on our league table position in all the other league table (not the Guardian) every year since 2008.