SlideShare a Scribd company logo
1 of 31
Learning from assessment:
insights about student learning
from programme level evidence
Dr Tansy Jessop, TESTA Project Leader
Launch of the Teaching Centre
School of Politics and International Relations
University of Nottingham
15 May 2014
1) Assessment drives what students pay
attention to, and defines the actual
curriculum (Ramsden 1992).
2) Feedback is significant (Hattie, 2009; Black
and Wiliam, 1998)
3) Programme is central to influencing change.
TESTA premises
Thinking about modules
modulus (Latin): small measure
“interchangeable units”
“standardised units”
“sections for easy constructions”
“a self-contained unit”
How well does IKEA 101 packaging
work for Sociology 101?
Furniture
 Bite-sized
 Self-contained
 Interchangeable
 Quick and instantaneous
 Standardised
 Comes with written
instructions
 Consumption
Student Learning
 Long and complicated
 Interconnected
 Distinctive
 Slow, needs deliberation
 Varied, differentiated
 Tacit, unfathomable,
abstract
 Production
 HEA funded research project (2009-12)
 Seven programmes in four partner universities
 Maps programme-wide assessment
 Engages with Quality Assurance processes
 Diagnosis – intervention – cure
What is TESTA?
Transforming the Experience of Students through Assessment
TESTA ‘Cathedrals Group’ Universities
Edinburgh
Edinburgh Napier
Greenwich
Canterbury Christchurch
Glasgow
Lady Irwin College University of Delhi
University of West Scotland
Sheffield Hallam
TESTA
“…is a way of thinking
about assessment and
feedback”
Graham Gibbs
 Time-on-task
 Challenging and high expectations
 Students need to understand goals and standards
 Prompt feedback
 Detailed, high quality, developmental feedback
 Dialogic cycles of feedback
 Deep learning – beyond factual recall
Based on assessment principles
TESTA Research Methods
(Drawing on Gibbs and Dunbar-Goddet, 2008,2009)
ASSESSMENT
EXPERIENCE
QUESTIONNAIRE
FOCUS GROUPS
PROGRAMME AUDIT
Programme
Team
Meeting
Case Study X: what’s going on?
 Mainly full-time lecturers
 Plenty of varieties of assessment, no exams
 Reasonable amount of formative assessment (14 x)
 33 summative assessments
 Masses of written feedback on assignments (15,000 words)
 Learning outcomes and criteria clearly specified
….looks like a ‘model’ assessment environment
But students:
 Don’t put in a lot of effort and distribute their effort across few topics
 Don’t think there is a lot of feedback or that it very useful, and don’t
make use of it
 Don’t think it is at all clear what the goals and standards are
 …are unhappy
Case Study Y: what’s going on?
 35 summative assessments
 No formative assessment specified in documents
 Learning outcomes and criteria wordy and woolly
 Marking by global, tacit, professional judgements
 Teaching staff mainly part-time and hourly paid
….looks like a problematic assessment environment
But students:
 Put in a lot of effort and distribute their effort across topics
 Have a very clear idea of goals and standards
 Are self-regulating and have a good idea of how to close the gap
Two paradigms…
Transmission Model
Social Constructivist model
 In pairs/groups, read through quotes from student
focus group data on a particular theme.
 What problems does the data imply?
 What solutions might a programme develop to
address some of these challenges?
 A3 sheets provided to tease out challenges and
solutions.
Focus Group data
Challenges Solutions
Student voice data
 If there weren’t loads of other assessments, I’d do it.
 If there are no actual consequences of not doing it, most
students are going to sit in the bar.
 I would probably work for tasks, but for a lot of people, if
it’s not going to count towards your degree, why bother?
 The lecturers do formative assessment but we don’t get
any feedback on it.
Theme 1: Formative is a great idea
but…
We could do with more assessments over the course of the year
to make sure that people are actually doing stuff.
We get too much of this end or half way through the term essay
type things. Continual assessments would be so much better.
So you could have a great time doing nothing until like a month
before Christmas and you’d suddenly panic. I prefer steady
deadlines, there’s a gradual move forward, rather than bam!
Theme 2: Assessment isn’t driving
and distributing student effort
 The feedback is generally focused on the module.
 It’s difficult because your assignments are so detached from
the next one you do for that subject. They don’t relate to each
other.
 Because it’s at the end of the module, it doesn’t feed into our
future work.
 You’ll get really detailed, really commenting feedback from
one tutor and the next tutor will just say ‘Well done’.
Theme 3: Feedback is disjointed
and modular
 The criteria are in a formal document so the language is quite
complex and I’ve had to read it a good few times to kind of
understand what they are saying.
 Assessment criteria can make you take a really narrow approach.
 I don’t have any idea of why it got that mark.
 They read the essay and then they get a general impression, then
they pluck a mark from the air.
 It’s a shot in the dark.
 We’ve got two tutors – one marks completely differently to the
other and it’s pot luck which one you get.
Theme 4: Students are not clear
about goals and standards
1. Too much summative; too little formative
2. Too wide a variety of assessment
3. Lack of time on task
4. Inconsistent marking standards
5. ‘Ticking’ modules off
6. Poor feedback: too little and too slow
7. Lack of oral feedback; lack of dialogue about standards
8. Instrumental reproduction of materials for marks
Main findings
1. Students and staff can’t do more of both.
2. Reductions in summative – how many is enough?
3. Increase in formative – and make sure it is valued and
required.
4. Debunking the myth of two summative per module.
5. Articulating rationale with students, lecturers, senior
managers and QA managers.
1. Summative-formative issues
The case of the under-performing engineers (Graham,
Strathclyde)
The case of the cunning (but not litigious) lawyers (Graham,
somewhere)
The case of the silent teachers (Winchester)
The case of the lost accountants (Winchester)
The case of the disengaged Media students (Winchester)
The case of the instrumental scientists (Saurashtra)
1. Examples of ramping up formative
The case of low effort on Media Studies
The case of bunching on the BA Primary
2. Examples of improving ‘time on task’
The case of the closed door (Psychology)
The case of the one-off in History (Bath Spa)
The case of the Sports Psychologist (Winchester)
The conversation gambit
3. Engaging students in reflection
through improving feedback
 The case of the maverick History lecturer (a dove)
 The case of the highly individualistic creative
writing markers
4. Internalising goals and standards
Programmatic Assessment Design
Feedback Practice
Paper processes to people talking
Changes
 Improvements in NSS scores on A&F – from bottom
quartile in 2009 to top quartile in 2013
 Three programmes with 100% satisfaction ratings post
TESTA
 All TESTA programmes have some movement upwards
on A&F scores
 Programme teams are talking about A&F and pedagogy
 Periodic review processes are changing for the better.
Impacts
www.testa.ac.uk
Gibbs, G. & Simpson, C. (2004) Conditions under which assessment supports students' learning.
Learning and Teaching in Higher Education. 1(1): 3-31.
Gibbs, G. & Dunbar-Goddet, H. (2009). Characterising programme-level assessment environments
that support learning. Assessment & Evaluation in Higher Education. 34,4: 481-489.
Hattie, J. (2007) The Power of Feedback. Review of Educational Research. 77(1) 81-112.
Jessop, T. and Maleckar, B. (in press). The Influence of disciplinary assessment patterns on student
learning: a comparative study. Studies in Higher Education.
Jessop, T. , El Hakim, Y. and Gibbs, G. (2014) The whole is greater than the sum of its parts: a large-
scale study of students’ learning in response to different assessment patterns. Assessment and
Evaluation in Higher Education. 39(1) 73-88.
Jessop, T, McNab, N & Gubby, L. (2012) Mind the gap: An analysis of how quality assurance processes
influence programme assessment patterns. Active Learning in Higher Education. 13(3). 143-154.
Nicol, D. (2010) From monologue to dialogue: improving written feedback processes in mass higher
education, Assessment & Evaluation in Higher Education, 35: 5, 501 – 517
Sadler, D.R. (1989) Formative assessment and the design of instructional systems, Instructional
Science, 18, 119-144.
References

More Related Content

What's hot

What's hot (20)

Dispelling myths; challenging traditions: TESTA evidence
Dispelling myths; challenging traditions: TESTA evidenceDispelling myths; challenging traditions: TESTA evidence
Dispelling myths; challenging traditions: TESTA evidence
 
TESTA, Durham University (December 2013)
TESTA, Durham University (December 2013)TESTA, Durham University (December 2013)
TESTA, Durham University (December 2013)
 
TESTA, University of Greenwich Keynote (July 2013)
TESTA, University of Greenwich Keynote (July 2013)TESTA, University of Greenwich Keynote (July 2013)
TESTA, University of Greenwich Keynote (July 2013)
 
TESTA, Universtiy of Warwick SCAP Conference (July 2013)
TESTA, Universtiy of Warwick SCAP Conference (July 2013)TESTA, Universtiy of Warwick SCAP Conference (July 2013)
TESTA, Universtiy of Warwick SCAP Conference (July 2013)
 
TESTA, University of Leeds: 'Talking @ Teaching' (September 2013)
TESTA, University of Leeds: 'Talking @ Teaching' (September 2013)TESTA, University of Leeds: 'Talking @ Teaching' (September 2013)
TESTA, University of Leeds: 'Talking @ Teaching' (September 2013)
 
Why formative? What is it? Why doesn't it work? How can we do it better?
Why formative? What is it? Why doesn't it work? How can we do it better?Why formative? What is it? Why doesn't it work? How can we do it better?
Why formative? What is it? Why doesn't it work? How can we do it better?
 
TESTA, SIAST Universities of Regina & Saskathewan Webinar (November 2013)
 TESTA, SIAST Universities of Regina & Saskathewan Webinar (November 2013) TESTA, SIAST Universities of Regina & Saskathewan Webinar (November 2013)
TESTA, SIAST Universities of Regina & Saskathewan Webinar (November 2013)
 
The why and what of testa
The why and what of testaThe why and what of testa
The why and what of testa
 
TESTA, HEPN University of Sheffield (December 2014)
 TESTA, HEPN University of Sheffield (December 2014) TESTA, HEPN University of Sheffield (December 2014)
TESTA, HEPN University of Sheffield (December 2014)
 
Implications of TESTA for curriculum design
Implications of TESTA for curriculum designImplications of TESTA for curriculum design
Implications of TESTA for curriculum design
 
Assessing Problem Based Learning
Assessing Problem Based LearningAssessing Problem Based Learning
Assessing Problem Based Learning
 
Push back Sisyphus! Connecting feedback to learning
Push back Sisyphus! Connecting feedback to learningPush back Sisyphus! Connecting feedback to learning
Push back Sisyphus! Connecting feedback to learning
 
Developing assessment patterns that work through TESTA
Developing assessment patterns that work through TESTADeveloping assessment patterns that work through TESTA
Developing assessment patterns that work through TESTA
 
Why do TESTA?
Why do TESTA?Why do TESTA?
Why do TESTA?
 
Testa interactive masterclass
Testa interactive masterclassTesta interactive masterclass
Testa interactive masterclass
 
TESTA - UNSW, Sydney Australia (September 2011)
TESTA - UNSW, Sydney Australia (September 2011)  TESTA - UNSW, Sydney Australia (September 2011)
TESTA - UNSW, Sydney Australia (September 2011)
 
TESTA, Sports Away Day Sheffield Hallam (June 2014)
TESTA, Sports Away Day Sheffield Hallam (June 2014)TESTA, Sports Away Day Sheffield Hallam (June 2014)
TESTA, Sports Away Day Sheffield Hallam (June 2014)
 
Activating student agency through feedback
Activating student agency through feedbackActivating student agency through feedback
Activating student agency through feedback
 
Out of the long shadow of the NSS: TESTA's transformative potential
Out of the long shadow of the NSS: TESTA's transformative potentialOut of the long shadow of the NSS: TESTA's transformative potential
Out of the long shadow of the NSS: TESTA's transformative potential
 
An evidence based model
An evidence based modelAn evidence based model
An evidence based model
 

Viewers also liked

Bryan Reece - Teaching with Technology in the Political Science Classroom
Bryan Reece - Teaching with Technology in the Political Science ClassroomBryan Reece - Teaching with Technology in the Political Science Classroom
Bryan Reece - Teaching with Technology in the Political Science Classroom
PearsonPoliticalScience
 
Comparative politics
Comparative politicsComparative politics
Comparative politics
ahosle
 
Impliccations for teachers
Impliccations for teachersImpliccations for teachers
Impliccations for teachers
Toni McMillian
 

Viewers also liked (13)

D3M Politics
D3M PoliticsD3M Politics
D3M Politics
 
Political cartoon
Political cartoonPolitical cartoon
Political cartoon
 
Social Media in the Classroom - Political
Social Media in the Classroom - PoliticalSocial Media in the Classroom - Political
Social Media in the Classroom - Political
 
The role of politics in secondary school truancy today
The role of politics in secondary school truancy todayThe role of politics in secondary school truancy today
The role of politics in secondary school truancy today
 
Media and politics lse mpa talk 2017
Media and politics lse mpa talk 2017Media and politics lse mpa talk 2017
Media and politics lse mpa talk 2017
 
The Impact of Policy on the Class of 2027
The Impact of Policy on the Class of 2027The Impact of Policy on the Class of 2027
The Impact of Policy on the Class of 2027
 
Childhood Play
Childhood PlayChildhood Play
Childhood Play
 
Bryan Reece - Teaching with Technology in the Political Science Classroom
Bryan Reece - Teaching with Technology in the Political Science ClassroomBryan Reece - Teaching with Technology in the Political Science Classroom
Bryan Reece - Teaching with Technology in the Political Science Classroom
 
Measuring the social dimension of development corridors
Measuring the social dimension of development corridorsMeasuring the social dimension of development corridors
Measuring the social dimension of development corridors
 
Comparative politics
Comparative politicsComparative politics
Comparative politics
 
Political ideologies
Political ideologiesPolitical ideologies
Political ideologies
 
Impliccations for teachers
Impliccations for teachersImpliccations for teachers
Impliccations for teachers
 
Reassessing innovative assessment - Erica Morris
Reassessing innovative assessment - Erica MorrisReassessing innovative assessment - Erica Morris
Reassessing innovative assessment - Erica Morris
 

Similar to TESTA, School of Politics & International Relations, University of Nottingham (May 2014)

Similar to TESTA, School of Politics & International Relations, University of Nottingham (May 2014) (20)

Why a programme view? Why TESTA?
Why a programme view? Why TESTA?Why a programme view? Why TESTA?
Why a programme view? Why TESTA?
 
Lights, action, clapperboards: changing how students think and perform throug...
Lights, action, clapperboards: changing how students think and perform throug...Lights, action, clapperboards: changing how students think and perform throug...
Lights, action, clapperboards: changing how students think and perform throug...
 
Myths to Truths: Acting on evidence about student learning from assessment
Myths to Truths: Acting on evidence about student learning from assessmentMyths to Truths: Acting on evidence about student learning from assessment
Myths to Truths: Acting on evidence about student learning from assessment
 
SLTCC 2016 (Keynote 2) Evidence to Action: Why TESTA works
SLTCC 2016 (Keynote 2) Evidence to Action: Why TESTA worksSLTCC 2016 (Keynote 2) Evidence to Action: Why TESTA works
SLTCC 2016 (Keynote 2) Evidence to Action: Why TESTA works
 
Evidence to action: Why TESTA works
Evidence to action: Why TESTA worksEvidence to action: Why TESTA works
Evidence to action: Why TESTA works
 
1 why do testa
1 why do testa1 why do testa
1 why do testa
 
TESTA, HEDG Spring Meeting London (March 2013)
 TESTA, HEDG Spring Meeting London (March 2013) TESTA, HEDG Spring Meeting London (March 2013)
TESTA, HEDG Spring Meeting London (March 2013)
 
CAN Conference TESTA Programme Assessment
CAN Conference TESTA Programme Assessment CAN Conference TESTA Programme Assessment
CAN Conference TESTA Programme Assessment
 
A broken assessment paradigm?
A broken assessment paradigm?A broken assessment paradigm?
A broken assessment paradigm?
 
Improving student learning through programme assessment
Improving student learning through programme assessmentImproving student learning through programme assessment
Improving student learning through programme assessment
 
Inspiring change in assessment and feedback
Inspiring change in assessment and feedbackInspiring change in assessment and feedback
Inspiring change in assessment and feedback
 
TESTA, HEIR Conference Keynote (September 2014)
TESTA, HEIR Conference Keynote (September 2014)TESTA, HEIR Conference Keynote (September 2014)
TESTA, HEIR Conference Keynote (September 2014)
 
Improving student learning through programme assessment
Improving student learning through programme assessmentImproving student learning through programme assessment
Improving student learning through programme assessment
 
TESTA to FASTECH (November 2011)
 TESTA to FASTECH (November 2011) TESTA to FASTECH (November 2011)
TESTA to FASTECH (November 2011)
 
Changing the assessment narrative
Changing the assessment narrativeChanging the assessment narrative
Changing the assessment narrative
 
Y1 Feedback Project, Maynooth Ireland
Y1 Feedback Project, Maynooth IrelandY1 Feedback Project, Maynooth Ireland
Y1 Feedback Project, Maynooth Ireland
 
Assesment for learning lars helle - sviland skole
Assesment for learning   lars helle - sviland skoleAssesment for learning   lars helle - sviland skole
Assesment for learning lars helle - sviland skole
 
Assessment for learning Lars Helle - Sviland Skole
Assessment for learning   Lars Helle - Sviland SkoleAssessment for learning   Lars Helle - Sviland Skole
Assessment for learning Lars Helle - Sviland Skole
 
Improving student learning through taking a programme approach
Improving student learning through taking a programme approachImproving student learning through taking a programme approach
Improving student learning through taking a programme approach
 
TESTA to FASTECH Presentation
TESTA to FASTECH PresentationTESTA to FASTECH Presentation
TESTA to FASTECH Presentation
 

Recently uploaded

The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
heathfieldcps1
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
negromaestrong
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
Chris Hunter
 
Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.
MateoGardella
 
An Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdfAn Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdf
SanaAli374401
 

Recently uploaded (20)

The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
 
Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptx
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot Graph
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptx
 
APM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAPM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across Sectors
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.
 
Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activity
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
 
An Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdfAn Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdf
 

TESTA, School of Politics & International Relations, University of Nottingham (May 2014)

  • 1. Learning from assessment: insights about student learning from programme level evidence Dr Tansy Jessop, TESTA Project Leader Launch of the Teaching Centre School of Politics and International Relations University of Nottingham 15 May 2014
  • 2. 1) Assessment drives what students pay attention to, and defines the actual curriculum (Ramsden 1992). 2) Feedback is significant (Hattie, 2009; Black and Wiliam, 1998) 3) Programme is central to influencing change. TESTA premises
  • 3. Thinking about modules modulus (Latin): small measure “interchangeable units” “standardised units” “sections for easy constructions” “a self-contained unit”
  • 4. How well does IKEA 101 packaging work for Sociology 101? Furniture  Bite-sized  Self-contained  Interchangeable  Quick and instantaneous  Standardised  Comes with written instructions  Consumption Student Learning  Long and complicated  Interconnected  Distinctive  Slow, needs deliberation  Varied, differentiated  Tacit, unfathomable, abstract  Production
  • 5.  HEA funded research project (2009-12)  Seven programmes in four partner universities  Maps programme-wide assessment  Engages with Quality Assurance processes  Diagnosis – intervention – cure What is TESTA? Transforming the Experience of Students through Assessment
  • 7. Edinburgh Edinburgh Napier Greenwich Canterbury Christchurch Glasgow Lady Irwin College University of Delhi University of West Scotland Sheffield Hallam
  • 8. TESTA “…is a way of thinking about assessment and feedback” Graham Gibbs
  • 9.  Time-on-task  Challenging and high expectations  Students need to understand goals and standards  Prompt feedback  Detailed, high quality, developmental feedback  Dialogic cycles of feedback  Deep learning – beyond factual recall Based on assessment principles
  • 10. TESTA Research Methods (Drawing on Gibbs and Dunbar-Goddet, 2008,2009) ASSESSMENT EXPERIENCE QUESTIONNAIRE FOCUS GROUPS PROGRAMME AUDIT Programme Team Meeting
  • 11. Case Study X: what’s going on?  Mainly full-time lecturers  Plenty of varieties of assessment, no exams  Reasonable amount of formative assessment (14 x)  33 summative assessments  Masses of written feedback on assignments (15,000 words)  Learning outcomes and criteria clearly specified ….looks like a ‘model’ assessment environment But students:  Don’t put in a lot of effort and distribute their effort across few topics  Don’t think there is a lot of feedback or that it very useful, and don’t make use of it  Don’t think it is at all clear what the goals and standards are  …are unhappy
  • 12. Case Study Y: what’s going on?  35 summative assessments  No formative assessment specified in documents  Learning outcomes and criteria wordy and woolly  Marking by global, tacit, professional judgements  Teaching staff mainly part-time and hourly paid ….looks like a problematic assessment environment But students:  Put in a lot of effort and distribute their effort across topics  Have a very clear idea of goals and standards  Are self-regulating and have a good idea of how to close the gap
  • 16.  In pairs/groups, read through quotes from student focus group data on a particular theme.  What problems does the data imply?  What solutions might a programme develop to address some of these challenges?  A3 sheets provided to tease out challenges and solutions. Focus Group data
  • 18.  If there weren’t loads of other assessments, I’d do it.  If there are no actual consequences of not doing it, most students are going to sit in the bar.  I would probably work for tasks, but for a lot of people, if it’s not going to count towards your degree, why bother?  The lecturers do formative assessment but we don’t get any feedback on it. Theme 1: Formative is a great idea but…
  • 19. We could do with more assessments over the course of the year to make sure that people are actually doing stuff. We get too much of this end or half way through the term essay type things. Continual assessments would be so much better. So you could have a great time doing nothing until like a month before Christmas and you’d suddenly panic. I prefer steady deadlines, there’s a gradual move forward, rather than bam! Theme 2: Assessment isn’t driving and distributing student effort
  • 20.  The feedback is generally focused on the module.  It’s difficult because your assignments are so detached from the next one you do for that subject. They don’t relate to each other.  Because it’s at the end of the module, it doesn’t feed into our future work.  You’ll get really detailed, really commenting feedback from one tutor and the next tutor will just say ‘Well done’. Theme 3: Feedback is disjointed and modular
  • 21.  The criteria are in a formal document so the language is quite complex and I’ve had to read it a good few times to kind of understand what they are saying.  Assessment criteria can make you take a really narrow approach.  I don’t have any idea of why it got that mark.  They read the essay and then they get a general impression, then they pluck a mark from the air.  It’s a shot in the dark.  We’ve got two tutors – one marks completely differently to the other and it’s pot luck which one you get. Theme 4: Students are not clear about goals and standards
  • 22. 1. Too much summative; too little formative 2. Too wide a variety of assessment 3. Lack of time on task 4. Inconsistent marking standards 5. ‘Ticking’ modules off 6. Poor feedback: too little and too slow 7. Lack of oral feedback; lack of dialogue about standards 8. Instrumental reproduction of materials for marks Main findings
  • 23. 1. Students and staff can’t do more of both. 2. Reductions in summative – how many is enough? 3. Increase in formative – and make sure it is valued and required. 4. Debunking the myth of two summative per module. 5. Articulating rationale with students, lecturers, senior managers and QA managers. 1. Summative-formative issues
  • 24. The case of the under-performing engineers (Graham, Strathclyde) The case of the cunning (but not litigious) lawyers (Graham, somewhere) The case of the silent teachers (Winchester) The case of the lost accountants (Winchester) The case of the disengaged Media students (Winchester) The case of the instrumental scientists (Saurashtra) 1. Examples of ramping up formative
  • 25. The case of low effort on Media Studies The case of bunching on the BA Primary 2. Examples of improving ‘time on task’
  • 26. The case of the closed door (Psychology) The case of the one-off in History (Bath Spa) The case of the Sports Psychologist (Winchester) The conversation gambit 3. Engaging students in reflection through improving feedback
  • 27.  The case of the maverick History lecturer (a dove)  The case of the highly individualistic creative writing markers 4. Internalising goals and standards
  • 28. Programmatic Assessment Design Feedback Practice Paper processes to people talking Changes
  • 29.  Improvements in NSS scores on A&F – from bottom quartile in 2009 to top quartile in 2013  Three programmes with 100% satisfaction ratings post TESTA  All TESTA programmes have some movement upwards on A&F scores  Programme teams are talking about A&F and pedagogy  Periodic review processes are changing for the better. Impacts
  • 31. Gibbs, G. & Simpson, C. (2004) Conditions under which assessment supports students' learning. Learning and Teaching in Higher Education. 1(1): 3-31. Gibbs, G. & Dunbar-Goddet, H. (2009). Characterising programme-level assessment environments that support learning. Assessment & Evaluation in Higher Education. 34,4: 481-489. Hattie, J. (2007) The Power of Feedback. Review of Educational Research. 77(1) 81-112. Jessop, T. and Maleckar, B. (in press). The Influence of disciplinary assessment patterns on student learning: a comparative study. Studies in Higher Education. Jessop, T. , El Hakim, Y. and Gibbs, G. (2014) The whole is greater than the sum of its parts: a large- scale study of students’ learning in response to different assessment patterns. Assessment and Evaluation in Higher Education. 39(1) 73-88. Jessop, T, McNab, N & Gubby, L. (2012) Mind the gap: An analysis of how quality assurance processes influence programme assessment patterns. Active Learning in Higher Education. 13(3). 143-154. Nicol, D. (2010) From monologue to dialogue: improving written feedback processes in mass higher education, Assessment & Evaluation in Higher Education, 35: 5, 501 – 517 Sadler, D.R. (1989) Formative assessment and the design of instructional systems, Instructional Science, 18, 119-144. References

Editor's Notes

  1. Students spend most time and effort on assessment. Assessment is the cue for student learning and attention. It is also the area where students show least satisfaction on the NSS. Scores on other factors return about 85% of good rankings, whereas only 75% of students find assessment and feedback ‘good’. We often think the curriculum is the knowledge, content and skills we set out in the planned curriculum, but from a students’ perspective, the assessment demands frame the curriculum. Looking at assessment from a modular perspective leads to myopia about the whole degree, the disciplinary discourse, and often prevents students from connecting and integrating knowledge and meeting progression targets. It is very difficult for individual teachers on modules to change the way a programme works through exemplary assessment practice on modules. It takes a programme team and a programme to bring about changes in the student experience. Assessment innovations at the individual module level often fail to address assessment problems at the programme-level, some of which, such as too much summative assessment and not enough formative assessment, are a direct consequence of module-focused course design and innovation.
  2. Raise the question: are there problems with the packaging? Works for furniture – does it work for student learning? Assumptions of modularity: self-contained; disconnected; interchangeable. The next slide indicates some of the tensions of packaging learning in modules, and tensions inherent in the ,metaphor./
  3. Originally used for furniture and prefab and modular homes – how well does it suit educational purposes? I’m not taking issue with modules per se, but want to highlight that there have been some unintended consequences – some good, some bad – of using modular systems. Many programmes have navigated through them, some haven’t. Anyone who has built IKEA furniture knows that the instructions are far from self-evident – and we have translated a lot of our instructions, criteria, programme and module documents for students in ways that may be as baffling for them. Have we squeezed learning into a mould that works better for furniture?
  4. Huge appetite for programme-level data in the sector. Worked with more than 100 programmes in 40 universities internationally. The timing of TESTA – many universities revisiting the design of degrees, thinking about coherence, progression and the impact of modules on student learning. The confluence of modules with semesterisation, lacl of slow learning, silo effects and pointlessness of feedback after the end of a module…
  5. What started as a research methodology has become a way of thinking. David Nicol – changing the discourse, the way we think about assessment and feedback; not only technical, research, mapping, also shaping our thinking. Evidence, assessment principles
  6. Based on robust research methods about whole programmes - 40 audits; 2000 AEQ returns; 50 focus groups. The two triangulating methodologies of the AEQ and focus groups are student experience data – student voice etc. Three legged stool. These three elements of data are compiled into a case profile which captures the interaction of an academic’s programme view, the ‘official line’ or discourse of assessment and how students perceive it. This is a very dynamic rendering because student voice is explanatory, but also probes some of our assumptions as academics about how students work and how assessment works for them etc. Finally the case profile is subject to discussion and contextualisation by insiders – the people who teach on the programme, who prioritise interventions.
  7. Large programme; modular approaches; marker variation, late feedback; dependency on tutors
  8. Student workloads often concentrated around two summative points per module. Sequencing, timing, bunching issues, and ticking off modules so students don’t pay attention to feedback at the end point.
  9. Limitations of explicit criteria, marker variation is huge, particularly in humanities, arts and professional courses (non science ones) Students haven’t internalised standards which are often tacit. Marking workshops, exemplars, peer review.
  10. Seminars youtube presentations; teaching student – map my programme; under-confident but keen journal club Principles make it authentic, Multi stage; Public work – social pressure; Spread and co-ordinate hand in dates; Formative requirements peer marking and accountability sampling Setting first year expectations Brief, frequent, innovative, developmental
  11. TESTA Higher Education Academy NTFS project, funded for 3 years in 2009. 4 partner universities, 7 programmes – ‘cathedrals group’. Gather data on whole programme assessment, and feed this back to teams in order to bring about changes. In the original seven programmes collected before and after data.