The first keynote I delivered at #eas14 used Brookfield's notion of lenses of reflection to encourage participants to consider other roles in everyday practice. This presentation focussed on a number of examples, including visualising assessment timelines.
8. AM I CRITICALLY REFLETIVE?
SELF
REFLECTION
STUDENTS LITERATURE
PEERS
PROGRAMME SYSTEM(S)
9. STUDENTS
A BIGGER PICTURE…
THE
INSTITUTION
RECRUITMENT
& RETENTION
NSS as driver
for change
What can we do to improve the student experience?
Is there a danger in using the NSS as a driver for change?
12. SYSTEMS & ‘THE SYSTEM’
STUDENT AS
PRODUCER != TRADITIONAL
COURSEWORK
RECEIPTING
We're operating in a world in which the rules and
processes we have were not invented for
‘Practice remains stubbornly resistant to change’
(JISC, 2012)
18. TECHNOLOGY & THE LIFECYCLE
EFFICIENCY
ENHANCEMENT
TRANSFORMATION
SPECIFYING
SETTING
SUPPORTING
SUBMITTING
MARKING & FEEDBACK
RECORDING GRADES
REFLECTING
Very modular?
19. CONSEQUENTIAL VALIDITY
"Students experience the interaction effects of one form of
assessment on another. In any given month they may have to
complete ten assessment tasks, in another month only one. The ways
in which they approach each of these will be influenced by the others.
A task which is intrinsically interesting and which may be approached
meaningfully at any other time may be given short shrift when it is
located among a thicket of examinations.
Very little attention has been given to the compounding effects of
assessment even when we know that it is the total array of demands
in a given period which influences how each one is tackled.”
(Boud p3, in Knight 1995)
20. ASSESSMENT TIMELINES
What does assessment
look like through the
student lens? As a whole?
What do you see?
Represents high stakes assessments
Represents medium stakes assessment
Represents low stakes assessments
Week 3 6 9 12
Week 3 6 9 12
Week 3 6 9 12
Week 3 6 9 12
24. ASSESSMENT TIMELINES
Strategic planning of
summative assessments
(ESCAPE Project,
Hertfordshire)
JISC Design Studio on
Transforming A&F
Week 3 6 9 12
Traditional summative assessments at the end of module
Week 3 6 9 12
Split one high stakes summative into two medium
stakes assessments & low stakes
Week 3 6 9 12
Continuous or low stakes assessments throughout
Represents high stakes assessments
Represents medium stakes assessment
Represents low stakes assessments
26. ASSESSMENT TIMELINES
Represents summative assessments
10 13 16 22 24 26 29 30 39 43
YEAR 1
WRITTEN EXAM
2 X 1.5 HR PAPERS
RESEARCH &
SCHOLARSHIP
OSCE
RESEARCH &
SCHOLARSHIP
RESIT
WRITTEN EXAM
2 X 1.5 HR PAPERS
4 8 13 21 26 31 34 38 43 45 49
YEAR 2
Represents online formative assessments
Represents re-sit opportunity
RESIT
What things pop out at you?
27. ASSESSMENT TIMELINES
SELF
REFLECTION
STUDENTS LITERATURE PEERS
GOOD PRACTICE LOST STUDENT WORKLOAD
PROGRAMME SYSTEM(S)
GOOD PRACTICE LOST
WEIGHTINGS & FORMATS
IMPACT ON PROCESSES
FEED-FORWARD
LANGUAGE
SYSTEM PERFORMANCE
TESTA Project @ Dundee
29. ONE MORE THING
‘Learning analytics’
‘refers to the interpretation of a wide range of data
produced by and gathered on behalf of students in
order to assess academic progress, predict future
performance, and spot potential issues’ (L. Johnson,
Smith, Willis, Levine, & Haywood, 2011).
30. ONE MORE THING
“Assessment analytics offers the potential for
student attainment to be measured across time, in
comparison with individual students’ starting points
(ipsative development), with their peers and/or
against benchmarks or standards” (Ellis, 2013, p.
663).
32. ONE MORE THING
Ahead of Mark’s keynote, consider:
• How/what might we analyse?
• Reflect through the student lens - ethical issues
your data being analysed?
• Staff lens – what about your data?
• How might we use data to inform future
practice?
Notes de l'éditeur
This perspective calls for us to look at the profile of assessment as students see it - through their lens or from their point of view. The total experience as a whole. I was talking about this very issue with a colleague last year and we spoke about visualising what a student’s assessment profile might look like. I was half way through visualising our medical programme when I came across some other work that already existed around visualising timelines from the ESCAPE Project at Hertfordshire. This gave me some inspiration in my diagrams.
So what we might typically or traditionally see, is for one module, high stakes assessment bunched up at the end of the module. When we consider what else is happening, we realise that students are studying 4 modules at a time, all of which might be assessed in similar ways at similar times.What is the consequential validity here? Which assignments do students spend more time on? What is their experience like in these few weeks?
The point of this is that we can, with a little effort, produce an assessment profile - a birds eye view of a programme which we can use to see the wider impact of assessment, because there are many factors at play.
I’m going to ask you another question - when you see this what is the first thing that comes to mind - any volunteers?:
There are a number of things that pop out for me:
* impact of assessment on student workload - we go back to that point about students having 10 pieces in one month, and none in other months.
Or the point that some really good assessment practice can be swallowed up in the midst of other high stakes assessment. We can then use this to shift timings more efficiently
* Gives us the opportunity to look in more depth about how we’re assessing students - weightings and formats - encourage breaking down high stakes assessment into lower stakes assessment.
* if we are in the paper world, impact of assessment on admin processes to manage this assessment - collecting papers and distributing?
* reliance on system performance to manage - scheduling upgrades, etc, etc.
* potential for feed-forward. Typically we only consider feed-forward on a modular basis. On a very basic level at this programme view - if students are all submitting essays at the end of the module, they could be making fundamental errors which will be reflected across all assignments. Referencing for example. Or the structure itself. Taking this view might amend timings which can enable greater feed forward.
* What about the language we use when communicating with students around assessments? Do they know what formative and summative even mean? Does the use of High Stakes scare them? Does low stakes mean it’s not important? So this is another huge area that we need to think about.
Many programme leaders might already have a spreadsheet or a table with much of this data. Why shouldn’t a clear picture like this be given to students on day 1 so they can see this? I wonder what students would do or could do with this information so easily available. How might they adapt their learning strategies. How might they change their own plans around this view? I think there are a lot of aspects that would come to the fore if we were more open in this regard.
I blogged some of this work and Simon Walker from Greenwich pointed me to a tool they had developed to help visualise this - https://sites.google.com/site/mapmyprogramme/home
So before I finish up for this presentation I just want to summarise.
I’ve emphasised the role of reflection in our everyday practice – aided by Brookfields lenses. I believe this approach can help us re-engineer assessment practices from the ground up, rather than piecemeal approaches to attempting to raise standards.
And we need to consider the holistic bigger picture in assessment to think about the impact on students as well as the impact on other stakeholders in the assessment lifecycle.
As we rely more heavily on systems, whether it’s the VLE or other tool to support learning, teaching and assessment, there is a lot of data being captured, which until recently has lay rather dormant. The concept of learning analytics refers to the zeitgeist of big data and analysing that data to uncover past experiences and/or shape future practice.