This is my latest PPT on the Principles of student assessment in medical education which is illustrated with suitable pictures, diagrams for understanding better..
16. Prof. Dr. V. Sathyanarayanan MBBS., MD.,( F.I.M.E )
SRM MCH & RC, SRM University, Kattankulathur,
INDIA
*PRINCIPLES OF
STUDENT ASSESSMENT
17. *
*What is assessment?
*Why is it required ?
*Types of assessment
*Characteristics of assessment
*Assessment tools
*Steps in assessment
*Limitations of assessment
*Summary
18. *
*AT THE END OF THIS SESSION, PARTICIPANTS BE ABLE TO
1. Name the types of assessment
2. Compare formative and summative assessment
3. Explain the characteristics of an assessment tool
4. Identify the steps in student assessment
5. Choose an appropriate assessment tool
6. Express enthusiasm to apply the principles of student
assessment in everyday practice
34. *
*Assessment and evaluation are often used
interchangeably
*However for our purposes…
*Assessment describes the measurement of
learner outcomes
*Evaluation describes the measurement of
course/program outcomes
38. *Done at the classroom level
*Done for planning Teaching-
Learning
*For student development
*To provide feedback for students
and teachers
*Can help to modify T-L methods
*Before final test of competence
* Done at a wider level (school,
college, university , national )
* Done scoring/grading or pass/fail
decision
* Used to find out the competence
before certification
*
39. Crucial Distinction
Assessment OF Learning (Summative):
How much have students learned as of
at a particular point in time?
Assessment FOR Learning (Formative):
How can we use assessment to help students
learn more?
40. *“When the cook tastes the soup, that’s
formative assessment
41. *when the customer tastes the soup, that’s
summative assessment.”
(Brookhart, 1999)
42.
43. *
*Student’s performance is related to
that of his peers
*Used if only a fixed number of places
are available
*Eg entry to PG course
53. Evolution of Medical Students
Website by NUS students:
http://medicus.tk
“Assessment drives learning in the direction you wish.”
54.
55.
56.
57.
58. *
*“Drives learning”
*Provides baseline data
*Provides summative and formative feedback
*Allows to measure individual progress.
*Encourages “student” reflection
*Assures public that providers are competent
*Licensure/credentialing requirements
69. *
*Measures what the student
actually DOES
*Applicable to only real life
situations
*Authentic assessment
70.
71. *
Miller GE. The assessment of clinical
skills/competence/performance. Academic Medicine
(Supplement) 1990; 65: S63-S7.
Knows
Shows how
Knows how
Does
Professionalauthenticity
Cognition
Performance
83. A examination that attempts to test students’
mastery at a given point of time is less preferable
than one that tests the mastery over a span of time.
89. *
*The most important characteristic of an assessment
tool
*Ask “ what is the learning outcome to be
measured?”
*Matter of degree ( less valid or more valid)
90. Content validity: ability of the assessment instrument to sample
representative content of the course.
Course content Assessment
91.
92.
93. *
*Reliability refers to the consistency of
test scores and the concept of reliability is
linked to specific types of consistency.
*Over time
*Between different examiners,
*Different testing conditions
*Instruments for student assessment needs
high reliability to ensure transparency and
fairness
94. *
*Is a measure of the reproducibility of the
results of a test
*Measure of the correlation between two sets of scores
obtained
when the test is repeated after an interval
( test – retest method)
*When the test is split into 2 halves and
results are compared ( split half method )
95. Ex 1 Ex 2 Ex 3 Ex 4 Ex 5
Q 1 X
Q 2 X
Q 3 X
Q 4 X
Q 5 X
Examiner
Question
96. Ex 1 Ex 2 Ex 3 Ex 4 Ex 5
Q 1 X X X X X
Q 2
Q 3
Q 4
Q 5
Examiner
Question
97. Ex 1 Ex 2 Ex 3 Ex 4 Ex 5
Q 1 X
Q 2 X
Q 3 X
Q 4 X
Q 5 X
Examiner
Question
98.
99. *
*Assessment must be appropriate in the
context of the needs
*Should be obvious to the teacher and
the student
100. *
*Is the degree to which the assessment adheres to
specific criteria
101. *
*Is the degree to which the assessment adheres to
specific criteria
*Measures To increase objectivity of scoring
1. Structuring the questions
2. Preparing model answers
3. Agreement on marking scheme
4. Independent assessment by more than one
examiner
102. *
*Is the degree to which the process of assessment
is practical
*And possible to implement in the
circumstances
*Time
*Expertise
*cost
103. *
*Are we measuring what we are supposed
to be measuring?
*Use the appropriate instrument for the knowledge, skill, or
attitude you are testing
*The major types of validity should be considered (content,
predictive, and face)
104. *
*Does the test consistently measure what it
is supposed to be measuring ?
*Types of reliability:
*Inter-rater (consistency over raters)
*Test-retest (consistency over time)
*Internal consistency (over different items/forms)
105. *
*Is the administration of the assessment
instrument feasible in terms of time and
resources?
*Time to construct?
*Time to Score ?
*Ease of interpreting the score/producing results ?
*Practical given staffing/organization ?
*Quality of feedback ?
*Learner takeaway ?
*Motivate Learner ?
106. *
*Number of students to be assessed
*Time available for the assessment
*Number of staff available
*Resources/equipment available
*Special accommodations
107.
108.
109. *
1. OBJECTIVE MEASUREMENT marks,
rank, percentile
2. VALUE JUDGEMENT Regarding
desirability of the result of
measurement to relook SLO and TL
Process decide on changes
110.
111.
112.
113. *
1. Define Learning objectives
2. provide Teaching –Learning experiences
3. select measuring instrument
4. administer test
5. decide marking
6. score test
7. analyse result
8. make a final decision
9. not right, choose an alternative method
116. *
*Most commonly long answer or essay for theory
( assessment of knowledge)
*Long case or short case for clinical examination
*Oral examination for the assessment of practical
skills
*OSCE , OSPE increase reliability
*Log book, diary attitude assessment
117. *
*Low Stake Examinations High Stake Examinations
*Long essay question —> Multiple short answer question
*Traditional long case —> Multi-station OSCE
124. *
*Any assessment is anxiety provoking for the students
and (staff)
*Assessment has potential positive and negative
steering effects on learning and professional
development
133. Critical questions in assessment
1. WHY are we doing the assessment?
2. WHAT are we assessing?
3. HOW are we assessing it?
4. HOW WELL is the assessment working?
134. 1. WHY are we doing the assessment?
What Is its purpose?
Formative?
Summative?
135. 2. WHAT are we testing?
Elements of competence
Knowledge
factual
applied: clinical reasoning
Skills
communication
clinical
Attitudes
professional behaviour
Tomorrow’s Doctors, GMC 2003
136. 3. How are we doing the assessment?
Test formats
Knows
Shows how
Knows how
Does
Knows Factual tests:
SBAs ( MCQs )
Knows how
(Clinical) Context based tests:
SBAs, EMQs, SAQ
Shows how
Performance assessment in vitro:
OSCEs
Does
Performance assessment in vivo:
Video, WBA eg mini-CEX, DOPs
137. 4. HOW WELL is the assessment working?
Evaluation of assessment systems
•Is it valid?
•Is it reliable?
•Is it doing what it is supposed to be
doing?
• To answer these questions, we have to consider
the characteristics of assessment instruments
138. Principles of Assessment
There is no perfect assessment:
compromise is always required
The compromise depends on the context of the
assessment
The Quality of assessments is a matter of
the integral assessment programme,
rather than of the individual instruments
141. *No single assessment method can provide
all the data required for judgment of
anything so complex as the delivery of
professional services by a successful
physician : George Miller 1990