4. What is Assessment?
Assessment is the ongoing process of
gathering, analyzing and reflecting on
evidence to make informed and
consistent judgments to improve future
student learning.
6/5/2013Preparedby:SBSatorre
4
5. In layman’s language, how is
the process of assessment
described?
• Plan it!
• Do it!
• Check it!
• Revise it!
• Repeat it!
6/5/2013Preparedby:SBSatorre
5
6. Uses of Assessment
Planning, Conducting, and Evaluating
Instruction
• assessment can provide information to guide
instructional decisions
• prior to instruction—planning for instruction and
subsequent assessment
• during instruction—determining effectiveness of
instruction and whether reinstruction is needed
• following instruction—determining if revisions are
necessary for next period, next class meeting, or
next year
6/5/2013Preparedby:SBSatorre
6
7. Uses of Assessment
Diagnosing Student Difficulties
• assessment prior to instruction in order to determine
what students know and can do
• important in helping teachers plan for instruction
Placing Students
• assessment for purposes of grouping students
based on ability, organizing students for group
work, sequencing of coursework, etc.
6/5/2013Preparedby:SBSatorre
7
8. Uses of Assessment
Providing Feedback (Formative)
• assessment can provide feedback to students
regarding their academic progress
• important to provide this type of feedback in an
ongoing manner
Grading and Evaluating Learning
(Summative)
• formal assessments of learning following the
completion of instruction
• typically used to communicate results to
students, parents, and others
6/5/2013Preparedby:SBSatorre
8
9. 3 Main Purposes for Assessment
6/5/2013Preparedby:SBSatorre
9
10. • Assessment for Learning(AfL)
occurs when teachers use
inferences about student
progress to inform their teaching.
(formative)
• Assessment as Learning(AsL)
occurs when students reflect on
and monitor their progress to
inform their future learning goals.
(formative)
• Assessment of Learning(AoL)
occurs when teachers use
evidence of student learning to
make judgments on student
achievement against goals and
standards. (summative)
embedded in
the TLAs
Occurs at the
end of the
process, task,
or period
6/5/2013Preparedby:SBSatorre
10
11. Formal vs. Informal Assessment
Formal Assessment Methods
planned in advance of their administration
lack spontaneity
typically occur at the end of instruction
students are aware of these methods
examples include chapter tests, final
exams, graded homework, etc.
Informal Assessment Methods
more spontaneous; less obvious
typically occur during instruction
examples include teacher observations and
questions
6/5/2013Preparedby:SBSatorre
11
12. Qualitative vs. Quantitative
Assessment
Quantitative Assessment Methods
yield numerical scores
major types include teacher-constructed
tests, standardized tests, checklists, and rating
scales
Qualitative Assessment Methods
yield verbal descriptions of characteristics
main types include teacher
observations, anecdotal records, and informal
questions
6/5/2013Preparedby:SBSatorre
12
13. Formative vs. Summative
Formative Evaluation
decision making that occurs during instruction for
purposes of making adjustments to instruction
more of an evaluation of one’s own teaching
rather than of students’ work
may be based on formal or informal methods
Summative Evaluation
occurs at the end of instruction (e.g., end of
chapter, end of unit, end of semester)
typically used for administrative decisions
(e.g., assigning grades, promoting/retaining
students)
based solely on formal assessment methods
6/5/2013Preparedby:SBSatorre
13
14. Standardized vs. Nonstandardized
Assessment
Standardized Assessment Methods
administered, scored, and interpreted in identical
fashion for all examinees
purpose is to allow educators to compare
students from different schools, states, etc.
examples include SAT, GRE, ITBS, CAT, PRAXIS
Nonstandardized Assessment Methods
typically made by teachers for classroom use
purpose is to determine extent to which subject
matter is being taught and learned
6/5/2013Preparedby:SBSatorre
14
15. Norm-Referenced vs. Criterion-
Referenced Assessment
Norm-Referenced Assessment Methods
show where an individual student’s performance
lies in relation to other students
standardized tests are usually norm-referenced
results are quantitative
student performance is compared to norm group
Criterion-Referenced Assessment Methods
compare student performance to pre-established
criteria or objectives
results are quantitative, qualitative, or both
also known as mastery, objectives-referenced, or
competency tests
6/5/2013Preparedby:SBSatorre
15
16. Traditional vs. Alternative
Assessment
Traditional Assessment Methods
procedures such as pencil-and-paper tests and
quizzes
only one correct response to each test item
easily and efficiently assess many students
simultaneously
encourage memorization of facts, etc.
Alternative Assessment Methods
more appropriate for hands-on, experiential
learning
include authentic assessment (involve real
application of skills beyond instructional context)
6/5/2013Preparedby:SBSatorre
16
17. Objective vs. Subjective Assessment
Objective Assessment Methods
―objective‖ refers to method of scoring (no judgments)
contain only one correct answer
examples: multiple-choice, true-false, matching items
also known as structured-response, selected-
response, teacher-supplied items
Subjective Assessment Methods
scoring involves teachers’ subjective judgments
several possible correct responses or single correct
response with several ways to arrive at that answer
examples: short-answer and essay items
also known as open-ended, constructed-response, supply-
type items
6/5/2013Preparedby:SBSatorre
17
18. Ethical Issues Related to Assessment
Teacher Responsibilities in the Classroom
• ensuring that students are properly motivated to do
their best on any type of assessment method, that
all types of assessment methods are administered
fairly, and results are interpreted appropriately
Motivating Students
• should not try to trick students on classroom
assessments
• provide encouragement
• familiarize students with assessment procedures
(i.e., develop students’ ―testwiseness‖ skills)
6/5/2013Preparedby:SBSatorre
18
19. Ethical Issues Related to Assessment
Test Administration
• establishes a positive environment within the
assessment situation
• discourages cheating
Interpretation of Test Results
• tests do not result in measures of the entire person
• interpretation should be limited to only those skills
measured by a particular test
• avoids overgeneralizations
6/5/2013Preparedby:SBSatorre
19
20. Characteristics of Exemplary
Assessment Task (Huba & Freed)
• Valid yields useful information to guide
learning
• Coherent is structured so that activities
lead to desired performance product
• Authentic addresses ill-defined
problems/issues that are enduring or
emerging
• Rigorous requires use of declarative and
functional knowledge
6/5/2013Preparedby:SBSatorre
20
21. Characteristics of Exemplary
Assessment Task (Huba & Freed)
• Engaging provokes student interest and
persistence
• Challenging provokes, as well as
evaluates, student learning
• Respectful allows students to reveal their
uniqueness as learners
• Responsive provides feedback to
students learning to improvement
6/5/2013Preparedby:SBSatorre
21
22. Steps in Designing OBE-
based Assessment Task
1. Choose the right assessment task/method.
2. Choose the right student activities to
complete the assessment task/method.
3. Create the scoring or grading criteria.
6/5/2013Preparedby:SBSatorre
22
23. 1. Choose the right
assessment task or
method.
6/5/2013Preparedby:SBSatorre
23
24. 1. Is the assessment task aligned with the
subject intended learning outcome?
2. Is the assessment task reflect a relative
importance to the subject intended
learning outcome?
3. Is the assessment task realistic to the
student?
4. Is the assessment task measurable?
5. Are the resources needed to carry out
the assessment task available?
6/5/2013Preparedby:SBSatorre
24
25. Common Verbs in
the ILOs
Possible Assessment Tasks
Describe Assignment, Essay question exam
Explain Assignment, Essay question exam, Oral exam
Integrate Project, Assignment
Analyse Case Study, Assignment
Apply Project, Case Study, Experiment
Solve Case Study, Project, Experiment
Design, Create Project, Experiment
Reflect Reflective journal/diary, Portfolio, Self-assessment
Communicate A range of Oral, writing or listening tasks
6/5/2013Preparedby:SBSatorre
25
26. Possible Assessment Methods for
the Computing Field
• Practical Work
• Computer
Simulations
• Laboratory Work
• Problems to Solve
• Reflective Learning
Statements
• Self-test
• Final Exams
• Essays
• Assignments
• Field Reports
• Article Review
• Group Work
• Portfolios
• Performances &
Presentations
• Projects
• Independent Study
• Learning Contracts
6/5/2013Preparedby:SBSatorre
26
27. • If you want a written assessment
instrument, which of the following would you
choose? Consider the best uses of
essays, reports, reviews, summaries, dissertations, t
heses, annotated bibliographies, case
studies, journal articles, presentations and exams.
• Should the method be time-constrained? Exams
and "in-class" activities might well be the most
appropriate for the occasion. Time constrained tests
put students under pressure, but are usually fairly
good at preventing cheating.
Specific Guide Questions
(adapted from 500 Tips on Assessment (Sally
Brown, Phil Race and Brenda Smith, 1996)
6/5/2013Preparedby:SBSatorre
27
28. • Is it important that the method you choose
includes cooperative activity? If it is important,
you might choose to assess students in groups,
perhaps on group projects, poster displays or
presentations.
• Is it important that the method you choose
includes cooperative activity? If it is important,
you might choose to assess students in groups,
perhaps on group projects, poster displays or
presentations.
6/5/2013Preparedby:SBSatorre
28
29. • Is a visual component important? When it is, you
might choose portfolios, poster displays, 'critique'
sessions or exhibitions.
• Is it important that students use information
technology? When this is the case, computer-
based assessments may be best, either getting
students to answer multiple-choice questions, or
write their own programs, or prepare databases, or
write information stacks for hypertext, or material for
use in CD-ROM systems or on the Internet.
6/5/2013Preparedby:SBSatorre
29
30. • Do you wish to try to assess innovation or
creativity? Some assessment methods that allow
students to demonstrate these include:
performances, exhibitions, poster
displays, presentations, projects, student-led
assessed seminars, simulations and games.
• Do you want to encourage students to develop
oral skills? If so, you might choose to assess
presentations, recorded elements of audio and
video tapes made by students, assessed
discussions or seminars, interviews or simulations.
• Do you want to assess the ways in which
students interact together? You might then assess
negotiations, debates, role
plays, interviews, selection panels, and case
studies.
6/5/2013Preparedby:SBSatorre
30
31. • Is the assessment of learning done away from
the institution important? For example, you may
wish to assess learning done in the work place, in
professional contexts or on field courses. You may
choose to assess logs, reflective journals, field
studies, case studies or portfolios.
• Is your aim to establish what students are able
to do already? Then you could try diagnostic tests
(paper-based or technology-
based), profiles, records of achievement, portfolios
6/5/2013Preparedby:SBSatorre
31
32. 2. Choose the right
student activities to
complete the assessment
task/method.
6/5/2013Preparedby:SBSatorre
32
33. Are the student activities to
complete the assessment
task aligned with the subject
intended learning outcome?
The verb in the subject
intended learning
outcome provides the
clue on the kinds of
student activities in the
assessment task.
6/5/2013Preparedby:SBSatorre
33
34. 3. Create the scoring or
grading criteria.
6/5/2013Preparedby:SBSatorre
34
35. Methods of Grading SILOs
1. Direct Grading
2. Indirect Grading
6/5/2013Preparedby:SBSatorre
35
36. Direct Grading
Grading the
overall SILOs
Grading Criteria
(using Rubrics) of
Individual SILO
Derive Final
Grade
6/5/2013Preparedby:SBSatorre
36
37. Example: DBSys31 SILOs
• SILO 1 – contrast traditional file-based systems
and database system in terms of efficiency on data
manipulation, information access and security
• SILO 2 – explain the different data models as basis
for designing an information system.
• SILO 3 –apply a relational database model to
design the database for a particular information
system
• SILO 4 – design a normalized database for the
intended information system
• SILO 5 – construct the appropriate SQL
statements to solve SQL query problems
6/5/2013Preparedby:SBSatorre
37
38. UC Grading System
Grade Equivalent
1.0 100% - 95%
1.1 – 1.5 94% - 90%
1.6 – 2.5 89% - 80%
2.6 – 3.0 79% - 75%
5.0 74% - 65%
NC No Credit
NG No Grade
DR Dropped
W Withdrawn
6/5/2013Preparedby:SBSatorre
38
39. • SILO 1 – contrast traditional file-
based systems and database
system in terms of efficiency on
data manipulation, information
access and security
• SILO 2 – explain the different data
models as basis for designing an
information system.
• SILO 3 –apply a relational
database model to design the
database for a particular
information system
2.6 – 3. 0
79% - 75%
6/5/2013Preparedby:SBSatorre
39
40. • SILO 1 – contrast traditional file-
based systems and database
system in terms of efficiency on
data manipulation, information
access and security
• SILO 2 – explain the different data
models as basis for designing an
information system.
• SILO 3 –apply a relational
database model to design the
database for a particular
information system
• SILO 4 – design a normalized
database for the intended
information system
1. 6 – 2. 5
89% - 80%
6/5/2013Preparedby:SBSatorre
40
41. • SILO 1 – contrast traditional file-
based systems and database system
in terms of efficiency on data
manipulation, information access and
security
• SILO 2 – explain the different data
models as basis for designing an
information system.
• SILO 3 –apply a relational database
model to design the database for a
particular information system
• SILO 4 – design a normalized
database for the intended information
system
• SILO 5 – construct the appropriate
SQL statements to solve SQL query
problems
1.1 – 1.5
94% - 90%
1.0
100% - 95%
6/5/2013Preparedby:SBSatorre
41
42. Indirect Grading
Grading the
Assessment Tasks
which are aligned
with the SILOs
Grading Criteria of
(using Rubrics)
individual
assessment task
Derive Final Grade
6/5/2013Preparedby:SBSatorre
42
43. Using Rubrics
• A rubric is a scoring tool that lays out the
specific expectations for a performance task.
• Rubrics divide a performance tasks into its
component parts and provide a detailed
description of what constitutes acceptable and
unacceptable levels of performance for each of
those parts.
• Rubric can be use for grading a large variety of
tasks: discussion participation, laboratory
reports, portfolio, group work, oral
presentation, role play and more (Stevens and
Levi, 2005).
6/5/2013Preparedby:SBSatorre
43
44. 2 Vital Components of a
Rubric
1. Criteria
2. Scale – descries how well or poorly any
given task has been performed (ex: Very
Good, Good, Fair, Needs Improvement)
6/5/2013Preparedby:SBSatorre
44
46. Workshop # 3 – Designing Assessment
Tasks (ATs)
1. Design Assessment Tasks for your CILO # 1.
2. Present your design in the form of a table below.
CILO Assessment Tasks Student Activities in
completing the ATs
6/5/2013Preparedby:SBSatorre
46
47. References:
• http://www.aaia.org.uk/pdf/Publications/AAIA%20Pupils%20Learning%20from
%20Teachers'%20Responses.pdf
• http://www.aaia.org.uk/pdf/Publications/AAIAformat4.pdf
• http://www.aaia.org.uk/pdf/asst_learning_practice.pdf
• http://community.tes.co.uk/forums/t/300200.aspx
• http://www.schoolhistory.co.uk/forum/lofiversion/index.php/t7669.html
• www.harford.edu/irc/assessment/FormativeAssessmentActivities.doc
• Paul Black et al, Assessment for Learning, (Open University
Press, Maidenhead, 2003)
• Paul Black et al, ―Working inside the black box‖, (nferNelson, London, 2002)
• Paul Black and Dylan William, Inside the Black
Box, (nferNelson, London, 1998)
• Assessment Reform Group, Testing, Motivation and Learning, (The
Assessment Reform Group, Cambridge, 2002)
• Assessment Reform Group, Assessment for Learning, (The Assessment
Reform Group, Cambridge, 1999)
• Angelo, TA, KP Cross. Classroom Asessment Techniques: A Handbook for
College Teachers. Jossey-Bass Publishers: San Francisco. 1993.
6/5/2013Preparedby:SBSatorre
47
48. • Southern Illinois University : Several CATs online:
http://www.siue.edu/~deder/assess/catmain.html
• Bresciani, M.J. (September, 2002). The relationship between outcomes,
measurement. and decisions for continuous improvement. National
Association for Student Personnel Administrators, Inc NetResults E-Zine.
http://www.naspa.org/netresults/index.cfm
• Bresciani, M.J., Zelna, C.L., and Anderson, J.A. (2004). Techniques for
Assessing Student Learning and Development in Academic and Student
Support Services. Washington D.C.:NASPA.
• Ewell, P. T. (2003). Specific Roles of Assessment within this Larger Vision.
Presentation given at the Assessment Institute at IUPUI. Indiana University-
Purdue University- Indianapolis.
• Maki, P. (2001). Program review assessment. Presentation to the Committee
on Undergraduate Academic Review at NC State University.
• Bresciani, MJ.(2006). Outcomes-Based Undergraduate Academic Program
Review: A Compilation of Institutional Good Practices. Sterling, VA: Stylus
Publishing.
• Bresciani, M. J., Gardner, M. M., & Hickmott, J. (In Press). Demonstrating
student success in student affairs. Sterling, VA: Stylus Publishing.
• NC State University, Undergraduate Academic Program Review. (2001)
Common Language for Assessment. Taken from the World Wide Web
September 13, 2003:
http://www.ncsu.edu/provost/academic_programs/uapr/process/language.html
• Palomba, C.A. and Banta, T.W. (1999). Assessment essentials: Planning,
implementing and improving assessment in Higher Education. San Francisco:
Jossey-Bass.
• University of Victoria, Counseling Services. (2003) Learning Skills Program:
Blooms Taxonomy. Taken from the World Wide Web September 13, 2003:
http://www.Coun.uvic.ca/learn/program/hndouts/bloom.html
6/5/2013Preparedby:SBSatorre
48