SlideShare une entreprise Scribd logo
1  sur  29
Télécharger pour lire hors ligne
The Student Learning Objective (SLO) Process is comprised of three (3) components: Design, Build, and Review. Student Learning Objectives provide indicator of teacher
effectiveness through student performance outcomes based on standards.
Welcome to Training Module 2: “Building”
Concept
Within the Building Module, the trainer will guide participants to create Student Learning Objectives and performance measures that will guide instruction and provide evidence
of student mastery or growth. During this phase, the participants will complete the development of SLOs and the Performance Measures associated with each goal.
Key Points for Trainers
1.
2.

Orient the user to where in the process the “building” activities take place.
a. After teachers have completed the Design Phase (which includes the development of a goal statement, identifying underlying content standards, and creating
a blueprint) they will move from this thinking/designing stage to actually “building” the Performance Measures needed to support the SLO process.
Articulate that using the SLO Process Template requires an initial step of i) reviewing the data definitions, ii) referencing models, and iii) examining the structure of the
SLO Template.
a.
During this module, participants should review the following:
1) SLO Process Template-referenced in Slide 6 (template found in SLO/Build/Templates),
2) The Art and Physical Education Sample Models-referenced in Slide 7 (models found in SLO/Build/Guides), and
3) The following definitions:
i.
PDE’s SLO Definition: A process to document a measure of educator effectiveness based on student achievement of content standards.
ii.
iii.
iv.
v.

3.

(Orientation Module Slide 8)

Goal Statement: Narrative articulating the “big idea” upon which the SLO is based. (Orientation Module Slide 9)
Standards: Targeted content standards used in developing SLOs, which are the foundation of performance measures. (Orientation Module
Slide 9)

Assessment Literacy: Technical and operational understanding of the assessment “life cycle”, including the critical examination of statistic
evidence. (Orientation Module Slide 9)
Rationale statement: Narrative providing reasons why the Goal Statement and the aligned standards address important learning.
(Orientation Module Slide 10)

vi.
Blueprint: Visual depiction of the relationship among key SLO components. (Orientation Module Slide 10)
vii.
Performance Indicator: Statement of the expected level of achievement for each student in the SLO population. (Orientation Module Slide 10)
viii.
Performance Measures: The various tools/assessments used to measure student achievement of a specific goal.
Clarify completion of SLO Process Template, including the development and/or selection of performance measures aligned to the selected content standards.

Learning Activity
________________________________________________________________________________________________________________________________________________
Technical Notes
“Structure”
1.

Concept – “What is this slide telling the audience?”

2.

Key Points – “What/Where are the details ‘needed for teaching’?”

3.

Learning Activity – “How can the participant’s learning be enhanced?” (This item will not be populated for every slide.)

Module 2-SLO Building

1
Goal & Objectives
Concept
The Goal of the Build Module is teachers will create SLOs and identify and/or develop
Performance Measures for each Performance Indicator that can be used in guiding
instruction and determining student mastery and/or growth as part of Pennsylvania’s
Educator Effectiveness System. SLO is the process through which the Elective Data
portion in the PA evaluation system is developed.
Key Points for Trainers
1. Explain the relationship between the SLO Process Template and the “Help Desk”
definitions. (found at SLO/Build/Other Stuff)
a. The “Help Desk” definitions define each element within the SLO Process
Template. When using the online SLO Process Template, the “Help Desk”
document also displays the correct format for each section and element as
well as providing an example.
2. Ensure the audience recognizes the performance measures are developed or
selected during the “Build” phase.
a. This is the time where teachers will take their fully developed blueprints and
identify and/or develop the required performance measures for each SLO.
Learning Activity

Module 2-SLO Building

2
Helpful Tools
Concept
As you work through the Build Module, you will need to access both Guides and Templates. These Guides and Templates will
provide focus for the work and help ensure the development of complete SLOs and Performance Measures.
All SLO training materials can be found at both the PDE’s SAS portal and Research in Action’s Homeroom learning platform. Go to
http://www.pdesas.org. Once there you can login to the Homeroom learning portal.
Key Points for Trainers
1.

Differentiate how tools can assist in the building task:
a. Guides have materials such as handouts, rules of thumb, model SLOs, etc. that reinforce content presented in the
videos. In this module, models are provided to guide new users in what a completed SLO, including developed
and/or selected performance measures, looks like.
b. Templates are used to complete each phase of the process. The key template is SLO Process Template. This
template is provided as an online tool. Template #4 is a replication of SLO Process Template, provided as a
downloadable Word document. Template #5, the Performance Task Framework Template, is provided here as a
Word document for performance tasks that may be culminating events over multiple days (e.g., student projects) or
other types of teacher and district developed assessments. Additional development templates are found within the
Assessment Literacy Series.
c.
During the Build Phase, teachers will review the following templates:
i.
Template #1 – Goal Statement
ii.
Template #2 – Targeted Content Standards
iii.
Template #3 – SLO Blueprint
d. During the Build Phase, teachers will also use and complete the following templates:
i.
Template #4 – SLO Process Template (must be completed in this phase)
ii.
Template #5 – used as a resource when creating performance measures that are culminating events or
accumulation of students’ work products. (Recommended for teacher or district developed
assessments.)

Learning Activity
1.

Have participants go to www.pdesas.org to review the resources that are available on the SAS portal and the Homeroom
learning portal.

Module 2-SLO Building

3
Outline of the Build Module
Concept
This figure is designed to help visualize the various components associated with the “building” phase of the SLO
Process.
Key Points for Trainers
1.
2.

Apply this “learning map” into the beginning of the presentation to provide an outline of both the “building” and
SLO process, including some techniques (steps) to begin the work, including the presentation, refinement, and
review of created SLOs.
SLO Process Components
a. BUILD: This component is the “action” step in the process that focuses on completing the SLO Process
template and creating and/or selecting performance measures.
b. Preview the SLO Process Template (Template #4) and “Help Desk” Definitions. These will provide
guidance around the classroom context, SLO goal, Performance Indicators, Performance Measures, and
Teacher Expectations.
i.
Performance Indicators – Description of the expected level of achievement for each
student in the SLO population.
ii.
Performance Measures – The various tools/assessments which will be used to measure
student achievement toward a specific goal.
iii.
Teacher Expectations – As part of the SLO process and the Educator Effectiveness Rating
Tool, teachers will be required to set four (4) levels that describe the number of students
expected to meet the targets listed for each Performance Indicator. These levels include:
Distinguished, Proficient, Needs Improvement, and Failing. Educators determine the
percentages within each rating prior to the beginning of instruction. These performance
ratings will be examined at the end of the evaluation period and be used to determine the
Elective Rating that is applied to the overall final rating for each teacher.

Learning Activity
*Process Template #4

Module 2-SLO Building

4
SLO Template Preview
Concept
This section of the Module will provide a preview of the materials used in developing
SLOs.
Key Points for Trainers
1. Make sure the participants have the Templates and Models needed for working
through the Building phase of the SLO process. They will need the following:
• Template #4 – SLO Process Template (SLO/Build/Templates)
• Template #5 – Performance Task Framework (SLO/Build/Templates)
• Help Desk Definitions (SLO/Build/Other Stuff)
• Art Model (SLO/Build/Guides)
• Physical Education Model (SLO/Build/Guides)
• Completed Template #1 – Goal Statement (SLO/Design/Templates)
• Completed Template #2 – Targeted Content Standards (SLO/Design/Templates)
• Completed Template #3 – SLO Blueprint (SLO/Design/Templates)
Learning Activity
1. Allow time for participants to gather the above mentioned Templates, Models, and
Guides from both their previous work during the Design phase and the Homeroom
learning portal link found on the SAS Portal.

Module 2-SLO Building

5
SLO Template Preview
Concept

To begin the Build phase of the SLO, it is important to review some previously completed
documents. Make sure that teachers review the Goal Statement (Template #1), the Targeted
Standards (Template #2), and their completed SLO Blueprint (Template #3). This information is
needed because the SLOs are based upon the targeted standards chosen during the Design
phase of the process.
Key Points for Trainers
1. Orient the user to the SLO Process Template: The Template’s front page is focused on the
demographics, goals, and performance indicators; the back page is the performance
measure(s) used to measure student achievement, along with articulating teacher
expectations.
a. Identify what data (demographics) (performance indicators) (performance scores) is
needed to complete the template.
b. Identify the group of students the SLO will be based upon, including any “focused”
target group.
2. Ensure the definitions for each data element (i.e., each data field within the template) are
understood with focus on the technical definitions and the examples provided.
The SLO Process Template reflects the entirety of the SLO Process: Design (think about the goals
and targeted standards and indicators), Build (identify/build performance measures for each
performance indicator selected), Review (refine/edit/improve completed SLOs).
Learning Activity

Module 2-SLO Building

6
SLO Process Template
Concept
Model #1 Grade 8 Art and Model #2 Grade 3 Physical Education provide participants with examples of
what a completed SLO Process Template entails. These SLO Models and Performance Measure Templates
were developed by PA educators as demonstrations, NOT exemplars. They provide participants with
concrete examples of each phase of the SLO process: Design, Build, and Review.
Key Points for Trainers
1.
2.
3.

Review Handout #1 to link the conceptual framework of a three phase process for SLOs with the
operational structure of the material.
Ensure the audience understands that this is a modular design. This means that, given work and
experience with SLO, individuals can engage in the work at any phase, not necessarily at the “implied”
beginning [Design].
Reinforce, via model review, how these models reflect the three C’s of quality:
a. Completeness: Template is “completed” according to pre-established business
rules.
b. Comprehensiveness: The performance measures are “comprehensive”
assessments of the targeted content standards.
c. Coherency: The SLO’s focus on a “Big Idea” within the PA Standard is aligned with
an integrated set of standards, performance indicators and performance measures.

Learning Activity
1.

Using the Art and Physical Education Models, have participants review and identify the various
sections of the completed SLO. (These sections will be unpacked in upcoming slides, most specifically
8-16 and 25-27. Information to support section 4, quality performance measures, is unpacked in
slides 17-24. )

Module 2-SLO Building

7
Section 1: Classroom Context
Concept
Section 1 of the SLO Process Template contains basic information and provides focus for
the work.
Key Points for Trainers
1. Make sure that teachers are using the “Help Desk” Definitions as a guide. This
document (found in SLO/Build/Stuff) provides the format and expectations when
filling out each section. It is important that the SLO Template is filled out correctly.
2. Emphasize, especially for high school teachers, that the SLO is subject/content
specific. This means that the educator teaching multiple subjects, (e.g., Algebra I,
Geometry, Integrated Math III) will identify one subject, and then include all as a
sample of students enrolled in that particular subject/course.
Learning Activity
1. Using Template #4 – SLO Process Template and the “Help Desk” Definitions, guide
participants through filling out Section 1 of the SLO Process Template.

Module 2-SLO Building

8
Section 1: Classroom Context
Concept
This slide provides the definitions for each part in Section 1 of the SLO Process Template.
Completely filling out this section of the template first will provide context applicable to
working on the specific performance measures that will be used for each SLO.
Key Points for Trainers
1. Even though this chart is here and clarifies each part of Section 1, please make sure
that teachers refer to the “Help Desk” definition document (found in SLO/Build/Stuff)
as it provides the correct formatting along with examples. It is important that each
section is filled out properly.
2. Example (as found in SLO Model 1, Art; SLO/Build/Guides):
a. 1d. Class/Course Title – Full Name(s) - Art
b. 1e. Grade Level – Numeric Values/Text – 8
c. 1f. Total # of students – Numeric values only – 100
d. 1g. Typical Class Size – Numeric values only – 25-30
e. 1h. Class Frequency – (# of sessions) per (week, 6 day cycle) for (year,
semester, 35 day rotation) equaling a total of (#) sessions – daily for one
quarter (42 sessions)
f. 1i. Typical Class Duration – Numeric values only – 45 minutes
Learning Activity

Module 2-SLO Building

9
Section 2: SLO Goal
Concept
Referring to Template #1 and Template #2 from the SLO Design module, teachers will be able to refresh themselves
about the Goal Statement, Targeted Standards, and the Rationale developed during the Design phase. Using these
templates will help teachers complete Section 2 of the SLO Process Template.
Key Points for Trainers
1.
2.
3.

4.

Remind participants that the Goal Statement is integral to the development of an SLO. It is the narrative that
articulates the “big idea” upon which the SLO is based. This statement must be aligned with PA standards.
Including national or professional standards is acceptable, but not as a replacement for PA standards.
Targeted Standards are those that have been selected for use with the performance measures that will be
developed. These selected standards should represent the “big ideas” within the content area. Refer
participants to the Curriculum Framework for the Course/Content area for which the SLO is being written.
Example from the SAS Portal – Curriculum Framework: Mathematics – 3rd grade
a. Big Idea – The likelihood of an event occurring can be described numerically and used to make
predictions.
b. Essential Question – How can using graphs help us to solve problems and describe data we collect?
c. Concepts – Graphical displays of data: Frequency tables, bar graphs, picture graphs, line plots
d. Competencies – Construct and analyze frequency tables, bar graphs, picture graphs, and line plots and
use them to describe data and solve problems.
e. Standards/Eligible Content – 2.6.3.A, 2.6.3.B, 2.6.3.C, 2.6.3.D, 2.6.3.E, 2.7.3.D, 2.8.3.F, CC2.4.3.A.4,
M3.E.1.1.1, M3.E.1.1.2, M3.E.1.2.1,M3.E.1.2.2
Teachers can cut and paste the language that has already been created by the Pennsylvania Department of
Education when developing goal statements. All the information is easily accessible on the SAS Portal.

Learning Activity

Module 2-SLO Building

10
Section 2: SLO Goal
Concept
This slide provides the definitions for each part of Section 2 of the SLO Process Template. This section can
be filled out using the information from Templates #1 (Goal Statement) and #2 (Targeted Standards).
Key Points for Trainers
1.

2.

Remind teachers that they should use the “Help Desk” Definition document (found in SLO/Build/Stuff)
when filling each Section out as it provides the correct formatting as well as example statements.
Teachers can also refer to the Art and Physical Education Models that were completed by PA
educators.
Step-by-Step process to select “Big Idea” from SAS Portal:
a. Go to www.pdesas.org
b. Select Curriculum Framework
c. Select Subject Area/Grade Level from the drop-down screen.
d. Click Search
e. “Big Ideas” are listed
f. Choose a “Big Idea”
g. Click on a specific “Big Idea”
h. Complete framework for the “Big Idea” provides teachers with Essential Questions,
Concepts, Competencies, and Standards/Eligible Content available for that particular “Big
Idea/(Enduring Understanding).”
i. Click on specific standard to drill down to materials, resources, and assessments available to
specific content standard.

Learning Activity
1.

Have participants work with the curriculum framework element on the SAS Portal. Choose a
particular subject area and grade level. Locate the various statements related to the specific subject
and grade level. (Big Idea, Essential Question, Concepts, Competencies, and Standards/Eligible
Content).

Module 2-SLO Building

11
Section 3: Performance Indicators
Concept
Performance indicators are a description of the expected level of achievement on each measure used in the SLO. An
understanding of the scoring tool used to describe student achievement for any given performance measure is necessary to
write a performance indicator statement. Sample performance indicator statements, as found in the “Help Desk”, are listed
below:
Physics
(1) Roller Coaster Energy Project
(2) achieve 6 out of 9 using the roller coaster project rubric.
US History
(1) US History Final Exam
(2) achieve an 85% or higher on the final exam.
5th Grade ELA
(1) DRA
(2) Using the DRA text gradient chart, one year of reading growth.
Referring to SLO Design/Template #3 (SLO Blueprint), teachers can begin to fill out Section 3: Performance Indicators of the
SLO Process Template. When teachers filled out the SLO Blueprint, they identified the targets for each Performance Measure
selected and/or to be developed.
Key Points for Trainers
1.

2.
3.

Remind teachers that Performance Indicators offer a great deal of flexibility in the system. Performance Indicators can be
linked – meaning a student must meet a specific achievement level across two or more Performance Measures in order to
meet the standard. Also, Performance Indicators can be weighted if there is more than one (1) Performance Indicator.
Performance Indicators should be specific, measurable, and ambitious, but attainable.
The “Help Desk” definition document provides further details and examples of linked and weighted Performance
Indicators.

Module 2-SLO Building

12
Learning Activity

Module 2-SLO Building

12
Section 3: Performance Indicators
Concept
Again, this slide provides the definitions for each part of Section 3 of the SLO Process Template. However,
referring to the “Help Desk” Definition Guide will be very useful toward completing this section.
[Remind teachers that each Section of the SLO Template builds upon each other. You will not be able to
fill out Section 3 if you have not completed the work in order to fill out Section 2.]
Key Points for Trainers
Participants learning the SLO process often confuse and intertwine the Performance Indicator statement
(3a) with the Teacher Expectation statement (5a), especially when the Performance Indicator statement is
described in percentages (i.e., achieves an 80% on a test).
•
•
•

Right: scoring 4 out of 5 on the “my awesome project rubric”
Right: achieving 80% on a final exam
Wrong: 80% of the students in the sample will score a 4 on the PM Rubric

ALL STUDENT GROUP
1. Ensure teachers are clear that PIs are not “performance expectations” for a group of students but
rather as a single indicator of a performance on the assessment.
FOCUS STUDENT GROUP
1. This function allows teachers to differentiate their instruction and assessment of various students
within a SLO population.
a. PI Targets: Focused Student Group
i.
PI Target #1 – Students who score below 2 on Performance Measure (PM) pre-test
will improve a minimum of one level on Performance Measure post-test.
Learning Activity

Module 2-SLO Building

13
Section 4: Performance Measures
Concept

Each section of the SLO Process has a specific task and each has its own importance. However,
Section 4: Performance Measures is the most critical and probably the most challenging work to
complete. Selecting and/or developing Performance Measures that are of high-quality is
essential for demonstrating student achievement of the selected content standards.
Key Points for Trainers
1. Make sure the participants understand that performance measures must allow equitable
opportunities for students to demonstrate learning.
2. Discuss the strengths and weaknesses of status (mastery) and growth metrics.
a. Status metrics have absolute standards and are easily understood; however, they do
not reflect changes (improvement) in student learning.
b. Growth metrics are sensitive to changes in learning; however, they are more
unstable and can be limited for high performing students.
3. Principles of Well-Developed Measures:
a. Be built to achieve the designed purpose
b. Produce results that are used for the intended purpose
c. Align to targeted content standards
d. Contain a balance between depth and breadth of targeted content
e. Be standardized, rigorous, and fair
f. Be sensitive to testing time and objectivity
g. Be valid and reliable
Learning Activity

Module 2-SLO Building

14
Section 4: Performance Measures
Concept
This slide provides the definitions for each part of Section 4 of the SLO Process Template. This section will take some time to complete and
teachers need to consider many aspects of the performance measure to ensure that they select/develop a performance measure that fulfills the
purpose of the assessment. Section 4 breaks out each aspect that teachers must consider when selecting/developing performance measures.
Key Points for Trainers
1.
2.
3.

Referring teachers to the “Help Desk” Definition Guide and the Models will be useful toward filling out this section of the SLO Process
Template.
Allow teachers time to fully consider all aspects of the Performance Measures and complete Section 4.
4c. Purpose Statement: states “what” the performance task is measuring, “how” the results (scores) can be used, and “why” the
performance measure was developed. Here is an example statement:
Elementary Pre-Post – Checkpoints in Mathematics are assessments intended to measure student proficiency of grade-level expectations in
the sequence of the district’s curriculum, including different depths of knowledge. This grade-level assessment is provided to all students in
the fall and spring of each year. Item and strand-level scores are reported to educators. Scores will be used by the district, schools, and
teachers to monitor growth in student achievement.
Handout #1-Purpose Statement Examples (found in ALS/Design/Templates) can be used to unpack the Purpose Statement process.

4

“4d. Metric – teachers must ensure that the metric used by the Performance Measure is aligned with the Performance Indicator 3a. Is the
Performance Indicator determining Growth (change in student performance across two or more points in time), Mastery (attainment of a
defined level of achievement), or Growth and Mastery (Mastery for All Student Groups/Growth for Focused Student Group).
a.
Art/Grade 8 Model
i.
PI Target #1 – achieve Advanced or Proficient on all four criteria of the Mood Portrait rubric
ii.
PI Target #2 – achieve Advanced or Proficient on all four criteria of the Demuth Oil Pastel Drawing rubric.
iii.
PI Target #3 – achieve Advanced or Proficient on all four criteria of the Clay Architectural rubric.
i.
All three of these PI Targets describe a “Mastery” metric – an attainment of a defined level of
achievement.
4e. Administration Frequency – if it is a pre-post performance measure – the frequency could be noted as “at the beginning of the
semester/at the end of the semester; if it is a portfolio – the frequency could be noted as “during a six-week period”; if it’s a culminating
activity/event – the frequency could be noted as “once a semester.”
a.
Art/Grade 8 Model
i.
Performance Measure (PM) #1 – once a semester
ii.
Performance Measure (PM) #2 – once a semester
iii.
Performance Measure (PM) #3 – once a semester
4f. Adaptation/Accommodations – unique accommodation needed because of the performance measure’s design.
a.
Art/Grade 8 Model
i.
IEP, Gifted IEP, ELL – Additional time out of class is offered for those who need more time to complete the projects.
All other adaptations will be developed based on an IEP or specified district policy.

Learning Activity

Module 2-SLO Building

15
Section 4: Performance Measures (continued)
Concept
This slide continues with the various aspects associated with Performance Measures.
Key Points for Trainers
1.
2.

Resources and equipment are those materials needed for the assessment, not the instruction of the
content. This data element is frequently misunderstood.
Scoring personnel (including second scorers) and the assessment developer and administrator are
often the same teacher, especially for very unique performance measures. Here, the audience must
be aware and articulate ways to mitigate rater/observer bias. This can be helped by having well
defined talk and scoring criteria.

Learning Activity
The next eight slides provide an overview of the assessment literacy process, a process that supports the
creation and/or review of performance measures that teachers will build and/or select to be used for the
purposes of the SLO process. The activities below are provided to create a scenario in which the
participant will want more information and subsequently recognize the purpose of the Assessment
Literacy Series Materials. (Adult Learning Theory, Robert Mager: Adults want to know why they should
learn.)
1.

Ask participants to write performance measure administration instructions for a task that they
frequently administer, pretending that they might ask a colleague from their content area and grade
level to administer the performance measure. Follow this activity with a tour of the materials found
at ALS/Build/Videos and ALS/Build/Guides, Handouts 8 & 9.

2.

Ask participants to write scoring rubrics for a task that they frequently administer, pretending that
they might ask a colleague from their content area and grade level to score student exemplars.
Follow this activity with a tour of the materials found at ALS/Build/Videos and ALS/Build/Guides,
Handout 7.

Module 2-SLO Building

16
What is “Assessment Literacy”?
Concept
Assessment Literacy means understanding the basic principles of quality assessment practices in order to assess
students effectively.
“Research suggests that teachers spend from one-quarter to one-third of their professional time on assessmentrelated activities. Yet almost all do so without the benefit of having learning the principles of sound assessment”
(Stiggins, 2007).
“Governmental agencies and others involved in test development activities must be held to the same high
expectations as test publishers and professional assessment companies with regard to following established
requirements, adhering to industry standards, and implementing best practices related to established requirements
and standards for products and services as test publishers and professional companies. A crucial part of this process is
to agree upon standards for Quality Assurance (QA) and Quality Management (QM). Because testing is such an
important enterprise with results that impact students, teachers, and schools, every step related to assessment
development, administration, and scoring must be clearly documented and correctly implemented.” (Research in
Action, Inc., Quality Assurance Techniques for Developing Measures of Student Achievement-Standard Operating
Procedures ‘Smart Book’, April 2011).
Key Points for Trainers
1.
2.
3.

Focus on common understanding and misunderstanding about assessment literacy.
Assessment is part of the instructional process; Testing is an event.
Assessment literacy is a unitary concept; meaning that educators are never “assessment literate”, they simply
have a greater understanding of assessment concepts and procedures.

Learning Activity
Ask one of the following:
1. How is assessment different than testing?
2. Explain the relationship between curricula, instruction, and assessment.
3. What is the role of formative vs. summative assessment?
4. Which assessment techniques should be avoided in the SLO process and why?

Module 2-SLO Building

17
Assessment Life Cycle
Concept
Each step of the Assessment Life Cycle outlines the sequence required to ensure highquality assessments that measure and validate student achievement. Details associated
with each step of the Life Cycle will be outlined further in this training module.
Key Points for Trainers
1. Stress the importance for participants to understand and follow this step-by-step
process, as it will ensure the development of high-quality assessments.
2. Ensure that teachers recognize the assessment life cycle as a continuous sequence
that allows for additional evidence and corrective action to be implemented from
administration to administration. This “process” creates a body of evidence about
the score inference by which teachers are, in part, regarded as “effective”.
3. Clarify the advantages and limitations associated with educators creating their own
customized assessments for use in the SLO process.
Learning Activity

Module 2-SLO Building

18
Principles of Well-Developed Measures
Concept
Working through the steps associated with the Assessment Life Cycle focuses test developers on the principles of well-developed measures. The Performance Measure Rubric for Teachers (found in ALS/Review/Stuff) will guide participants through the process of
selecting/developing high-quality performance measures. The rubric has 18 technical aspects related to the basic principles of quality assessment. They are organized into three (3) strands to align with the Design, Build, Review components of both the SLO Process
and the ALS Process. Using this rubric allows teachers to validate the performance measures they have determined will effectively measure student progress toward the goals identified.
Key Points for Trainers
Measures must:
1.
Be built to achieve the designed purpose – “For all types of assessments, the first step is to clearly define the purpose. Teachers must specify exactly what the assessment is intended to measure, characteristics of intended test takers, types of scores to
be reported, and how the information derived from the assessment will be used.” (RIA “Smart Book”, 2011)
a.
“If the purpose is not well defined, there is a high risk that the assessment will not satisfy fundamental measures of test validity. Critical measures of test validity examine whether the test is built to achieve its purpose(s) and
whether the results are used for the intended purpose(s). (RIA, “Smart Book”, 2011).
b.
Strand 1: Design of the Performance Measure Rubric
i. 1.1 The purpose of the performance measure is explicitly stated (who, what, why)
2.
Produce the results that are used for the intended purpose - does the assessment measure what we really want to measure? Evaluate the final assessment against the intended purpose.
a.
Strand 2: Build
i. 2.2 Item/tasks are created and reviewed in terms of: (a) alignment to the targeted standards, (b) content accuracy, (c) developmental appropriateness, (d) cognitive demand, and (e) bias, sensitivity, and fairness.
b.
Strand 3: Review
i. 3.1 The performance measures are reviewed in terms of design fidelity•
Items/tasks are distributed based upon the design properties found within the specification or blueprint documents.
•
Item/task and form statistics are used to examine levels of difficulty, complexity, distracter quality, and other properties.
•
Items/tasks and forms are rigorous and free of bias, sensitive, or unfair characteristics.
3.
Aligned to targeted content standards - “The desired attributes and characteristics of the assessment need to be identified, specified, and documented. The test design framework and blueprint provide information necessary to guide the item/task
development process. This is a critical activity in establishing validity.” (RIA “Smart Book”, 2011)
a.
Strand 1: Design of the Performance Measure Rubric
i.
1.2 The performance measure has targeted content standards representing a range of knowledge and skills students are expected to know and demonstrate.
ii.
1.4 Specification tables articulate the number of items/tasks, item/task types, passage readability, and other information about the performance measure – OR – blueprints are used to align items/tasks to
targeted content standards.
b.
Strand 3: Review
i. 3.3 The performance measures are reviewed in terms of alignment characteristics•
Pattern consistency (within specifications and/or blueprints
•
Matching the targeted content standards
•
Cognitive demand
•
Developmental appropriateness
4.
Contain a balance between depth and breadth of targeted content – “Several considerations are taken into account during the assessment design phase. First, to satisfy accepted standards of reliability and validity, a minimum number of items and score
points are required within each subtest and for the overall assessment. Second, the assessment must include items with a range of difficulty levels if the assessment’s purpose is to provide information about student achievement at different levels on the
performance continuum.” (RIA, “Smart Book”, 2011)
a.
Strand 1: Design of the Performance Measure Rubric
i.
1.5 Items/tasks are rigorous (designed to measure a range of cognitive demands/higher-order thinking skills at developmentally appropriate levels) and of sufficient quantities to measure the depth and breadth
of the targeted content standards.
5.
Be standardized, rigorous, and fair – “A number of technical and editorial issues are related to item and test specifications. Obviously, crucial quality processes are needed in assessment development, such as adequate content coverage and the
development of items and test forms to meet best practice requirements. The desired psychometric properties for the items, such as difficulty and discrimination, as well as desired test properties such as overall test difficulty and reliability need to be
targeted and met. Detailed item specifications must include: item types, number of items, response options, difficulty levels, language load, and artwork. Test specifications guide the process of developing forms and likewise need to be detailed to
address issues of content coverage, overall difficulty level, balance of items, targeted distribution of item difficulties, rules for sequencing items, total number of items, and timing.” (RIA “Smart Book”, 2011)
a.
Strand 1: Design of the Performance Measure Rubric
i. 1.3 The performance measure’s design is appropriate for the intended audience and reflects challenging material needed to develop higher-order thinking skills.
b.
Strand 2: Build
i.
2.1 Items/tasks and score keys are developed using standardized procedures, including scoring rubrics for human-scored, open-ended questions (i.e., short constructed response, writing prompts,
performance tasks, etc.).
ii.
2.2 Item/tasks are created and reviewed in terms of: (a) alignment to the targeted standards, (b) content accuracy, (c) developmental appropriateness, (d) cognitive demand, and (3) bias, sensitivity, and
fairness.
iii.
2.3 Administrative guidelines are developed that contain the step-by-step procedures used to administer the performance measure in a consistent manner, including scripts to orally communicate directions
to students, day and time constraints, and allowable accommodations/adaptations.
iv.
2.4 Scoring guidelines are developed for human-scored items/tasks to promote score consistency across items/tasks and among different scorers. These guidelines articulate point values for each item/task
used to combine results into an overall score.
a.
Strand 3: Review
i.
3.2 The performance measures are reviewed in terms of editorial soundness, while ensuring consistency and accuracy of other documents (i.e., administration)•
Identifies words, text, reading passages, and/or graphics that require copyright permission or acknowledgement
•
Applies Universal Design principles
•
Ensures linguistic demands and/or readability is developmentally appropriate
6.
Be sensitive to testing time and objectivity –
a.
Strand 2: Build
i.
2.6 The total time to administer the performance measure is developmentally appropriate for the test-taker. Generally, this is 30 minutes or less for young students and up to 60 minutes per session for older
students (high school).
7.
Have score validity and reliability evidence – “General scoring criteria and methods for scoring both short-constructed response (SCR) and extended-constructed response (ECR) should be outlined when the guidelines for writing items are developed.”
a.
Strand 2: Build
i. 2.5 Summary scores are reported using both raw score points and performance level. Performance levels reflect the range of scores possible on the assessment and use terms or symbols to denote performance levels.
b.
Strand 3: Review
i.
3.4 Cut scores are established for each performance level. Performance level descriptors describe the achievement continuum using content-based competencies for each assessed content area.
ii.
3.5 As part of the assessment cycle, post administration analyses are conducted to examine such aspects as items/tasks performance, scale functioning, overall score distribution, rater drift, content
alignment, etc.
iii.
3.6 The performance measure has score validity evidence that demonstrated item responses were consistent with content specifications. Data suggest the scores represent the intended construct by using
an adequate sample of items/tasks within the targeted content standards. Other sources of validity evidence such as the interrelationship of items/tasks and alignment characteristics of the performance
measure are collected.
iv.
3.7 Reliability coefficients are reported for the performance measure, which includes estimating internal consistency. Standard errors are reported for summary scores. When applicable, other reliability
statistics such as classification accuracy, rater reliabilities, and others are calculated and reviewed.
Learning Activity

Module 2-SLO Building

19
Overview of the Assessment Literacy Series (ALS) Process
Concept
Just like the Student Learning Objective Process (SLO), the Assessment Literacy Series
(ALS) Process is comprised of three (3) phases: Design, Build, and Review.
Key Points for Trainers
1. Clarify that the ALS (full model) guides the development of assessments. For existing
measures, the ALS Review section will outline the quality criteria the assessment
must adhere to, in order to be considered of sufficient quality (i.e., sufficient quality
for use in the SLO process).
2. Each phase of the process deals with specific steps within the Assessment Life Cycle.
Working through the three (3) phases (Design, Build, and Review) will provide
participants the opportunity to develop measures that meet crucial criteria for each
step and ensure the development of high-quality assessments/measures of student
achievement.
3. In many cases, educators will be using vendor-produced assessments for their SLOs.
In these instances, the Review section is most applicable.
Learning Activity

Module 2-SLO Building

20
ALS Process Components
Concept
The first phase of the ALS Process is the DESIGN Phase. During this phase, participants will decide on a Purpose Statement. The
Purpose Statement clearly defines why the assessment is being developed, what the assessment will measure, and how the results
will be used. This phase also includes the selection of specific content standards that will become the focus of the Performance
Measure.
“If the purpose is not well defined, there is a high risk that the assessment will not satisfy fundamental measures of test validity.
Critical measures of test validity examine whether the test is built to achieve its purpose(s) and whether the results are used for the
intended purpose(s). The test design, framework, and/or blueprint are specifically created to maximize the likelihood of meeting
acceptable measures of test validity. Therefore, these components cannot be fully developed until the assessment’s purpose is
clear. Assessment developers need to know the specific purpose(s) of the assessment to make sure the items and forms are
constructed to meet the intended use.” (RIA, Smart Book, April 2011)
Key Points for Trainers
1.
2.

Have participants review content standards and/or national/professional organization standards and identify/select a set of
standards that will become the focus of the Performance Measure (PM). Alignment to the Goal Statement of the SLO should
be addressed.
Allow time for discussion of why the assessment is being developed, what the assessment is measuring, and how the results
will be used. Participants should reach consensus on these key aspects.

Learning Activity
1.

2.

Questions to consider for focusing this discussion include:
a.
Which content areas will be focused upon within the various content standards?
b. What is the assessment intended to measure?
c.
What grade levels?
d. What are the developmental needs of the students?
e. How will the results be used?
f.
When will the measure be administered?
When the purpose statement is complete, review and refine as necessary.
a.
Does the purpose statement address the why, what, and how of the measure?
b. Is the statement clear and concise?
c.
Is the purpose aligned with targeted content standards?
d. Does it reflect how the results will be used?

Module 2-SLO Building

21
ALS Process Components (continued)
Concept
The purpose statement provides the foundation for developing the test specifications and blueprints.
These specifications and blueprints will then provide the necessary information for building the specific
test items. Specifications and blueprints are tools that guide teachers when creating/selecting the items
for a particular measure.
Creating the Purpose Statement and designing the Specification and Blueprints are completed during the
Design Phase (Step One) of the Assessment Literacy Series Process. This becomes the foundation for the
Building Phase (Step Two) which begins with the Item Development.
Key Points for Trainers
1.

Keep participants focused on the Purpose Statement they developed and the targeted standards
chosen. Specifications and blueprints will outline the following:
a. Targeted Content Standards (found at pdesas.org)
b. Depth of Knowledge (DoK) – level of difficulty (DoK charts can be found in ALS/Build/Stuff
and ALS/Build/Guides; Handout #4)
c. Item type – multiple choice, extended response, performance indicator, fill-in the blank
d. Number of items

Learning Activity
1.

When working through the specifications and blueprints, keep in mind the following:
a. Are there a sufficient number of items for each targeted content standard identified?
b. Is there a developmentally appropriate distribution of DoK levels?
c. Were different item types chosen and the appropriate weights assigned?
d. Was consideration given to the time burden of the assessment for both the teacher and the
student?

Module 2-SLO Building

22
ALS Process Components (continued)
Concept
Once the Specifications and Blueprints are complete, teachers move into the Build Phase of the ALS Process. This Phase includes the
following steps:
a.
Development of specific test items-aligned to specifications and blueprints
b. Scoring sheets – provide answers for multiple choice
i.
Pros of multiple choice – easy to administer; objective scoring
ii.
Cons of multiple choice – guessing and single answer (no information process used by students)
c.
Scoring rubrics – required for any constructed and/or performance task
i.
Pros of constructed responses and performance tasks – allows for partial credit, provides more details
about cognitive processes, provides opportunities to evaluate multiple skills, reduces likelihood of
guessing, creates more information to guide instruction
ii.
Cons of constructed responses and performance tasks – greater subjectivity; longer time to administer,
respond, and score.
d. Operational forms – the test form itself
Key Points for Trainers
1.
2.
3.
4.
5.
6.

Teachers will select and/or develop the items, any reading passage, scenario/performance task, and answer options.
Make sure the item sample represents the “big idea” noted in the Goal Statement.
Make sure the items are measuring the skills and knowledge associated with the targeted content standards.
Make sure consideration is given to the pros/cons of each item type before selecting item type.
Establish the DoK of each item.
Develop scoring rubrics where applicable.
a.
Create different levels of performance in the rubric’s descriptors.
b. Make scoring as clear as possible.
c.
Provide an example of a fully complete/correct response along with examples of partially correct responses.

Learning Activity
1.

Teacher should also consider the following:
a.
Does the item type distribution match the specification table (found in ALS/Design/Templates; Template #3)?
b. Are there sufficient items to sample the targeted standards (found in ALS/Design/Templates; Template #2)?
c.
Are the items developmentally appropriate?
d. Is the correct answer and/or expected response clearly identified?
e. Does the operational form (test) place items in a logical order (found in ALS/Design/Templates; Template #2)?
f.
Is the distribution of item difficulty random throughout the form?
g.
Did you ensure that items do not “key” (give clues) to one another?
h. Are the directions clear and concise for the students?
i.
Did you make sure that items do not drift from one page to another?

Module 2-SLO Building

23
ALS Process Components (continued)
Concept
The last step during the BUILD Phase is to create the Administrative Guidelines for the selected measure. These
guidelines must provide sufficient detail so that testing conditions are comparable.
The Quality Assurance and Form Reviews is conducted during the final and third phase of the ALS Process; the
REVIEW Phase. More details about the REVIEW Phase are outlined in Training Module Three.
Key Points for Trainers
1.

Teachers must create:
a. Directions for the students (test-takers).
b. Directions for teachers (test administrators).
c. Steps needed to prepare and administer the measure.
d. Materials needed to administer the measure.
e. How to collect the results.
f. Procedures for completed assessments.
g. Conduct a rigor check (using the Screening Tool).
i.
Is it developmentally appropriate?
ii.
Is each item assigned to a targeted content standard?
iii.
Is each item assigned the correct cognitive level?
iv.
Have items been reviewed for sensitivity, bias, and fairness?
h. Examine the quality of teacher-made performance measures (using the Performance Measure Rubric
for Teachers)
i.
Review information, data, and documents associated with the design, development, and
review of the selected performance measure.
ii.
Assign a value for each aspect within a particular strand.
iii.
Reference supporting information associated with each assigned rating.
iv.
Add any additional notations and/or comments that articulate any important nuances of the
performance measure.

Learning Activity

Module 2-SLO Building

24
Section 5: Teacher Expectations
Concept
As part of the SLO Process, educators will be required to set expected levels of student learning based on
the Performance Indicators and their Performance Measures. The levels are identified as “Distinguished,
Proficient, Needs Improvement, and Failing.” This Teacher Expectation Rating will determine the overall
SLO rating which will be used as the Elective Rating on the Pennsylvania Educator Effectiveness Rating
Tool.
Key Points for Trainers
1.

Participants learning the SLO process often confuse and intertwine the Performance Indicator
Statement (3a) with the Teacher Expectation Statement (5a), especially when the Performance
Indicator statement is described in percentages.

Examples:
Performance Indicator Statement: achieves at minimum an 80% on the post-test.
Teacher Expectation Statement: 80% of students meet the performance indicator (implies that 80% of
students achieve at minimum an 80% on the test).
1.
2.
3.

The SLO Process allows for a great deal of control and flexibility over the performance measures and
the expected levels of growth and/or mastery associated with each measure that will be used to fulfill
the requirements of the Elective portion of teacher evaluation.
These levels are established by educators prior to the evaluation period and each performance level is
populated with a percentage range such that 0% to 100% meeting expectations is distributed among
the levels.
The Elective Rating for teachers is not completed until after performance data are collected,
reviewed, and evaluated against each Performance Indicator.

Learning Activity

Module 2-SLO Building

25
Section 5: Teacher Expectations (continued)
Concept
This slide provides one (1) example of how Teacher Expectations levels can be determined.
Key Points for Trainers
1.

2.
3.

4.

Once the performance data has been collected, reviewed, and evaluated against each Performance Indicator, the number of
students meeting expectations can be charted and totaled. Then an overall percentage can be determined by a simple division
problem: the total number of students meeting expectations/the total number of students in the SLO population across all
indicators.
The same student may be included across all indicators (which is demonstrated in the slide); however, some variation in
student counts across time will exist.
Remember, Teacher Expectations and Performance Indicators are two different things:
a.
Performance Indicators are the descriptions of the expected level of achievement for each student in the SLO
population on a particular Performance Measure – Physical Education/3rd grade Model
b. Teacher Expectations – are the four levels of projected performance regarding the PI, reflecting a continuum
established by the educator prior to the evaluation period. Each performance level (i.e., Failing, Needs
Improvement, Proficient, and Distinguished) is populated with a percentage range such that 0% to 100% meeting
expectations is distributed among the levels. – Physical Education/3rd grade Model
i.
Failing – 0% - 60% of students will meet the PI targets.
ii.
Needs Improvement – 61% to 84% of students will meet the PI targets.
iii.
Proficient – 85% - 94% of students will meet the PI targets.
iv.
Distinguished - 95% - 100% of students will meet the PI targets
Another problematic area is where to set the values among the different performance levels. A “trade-off” exists between the
rigor of the performance targets and expectation of teachers that students will meet those standards.
a.
In looking at the Physical Education – 3rd grade Model, the different PI Targets for the focused student group is
reflective of the “trade-off” between the rigor of the performance targets and expectation of teachers that students
will meet those standards.
b. Because the PI Targets are based on both Mastery and Growth – the teacher expectations are different for different
students.

Learning Activity
1.

Using the Physical Education-3rd grade Model, have participants create student results and from that data determine Section 5:
Teacher Expectations

Module 2-SLO Building

26
Section 5: Teacher Expectations (continued)
Concept
This slide provides the definitions for each part in Section 5 of the SLO Process Template.
Key Points for Trainers
1.
2.

3.

4.
5.

Even though this chart is here and clarifies each part of Section 5, please refer to the “Help Desk” definition guide (found in
SLO/Build/Stuff) as it provides the correct formatting along with examples. It is important that each section is filled out
correctly.
One thing that needs to be considered is what data will be used.
a.
In the Physical Education-3rd grade Model – the teachers choose to use data from teacher-developed measures.
i. It is not clear how many teacher-developed measures are expected to be used in determining if the PI
Targets have been met by students in the SLO population. Teachers will need to be clear on what and how
many teacher-developed measures were used and how the results were combined to determine the
percentages of students who met the PI Targets.
b. In the Art – 8th grade Model – teachers choose to use data from District-designed Measures and Examinations and
Student Projects.
i. Again, teachers will be gathering information from different performance measures. Will the same weight
be given to each measure? Will one measure be weighted more than the other?
Another factor to consider is how the data is aggregated (how it is gathered and summarized).
a.
Physical Education – 3rd grade Model is gathering data based on the growth of some students and the proficiency
levels of students. This can make the process a little more complicated when trying to determine and calculate
teacher elective ratings.
b. In the Art – 8th grade Model – the data is gathered based on mastery level (a defined level of achievement). This
makes determining a teacher elective rating somewhat easier.
Element 5b is not determined until after performance data are collected, reviewed, evaluated and reported.
Element 5b also has a section for Notes/Explanations – this provides an opportunity for teachers to offer additional
information related to student outcomes.
a.
Description of the anticipated outcomes vs. the actual outcome.
b. In-depth analysis of the data that will provide goals for future implementation and improvement of student
achievement through the SLO.
c.
Recommendations as to how analysis of the achievement data will inform future teaching practice as defined by
Danielson’s Framework for Teaching.
d. Recommendations for further SLO development to support student achievement of standards in this
class/course/content area.

Learning Activity

Module 2-SLO Building

27
Summary & Next Steps
Concept

During this Module 2: Building teachers will have worked through the development of items,
scoring keys & rubrics, operational forms (tests), and administrative guidelines for each
Performance Measure. Completing the following templates provides guidance through this
phase of the SLO process.
• Template #4-SLO Process Template (found in SLO/Build/Templates)
• Template #5-Performance Task Framework (found in SLO/Build/Templates)
Using the “Help Desk” Definition Guide (found in SLO/Build/Stuff) will assist in the completion of
these templates.
The next and final step of the SLO Process is outlined in Training Module 3: REVIEW. During this
module, educators will conduct an extensive quality assurance review.
Key Points for Trainers
1. Completed Templates #4 and #5 (if applicable) will be needed in order to move ahead to
Module 3: Review, where educators will review the completed SLO and the Performance
Measures associated with each Performance Indicator to ensure that they meet all the
criteria for high-quality SLOs and Performance Measures.
Learning Activity
1. Make sure participants have completed the above mentioned Templates before leaving
Module 2: Building.
2. Answer any questions related to the Build Phase of the SLO Process.
• Template #5 Performance Task Framework
Using the “Help Desk” Definition Guide will assist in the completion of these templates.

Module 2-SLO Building

28

Contenu connexe

Tendances

eee499-gpc-meeting
eee499-gpc-meetingeee499-gpc-meeting
eee499-gpc-meetingslmnsvn
 
A process oriented approach for modelling on line Learning Environments
A process oriented approach for modelling on line Learning EnvironmentsA process oriented approach for modelling on line Learning Environments
A process oriented approach for modelling on line Learning EnvironmentsBaker Khader Abdallah, PMP
 
Signature assignment instructional plan & presentation
Signature assignment instructional plan & presentationSignature assignment instructional plan & presentation
Signature assignment instructional plan & presentationangavin
 
Totara - Moodle for Corporate Training
Totara - Moodle for Corporate TrainingTotara - Moodle for Corporate Training
Totara - Moodle for Corporate TrainingLambda Solutions
 
Edld 5352 week04_assignment[1] (autosaved)
Edld 5352 week04_assignment[1] (autosaved)Edld 5352 week04_assignment[1] (autosaved)
Edld 5352 week04_assignment[1] (autosaved)AmyAhmad
 
The Quality Matters Review Process
The Quality Matters Review ProcessThe Quality Matters Review Process
The Quality Matters Review ProcessMichael Wilder
 
Nba co attainment
Nba co attainmentNba co attainment
Nba co attainmentSHIMI S L
 
Moodle My feedback: view and compare assessment feedback (MoodleMoot IE/UK 2017)
Moodle My feedback: view and compare assessment feedback (MoodleMoot IE/UK 2017)Moodle My feedback: view and compare assessment feedback (MoodleMoot IE/UK 2017)
Moodle My feedback: view and compare assessment feedback (MoodleMoot IE/UK 2017)Jessica Gramp
 
Spring planning 2012 program & dept outcomes chairs & managers (mike)
Spring planning 2012 program & dept  outcomes   chairs & managers (mike)Spring planning 2012 program & dept  outcomes   chairs & managers (mike)
Spring planning 2012 program & dept outcomes chairs & managers (mike)Mike Moore
 
Annual Review Process Training Strategy
Annual Review Process Training StrategyAnnual Review Process Training Strategy
Annual Review Process Training StrategyMatthew Pettengill
 
M4(qsb 60404) mod outline - mar 2018
M4(qsb 60404)   mod outline - mar 2018M4(qsb 60404)   mod outline - mar 2018
M4(qsb 60404) mod outline - mar 2018shensin1015
 
Protocole of fine tuning of e-learning modules
Protocole of fine tuning of e-learning modulesProtocole of fine tuning of e-learning modules
Protocole of fine tuning of e-learning modulesTELECENTRE EUROPE
 
INTERFACE BASED PROGRAMMING ASSIGNMENTS AND AUTOMATIC ASSESSMENT AND GRADING ...
INTERFACE BASED PROGRAMMING ASSIGNMENTS AND AUTOMATIC ASSESSMENT AND GRADING ...INTERFACE BASED PROGRAMMING ASSIGNMENTS AND AUTOMATIC ASSESSMENT AND GRADING ...
INTERFACE BASED PROGRAMMING ASSIGNMENTS AND AUTOMATIC ASSESSMENT AND GRADING ...ijiert bestjournal
 
Guidelines for project
Guidelines for projectGuidelines for project
Guidelines for projectVijay George
 
C4 model in a Software Engineering subject to ease the comprehension of UML a...
C4 model in a Software Engineering subject to ease the comprehension of UML a...C4 model in a Software Engineering subject to ease the comprehension of UML a...
C4 model in a Software Engineering subject to ease the comprehension of UML a...Grial - University of Salamanca
 
Organizational Chart and Technology Plan
Organizational Chart and Technology PlanOrganizational Chart and Technology Plan
Organizational Chart and Technology PlanSonia_Crockett
 

Tendances (19)

eee499-gpc-meeting
eee499-gpc-meetingeee499-gpc-meeting
eee499-gpc-meeting
 
A process oriented approach for modelling on line Learning Environments
A process oriented approach for modelling on line Learning EnvironmentsA process oriented approach for modelling on line Learning Environments
A process oriented approach for modelling on line Learning Environments
 
Signature assignment instructional plan & presentation
Signature assignment instructional plan & presentationSignature assignment instructional plan & presentation
Signature assignment instructional plan & presentation
 
Totara - Moodle for Corporate Training
Totara - Moodle for Corporate TrainingTotara - Moodle for Corporate Training
Totara - Moodle for Corporate Training
 
Edld 5352 week04_assignment[1] (autosaved)
Edld 5352 week04_assignment[1] (autosaved)Edld 5352 week04_assignment[1] (autosaved)
Edld 5352 week04_assignment[1] (autosaved)
 
The Quality Matters Review Process
The Quality Matters Review ProcessThe Quality Matters Review Process
The Quality Matters Review Process
 
Nba co attainment
Nba co attainmentNba co attainment
Nba co attainment
 
Moodle My feedback: view and compare assessment feedback (MoodleMoot IE/UK 2017)
Moodle My feedback: view and compare assessment feedback (MoodleMoot IE/UK 2017)Moodle My feedback: view and compare assessment feedback (MoodleMoot IE/UK 2017)
Moodle My feedback: view and compare assessment feedback (MoodleMoot IE/UK 2017)
 
Spring planning 2012 program & dept outcomes chairs & managers (mike)
Spring planning 2012 program & dept  outcomes   chairs & managers (mike)Spring planning 2012 program & dept  outcomes   chairs & managers (mike)
Spring planning 2012 program & dept outcomes chairs & managers (mike)
 
Annual Review Process Training Strategy
Annual Review Process Training StrategyAnnual Review Process Training Strategy
Annual Review Process Training Strategy
 
M4(qsb 60404) mod outline - mar 2018
M4(qsb 60404)   mod outline - mar 2018M4(qsb 60404)   mod outline - mar 2018
M4(qsb 60404) mod outline - mar 2018
 
Protocole of fine tuning of e-learning modules
Protocole of fine tuning of e-learning modulesProtocole of fine tuning of e-learning modules
Protocole of fine tuning of e-learning modules
 
Co ct1
Co ct1Co ct1
Co ct1
 
Online module
Online moduleOnline module
Online module
 
INTERFACE BASED PROGRAMMING ASSIGNMENTS AND AUTOMATIC ASSESSMENT AND GRADING ...
INTERFACE BASED PROGRAMMING ASSIGNMENTS AND AUTOMATIC ASSESSMENT AND GRADING ...INTERFACE BASED PROGRAMMING ASSIGNMENTS AND AUTOMATIC ASSESSMENT AND GRADING ...
INTERFACE BASED PROGRAMMING ASSIGNMENTS AND AUTOMATIC ASSESSMENT AND GRADING ...
 
Guidelines for project
Guidelines for projectGuidelines for project
Guidelines for project
 
C4 model in a Software Engineering subject to ease the comprehension of UML a...
C4 model in a Software Engineering subject to ease the comprehension of UML a...C4 model in a Software Engineering subject to ease the comprehension of UML a...
C4 model in a Software Engineering subject to ease the comprehension of UML a...
 
Cambridge ICT Starters
Cambridge ICT StartersCambridge ICT Starters
Cambridge ICT Starters
 
Organizational Chart and Technology Plan
Organizational Chart and Technology PlanOrganizational Chart and Technology Plan
Organizational Chart and Technology Plan
 

En vedette

Untitled Presentation
Untitled PresentationUntitled Presentation
Untitled PresentationElda Rivera
 
Bilal Resume 150715
Bilal Resume 150715Bilal Resume 150715
Bilal Resume 150715Bilal dar
 
C конференции
C конференцииC конференции
C конференцииpermskijkrai
 
Accounting officer performance appraisal
Accounting officer performance appraisalAccounting officer performance appraisal
Accounting officer performance appraisalmichealtaylor431
 
Estudos de Caso de Probabilidade - Prof.Dr. Nilo Sampaio
Estudos de Caso de Probabilidade - Prof.Dr. Nilo SampaioEstudos de Caso de Probabilidade - Prof.Dr. Nilo Sampaio
Estudos de Caso de Probabilidade - Prof.Dr. Nilo SampaioNilo Sampaio
 
Zaag tech delivered project credentials dec. 2015
Zaag tech delivered project credentials dec. 2015Zaag tech delivered project credentials dec. 2015
Zaag tech delivered project credentials dec. 2015Nicole Zhai
 
Goal-Driven Facebook Business Page Tips & Strategies
Goal-Driven Facebook Business Page Tips & StrategiesGoal-Driven Facebook Business Page Tips & Strategies
Goal-Driven Facebook Business Page Tips & StrategiesRoel Manarang
 
Reflexión sobre mi PLE
Reflexión sobre mi PLEReflexión sobre mi PLE
Reflexión sobre mi PLE1221Mariia
 

En vedette (17)

001
001001
001
 
El yasuní
El yasuníEl yasuní
El yasuní
 
Untitled Presentation
Untitled PresentationUntitled Presentation
Untitled Presentation
 
Autoestima
AutoestimaAutoestima
Autoestima
 
Cad outsourcing
Cad outsourcingCad outsourcing
Cad outsourcing
 
Bilal Resume 150715
Bilal Resume 150715Bilal Resume 150715
Bilal Resume 150715
 
Slideshare for LAAWS
Slideshare for LAAWSSlideshare for LAAWS
Slideshare for LAAWS
 
C конференции
C конференцииC конференции
C конференции
 
Accounting officer performance appraisal
Accounting officer performance appraisalAccounting officer performance appraisal
Accounting officer performance appraisal
 
Medidas emergencia-social
Medidas emergencia-socialMedidas emergencia-social
Medidas emergencia-social
 
Estudos de Caso de Probabilidade - Prof.Dr. Nilo Sampaio
Estudos de Caso de Probabilidade - Prof.Dr. Nilo SampaioEstudos de Caso de Probabilidade - Prof.Dr. Nilo Sampaio
Estudos de Caso de Probabilidade - Prof.Dr. Nilo Sampaio
 
Sree chakra bandha sree raama dasakamu
Sree chakra bandha sree raama dasakamuSree chakra bandha sree raama dasakamu
Sree chakra bandha sree raama dasakamu
 
The Residence at Biltmore 2
The Residence at Biltmore 2The Residence at Biltmore 2
The Residence at Biltmore 2
 
BE DEGREE
BE DEGREEBE DEGREE
BE DEGREE
 
Zaag tech delivered project credentials dec. 2015
Zaag tech delivered project credentials dec. 2015Zaag tech delivered project credentials dec. 2015
Zaag tech delivered project credentials dec. 2015
 
Goal-Driven Facebook Business Page Tips & Strategies
Goal-Driven Facebook Business Page Tips & StrategiesGoal-Driven Facebook Business Page Tips & Strategies
Goal-Driven Facebook Business Page Tips & Strategies
 
Reflexión sobre mi PLE
Reflexión sobre mi PLEReflexión sobre mi PLE
Reflexión sobre mi PLE
 

Similaire à M2-Building-SLOs-13NOV13

MODULE 7A FOR LDM IMPLEMENTATION OF SCHOOL
MODULE 7A FOR LDM IMPLEMENTATION OF SCHOOLMODULE 7A FOR LDM IMPLEMENTATION OF SCHOOL
MODULE 7A FOR LDM IMPLEMENTATION OF SCHOOLHendrixAntonniAmante
 
Template #6-Performance Measure Rubric-May 2014-Final
Template #6-Performance Measure Rubric-May 2014-FinalTemplate #6-Performance Measure Rubric-May 2014-Final
Template #6-Performance Measure Rubric-May 2014-FinalResearch in Action, Inc.
 
M1-Designing the Assessment-June 2014-FINAL
M1-Designing the Assessment-June 2014-FINALM1-Designing the Assessment-June 2014-FINAL
M1-Designing the Assessment-June 2014-FINALResearch in Action, Inc.
 
Template #3-Performance Measure Rubric-June 2014-FINAL
Template #3-Performance Measure Rubric-June 2014-FINALTemplate #3-Performance Measure Rubric-June 2014-FINAL
Template #3-Performance Measure Rubric-June 2014-FINALResearch in Action, Inc.
 
ADDIE- An Instructional Systems Design Model
ADDIE- An Instructional Systems Design ModelADDIE- An Instructional Systems Design Model
ADDIE- An Instructional Systems Design Modeleshikachattopadhyay
 
The Pact Processes - Overview
The Pact Processes - OverviewThe Pact Processes - Overview
The Pact Processes - OverviewEPPIC Inc.
 
Template #3a performance measure rubric-final-la
Template #3a performance measure rubric-final-laTemplate #3a performance measure rubric-final-la
Template #3a performance measure rubric-final-laResearch in Action, Inc.
 

Similaire à M2-Building-SLOs-13NOV13 (20)

M1-Designing-SLOs-13NOV13
M1-Designing-SLOs-13NOV13M1-Designing-SLOs-13NOV13
M1-Designing-SLOs-13NOV13
 
M0 School Leader Orientation-Demo Site
M0 School Leader Orientation-Demo SiteM0 School Leader Orientation-Demo Site
M0 School Leader Orientation-Demo Site
 
M0 school leader orientation-final
M0 school leader orientation-finalM0 school leader orientation-final
M0 school leader orientation-final
 
SLO for teachers
SLO for teachersSLO for teachers
SLO for teachers
 
M0 Orientation to the SLO-SSO-DemoSite
M0 Orientation to the SLO-SSO-DemoSiteM0 Orientation to the SLO-SSO-DemoSite
M0 Orientation to the SLO-SSO-DemoSite
 
M0 orientation to the slo-sso-final
M0 orientation to the slo-sso-finalM0 orientation to the slo-sso-final
M0 orientation to the slo-sso-final
 
MODULE 7A FOR LDM IMPLEMENTATION OF SCHOOL
MODULE 7A FOR LDM IMPLEMENTATION OF SCHOOLMODULE 7A FOR LDM IMPLEMENTATION OF SCHOOL
MODULE 7A FOR LDM IMPLEMENTATION OF SCHOOL
 
Template #6-Performance Measure Rubric-May 2014-Final
Template #6-Performance Measure Rubric-May 2014-FinalTemplate #6-Performance Measure Rubric-May 2014-Final
Template #6-Performance Measure Rubric-May 2014-Final
 
QS M1-Designing the Assessment-22JAN14
QS M1-Designing the Assessment-22JAN14QS M1-Designing the Assessment-22JAN14
QS M1-Designing the Assessment-22JAN14
 
M1-Designing the Assessment-June 2014-FINAL
M1-Designing the Assessment-June 2014-FINALM1-Designing the Assessment-June 2014-FINAL
M1-Designing the Assessment-June 2014-FINAL
 
M3 reviewing the slo-sso-final
M3 reviewing the slo-sso-finalM3 reviewing the slo-sso-final
M3 reviewing the slo-sso-final
 
M3-Reviewing the SLO-SSO-DemoSite
M3-Reviewing the SLO-SSO-DemoSiteM3-Reviewing the SLO-SSO-DemoSite
M3-Reviewing the SLO-SSO-DemoSite
 
Template #3-Performance Measure Rubric-June 2014-FINAL
Template #3-Performance Measure Rubric-June 2014-FINALTemplate #3-Performance Measure Rubric-June 2014-FINAL
Template #3-Performance Measure Rubric-June 2014-FINAL
 
ADDIE- An Instructional Systems Design Model
ADDIE- An Instructional Systems Design ModelADDIE- An Instructional Systems Design Model
ADDIE- An Instructional Systems Design Model
 
Addie model
Addie modelAddie model
Addie model
 
The Pact Processes - Overview
The Pact Processes - OverviewThe Pact Processes - Overview
The Pact Processes - Overview
 
M1 Designing the SLO-SSO-Demo Site
M1 Designing the SLO-SSO-Demo SiteM1 Designing the SLO-SSO-Demo Site
M1 Designing the SLO-SSO-Demo Site
 
M1 designing the slo-sso-final
M1 designing the slo-sso-finalM1 designing the slo-sso-final
M1 designing the slo-sso-final
 
Template #3a performance measure rubric-final-la
Template #3a performance measure rubric-final-laTemplate #3a performance measure rubric-final-la
Template #3a performance measure rubric-final-la
 
Student Growth Measures
Student Growth MeasuresStudent Growth Measures
Student Growth Measures
 

Plus de Research in Action, Inc.

Performance Task Framework-June 2014-FINAL
Performance Task Framework-June 2014-FINALPerformance Task Framework-June 2014-FINAL
Performance Task Framework-June 2014-FINALResearch in Action, Inc.
 
Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)
Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)
Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)Research in Action, Inc.
 
Model #3-Nutrition Culinary-Level III-DEMO-FINAL
Model #3-Nutrition Culinary-Level III-DEMO-FINALModel #3-Nutrition Culinary-Level III-DEMO-FINAL
Model #3-Nutrition Culinary-Level III-DEMO-FINALResearch in Action, Inc.
 
HO #3-Reviewing the Assessment-Scored Example-June 2014-FINAL
HO #3-Reviewing the Assessment-Scored Example-June 2014-FINALHO #3-Reviewing the Assessment-Scored Example-June 2014-FINAL
HO #3-Reviewing the Assessment-Scored Example-June 2014-FINALResearch in Action, Inc.
 
M3-Reviewing the Assessment-June 2014-FINAL
M3-Reviewing the Assessment-June 2014-FINALM3-Reviewing the Assessment-June 2014-FINAL
M3-Reviewing the Assessment-June 2014-FINALResearch in Action, Inc.
 
Cognitive Demand Crosswalk-June 2014-FINAL
Cognitive Demand Crosswalk-June 2014-FINALCognitive Demand Crosswalk-June 2014-FINAL
Cognitive Demand Crosswalk-June 2014-FINALResearch in Action, Inc.
 
HO #2-Building the Assessment-Examples-June 2014-FINAL
HO #2-Building the Assessment-Examples-June 2014-FINALHO #2-Building the Assessment-Examples-June 2014-FINAL
HO #2-Building the Assessment-Examples-June 2014-FINALResearch in Action, Inc.
 
Template #2-Building the Assessment-June 2014-FINAL
Template #2-Building the Assessment-June 2014-FINALTemplate #2-Building the Assessment-June 2014-FINAL
Template #2-Building the Assessment-June 2014-FINALResearch in Action, Inc.
 

Plus de Research in Action, Inc. (18)

M2-Building the SLO-SSO-DemoSite
M2-Building the SLO-SSO-DemoSiteM2-Building the SLO-SSO-DemoSite
M2-Building the SLO-SSO-DemoSite
 
Template #3 coherency rubric-final-jp
Template #3 coherency rubric-final-jpTemplate #3 coherency rubric-final-jp
Template #3 coherency rubric-final-jp
 
Template #2c building the sso-final
Template #2c building the sso-finalTemplate #2c building the sso-final
Template #2c building the sso-final
 
Template #2a building the slo-final
Template #2a building the slo-finalTemplate #2a building the slo-final
Template #2a building the slo-final
 
Template #1 designing the slo-sso-final
Template #1 designing the slo-sso-finalTemplate #1 designing the slo-sso-final
Template #1 designing the slo-sso-final
 
M2 building the slo-sso-final
M2 building the slo-sso-finalM2 building the slo-sso-final
M2 building the slo-sso-final
 
Educator evaluation policy overview-final
Educator evaluation policy overview-finalEducator evaluation policy overview-final
Educator evaluation policy overview-final
 
Performance Task Framework-June 2014-FINAL
Performance Task Framework-June 2014-FINALPerformance Task Framework-June 2014-FINAL
Performance Task Framework-June 2014-FINAL
 
Depth of Knowledge Chart
Depth of Knowledge ChartDepth of Knowledge Chart
Depth of Knowledge Chart
 
Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)
Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)
Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)
 
Model #3-Nutrition Culinary-Level III-DEMO-FINAL
Model #3-Nutrition Culinary-Level III-DEMO-FINALModel #3-Nutrition Culinary-Level III-DEMO-FINAL
Model #3-Nutrition Culinary-Level III-DEMO-FINAL
 
HO #3-Reviewing the Assessment-Scored Example-June 2014-FINAL
HO #3-Reviewing the Assessment-Scored Example-June 2014-FINALHO #3-Reviewing the Assessment-Scored Example-June 2014-FINAL
HO #3-Reviewing the Assessment-Scored Example-June 2014-FINAL
 
M3-Reviewing the Assessment-June 2014-FINAL
M3-Reviewing the Assessment-June 2014-FINALM3-Reviewing the Assessment-June 2014-FINAL
M3-Reviewing the Assessment-June 2014-FINAL
 
Cognitive Demand Crosswalk-June 2014-FINAL
Cognitive Demand Crosswalk-June 2014-FINALCognitive Demand Crosswalk-June 2014-FINAL
Cognitive Demand Crosswalk-June 2014-FINAL
 
Model #2-Grade 8 Pre-Algebra-DEMO-FINAL
Model #2-Grade 8 Pre-Algebra-DEMO-FINALModel #2-Grade 8 Pre-Algebra-DEMO-FINAL
Model #2-Grade 8 Pre-Algebra-DEMO-FINAL
 
Model #1-Art Grade 5-DEMO-FINAL
Model #1-Art Grade 5-DEMO-FINALModel #1-Art Grade 5-DEMO-FINAL
Model #1-Art Grade 5-DEMO-FINAL
 
HO #2-Building the Assessment-Examples-June 2014-FINAL
HO #2-Building the Assessment-Examples-June 2014-FINALHO #2-Building the Assessment-Examples-June 2014-FINAL
HO #2-Building the Assessment-Examples-June 2014-FINAL
 
Template #2-Building the Assessment-June 2014-FINAL
Template #2-Building the Assessment-June 2014-FINALTemplate #2-Building the Assessment-June 2014-FINAL
Template #2-Building the Assessment-June 2014-FINAL
 

Dernier

Separation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesSeparation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesFatimaKhan178732
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptxVS Mahajan Coaching Centre
 
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxContemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxRoyAbrique
 
Privatization and Disinvestment - Meaning, Objectives, Advantages and Disadva...
Privatization and Disinvestment - Meaning, Objectives, Advantages and Disadva...Privatization and Disinvestment - Meaning, Objectives, Advantages and Disadva...
Privatization and Disinvestment - Meaning, Objectives, Advantages and Disadva...RKavithamani
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingTechSoup
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeThiyagu K
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Sapana Sha
 
URLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website AppURLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website AppCeline George
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformChameera Dedduwage
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13Steve Thomason
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionSafetyChain Software
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application ) Sakshi Ghasle
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...EduSkills OECD
 
Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3JemimahLaneBuaron
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfJayanti Pande
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxGaneshChakor2
 

Dernier (20)

Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
 
Separation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesSeparation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and Actinides
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
 
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxContemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
 
Privatization and Disinvestment - Meaning, Objectives, Advantages and Disadva...
Privatization and Disinvestment - Meaning, Objectives, Advantages and Disadva...Privatization and Disinvestment - Meaning, Objectives, Advantages and Disadva...
Privatization and Disinvestment - Meaning, Objectives, Advantages and Disadva...
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
 
URLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website AppURLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website App
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy Reform
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory Inspection
 
Staff of Color (SOC) Retention Efforts DDSD
Staff of Color (SOC) Retention Efforts DDSDStaff of Color (SOC) Retention Efforts DDSD
Staff of Color (SOC) Retention Efforts DDSD
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application )
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdf
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptx
 

M2-Building-SLOs-13NOV13

  • 1. The Student Learning Objective (SLO) Process is comprised of three (3) components: Design, Build, and Review. Student Learning Objectives provide indicator of teacher effectiveness through student performance outcomes based on standards. Welcome to Training Module 2: “Building” Concept Within the Building Module, the trainer will guide participants to create Student Learning Objectives and performance measures that will guide instruction and provide evidence of student mastery or growth. During this phase, the participants will complete the development of SLOs and the Performance Measures associated with each goal. Key Points for Trainers 1. 2. Orient the user to where in the process the “building” activities take place. a. After teachers have completed the Design Phase (which includes the development of a goal statement, identifying underlying content standards, and creating a blueprint) they will move from this thinking/designing stage to actually “building” the Performance Measures needed to support the SLO process. Articulate that using the SLO Process Template requires an initial step of i) reviewing the data definitions, ii) referencing models, and iii) examining the structure of the SLO Template. a. During this module, participants should review the following: 1) SLO Process Template-referenced in Slide 6 (template found in SLO/Build/Templates), 2) The Art and Physical Education Sample Models-referenced in Slide 7 (models found in SLO/Build/Guides), and 3) The following definitions: i. PDE’s SLO Definition: A process to document a measure of educator effectiveness based on student achievement of content standards. ii. iii. iv. v. 3. (Orientation Module Slide 8) Goal Statement: Narrative articulating the “big idea” upon which the SLO is based. (Orientation Module Slide 9) Standards: Targeted content standards used in developing SLOs, which are the foundation of performance measures. (Orientation Module Slide 9) Assessment Literacy: Technical and operational understanding of the assessment “life cycle”, including the critical examination of statistic evidence. (Orientation Module Slide 9) Rationale statement: Narrative providing reasons why the Goal Statement and the aligned standards address important learning. (Orientation Module Slide 10) vi. Blueprint: Visual depiction of the relationship among key SLO components. (Orientation Module Slide 10) vii. Performance Indicator: Statement of the expected level of achievement for each student in the SLO population. (Orientation Module Slide 10) viii. Performance Measures: The various tools/assessments used to measure student achievement of a specific goal. Clarify completion of SLO Process Template, including the development and/or selection of performance measures aligned to the selected content standards. Learning Activity ________________________________________________________________________________________________________________________________________________ Technical Notes “Structure” 1. Concept – “What is this slide telling the audience?” 2. Key Points – “What/Where are the details ‘needed for teaching’?” 3. Learning Activity – “How can the participant’s learning be enhanced?” (This item will not be populated for every slide.) Module 2-SLO Building 1
  • 2. Goal & Objectives Concept The Goal of the Build Module is teachers will create SLOs and identify and/or develop Performance Measures for each Performance Indicator that can be used in guiding instruction and determining student mastery and/or growth as part of Pennsylvania’s Educator Effectiveness System. SLO is the process through which the Elective Data portion in the PA evaluation system is developed. Key Points for Trainers 1. Explain the relationship between the SLO Process Template and the “Help Desk” definitions. (found at SLO/Build/Other Stuff) a. The “Help Desk” definitions define each element within the SLO Process Template. When using the online SLO Process Template, the “Help Desk” document also displays the correct format for each section and element as well as providing an example. 2. Ensure the audience recognizes the performance measures are developed or selected during the “Build” phase. a. This is the time where teachers will take their fully developed blueprints and identify and/or develop the required performance measures for each SLO. Learning Activity Module 2-SLO Building 2
  • 3. Helpful Tools Concept As you work through the Build Module, you will need to access both Guides and Templates. These Guides and Templates will provide focus for the work and help ensure the development of complete SLOs and Performance Measures. All SLO training materials can be found at both the PDE’s SAS portal and Research in Action’s Homeroom learning platform. Go to http://www.pdesas.org. Once there you can login to the Homeroom learning portal. Key Points for Trainers 1. Differentiate how tools can assist in the building task: a. Guides have materials such as handouts, rules of thumb, model SLOs, etc. that reinforce content presented in the videos. In this module, models are provided to guide new users in what a completed SLO, including developed and/or selected performance measures, looks like. b. Templates are used to complete each phase of the process. The key template is SLO Process Template. This template is provided as an online tool. Template #4 is a replication of SLO Process Template, provided as a downloadable Word document. Template #5, the Performance Task Framework Template, is provided here as a Word document for performance tasks that may be culminating events over multiple days (e.g., student projects) or other types of teacher and district developed assessments. Additional development templates are found within the Assessment Literacy Series. c. During the Build Phase, teachers will review the following templates: i. Template #1 – Goal Statement ii. Template #2 – Targeted Content Standards iii. Template #3 – SLO Blueprint d. During the Build Phase, teachers will also use and complete the following templates: i. Template #4 – SLO Process Template (must be completed in this phase) ii. Template #5 – used as a resource when creating performance measures that are culminating events or accumulation of students’ work products. (Recommended for teacher or district developed assessments.) Learning Activity 1. Have participants go to www.pdesas.org to review the resources that are available on the SAS portal and the Homeroom learning portal. Module 2-SLO Building 3
  • 4. Outline of the Build Module Concept This figure is designed to help visualize the various components associated with the “building” phase of the SLO Process. Key Points for Trainers 1. 2. Apply this “learning map” into the beginning of the presentation to provide an outline of both the “building” and SLO process, including some techniques (steps) to begin the work, including the presentation, refinement, and review of created SLOs. SLO Process Components a. BUILD: This component is the “action” step in the process that focuses on completing the SLO Process template and creating and/or selecting performance measures. b. Preview the SLO Process Template (Template #4) and “Help Desk” Definitions. These will provide guidance around the classroom context, SLO goal, Performance Indicators, Performance Measures, and Teacher Expectations. i. Performance Indicators – Description of the expected level of achievement for each student in the SLO population. ii. Performance Measures – The various tools/assessments which will be used to measure student achievement toward a specific goal. iii. Teacher Expectations – As part of the SLO process and the Educator Effectiveness Rating Tool, teachers will be required to set four (4) levels that describe the number of students expected to meet the targets listed for each Performance Indicator. These levels include: Distinguished, Proficient, Needs Improvement, and Failing. Educators determine the percentages within each rating prior to the beginning of instruction. These performance ratings will be examined at the end of the evaluation period and be used to determine the Elective Rating that is applied to the overall final rating for each teacher. Learning Activity *Process Template #4 Module 2-SLO Building 4
  • 5. SLO Template Preview Concept This section of the Module will provide a preview of the materials used in developing SLOs. Key Points for Trainers 1. Make sure the participants have the Templates and Models needed for working through the Building phase of the SLO process. They will need the following: • Template #4 – SLO Process Template (SLO/Build/Templates) • Template #5 – Performance Task Framework (SLO/Build/Templates) • Help Desk Definitions (SLO/Build/Other Stuff) • Art Model (SLO/Build/Guides) • Physical Education Model (SLO/Build/Guides) • Completed Template #1 – Goal Statement (SLO/Design/Templates) • Completed Template #2 – Targeted Content Standards (SLO/Design/Templates) • Completed Template #3 – SLO Blueprint (SLO/Design/Templates) Learning Activity 1. Allow time for participants to gather the above mentioned Templates, Models, and Guides from both their previous work during the Design phase and the Homeroom learning portal link found on the SAS Portal. Module 2-SLO Building 5
  • 6. SLO Template Preview Concept To begin the Build phase of the SLO, it is important to review some previously completed documents. Make sure that teachers review the Goal Statement (Template #1), the Targeted Standards (Template #2), and their completed SLO Blueprint (Template #3). This information is needed because the SLOs are based upon the targeted standards chosen during the Design phase of the process. Key Points for Trainers 1. Orient the user to the SLO Process Template: The Template’s front page is focused on the demographics, goals, and performance indicators; the back page is the performance measure(s) used to measure student achievement, along with articulating teacher expectations. a. Identify what data (demographics) (performance indicators) (performance scores) is needed to complete the template. b. Identify the group of students the SLO will be based upon, including any “focused” target group. 2. Ensure the definitions for each data element (i.e., each data field within the template) are understood with focus on the technical definitions and the examples provided. The SLO Process Template reflects the entirety of the SLO Process: Design (think about the goals and targeted standards and indicators), Build (identify/build performance measures for each performance indicator selected), Review (refine/edit/improve completed SLOs). Learning Activity Module 2-SLO Building 6
  • 7. SLO Process Template Concept Model #1 Grade 8 Art and Model #2 Grade 3 Physical Education provide participants with examples of what a completed SLO Process Template entails. These SLO Models and Performance Measure Templates were developed by PA educators as demonstrations, NOT exemplars. They provide participants with concrete examples of each phase of the SLO process: Design, Build, and Review. Key Points for Trainers 1. 2. 3. Review Handout #1 to link the conceptual framework of a three phase process for SLOs with the operational structure of the material. Ensure the audience understands that this is a modular design. This means that, given work and experience with SLO, individuals can engage in the work at any phase, not necessarily at the “implied” beginning [Design]. Reinforce, via model review, how these models reflect the three C’s of quality: a. Completeness: Template is “completed” according to pre-established business rules. b. Comprehensiveness: The performance measures are “comprehensive” assessments of the targeted content standards. c. Coherency: The SLO’s focus on a “Big Idea” within the PA Standard is aligned with an integrated set of standards, performance indicators and performance measures. Learning Activity 1. Using the Art and Physical Education Models, have participants review and identify the various sections of the completed SLO. (These sections will be unpacked in upcoming slides, most specifically 8-16 and 25-27. Information to support section 4, quality performance measures, is unpacked in slides 17-24. ) Module 2-SLO Building 7
  • 8. Section 1: Classroom Context Concept Section 1 of the SLO Process Template contains basic information and provides focus for the work. Key Points for Trainers 1. Make sure that teachers are using the “Help Desk” Definitions as a guide. This document (found in SLO/Build/Stuff) provides the format and expectations when filling out each section. It is important that the SLO Template is filled out correctly. 2. Emphasize, especially for high school teachers, that the SLO is subject/content specific. This means that the educator teaching multiple subjects, (e.g., Algebra I, Geometry, Integrated Math III) will identify one subject, and then include all as a sample of students enrolled in that particular subject/course. Learning Activity 1. Using Template #4 – SLO Process Template and the “Help Desk” Definitions, guide participants through filling out Section 1 of the SLO Process Template. Module 2-SLO Building 8
  • 9. Section 1: Classroom Context Concept This slide provides the definitions for each part in Section 1 of the SLO Process Template. Completely filling out this section of the template first will provide context applicable to working on the specific performance measures that will be used for each SLO. Key Points for Trainers 1. Even though this chart is here and clarifies each part of Section 1, please make sure that teachers refer to the “Help Desk” definition document (found in SLO/Build/Stuff) as it provides the correct formatting along with examples. It is important that each section is filled out properly. 2. Example (as found in SLO Model 1, Art; SLO/Build/Guides): a. 1d. Class/Course Title – Full Name(s) - Art b. 1e. Grade Level – Numeric Values/Text – 8 c. 1f. Total # of students – Numeric values only – 100 d. 1g. Typical Class Size – Numeric values only – 25-30 e. 1h. Class Frequency – (# of sessions) per (week, 6 day cycle) for (year, semester, 35 day rotation) equaling a total of (#) sessions – daily for one quarter (42 sessions) f. 1i. Typical Class Duration – Numeric values only – 45 minutes Learning Activity Module 2-SLO Building 9
  • 10. Section 2: SLO Goal Concept Referring to Template #1 and Template #2 from the SLO Design module, teachers will be able to refresh themselves about the Goal Statement, Targeted Standards, and the Rationale developed during the Design phase. Using these templates will help teachers complete Section 2 of the SLO Process Template. Key Points for Trainers 1. 2. 3. 4. Remind participants that the Goal Statement is integral to the development of an SLO. It is the narrative that articulates the “big idea” upon which the SLO is based. This statement must be aligned with PA standards. Including national or professional standards is acceptable, but not as a replacement for PA standards. Targeted Standards are those that have been selected for use with the performance measures that will be developed. These selected standards should represent the “big ideas” within the content area. Refer participants to the Curriculum Framework for the Course/Content area for which the SLO is being written. Example from the SAS Portal – Curriculum Framework: Mathematics – 3rd grade a. Big Idea – The likelihood of an event occurring can be described numerically and used to make predictions. b. Essential Question – How can using graphs help us to solve problems and describe data we collect? c. Concepts – Graphical displays of data: Frequency tables, bar graphs, picture graphs, line plots d. Competencies – Construct and analyze frequency tables, bar graphs, picture graphs, and line plots and use them to describe data and solve problems. e. Standards/Eligible Content – 2.6.3.A, 2.6.3.B, 2.6.3.C, 2.6.3.D, 2.6.3.E, 2.7.3.D, 2.8.3.F, CC2.4.3.A.4, M3.E.1.1.1, M3.E.1.1.2, M3.E.1.2.1,M3.E.1.2.2 Teachers can cut and paste the language that has already been created by the Pennsylvania Department of Education when developing goal statements. All the information is easily accessible on the SAS Portal. Learning Activity Module 2-SLO Building 10
  • 11. Section 2: SLO Goal Concept This slide provides the definitions for each part of Section 2 of the SLO Process Template. This section can be filled out using the information from Templates #1 (Goal Statement) and #2 (Targeted Standards). Key Points for Trainers 1. 2. Remind teachers that they should use the “Help Desk” Definition document (found in SLO/Build/Stuff) when filling each Section out as it provides the correct formatting as well as example statements. Teachers can also refer to the Art and Physical Education Models that were completed by PA educators. Step-by-Step process to select “Big Idea” from SAS Portal: a. Go to www.pdesas.org b. Select Curriculum Framework c. Select Subject Area/Grade Level from the drop-down screen. d. Click Search e. “Big Ideas” are listed f. Choose a “Big Idea” g. Click on a specific “Big Idea” h. Complete framework for the “Big Idea” provides teachers with Essential Questions, Concepts, Competencies, and Standards/Eligible Content available for that particular “Big Idea/(Enduring Understanding).” i. Click on specific standard to drill down to materials, resources, and assessments available to specific content standard. Learning Activity 1. Have participants work with the curriculum framework element on the SAS Portal. Choose a particular subject area and grade level. Locate the various statements related to the specific subject and grade level. (Big Idea, Essential Question, Concepts, Competencies, and Standards/Eligible Content). Module 2-SLO Building 11
  • 12. Section 3: Performance Indicators Concept Performance indicators are a description of the expected level of achievement on each measure used in the SLO. An understanding of the scoring tool used to describe student achievement for any given performance measure is necessary to write a performance indicator statement. Sample performance indicator statements, as found in the “Help Desk”, are listed below: Physics (1) Roller Coaster Energy Project (2) achieve 6 out of 9 using the roller coaster project rubric. US History (1) US History Final Exam (2) achieve an 85% or higher on the final exam. 5th Grade ELA (1) DRA (2) Using the DRA text gradient chart, one year of reading growth. Referring to SLO Design/Template #3 (SLO Blueprint), teachers can begin to fill out Section 3: Performance Indicators of the SLO Process Template. When teachers filled out the SLO Blueprint, they identified the targets for each Performance Measure selected and/or to be developed. Key Points for Trainers 1. 2. 3. Remind teachers that Performance Indicators offer a great deal of flexibility in the system. Performance Indicators can be linked – meaning a student must meet a specific achievement level across two or more Performance Measures in order to meet the standard. Also, Performance Indicators can be weighted if there is more than one (1) Performance Indicator. Performance Indicators should be specific, measurable, and ambitious, but attainable. The “Help Desk” definition document provides further details and examples of linked and weighted Performance Indicators. Module 2-SLO Building 12
  • 14. Section 3: Performance Indicators Concept Again, this slide provides the definitions for each part of Section 3 of the SLO Process Template. However, referring to the “Help Desk” Definition Guide will be very useful toward completing this section. [Remind teachers that each Section of the SLO Template builds upon each other. You will not be able to fill out Section 3 if you have not completed the work in order to fill out Section 2.] Key Points for Trainers Participants learning the SLO process often confuse and intertwine the Performance Indicator statement (3a) with the Teacher Expectation statement (5a), especially when the Performance Indicator statement is described in percentages (i.e., achieves an 80% on a test). • • • Right: scoring 4 out of 5 on the “my awesome project rubric” Right: achieving 80% on a final exam Wrong: 80% of the students in the sample will score a 4 on the PM Rubric ALL STUDENT GROUP 1. Ensure teachers are clear that PIs are not “performance expectations” for a group of students but rather as a single indicator of a performance on the assessment. FOCUS STUDENT GROUP 1. This function allows teachers to differentiate their instruction and assessment of various students within a SLO population. a. PI Targets: Focused Student Group i. PI Target #1 – Students who score below 2 on Performance Measure (PM) pre-test will improve a minimum of one level on Performance Measure post-test. Learning Activity Module 2-SLO Building 13
  • 15. Section 4: Performance Measures Concept Each section of the SLO Process has a specific task and each has its own importance. However, Section 4: Performance Measures is the most critical and probably the most challenging work to complete. Selecting and/or developing Performance Measures that are of high-quality is essential for demonstrating student achievement of the selected content standards. Key Points for Trainers 1. Make sure the participants understand that performance measures must allow equitable opportunities for students to demonstrate learning. 2. Discuss the strengths and weaknesses of status (mastery) and growth metrics. a. Status metrics have absolute standards and are easily understood; however, they do not reflect changes (improvement) in student learning. b. Growth metrics are sensitive to changes in learning; however, they are more unstable and can be limited for high performing students. 3. Principles of Well-Developed Measures: a. Be built to achieve the designed purpose b. Produce results that are used for the intended purpose c. Align to targeted content standards d. Contain a balance between depth and breadth of targeted content e. Be standardized, rigorous, and fair f. Be sensitive to testing time and objectivity g. Be valid and reliable Learning Activity Module 2-SLO Building 14
  • 16. Section 4: Performance Measures Concept This slide provides the definitions for each part of Section 4 of the SLO Process Template. This section will take some time to complete and teachers need to consider many aspects of the performance measure to ensure that they select/develop a performance measure that fulfills the purpose of the assessment. Section 4 breaks out each aspect that teachers must consider when selecting/developing performance measures. Key Points for Trainers 1. 2. 3. Referring teachers to the “Help Desk” Definition Guide and the Models will be useful toward filling out this section of the SLO Process Template. Allow teachers time to fully consider all aspects of the Performance Measures and complete Section 4. 4c. Purpose Statement: states “what” the performance task is measuring, “how” the results (scores) can be used, and “why” the performance measure was developed. Here is an example statement: Elementary Pre-Post – Checkpoints in Mathematics are assessments intended to measure student proficiency of grade-level expectations in the sequence of the district’s curriculum, including different depths of knowledge. This grade-level assessment is provided to all students in the fall and spring of each year. Item and strand-level scores are reported to educators. Scores will be used by the district, schools, and teachers to monitor growth in student achievement. Handout #1-Purpose Statement Examples (found in ALS/Design/Templates) can be used to unpack the Purpose Statement process. 4 “4d. Metric – teachers must ensure that the metric used by the Performance Measure is aligned with the Performance Indicator 3a. Is the Performance Indicator determining Growth (change in student performance across two or more points in time), Mastery (attainment of a defined level of achievement), or Growth and Mastery (Mastery for All Student Groups/Growth for Focused Student Group). a. Art/Grade 8 Model i. PI Target #1 – achieve Advanced or Proficient on all four criteria of the Mood Portrait rubric ii. PI Target #2 – achieve Advanced or Proficient on all four criteria of the Demuth Oil Pastel Drawing rubric. iii. PI Target #3 – achieve Advanced or Proficient on all four criteria of the Clay Architectural rubric. i. All three of these PI Targets describe a “Mastery” metric – an attainment of a defined level of achievement. 4e. Administration Frequency – if it is a pre-post performance measure – the frequency could be noted as “at the beginning of the semester/at the end of the semester; if it is a portfolio – the frequency could be noted as “during a six-week period”; if it’s a culminating activity/event – the frequency could be noted as “once a semester.” a. Art/Grade 8 Model i. Performance Measure (PM) #1 – once a semester ii. Performance Measure (PM) #2 – once a semester iii. Performance Measure (PM) #3 – once a semester 4f. Adaptation/Accommodations – unique accommodation needed because of the performance measure’s design. a. Art/Grade 8 Model i. IEP, Gifted IEP, ELL – Additional time out of class is offered for those who need more time to complete the projects. All other adaptations will be developed based on an IEP or specified district policy. Learning Activity Module 2-SLO Building 15
  • 17. Section 4: Performance Measures (continued) Concept This slide continues with the various aspects associated with Performance Measures. Key Points for Trainers 1. 2. Resources and equipment are those materials needed for the assessment, not the instruction of the content. This data element is frequently misunderstood. Scoring personnel (including second scorers) and the assessment developer and administrator are often the same teacher, especially for very unique performance measures. Here, the audience must be aware and articulate ways to mitigate rater/observer bias. This can be helped by having well defined talk and scoring criteria. Learning Activity The next eight slides provide an overview of the assessment literacy process, a process that supports the creation and/or review of performance measures that teachers will build and/or select to be used for the purposes of the SLO process. The activities below are provided to create a scenario in which the participant will want more information and subsequently recognize the purpose of the Assessment Literacy Series Materials. (Adult Learning Theory, Robert Mager: Adults want to know why they should learn.) 1. Ask participants to write performance measure administration instructions for a task that they frequently administer, pretending that they might ask a colleague from their content area and grade level to administer the performance measure. Follow this activity with a tour of the materials found at ALS/Build/Videos and ALS/Build/Guides, Handouts 8 & 9. 2. Ask participants to write scoring rubrics for a task that they frequently administer, pretending that they might ask a colleague from their content area and grade level to score student exemplars. Follow this activity with a tour of the materials found at ALS/Build/Videos and ALS/Build/Guides, Handout 7. Module 2-SLO Building 16
  • 18. What is “Assessment Literacy”? Concept Assessment Literacy means understanding the basic principles of quality assessment practices in order to assess students effectively. “Research suggests that teachers spend from one-quarter to one-third of their professional time on assessmentrelated activities. Yet almost all do so without the benefit of having learning the principles of sound assessment” (Stiggins, 2007). “Governmental agencies and others involved in test development activities must be held to the same high expectations as test publishers and professional assessment companies with regard to following established requirements, adhering to industry standards, and implementing best practices related to established requirements and standards for products and services as test publishers and professional companies. A crucial part of this process is to agree upon standards for Quality Assurance (QA) and Quality Management (QM). Because testing is such an important enterprise with results that impact students, teachers, and schools, every step related to assessment development, administration, and scoring must be clearly documented and correctly implemented.” (Research in Action, Inc., Quality Assurance Techniques for Developing Measures of Student Achievement-Standard Operating Procedures ‘Smart Book’, April 2011). Key Points for Trainers 1. 2. 3. Focus on common understanding and misunderstanding about assessment literacy. Assessment is part of the instructional process; Testing is an event. Assessment literacy is a unitary concept; meaning that educators are never “assessment literate”, they simply have a greater understanding of assessment concepts and procedures. Learning Activity Ask one of the following: 1. How is assessment different than testing? 2. Explain the relationship between curricula, instruction, and assessment. 3. What is the role of formative vs. summative assessment? 4. Which assessment techniques should be avoided in the SLO process and why? Module 2-SLO Building 17
  • 19. Assessment Life Cycle Concept Each step of the Assessment Life Cycle outlines the sequence required to ensure highquality assessments that measure and validate student achievement. Details associated with each step of the Life Cycle will be outlined further in this training module. Key Points for Trainers 1. Stress the importance for participants to understand and follow this step-by-step process, as it will ensure the development of high-quality assessments. 2. Ensure that teachers recognize the assessment life cycle as a continuous sequence that allows for additional evidence and corrective action to be implemented from administration to administration. This “process” creates a body of evidence about the score inference by which teachers are, in part, regarded as “effective”. 3. Clarify the advantages and limitations associated with educators creating their own customized assessments for use in the SLO process. Learning Activity Module 2-SLO Building 18
  • 20. Principles of Well-Developed Measures Concept Working through the steps associated with the Assessment Life Cycle focuses test developers on the principles of well-developed measures. The Performance Measure Rubric for Teachers (found in ALS/Review/Stuff) will guide participants through the process of selecting/developing high-quality performance measures. The rubric has 18 technical aspects related to the basic principles of quality assessment. They are organized into three (3) strands to align with the Design, Build, Review components of both the SLO Process and the ALS Process. Using this rubric allows teachers to validate the performance measures they have determined will effectively measure student progress toward the goals identified. Key Points for Trainers Measures must: 1. Be built to achieve the designed purpose – “For all types of assessments, the first step is to clearly define the purpose. Teachers must specify exactly what the assessment is intended to measure, characteristics of intended test takers, types of scores to be reported, and how the information derived from the assessment will be used.” (RIA “Smart Book”, 2011) a. “If the purpose is not well defined, there is a high risk that the assessment will not satisfy fundamental measures of test validity. Critical measures of test validity examine whether the test is built to achieve its purpose(s) and whether the results are used for the intended purpose(s). (RIA, “Smart Book”, 2011). b. Strand 1: Design of the Performance Measure Rubric i. 1.1 The purpose of the performance measure is explicitly stated (who, what, why) 2. Produce the results that are used for the intended purpose - does the assessment measure what we really want to measure? Evaluate the final assessment against the intended purpose. a. Strand 2: Build i. 2.2 Item/tasks are created and reviewed in terms of: (a) alignment to the targeted standards, (b) content accuracy, (c) developmental appropriateness, (d) cognitive demand, and (e) bias, sensitivity, and fairness. b. Strand 3: Review i. 3.1 The performance measures are reviewed in terms of design fidelity• Items/tasks are distributed based upon the design properties found within the specification or blueprint documents. • Item/task and form statistics are used to examine levels of difficulty, complexity, distracter quality, and other properties. • Items/tasks and forms are rigorous and free of bias, sensitive, or unfair characteristics. 3. Aligned to targeted content standards - “The desired attributes and characteristics of the assessment need to be identified, specified, and documented. The test design framework and blueprint provide information necessary to guide the item/task development process. This is a critical activity in establishing validity.” (RIA “Smart Book”, 2011) a. Strand 1: Design of the Performance Measure Rubric i. 1.2 The performance measure has targeted content standards representing a range of knowledge and skills students are expected to know and demonstrate. ii. 1.4 Specification tables articulate the number of items/tasks, item/task types, passage readability, and other information about the performance measure – OR – blueprints are used to align items/tasks to targeted content standards. b. Strand 3: Review i. 3.3 The performance measures are reviewed in terms of alignment characteristics• Pattern consistency (within specifications and/or blueprints • Matching the targeted content standards • Cognitive demand • Developmental appropriateness 4. Contain a balance between depth and breadth of targeted content – “Several considerations are taken into account during the assessment design phase. First, to satisfy accepted standards of reliability and validity, a minimum number of items and score points are required within each subtest and for the overall assessment. Second, the assessment must include items with a range of difficulty levels if the assessment’s purpose is to provide information about student achievement at different levels on the performance continuum.” (RIA, “Smart Book”, 2011) a. Strand 1: Design of the Performance Measure Rubric i. 1.5 Items/tasks are rigorous (designed to measure a range of cognitive demands/higher-order thinking skills at developmentally appropriate levels) and of sufficient quantities to measure the depth and breadth of the targeted content standards. 5. Be standardized, rigorous, and fair – “A number of technical and editorial issues are related to item and test specifications. Obviously, crucial quality processes are needed in assessment development, such as adequate content coverage and the development of items and test forms to meet best practice requirements. The desired psychometric properties for the items, such as difficulty and discrimination, as well as desired test properties such as overall test difficulty and reliability need to be targeted and met. Detailed item specifications must include: item types, number of items, response options, difficulty levels, language load, and artwork. Test specifications guide the process of developing forms and likewise need to be detailed to address issues of content coverage, overall difficulty level, balance of items, targeted distribution of item difficulties, rules for sequencing items, total number of items, and timing.” (RIA “Smart Book”, 2011) a. Strand 1: Design of the Performance Measure Rubric i. 1.3 The performance measure’s design is appropriate for the intended audience and reflects challenging material needed to develop higher-order thinking skills. b. Strand 2: Build i. 2.1 Items/tasks and score keys are developed using standardized procedures, including scoring rubrics for human-scored, open-ended questions (i.e., short constructed response, writing prompts, performance tasks, etc.). ii. 2.2 Item/tasks are created and reviewed in terms of: (a) alignment to the targeted standards, (b) content accuracy, (c) developmental appropriateness, (d) cognitive demand, and (3) bias, sensitivity, and fairness. iii. 2.3 Administrative guidelines are developed that contain the step-by-step procedures used to administer the performance measure in a consistent manner, including scripts to orally communicate directions to students, day and time constraints, and allowable accommodations/adaptations. iv. 2.4 Scoring guidelines are developed for human-scored items/tasks to promote score consistency across items/tasks and among different scorers. These guidelines articulate point values for each item/task used to combine results into an overall score. a. Strand 3: Review i. 3.2 The performance measures are reviewed in terms of editorial soundness, while ensuring consistency and accuracy of other documents (i.e., administration)• Identifies words, text, reading passages, and/or graphics that require copyright permission or acknowledgement • Applies Universal Design principles • Ensures linguistic demands and/or readability is developmentally appropriate 6. Be sensitive to testing time and objectivity – a. Strand 2: Build i. 2.6 The total time to administer the performance measure is developmentally appropriate for the test-taker. Generally, this is 30 minutes or less for young students and up to 60 minutes per session for older students (high school). 7. Have score validity and reliability evidence – “General scoring criteria and methods for scoring both short-constructed response (SCR) and extended-constructed response (ECR) should be outlined when the guidelines for writing items are developed.” a. Strand 2: Build i. 2.5 Summary scores are reported using both raw score points and performance level. Performance levels reflect the range of scores possible on the assessment and use terms or symbols to denote performance levels. b. Strand 3: Review i. 3.4 Cut scores are established for each performance level. Performance level descriptors describe the achievement continuum using content-based competencies for each assessed content area. ii. 3.5 As part of the assessment cycle, post administration analyses are conducted to examine such aspects as items/tasks performance, scale functioning, overall score distribution, rater drift, content alignment, etc. iii. 3.6 The performance measure has score validity evidence that demonstrated item responses were consistent with content specifications. Data suggest the scores represent the intended construct by using an adequate sample of items/tasks within the targeted content standards. Other sources of validity evidence such as the interrelationship of items/tasks and alignment characteristics of the performance measure are collected. iv. 3.7 Reliability coefficients are reported for the performance measure, which includes estimating internal consistency. Standard errors are reported for summary scores. When applicable, other reliability statistics such as classification accuracy, rater reliabilities, and others are calculated and reviewed. Learning Activity Module 2-SLO Building 19
  • 21. Overview of the Assessment Literacy Series (ALS) Process Concept Just like the Student Learning Objective Process (SLO), the Assessment Literacy Series (ALS) Process is comprised of three (3) phases: Design, Build, and Review. Key Points for Trainers 1. Clarify that the ALS (full model) guides the development of assessments. For existing measures, the ALS Review section will outline the quality criteria the assessment must adhere to, in order to be considered of sufficient quality (i.e., sufficient quality for use in the SLO process). 2. Each phase of the process deals with specific steps within the Assessment Life Cycle. Working through the three (3) phases (Design, Build, and Review) will provide participants the opportunity to develop measures that meet crucial criteria for each step and ensure the development of high-quality assessments/measures of student achievement. 3. In many cases, educators will be using vendor-produced assessments for their SLOs. In these instances, the Review section is most applicable. Learning Activity Module 2-SLO Building 20
  • 22. ALS Process Components Concept The first phase of the ALS Process is the DESIGN Phase. During this phase, participants will decide on a Purpose Statement. The Purpose Statement clearly defines why the assessment is being developed, what the assessment will measure, and how the results will be used. This phase also includes the selection of specific content standards that will become the focus of the Performance Measure. “If the purpose is not well defined, there is a high risk that the assessment will not satisfy fundamental measures of test validity. Critical measures of test validity examine whether the test is built to achieve its purpose(s) and whether the results are used for the intended purpose(s). The test design, framework, and/or blueprint are specifically created to maximize the likelihood of meeting acceptable measures of test validity. Therefore, these components cannot be fully developed until the assessment’s purpose is clear. Assessment developers need to know the specific purpose(s) of the assessment to make sure the items and forms are constructed to meet the intended use.” (RIA, Smart Book, April 2011) Key Points for Trainers 1. 2. Have participants review content standards and/or national/professional organization standards and identify/select a set of standards that will become the focus of the Performance Measure (PM). Alignment to the Goal Statement of the SLO should be addressed. Allow time for discussion of why the assessment is being developed, what the assessment is measuring, and how the results will be used. Participants should reach consensus on these key aspects. Learning Activity 1. 2. Questions to consider for focusing this discussion include: a. Which content areas will be focused upon within the various content standards? b. What is the assessment intended to measure? c. What grade levels? d. What are the developmental needs of the students? e. How will the results be used? f. When will the measure be administered? When the purpose statement is complete, review and refine as necessary. a. Does the purpose statement address the why, what, and how of the measure? b. Is the statement clear and concise? c. Is the purpose aligned with targeted content standards? d. Does it reflect how the results will be used? Module 2-SLO Building 21
  • 23. ALS Process Components (continued) Concept The purpose statement provides the foundation for developing the test specifications and blueprints. These specifications and blueprints will then provide the necessary information for building the specific test items. Specifications and blueprints are tools that guide teachers when creating/selecting the items for a particular measure. Creating the Purpose Statement and designing the Specification and Blueprints are completed during the Design Phase (Step One) of the Assessment Literacy Series Process. This becomes the foundation for the Building Phase (Step Two) which begins with the Item Development. Key Points for Trainers 1. Keep participants focused on the Purpose Statement they developed and the targeted standards chosen. Specifications and blueprints will outline the following: a. Targeted Content Standards (found at pdesas.org) b. Depth of Knowledge (DoK) – level of difficulty (DoK charts can be found in ALS/Build/Stuff and ALS/Build/Guides; Handout #4) c. Item type – multiple choice, extended response, performance indicator, fill-in the blank d. Number of items Learning Activity 1. When working through the specifications and blueprints, keep in mind the following: a. Are there a sufficient number of items for each targeted content standard identified? b. Is there a developmentally appropriate distribution of DoK levels? c. Were different item types chosen and the appropriate weights assigned? d. Was consideration given to the time burden of the assessment for both the teacher and the student? Module 2-SLO Building 22
  • 24. ALS Process Components (continued) Concept Once the Specifications and Blueprints are complete, teachers move into the Build Phase of the ALS Process. This Phase includes the following steps: a. Development of specific test items-aligned to specifications and blueprints b. Scoring sheets – provide answers for multiple choice i. Pros of multiple choice – easy to administer; objective scoring ii. Cons of multiple choice – guessing and single answer (no information process used by students) c. Scoring rubrics – required for any constructed and/or performance task i. Pros of constructed responses and performance tasks – allows for partial credit, provides more details about cognitive processes, provides opportunities to evaluate multiple skills, reduces likelihood of guessing, creates more information to guide instruction ii. Cons of constructed responses and performance tasks – greater subjectivity; longer time to administer, respond, and score. d. Operational forms – the test form itself Key Points for Trainers 1. 2. 3. 4. 5. 6. Teachers will select and/or develop the items, any reading passage, scenario/performance task, and answer options. Make sure the item sample represents the “big idea” noted in the Goal Statement. Make sure the items are measuring the skills and knowledge associated with the targeted content standards. Make sure consideration is given to the pros/cons of each item type before selecting item type. Establish the DoK of each item. Develop scoring rubrics where applicable. a. Create different levels of performance in the rubric’s descriptors. b. Make scoring as clear as possible. c. Provide an example of a fully complete/correct response along with examples of partially correct responses. Learning Activity 1. Teacher should also consider the following: a. Does the item type distribution match the specification table (found in ALS/Design/Templates; Template #3)? b. Are there sufficient items to sample the targeted standards (found in ALS/Design/Templates; Template #2)? c. Are the items developmentally appropriate? d. Is the correct answer and/or expected response clearly identified? e. Does the operational form (test) place items in a logical order (found in ALS/Design/Templates; Template #2)? f. Is the distribution of item difficulty random throughout the form? g. Did you ensure that items do not “key” (give clues) to one another? h. Are the directions clear and concise for the students? i. Did you make sure that items do not drift from one page to another? Module 2-SLO Building 23
  • 25. ALS Process Components (continued) Concept The last step during the BUILD Phase is to create the Administrative Guidelines for the selected measure. These guidelines must provide sufficient detail so that testing conditions are comparable. The Quality Assurance and Form Reviews is conducted during the final and third phase of the ALS Process; the REVIEW Phase. More details about the REVIEW Phase are outlined in Training Module Three. Key Points for Trainers 1. Teachers must create: a. Directions for the students (test-takers). b. Directions for teachers (test administrators). c. Steps needed to prepare and administer the measure. d. Materials needed to administer the measure. e. How to collect the results. f. Procedures for completed assessments. g. Conduct a rigor check (using the Screening Tool). i. Is it developmentally appropriate? ii. Is each item assigned to a targeted content standard? iii. Is each item assigned the correct cognitive level? iv. Have items been reviewed for sensitivity, bias, and fairness? h. Examine the quality of teacher-made performance measures (using the Performance Measure Rubric for Teachers) i. Review information, data, and documents associated with the design, development, and review of the selected performance measure. ii. Assign a value for each aspect within a particular strand. iii. Reference supporting information associated with each assigned rating. iv. Add any additional notations and/or comments that articulate any important nuances of the performance measure. Learning Activity Module 2-SLO Building 24
  • 26. Section 5: Teacher Expectations Concept As part of the SLO Process, educators will be required to set expected levels of student learning based on the Performance Indicators and their Performance Measures. The levels are identified as “Distinguished, Proficient, Needs Improvement, and Failing.” This Teacher Expectation Rating will determine the overall SLO rating which will be used as the Elective Rating on the Pennsylvania Educator Effectiveness Rating Tool. Key Points for Trainers 1. Participants learning the SLO process often confuse and intertwine the Performance Indicator Statement (3a) with the Teacher Expectation Statement (5a), especially when the Performance Indicator statement is described in percentages. Examples: Performance Indicator Statement: achieves at minimum an 80% on the post-test. Teacher Expectation Statement: 80% of students meet the performance indicator (implies that 80% of students achieve at minimum an 80% on the test). 1. 2. 3. The SLO Process allows for a great deal of control and flexibility over the performance measures and the expected levels of growth and/or mastery associated with each measure that will be used to fulfill the requirements of the Elective portion of teacher evaluation. These levels are established by educators prior to the evaluation period and each performance level is populated with a percentage range such that 0% to 100% meeting expectations is distributed among the levels. The Elective Rating for teachers is not completed until after performance data are collected, reviewed, and evaluated against each Performance Indicator. Learning Activity Module 2-SLO Building 25
  • 27. Section 5: Teacher Expectations (continued) Concept This slide provides one (1) example of how Teacher Expectations levels can be determined. Key Points for Trainers 1. 2. 3. 4. Once the performance data has been collected, reviewed, and evaluated against each Performance Indicator, the number of students meeting expectations can be charted and totaled. Then an overall percentage can be determined by a simple division problem: the total number of students meeting expectations/the total number of students in the SLO population across all indicators. The same student may be included across all indicators (which is demonstrated in the slide); however, some variation in student counts across time will exist. Remember, Teacher Expectations and Performance Indicators are two different things: a. Performance Indicators are the descriptions of the expected level of achievement for each student in the SLO population on a particular Performance Measure – Physical Education/3rd grade Model b. Teacher Expectations – are the four levels of projected performance regarding the PI, reflecting a continuum established by the educator prior to the evaluation period. Each performance level (i.e., Failing, Needs Improvement, Proficient, and Distinguished) is populated with a percentage range such that 0% to 100% meeting expectations is distributed among the levels. – Physical Education/3rd grade Model i. Failing – 0% - 60% of students will meet the PI targets. ii. Needs Improvement – 61% to 84% of students will meet the PI targets. iii. Proficient – 85% - 94% of students will meet the PI targets. iv. Distinguished - 95% - 100% of students will meet the PI targets Another problematic area is where to set the values among the different performance levels. A “trade-off” exists between the rigor of the performance targets and expectation of teachers that students will meet those standards. a. In looking at the Physical Education – 3rd grade Model, the different PI Targets for the focused student group is reflective of the “trade-off” between the rigor of the performance targets and expectation of teachers that students will meet those standards. b. Because the PI Targets are based on both Mastery and Growth – the teacher expectations are different for different students. Learning Activity 1. Using the Physical Education-3rd grade Model, have participants create student results and from that data determine Section 5: Teacher Expectations Module 2-SLO Building 26
  • 28. Section 5: Teacher Expectations (continued) Concept This slide provides the definitions for each part in Section 5 of the SLO Process Template. Key Points for Trainers 1. 2. 3. 4. 5. Even though this chart is here and clarifies each part of Section 5, please refer to the “Help Desk” definition guide (found in SLO/Build/Stuff) as it provides the correct formatting along with examples. It is important that each section is filled out correctly. One thing that needs to be considered is what data will be used. a. In the Physical Education-3rd grade Model – the teachers choose to use data from teacher-developed measures. i. It is not clear how many teacher-developed measures are expected to be used in determining if the PI Targets have been met by students in the SLO population. Teachers will need to be clear on what and how many teacher-developed measures were used and how the results were combined to determine the percentages of students who met the PI Targets. b. In the Art – 8th grade Model – teachers choose to use data from District-designed Measures and Examinations and Student Projects. i. Again, teachers will be gathering information from different performance measures. Will the same weight be given to each measure? Will one measure be weighted more than the other? Another factor to consider is how the data is aggregated (how it is gathered and summarized). a. Physical Education – 3rd grade Model is gathering data based on the growth of some students and the proficiency levels of students. This can make the process a little more complicated when trying to determine and calculate teacher elective ratings. b. In the Art – 8th grade Model – the data is gathered based on mastery level (a defined level of achievement). This makes determining a teacher elective rating somewhat easier. Element 5b is not determined until after performance data are collected, reviewed, evaluated and reported. Element 5b also has a section for Notes/Explanations – this provides an opportunity for teachers to offer additional information related to student outcomes. a. Description of the anticipated outcomes vs. the actual outcome. b. In-depth analysis of the data that will provide goals for future implementation and improvement of student achievement through the SLO. c. Recommendations as to how analysis of the achievement data will inform future teaching practice as defined by Danielson’s Framework for Teaching. d. Recommendations for further SLO development to support student achievement of standards in this class/course/content area. Learning Activity Module 2-SLO Building 27
  • 29. Summary & Next Steps Concept During this Module 2: Building teachers will have worked through the development of items, scoring keys & rubrics, operational forms (tests), and administrative guidelines for each Performance Measure. Completing the following templates provides guidance through this phase of the SLO process. • Template #4-SLO Process Template (found in SLO/Build/Templates) • Template #5-Performance Task Framework (found in SLO/Build/Templates) Using the “Help Desk” Definition Guide (found in SLO/Build/Stuff) will assist in the completion of these templates. The next and final step of the SLO Process is outlined in Training Module 3: REVIEW. During this module, educators will conduct an extensive quality assurance review. Key Points for Trainers 1. Completed Templates #4 and #5 (if applicable) will be needed in order to move ahead to Module 3: Review, where educators will review the completed SLO and the Performance Measures associated with each Performance Indicator to ensure that they meet all the criteria for high-quality SLOs and Performance Measures. Learning Activity 1. Make sure participants have completed the above mentioned Templates before leaving Module 2: Building. 2. Answer any questions related to the Build Phase of the SLO Process. • Template #5 Performance Task Framework Using the “Help Desk” Definition Guide will assist in the completion of these templates. Module 2-SLO Building 28