SlideShare a Scribd company logo
1 of 46
How to Design a Local
Student Growth Measures Plan
Ohio’s Statewide Education Conference
October 28-29, 2013
Presented By:
Mindy Schulz, Allen County ESC
Director of Curriculum
eTPES, OTES, OPES State Trainer
Agenda
Participants will learn:
• How to use the OTES & OPES student growth measures
(SGM) framework
• The steps for designing a local student growth measures
plan
• How to conduct an inventory of teachers, principals, and
assessments
• How to analyze SGM categories and current assessments
• How to set default SGM percentages
• Considerations for the local education agency (LEA) SGM
plan
Definition of Student Growth
For the purpose of use in Ohio’s
evaluation systems, student growth
is defined as the change in student
achievement for an individual
student between two or more points
in time.
SGM Framework – OTES
SGM Framework – OPES
What is a Student Growth Measures Plan?
“Teacher evaluation as required by
O.R.C.3319.111 relies on two key evaluation
components: a rating of teacher performance
and a rating of student academic growth, each
weighted at 50 percent of each evaluation.
The following guidance speaks to the student
growth measures component, specifically
addressing determinations to be made for
using student growth measures within teacher
evaluation.” – K. Harper
Special thanks to Dr. Kathy Harper, Greene Co. ESC, for sharing this definition.
ODE Steps for Designing a Local SGM Plan
1) Conduct an inventory of needs and
resources.
2) Determine and create (if necessary)
student growth measures to be used.
3) Communicate expectations and refine
the entire process.
How can we collaborate with others and who
should we include on our LEA design team?
1. What collaborative opportunities are
available with other LEAs, ESCs, Higher
Ed?
2. Who should we include on our SGM
LEA design team to help plan and
communicate to other stakeholders ?
What assessments does our LEA have
available?
1) Is our LEA using any of the ODE approved
vendor assessments?
2) Create an Inventory
• Which grade level(s) and course(s), have current ODE approved vendor assessments
available?
• Which assessments are not on the ODE approved vendors assessment list, but could
be used in SLOs? Checklist for Selecting Assessments
3) LEA Considerations:
• Does the manner in which our LEA is using the ODE vendor assessment meet the
definition of student growth?
• Have we secured the vendor assessment growth reports?
Who is required to be evaluated by the
new evaluation systems?
• Any person who is employed under a teaching license
or under a professional or permanent teacher’s
certificate and who spends at least 50 percent of
his/her time employed providing student instruction.
This does not apply to a teacher employed as a
substitute.
This usually excludes:
 Speech pathologists, occupational therapists
 Teachers on assignment
 Nurses, psychologists, guidance counselors
• Create an inventory of teachers in your LEA required to
be evaluated by the new evaluation system.
10
Categorize teachers into 3 groups
1) Who has teacher-level EVAAS Value-Added data,
grades 4-8 reading and math? (Category A1 & A2)
2) Who does not have teacher-level Value-Added
data, but has data from assessments on the ODE
approved vendor assessment list? (Category B)
3) Who has no Value-Added or approved vendor
data? (Category C)
Value Added Data Timeline
12
Spring 2013 Fall 2013
2nd Semester
2013-2014
May 2014
Spring 2013 Teacher-Level value-added reports
(which are actually received in Fall 2013) for
Category A1 & A2 Teachers will be used in
calculating SGM % based on LEA’s default
percentages. Spring 2013 Building-Level
value-added results for Category A principals
will be used in calculating SGM % based on
LEA’s default percentages.
4-8 Reading
& Math
OAA’s
Administered
Teacher Level
Value Added
Reports Released
(Spring 2013:
4-8 Reading &
Math OAAs)
Spring 2013 Teacher
and Building Level
Value Added reports
uploaded into
eTPES by ODE.
Value-Added Data Usage
A1 teacher in the previous year:
Value-Added data from the prior year must be used
at 26% - 50% in the current year, according to the
LEA’s SGM plan
A2 teacher in the previous year:
Value-Added data from the prior year must be used
at 10% - 50% in the current year, according to the
LEA’s SGM plan.
Who instructed in a value added
course in the previous year?
Inventory teachers receiving a teacher-level
value added report (Fall 2013) from courses
instructed in the previous year (2012-13).
• Did the teacher receive a teacher-level value added report
this fall?
• Who instructed all value added courses last year
(exclusively)?
• Who instructed some value added course(s), but not
exclusively? What percent of time was spent instructing in
value added course(s)?
• What is the teacher’s current (2013-2014) SGM category?
• What is the required value-added weight for the current year
(2013-14)?
Who has data from assessments on the
ODE approved vendor assessment list?
Create an Inventory
• Which teachers are using an ODE approved
vendor assessment?
Refer to the LEA’s “Available Assessments Inventory”.
• Are any Category A teachers using an ODE
approved vendor assessment?
If a value-added teacher uses an approved ODE vendor
assessment, that assessment becomes a local measure.
Who has no Value-Added or
approved vendor data?
Create an Inventory
Inventory teachers with no value-added
or ODE-approved vendor assessment
data. (Category C)
Special Considerations: Which Teachers are New to
Value Added Assignment for the Current Year?
• Inventory teachers that did not receive a value added
report (from previous year), in Fall 2013, but have been
assigned to a value added course for 2013-14 (current
year).
This may include:
 New teachers, e.g. Year One Resident Educators, new hire
 Any teacher that changed assignment from the prior year
to the current year, e.g. teacher instructed 3rd grade in
previous year, and currently instructs 6th grade math
• Determine 2013-14 (current year) SGM category,
dependent upon available data.
 Are there ODE-approved vendor assessments available?
(Category B)
 If there are no ODE-approved vendor assessments
available, LEA measures will be used. (Category C)
Step Two: Determine LEA Default
Percentages
What percentages will your LEA attribute
to:
• Value-Added data (Category A1 and A2)?
• Assessments from the ODE approved
vendors (Category B)?
• Local Measures within each category?
(Local Measures may also apply to Category A and B)
19
2013-2014: How much will our LEA attribute to
Teacher-Level Value-Added Data?
19
O.R.C. 3319.111, O.R.C. 3319.112
A1. Teacher Instructs Value-Added Subjects
Exclusively
A2. Teacher Instructs Value-Added Subjects, but
Not Exclusively
Teacher Value
Added
26-50%
LEA Measures
0-24%
Teacher-Level
Value Added
Proportional to teaching
schedule
10-50%
0-40%
LEA Measures
Proportional to
teaching schedule
20
2014-2015: Category A1 = 50% Teacher Value
Added
20
A1. Teacher Instructs Value-Added
Subjects Exclusively
O.R.C. 3319.111, O.R.C. 3319.112
Teacher Value Added
50%
A2. Teacher Instructs Value-Added
Subjects, but Not Exclusively
Teacher-Level
Value Added
Proportional to teaching
schedule
10-50%
0-40%
LEA Measures
Proportional to
teaching schedule
How much will our LEA attribute to Building-
Level Value-Added Data for Principals?
How much will our LEA attribute to the
assessments from the ODE Approved
Vendor List?
B:ApprovedVendorAssessmentdataavailable
C: NoTeacher-levelValue-AddedorApprovedVendorAssessmentdataavailable
St
=
=
LEAMeasures
0-40%
LEAMeasures
0-40%
TeacherValue-Added
10-50%
S
VendorAssessment
10-50%
Category B: Special Considerations
• How many years has the assessment(s) been
administered?
• Is there trend data to analyze?
• Are there variations in the number of vendor
assessments available by course and/or
grade level?
What LEA measures will be used?
Types:
• Student Learning Objectives (SLOs)
• Shared Attribution
Category A Only:
• If LEA measures are used, vendor assessment
data may be used.
What are SLOs?
•A goal that demonstrates a teacher’s
impact on student learning within a
given interval of instruction.
•A measurable, long-term academic
target written by an individual teacher
or a teacher team.
What is shared attribution?
• Shared attribution is a collective measure.
• The LEA determines which measure of
shared attribution it would like to use.
• Shared attribution could be:
 A building or district value-added score
Recommended if available
 Building team composite value-added score (i.e.
the 5th grade VAM score or the middle school
reading ELA team’s combined VAM score)
 Building-level or district-level SLOs
What is shared attribution for
principals?
• These are measures to encourage collaborative
goals.
• The LEA determines which measure of shared
attribution it would like to use.
• Shared attribution for principals may include:
 District Value-Added
Recommended if available
 Groups of schools (such as grade level buildings or
regional areas within a district) may utilize a composite
Value-Added score
 District-based SLOs
LEA Default Percentages
Percentages for Teachers in Category
•A1
•A2
•B
•C
What Default Percentages will your LEA Set for
2013-14?
*This information may appear differently in eTPES Spring 2014.
Educator Category Value-Added
Vendor
Assessment
LEA Measures
Total = 50%
SLOs/Other* Shared
Attribution
A: Value-
Added
A1 (exclusive)
26-50%
Remaining % may be split
among SLOs and Shared
Attribution areas
50%
A2 (non-
exclusive)
Representative
& Proportionate
to Teacher
Schedule
10-50%
Remaining % may be split
among SLOs and Shared
Attribution areas
50%
B: Approved Vendor
Assessment
10% or greater
Remaining % may be split
among SLOs and Shared
Attribution areas
50%
C: LEA Measures Remaining % may be split
among SLOs and Shared
Attribution areas 50%
29
*For Category A, teachers with Value-Added may also include
ODE-Approved Vendor Assessment data in this LEA Measures.
A special thanks to Dr. Kathy Harper , Greene Co. ESC, for contributing to the contents of this slide.
What Default Percentages will your LEA Set for
2014-15?
*This information may appear differently in eTPES Spring 2014.
Educator Category Value-Added
Vendor
Assessment
LEA Measures
Total = 50%
SLOs/Other* Shared
Attribution
A: Value-
Added
A1 (exclusive)
50%
Remaining % may be split
among SLOs and Shared
Attribution areas
50%
A2 (non-
exclusive)
Representative
& Proportionate
to Teacher
Schedule
10-50%
Remaining % may be split
among SLOs and Shared
Attribution areas
50%
B: Approved Vendor
Assessment
10% or greater
Remaining % may be split
among SLOs and Shared
Attribution areas
50%
C: LEA Measures Remaining % may be split
among SLOs and Shared
Attribution areas 50%
30
*For Category A, teachers with Value-Added may also include
ODE-Approved Vendor Assessment data in this LEA Measures.
A special thanks to Dr. Kathy Harper, Greene Co. ESC, for contributing to the contents of this slide.
Special Considerations
• Are there any special considerations to include in the LEA
SGM plan?
 Possible Example:
 New Teacher
• If the district decides to allow variation from the default
percentages, they must make manual adjustments within
eTPES.
Districts should try to be as consistent as possible when setting
percentages.
Percentages should not be determined by individual teachers
or determined based on individual past results.
• Inventory any teachers that require special consideration
for SGMs.
LEA Default Percentages
Percentages for Principals in Category
•A
•B
•C
33
SGM Default Percentages for Principals
*This information may appear differently in eTPES Spring 2014.
33
*For Category A principals, this could also include the ODE-Approved Vendor
Assessment data average of all teachers’ growth ratings.
*
Determine how the LEA will implement
the local measures process.
• Will shared attribution measures be used?
• Who is required to create SLOs?
• Within the guidelines of 2-4 SLOs, how many SLOs
are required for each teacher?
• Who will be approving the SLOs?
• How will SLOs be tracked, through revisions, and to
final approval?
• What guidance, training, and support will be
provided to teachers and evaluators?
Will shared attribution measures be
used?
• What shared attribution measures are we
using?
• Have we secured the proper reports?
• Will the same shared attribution measures
be used for all teachers within each SGM
category?
Note: Only one shared attribution measure
may be used per teacher.
Will SLOs be used?
• Who is required to create SLOs?
 Which categories of teachers will have LEA measures?
 Did we select SLOs as an LEA measure?
 Which SGM categories will this include?
• Within the guidelines of 2-4 SLOs, how many SLOs are required for each
teacher?
• What assessments will be used?
 Refer to the LEA’s “Available Assessments Inventory”
 If assessments do not exist for certain grade level(s) and/or courses, have we followed the
“SLO Guidelines for Selecting Assessments”?
 Will we have a district-approved list of SLO assessments?
• Inventory teachers writing SLOs, # of SLOs, and courses SLOs will be
written.
SLO Approval
• Who is approving SLOs in our LEA?
 LEAs are responsible for SLO approval.
 ODE recommends this process is completed by a
committee(s).
• Has SLO calibration been completed?
 SLO calibration is the process of ensuring a
thorough and fair review of all SLOs by
systematically requiring high quality and rigor
across SLOs.
SLO Procedures
•How will SLOs be tracked?
Submission
Revisions
Final Approval
•What guidance, training, and support
will be provided to teachers and
evaluators?
SLO Tracking Form
Teacher Name SLO Event Date Completed
Original SLO Submission
Committee Feedback Provided
to Teacher
SLO Approval
Midpoint Check-In
(recommended, not required)
SLO End-of-Interval Scoring
and Conference
Final SLO Score Entered in
eTPES
39
SLO Professional Development Form (Example)
Grade
Level
ELA Math Science Soc. St. P.E. Art Music Other
(__________)
Other
(__________)
K
1
2
3
4
5
6
7
40
Step Three: Communicate Expectations
and Refine the Entire Process
• Design communication plans, training, and
professional development opportunities around
requirements and implementation for teachers and
their evaluators.
 How will the SGM plan be communicated to all
stakeholders?
 What training will be provided to educators, e.g. SLO
training, Interpreting Teacher-Level Value Added
Reports, SGM training?
 What ongoing professional development and support
will be needed to sustain and refine the process?
SGM Professional Development Form (Example)
Date Agenda Items Target
Audience
(Identify
which
teachers will
attend the
training)
Follow-Up
Training
Date (if
applicable)
Follow up Training
Agenda Items
Target
Audience
(Identify
which
teachers will
attend the
training)
42
Additional Session Resources
To access session resources, including the
step-by-step workbook and templates on
how to design your own LEA SGM Plan, go
to:
http://bit.ly/SGMPlan
Works Cited
• Education, Ohio Department of. (2013, May 12). Steps for Designing a Local Student Growth Measures Plan. Retrieved
from Ohio Department of Education: http://education.ohio.gov/Topics/Teaching/Educator-Evaluation-System/Ohio-s-
Teacher-Evaluation-System/Student-Groth-Measures/Additional-Information/Steps-for-Designing-a-Local-Student-
Growth-Measure
• LaWriter Ohio Laws and Rules. (2013, March 22). 3319.111 Applicability of section; evaluating teachers on limited
contracts. Retrieved from LaWriter Ohio Laws and Rules: http://codes.ohio.gov/orc/3319.111
• LaWriter Ohio Laws and Rules. (2013, September 29). 3319.112 [Effective 9/29/2013] Standards-based state framework
for the evaluation of teachers. Retrieved from LaWriter Ohio Laws and Rules: http://codes.ohio.gov/orc/3319.112v2
• Ohio Department of Education. (2013, July 26). Approved Vendor Assessments. Retrieved from Ohio Department of
Education: http://education.ohio.gov/Topics/Teaching/Educator-Evaluation-System/Ohio-s-Teacher-Evaluation-
System/Student-Growth-Measures/Approved-List-of-Assessments#approved
• Ohio Department of Education. (2013, September 13). Local Measures. Retrieved from Ohio Department of Education:
http://education.ohio.gov/getattachment/Topics/Academic-Content-Standards/New-Learning-Standards/Student-
Learning-Objective-Examples/041113-Guidance_on_Selecting_Assessments_for_SLOs.pdf.aspx
• Ohio Department of Education. (2013, September 13). Local Measures. Retrieved from Ohio Department of Education:
http://education.ohio.gov/getattachment/Topics/Academic-Content-Standards/New-Learning-Standards/Student-
Learning-Objective-Examples/112912-SLO-Requirements-and-Recommendations.pdf.aspx
• Ohio Department of Education. (2013, September 25). Student Growth Measures for Teachers. Retrieved from Ohio
Department of Education: http://education.ohio.gov/getattachment/Topics/Teaching/Educator-Evaluation-System/Ohio-
s-Teacher-Evaluation-System/Student-Growth-Measures/091913_Business-rules-for-SGM-FINAL-040913-3.pdf.aspx
• Ohio Department of Education. (2013, September 25). Student Growth Measures for Teachers. Retrieved from Ohio
Department of Education: http://education.ohio.gov/getattachment/Topics/Teaching/Educator-Evaluation-System/Ohio-
s-Teacher-Evaluation-System/Student-Growth-Measures/091913_Combining-the-SGM-scores-one-pager.pdf.aspx
Acknowledgements
A special thank you to the following individuals for
editing, proofreading, and helping shape my
knowledge of the information contained in this
presentation:
• Dr. Kathy Harper, Coordinator/Liaison for State-
School Initiatives, Greene County Educational
Service Center
• Carolyn Everidge-Frey, Assistant Director, Office
of Educator Equity and Talent, Ohio Department
of Education
Questions
Steps for Designing a Local SGM Plan
Mindy Schulz
Director of Curriculum, Allen County ESC
mindy.schulz@allencountyesc.org
ODE Contact Information:
Student Growth Measures
SGM@education.ohio.gov
(P) 614-644-7446

More Related Content

What's hot

Revised introduction to ee power point
Revised introduction to ee power pointRevised introduction to ee power point
Revised introduction to ee power pointmbanios
 
TPEP WSU October 2011
TPEP WSU October 2011TPEP WSU October 2011
TPEP WSU October 2011WSU Cougars
 
Oam presentation webinar
Oam presentation webinarOam presentation webinar
Oam presentation webinarjhstapleton
 
Academic Overview - Board Training (Lake Superior State University)
Academic Overview - Board Training (Lake Superior State University)Academic Overview - Board Training (Lake Superior State University)
Academic Overview - Board Training (Lake Superior State University)National Charter Schools Institute
 
Getting Serious About Teacher Evaluation
Getting Serious About Teacher EvaluationGetting Serious About Teacher Evaluation
Getting Serious About Teacher EvaluationRichard Voltz
 
Teacher Rating 2012-2013
Teacher Rating 2012-2013Teacher Rating 2012-2013
Teacher Rating 2012-2013Justin Rook
 
SC Assessment Summit March 2013
SC Assessment Summit March 2013SC Assessment Summit March 2013
SC Assessment Summit March 2013NWEA
 
Georgia's Professional Teaching Expectations & Evaluations
Georgia's Professional Teaching Expectations & EvaluationsGeorgia's Professional Teaching Expectations & Evaluations
Georgia's Professional Teaching Expectations & Evaluationslafradieu
 
OECD Reviews of Evaluation and Assessment in Education: SWEDEN - Stockholm – ...
OECD Reviews of Evaluation and Assessment in Education: SWEDEN - Stockholm – ...OECD Reviews of Evaluation and Assessment in Education: SWEDEN - Stockholm – ...
OECD Reviews of Evaluation and Assessment in Education: SWEDEN - Stockholm – ...EduSkills OECD
 
OECD Review on Evaluation and Assessment Frameworks for Improving School Outc...
OECD Review on Evaluation and Assessment Frameworks for Improving School Outc...OECD Review on Evaluation and Assessment Frameworks for Improving School Outc...
OECD Review on Evaluation and Assessment Frameworks for Improving School Outc...EduSkills OECD
 
OLSP Faculty Evaluation and Development
OLSP Faculty Evaluation and DevelopmentOLSP Faculty Evaluation and Development
OLSP Faculty Evaluation and DevelopmentLydia Gillespie
 
20131018-TeacherEvaluationGuide-FINAL
20131018-TeacherEvaluationGuide-FINAL20131018-TeacherEvaluationGuide-FINAL
20131018-TeacherEvaluationGuide-FINALRyan Balch
 
Educators as designers of learning analytics?
Educators as designers of learning analytics?Educators as designers of learning analytics?
Educators as designers of learning analytics?Elizabeth Koh
 
Slo module1
Slo module1Slo module1
Slo module1eckkm
 

What's hot (19)

Revised introduction to ee power point
Revised introduction to ee power pointRevised introduction to ee power point
Revised introduction to ee power point
 
TPEP WSU October 2011
TPEP WSU October 2011TPEP WSU October 2011
TPEP WSU October 2011
 
Oam presentation webinar
Oam presentation webinarOam presentation webinar
Oam presentation webinar
 
Academic Overview - Board Training (Lake Superior State University)
Academic Overview - Board Training (Lake Superior State University)Academic Overview - Board Training (Lake Superior State University)
Academic Overview - Board Training (Lake Superior State University)
 
Getting Serious About Teacher Evaluation
Getting Serious About Teacher EvaluationGetting Serious About Teacher Evaluation
Getting Serious About Teacher Evaluation
 
SRI INFO
SRI INFOSRI INFO
SRI INFO
 
Teacher Rating 2012-2013
Teacher Rating 2012-2013Teacher Rating 2012-2013
Teacher Rating 2012-2013
 
SC Assessment Summit March 2013
SC Assessment Summit March 2013SC Assessment Summit March 2013
SC Assessment Summit March 2013
 
Georgia's Professional Teaching Expectations & Evaluations
Georgia's Professional Teaching Expectations & EvaluationsGeorgia's Professional Teaching Expectations & Evaluations
Georgia's Professional Teaching Expectations & Evaluations
 
OECD Reviews of Evaluation and Assessment in Education: SWEDEN - Stockholm – ...
OECD Reviews of Evaluation and Assessment in Education: SWEDEN - Stockholm – ...OECD Reviews of Evaluation and Assessment in Education: SWEDEN - Stockholm – ...
OECD Reviews of Evaluation and Assessment in Education: SWEDEN - Stockholm – ...
 
OECD Review on Evaluation and Assessment Frameworks for Improving School Outc...
OECD Review on Evaluation and Assessment Frameworks for Improving School Outc...OECD Review on Evaluation and Assessment Frameworks for Improving School Outc...
OECD Review on Evaluation and Assessment Frameworks for Improving School Outc...
 
OLSP Faculty Evaluation and Development
OLSP Faculty Evaluation and DevelopmentOLSP Faculty Evaluation and Development
OLSP Faculty Evaluation and Development
 
MD6AssignCCornwell
MD6AssignCCornwellMD6AssignCCornwell
MD6AssignCCornwell
 
20131018-TeacherEvaluationGuide-FINAL
20131018-TeacherEvaluationGuide-FINAL20131018-TeacherEvaluationGuide-FINAL
20131018-TeacherEvaluationGuide-FINAL
 
L4: Systematic Approach
 L4: Systematic Approach L4: Systematic Approach
L4: Systematic Approach
 
SWBOCES APPR Primer
SWBOCES APPR PrimerSWBOCES APPR Primer
SWBOCES APPR Primer
 
Scaling and sustaining change
Scaling and sustaining changeScaling and sustaining change
Scaling and sustaining change
 
Educators as designers of learning analytics?
Educators as designers of learning analytics?Educators as designers of learning analytics?
Educators as designers of learning analytics?
 
Slo module1
Slo module1Slo module1
Slo module1
 

Similar to How to Design a Local SGM Plan

Principal and-teacher-evaluation-key-ideas-your-role-and-your-school's-leadin...
Principal and-teacher-evaluation-key-ideas-your-role-and-your-school's-leadin...Principal and-teacher-evaluation-key-ideas-your-role-and-your-school's-leadin...
Principal and-teacher-evaluation-key-ideas-your-role-and-your-school's-leadin...Winnie de Leon
 
Teach NJ project SGO presentation 2013-14
Teach NJ project SGO presentation 2013-14Teach NJ project SGO presentation 2013-14
Teach NJ project SGO presentation 2013-14WHSwarriors
 
Hammond slides
Hammond slidesHammond slides
Hammond slideslguzniczak
 
Sse workshop 2 spring 2014
Sse workshop 2 spring 2014Sse workshop 2 spring 2014
Sse workshop 2 spring 2014Martin Brown
 
A Guide to Using Student Learning Objectives as a Locally-Determined Measure ...
A Guide to Using Student Learning Objectives as a Locally-Determined Measure ...A Guide to Using Student Learning Objectives as a Locally-Determined Measure ...
A Guide to Using Student Learning Objectives as a Locally-Determined Measure ...Alli Wischer
 
Cesa 6 effectiveness project ppt
Cesa 6 effectiveness project pptCesa 6 effectiveness project ppt
Cesa 6 effectiveness project pptroverdust
 
2010 ohio tif meeting creating a comprehensive teacher effectiveness system
2010 ohio tif meeting  creating a comprehensive teacher effectiveness system2010 ohio tif meeting  creating a comprehensive teacher effectiveness system
2010 ohio tif meeting creating a comprehensive teacher effectiveness systemChristopher Thorn
 
MSDE Presentation on Student Learning Objectives: MSEA 2013 Convention
MSDE Presentation on Student Learning Objectives: MSEA 2013 ConventionMSDE Presentation on Student Learning Objectives: MSEA 2013 Convention
MSDE Presentation on Student Learning Objectives: MSEA 2013 Conventionmarylandeducators
 
overviewaccountabilitymetrics_june2014.ppt
overviewaccountabilitymetrics_june2014.pptoverviewaccountabilitymetrics_june2014.ppt
overviewaccountabilitymetrics_june2014.pptMohamedHendawy17
 
234_Teacher_Evaluation_Policy.pdf
234_Teacher_Evaluation_Policy.pdf234_Teacher_Evaluation_Policy.pdf
234_Teacher_Evaluation_Policy.pdfUbaidRehman95
 
Teacher librarians 3 15 14 to becky
Teacher librarians 3 15 14 to beckyTeacher librarians 3 15 14 to becky
Teacher librarians 3 15 14 to beckyBecky Russell
 
Learning Enhancement Program
Learning Enhancement Program Learning Enhancement Program
Learning Enhancement Program Edubull
 
The new era of tenure and teacher evaluation final 1
The new era of tenure and teacher evaluation final 1The new era of tenure and teacher evaluation final 1
The new era of tenure and teacher evaluation final 1D Houseman
 
Sse workshop 2 spring 2014
Sse workshop 2 spring 2014Sse workshop 2 spring 2014
Sse workshop 2 spring 2014Martin Brown
 
Year 07 Assessment Workshop for Parents, November 2011
Year 07 Assessment Workshop for Parents, November 2011Year 07 Assessment Workshop for Parents, November 2011
Year 07 Assessment Workshop for Parents, November 2011Chinese International School
 

Similar to How to Design a Local SGM Plan (20)

Principal and-teacher-evaluation-key-ideas-your-role-and-your-school's-leadin...
Principal and-teacher-evaluation-key-ideas-your-role-and-your-school's-leadin...Principal and-teacher-evaluation-key-ideas-your-role-and-your-school's-leadin...
Principal and-teacher-evaluation-key-ideas-your-role-and-your-school's-leadin...
 
What are SLO's
What are SLO'sWhat are SLO's
What are SLO's
 
Teach NJ project SGO presentation 2013-14
Teach NJ project SGO presentation 2013-14Teach NJ project SGO presentation 2013-14
Teach NJ project SGO presentation 2013-14
 
Hammond slides
Hammond slidesHammond slides
Hammond slides
 
Educator evaluation policy overview-final
Educator evaluation policy overview-finalEducator evaluation policy overview-final
Educator evaluation policy overview-final
 
Sse workshop 2 spring 2014
Sse workshop 2 spring 2014Sse workshop 2 spring 2014
Sse workshop 2 spring 2014
 
A Guide to Using Student Learning Objectives as a Locally-Determined Measure ...
A Guide to Using Student Learning Objectives as a Locally-Determined Measure ...A Guide to Using Student Learning Objectives as a Locally-Determined Measure ...
A Guide to Using Student Learning Objectives as a Locally-Determined Measure ...
 
Cesa 6 effectiveness project ppt
Cesa 6 effectiveness project pptCesa 6 effectiveness project ppt
Cesa 6 effectiveness project ppt
 
2010 ohio tif meeting creating a comprehensive teacher effectiveness system
2010 ohio tif meeting  creating a comprehensive teacher effectiveness system2010 ohio tif meeting  creating a comprehensive teacher effectiveness system
2010 ohio tif meeting creating a comprehensive teacher effectiveness system
 
MSDE Presentation on Student Learning Objectives: MSEA 2013 Convention
MSDE Presentation on Student Learning Objectives: MSEA 2013 ConventionMSDE Presentation on Student Learning Objectives: MSEA 2013 Convention
MSDE Presentation on Student Learning Objectives: MSEA 2013 Convention
 
Va 101 ppt
Va 101 pptVa 101 ppt
Va 101 ppt
 
Assessment parent workshop nov 2010
Assessment parent workshop nov 2010Assessment parent workshop nov 2010
Assessment parent workshop nov 2010
 
overviewaccountabilitymetrics_june2014.ppt
overviewaccountabilitymetrics_june2014.pptoverviewaccountabilitymetrics_june2014.ppt
overviewaccountabilitymetrics_june2014.ppt
 
234_Teacher_Evaluation_Policy.pdf
234_Teacher_Evaluation_Policy.pdf234_Teacher_Evaluation_Policy.pdf
234_Teacher_Evaluation_Policy.pdf
 
Teacher librarians 3 15 14 to becky
Teacher librarians 3 15 14 to beckyTeacher librarians 3 15 14 to becky
Teacher librarians 3 15 14 to becky
 
Learning Enhancement Program
Learning Enhancement Program Learning Enhancement Program
Learning Enhancement Program
 
Weac
WeacWeac
Weac
 
The new era of tenure and teacher evaluation final 1
The new era of tenure and teacher evaluation final 1The new era of tenure and teacher evaluation final 1
The new era of tenure and teacher evaluation final 1
 
Sse workshop 2 spring 2014
Sse workshop 2 spring 2014Sse workshop 2 spring 2014
Sse workshop 2 spring 2014
 
Year 07 Assessment Workshop for Parents, November 2011
Year 07 Assessment Workshop for Parents, November 2011Year 07 Assessment Workshop for Parents, November 2011
Year 07 Assessment Workshop for Parents, November 2011
 

Recently uploaded

Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...DhatriParmar
 
4.11.24 Mass Incarceration and the New Jim Crow.pptx
4.11.24 Mass Incarceration and the New Jim Crow.pptx4.11.24 Mass Incarceration and the New Jim Crow.pptx
4.11.24 Mass Incarceration and the New Jim Crow.pptxmary850239
 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfJemuel Francisco
 
ICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfVanessa Camilleri
 
Oppenheimer Film Discussion for Philosophy and Film
Oppenheimer Film Discussion for Philosophy and FilmOppenheimer Film Discussion for Philosophy and Film
Oppenheimer Film Discussion for Philosophy and FilmStan Meyer
 
How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17Celine George
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)lakshayb543
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptxmary850239
 
BIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptx
BIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptxBIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptx
BIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptxSayali Powar
 
Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...Seán Kennedy
 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management systemChristalin Nelson
 
Mythology Quiz-4th April 2024, Quiz Club NITW
Mythology Quiz-4th April 2024, Quiz Club NITWMythology Quiz-4th April 2024, Quiz Club NITW
Mythology Quiz-4th April 2024, Quiz Club NITWQuiz Club NITW
 
Grade Three -ELLNA-REVIEWER-ENGLISH.pptx
Grade Three -ELLNA-REVIEWER-ENGLISH.pptxGrade Three -ELLNA-REVIEWER-ENGLISH.pptx
Grade Three -ELLNA-REVIEWER-ENGLISH.pptxkarenfajardo43
 
Scientific Writing :Research Discourse
Scientific  Writing :Research  DiscourseScientific  Writing :Research  Discourse
Scientific Writing :Research DiscourseAnita GoswamiGiri
 
Multi Domain Alias In the Odoo 17 ERP Module
Multi Domain Alias In the Odoo 17 ERP ModuleMulti Domain Alias In the Odoo 17 ERP Module
Multi Domain Alias In the Odoo 17 ERP ModuleCeline George
 

Recently uploaded (20)

Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...
 
4.11.24 Mass Incarceration and the New Jim Crow.pptx
4.11.24 Mass Incarceration and the New Jim Crow.pptx4.11.24 Mass Incarceration and the New Jim Crow.pptx
4.11.24 Mass Incarceration and the New Jim Crow.pptx
 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
 
Faculty Profile prashantha K EEE dept Sri Sairam college of Engineering
Faculty Profile prashantha K EEE dept Sri Sairam college of EngineeringFaculty Profile prashantha K EEE dept Sri Sairam college of Engineering
Faculty Profile prashantha K EEE dept Sri Sairam college of Engineering
 
ICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdf
 
Mattingly "AI & Prompt Design: Large Language Models"
Mattingly "AI & Prompt Design: Large Language Models"Mattingly "AI & Prompt Design: Large Language Models"
Mattingly "AI & Prompt Design: Large Language Models"
 
Oppenheimer Film Discussion for Philosophy and Film
Oppenheimer Film Discussion for Philosophy and FilmOppenheimer Film Discussion for Philosophy and Film
Oppenheimer Film Discussion for Philosophy and Film
 
How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx
 
Paradigm shift in nursing research by RS MEHTA
Paradigm shift in nursing research by RS MEHTAParadigm shift in nursing research by RS MEHTA
Paradigm shift in nursing research by RS MEHTA
 
BIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptx
BIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptxBIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptx
BIOCHEMISTRY-CARBOHYDRATE METABOLISM CHAPTER 2.pptx
 
prashanth updated resume 2024 for Teaching Profession
prashanth updated resume 2024 for Teaching Professionprashanth updated resume 2024 for Teaching Profession
prashanth updated resume 2024 for Teaching Profession
 
Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...
 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management system
 
INCLUSIVE EDUCATION PRACTICES FOR TEACHERS AND TRAINERS.pptx
INCLUSIVE EDUCATION PRACTICES FOR TEACHERS AND TRAINERS.pptxINCLUSIVE EDUCATION PRACTICES FOR TEACHERS AND TRAINERS.pptx
INCLUSIVE EDUCATION PRACTICES FOR TEACHERS AND TRAINERS.pptx
 
Mythology Quiz-4th April 2024, Quiz Club NITW
Mythology Quiz-4th April 2024, Quiz Club NITWMythology Quiz-4th April 2024, Quiz Club NITW
Mythology Quiz-4th April 2024, Quiz Club NITW
 
Grade Three -ELLNA-REVIEWER-ENGLISH.pptx
Grade Three -ELLNA-REVIEWER-ENGLISH.pptxGrade Three -ELLNA-REVIEWER-ENGLISH.pptx
Grade Three -ELLNA-REVIEWER-ENGLISH.pptx
 
Scientific Writing :Research Discourse
Scientific  Writing :Research  DiscourseScientific  Writing :Research  Discourse
Scientific Writing :Research Discourse
 
Multi Domain Alias In the Odoo 17 ERP Module
Multi Domain Alias In the Odoo 17 ERP ModuleMulti Domain Alias In the Odoo 17 ERP Module
Multi Domain Alias In the Odoo 17 ERP Module
 

How to Design a Local SGM Plan

  • 1. How to Design a Local Student Growth Measures Plan Ohio’s Statewide Education Conference October 28-29, 2013 Presented By: Mindy Schulz, Allen County ESC Director of Curriculum eTPES, OTES, OPES State Trainer
  • 2. Agenda Participants will learn: • How to use the OTES & OPES student growth measures (SGM) framework • The steps for designing a local student growth measures plan • How to conduct an inventory of teachers, principals, and assessments • How to analyze SGM categories and current assessments • How to set default SGM percentages • Considerations for the local education agency (LEA) SGM plan
  • 3. Definition of Student Growth For the purpose of use in Ohio’s evaluation systems, student growth is defined as the change in student achievement for an individual student between two or more points in time.
  • 6. What is a Student Growth Measures Plan? “Teacher evaluation as required by O.R.C.3319.111 relies on two key evaluation components: a rating of teacher performance and a rating of student academic growth, each weighted at 50 percent of each evaluation. The following guidance speaks to the student growth measures component, specifically addressing determinations to be made for using student growth measures within teacher evaluation.” – K. Harper Special thanks to Dr. Kathy Harper, Greene Co. ESC, for sharing this definition.
  • 7. ODE Steps for Designing a Local SGM Plan 1) Conduct an inventory of needs and resources. 2) Determine and create (if necessary) student growth measures to be used. 3) Communicate expectations and refine the entire process.
  • 8. How can we collaborate with others and who should we include on our LEA design team? 1. What collaborative opportunities are available with other LEAs, ESCs, Higher Ed? 2. Who should we include on our SGM LEA design team to help plan and communicate to other stakeholders ?
  • 9. What assessments does our LEA have available? 1) Is our LEA using any of the ODE approved vendor assessments? 2) Create an Inventory • Which grade level(s) and course(s), have current ODE approved vendor assessments available? • Which assessments are not on the ODE approved vendors assessment list, but could be used in SLOs? Checklist for Selecting Assessments 3) LEA Considerations: • Does the manner in which our LEA is using the ODE vendor assessment meet the definition of student growth? • Have we secured the vendor assessment growth reports?
  • 10. Who is required to be evaluated by the new evaluation systems? • Any person who is employed under a teaching license or under a professional or permanent teacher’s certificate and who spends at least 50 percent of his/her time employed providing student instruction. This does not apply to a teacher employed as a substitute. This usually excludes:  Speech pathologists, occupational therapists  Teachers on assignment  Nurses, psychologists, guidance counselors • Create an inventory of teachers in your LEA required to be evaluated by the new evaluation system. 10
  • 11. Categorize teachers into 3 groups 1) Who has teacher-level EVAAS Value-Added data, grades 4-8 reading and math? (Category A1 & A2) 2) Who does not have teacher-level Value-Added data, but has data from assessments on the ODE approved vendor assessment list? (Category B) 3) Who has no Value-Added or approved vendor data? (Category C)
  • 12. Value Added Data Timeline 12 Spring 2013 Fall 2013 2nd Semester 2013-2014 May 2014 Spring 2013 Teacher-Level value-added reports (which are actually received in Fall 2013) for Category A1 & A2 Teachers will be used in calculating SGM % based on LEA’s default percentages. Spring 2013 Building-Level value-added results for Category A principals will be used in calculating SGM % based on LEA’s default percentages. 4-8 Reading & Math OAA’s Administered Teacher Level Value Added Reports Released (Spring 2013: 4-8 Reading & Math OAAs) Spring 2013 Teacher and Building Level Value Added reports uploaded into eTPES by ODE.
  • 13. Value-Added Data Usage A1 teacher in the previous year: Value-Added data from the prior year must be used at 26% - 50% in the current year, according to the LEA’s SGM plan A2 teacher in the previous year: Value-Added data from the prior year must be used at 10% - 50% in the current year, according to the LEA’s SGM plan.
  • 14. Who instructed in a value added course in the previous year? Inventory teachers receiving a teacher-level value added report (Fall 2013) from courses instructed in the previous year (2012-13). • Did the teacher receive a teacher-level value added report this fall? • Who instructed all value added courses last year (exclusively)? • Who instructed some value added course(s), but not exclusively? What percent of time was spent instructing in value added course(s)? • What is the teacher’s current (2013-2014) SGM category? • What is the required value-added weight for the current year (2013-14)?
  • 15. Who has data from assessments on the ODE approved vendor assessment list? Create an Inventory • Which teachers are using an ODE approved vendor assessment? Refer to the LEA’s “Available Assessments Inventory”. • Are any Category A teachers using an ODE approved vendor assessment? If a value-added teacher uses an approved ODE vendor assessment, that assessment becomes a local measure.
  • 16. Who has no Value-Added or approved vendor data? Create an Inventory Inventory teachers with no value-added or ODE-approved vendor assessment data. (Category C)
  • 17. Special Considerations: Which Teachers are New to Value Added Assignment for the Current Year? • Inventory teachers that did not receive a value added report (from previous year), in Fall 2013, but have been assigned to a value added course for 2013-14 (current year). This may include:  New teachers, e.g. Year One Resident Educators, new hire  Any teacher that changed assignment from the prior year to the current year, e.g. teacher instructed 3rd grade in previous year, and currently instructs 6th grade math • Determine 2013-14 (current year) SGM category, dependent upon available data.  Are there ODE-approved vendor assessments available? (Category B)  If there are no ODE-approved vendor assessments available, LEA measures will be used. (Category C)
  • 18. Step Two: Determine LEA Default Percentages What percentages will your LEA attribute to: • Value-Added data (Category A1 and A2)? • Assessments from the ODE approved vendors (Category B)? • Local Measures within each category? (Local Measures may also apply to Category A and B)
  • 19. 19 2013-2014: How much will our LEA attribute to Teacher-Level Value-Added Data? 19 O.R.C. 3319.111, O.R.C. 3319.112 A1. Teacher Instructs Value-Added Subjects Exclusively A2. Teacher Instructs Value-Added Subjects, but Not Exclusively Teacher Value Added 26-50% LEA Measures 0-24% Teacher-Level Value Added Proportional to teaching schedule 10-50% 0-40% LEA Measures Proportional to teaching schedule
  • 20. 20 2014-2015: Category A1 = 50% Teacher Value Added 20 A1. Teacher Instructs Value-Added Subjects Exclusively O.R.C. 3319.111, O.R.C. 3319.112 Teacher Value Added 50% A2. Teacher Instructs Value-Added Subjects, but Not Exclusively Teacher-Level Value Added Proportional to teaching schedule 10-50% 0-40% LEA Measures Proportional to teaching schedule
  • 21. How much will our LEA attribute to Building- Level Value-Added Data for Principals?
  • 22. How much will our LEA attribute to the assessments from the ODE Approved Vendor List? B:ApprovedVendorAssessmentdataavailable C: NoTeacher-levelValue-AddedorApprovedVendorAssessmentdataavailable St = = LEAMeasures 0-40% LEAMeasures 0-40% TeacherValue-Added 10-50% S VendorAssessment 10-50%
  • 23. Category B: Special Considerations • How many years has the assessment(s) been administered? • Is there trend data to analyze? • Are there variations in the number of vendor assessments available by course and/or grade level?
  • 24. What LEA measures will be used? Types: • Student Learning Objectives (SLOs) • Shared Attribution Category A Only: • If LEA measures are used, vendor assessment data may be used.
  • 25. What are SLOs? •A goal that demonstrates a teacher’s impact on student learning within a given interval of instruction. •A measurable, long-term academic target written by an individual teacher or a teacher team.
  • 26. What is shared attribution? • Shared attribution is a collective measure. • The LEA determines which measure of shared attribution it would like to use. • Shared attribution could be:  A building or district value-added score Recommended if available  Building team composite value-added score (i.e. the 5th grade VAM score or the middle school reading ELA team’s combined VAM score)  Building-level or district-level SLOs
  • 27. What is shared attribution for principals? • These are measures to encourage collaborative goals. • The LEA determines which measure of shared attribution it would like to use. • Shared attribution for principals may include:  District Value-Added Recommended if available  Groups of schools (such as grade level buildings or regional areas within a district) may utilize a composite Value-Added score  District-based SLOs
  • 28. LEA Default Percentages Percentages for Teachers in Category •A1 •A2 •B •C
  • 29. What Default Percentages will your LEA Set for 2013-14? *This information may appear differently in eTPES Spring 2014. Educator Category Value-Added Vendor Assessment LEA Measures Total = 50% SLOs/Other* Shared Attribution A: Value- Added A1 (exclusive) 26-50% Remaining % may be split among SLOs and Shared Attribution areas 50% A2 (non- exclusive) Representative & Proportionate to Teacher Schedule 10-50% Remaining % may be split among SLOs and Shared Attribution areas 50% B: Approved Vendor Assessment 10% or greater Remaining % may be split among SLOs and Shared Attribution areas 50% C: LEA Measures Remaining % may be split among SLOs and Shared Attribution areas 50% 29 *For Category A, teachers with Value-Added may also include ODE-Approved Vendor Assessment data in this LEA Measures. A special thanks to Dr. Kathy Harper , Greene Co. ESC, for contributing to the contents of this slide.
  • 30. What Default Percentages will your LEA Set for 2014-15? *This information may appear differently in eTPES Spring 2014. Educator Category Value-Added Vendor Assessment LEA Measures Total = 50% SLOs/Other* Shared Attribution A: Value- Added A1 (exclusive) 50% Remaining % may be split among SLOs and Shared Attribution areas 50% A2 (non- exclusive) Representative & Proportionate to Teacher Schedule 10-50% Remaining % may be split among SLOs and Shared Attribution areas 50% B: Approved Vendor Assessment 10% or greater Remaining % may be split among SLOs and Shared Attribution areas 50% C: LEA Measures Remaining % may be split among SLOs and Shared Attribution areas 50% 30 *For Category A, teachers with Value-Added may also include ODE-Approved Vendor Assessment data in this LEA Measures. A special thanks to Dr. Kathy Harper, Greene Co. ESC, for contributing to the contents of this slide.
  • 31. Special Considerations • Are there any special considerations to include in the LEA SGM plan?  Possible Example:  New Teacher • If the district decides to allow variation from the default percentages, they must make manual adjustments within eTPES. Districts should try to be as consistent as possible when setting percentages. Percentages should not be determined by individual teachers or determined based on individual past results. • Inventory any teachers that require special consideration for SGMs.
  • 32. LEA Default Percentages Percentages for Principals in Category •A •B •C
  • 33. 33 SGM Default Percentages for Principals *This information may appear differently in eTPES Spring 2014. 33 *For Category A principals, this could also include the ODE-Approved Vendor Assessment data average of all teachers’ growth ratings. *
  • 34. Determine how the LEA will implement the local measures process. • Will shared attribution measures be used? • Who is required to create SLOs? • Within the guidelines of 2-4 SLOs, how many SLOs are required for each teacher? • Who will be approving the SLOs? • How will SLOs be tracked, through revisions, and to final approval? • What guidance, training, and support will be provided to teachers and evaluators?
  • 35. Will shared attribution measures be used? • What shared attribution measures are we using? • Have we secured the proper reports? • Will the same shared attribution measures be used for all teachers within each SGM category? Note: Only one shared attribution measure may be used per teacher.
  • 36. Will SLOs be used? • Who is required to create SLOs?  Which categories of teachers will have LEA measures?  Did we select SLOs as an LEA measure?  Which SGM categories will this include? • Within the guidelines of 2-4 SLOs, how many SLOs are required for each teacher? • What assessments will be used?  Refer to the LEA’s “Available Assessments Inventory”  If assessments do not exist for certain grade level(s) and/or courses, have we followed the “SLO Guidelines for Selecting Assessments”?  Will we have a district-approved list of SLO assessments? • Inventory teachers writing SLOs, # of SLOs, and courses SLOs will be written.
  • 37. SLO Approval • Who is approving SLOs in our LEA?  LEAs are responsible for SLO approval.  ODE recommends this process is completed by a committee(s). • Has SLO calibration been completed?  SLO calibration is the process of ensuring a thorough and fair review of all SLOs by systematically requiring high quality and rigor across SLOs.
  • 38. SLO Procedures •How will SLOs be tracked? Submission Revisions Final Approval •What guidance, training, and support will be provided to teachers and evaluators?
  • 39. SLO Tracking Form Teacher Name SLO Event Date Completed Original SLO Submission Committee Feedback Provided to Teacher SLO Approval Midpoint Check-In (recommended, not required) SLO End-of-Interval Scoring and Conference Final SLO Score Entered in eTPES 39
  • 40. SLO Professional Development Form (Example) Grade Level ELA Math Science Soc. St. P.E. Art Music Other (__________) Other (__________) K 1 2 3 4 5 6 7 40
  • 41. Step Three: Communicate Expectations and Refine the Entire Process • Design communication plans, training, and professional development opportunities around requirements and implementation for teachers and their evaluators.  How will the SGM plan be communicated to all stakeholders?  What training will be provided to educators, e.g. SLO training, Interpreting Teacher-Level Value Added Reports, SGM training?  What ongoing professional development and support will be needed to sustain and refine the process?
  • 42. SGM Professional Development Form (Example) Date Agenda Items Target Audience (Identify which teachers will attend the training) Follow-Up Training Date (if applicable) Follow up Training Agenda Items Target Audience (Identify which teachers will attend the training) 42
  • 43. Additional Session Resources To access session resources, including the step-by-step workbook and templates on how to design your own LEA SGM Plan, go to: http://bit.ly/SGMPlan
  • 44. Works Cited • Education, Ohio Department of. (2013, May 12). Steps for Designing a Local Student Growth Measures Plan. Retrieved from Ohio Department of Education: http://education.ohio.gov/Topics/Teaching/Educator-Evaluation-System/Ohio-s- Teacher-Evaluation-System/Student-Groth-Measures/Additional-Information/Steps-for-Designing-a-Local-Student- Growth-Measure • LaWriter Ohio Laws and Rules. (2013, March 22). 3319.111 Applicability of section; evaluating teachers on limited contracts. Retrieved from LaWriter Ohio Laws and Rules: http://codes.ohio.gov/orc/3319.111 • LaWriter Ohio Laws and Rules. (2013, September 29). 3319.112 [Effective 9/29/2013] Standards-based state framework for the evaluation of teachers. Retrieved from LaWriter Ohio Laws and Rules: http://codes.ohio.gov/orc/3319.112v2 • Ohio Department of Education. (2013, July 26). Approved Vendor Assessments. Retrieved from Ohio Department of Education: http://education.ohio.gov/Topics/Teaching/Educator-Evaluation-System/Ohio-s-Teacher-Evaluation- System/Student-Growth-Measures/Approved-List-of-Assessments#approved • Ohio Department of Education. (2013, September 13). Local Measures. Retrieved from Ohio Department of Education: http://education.ohio.gov/getattachment/Topics/Academic-Content-Standards/New-Learning-Standards/Student- Learning-Objective-Examples/041113-Guidance_on_Selecting_Assessments_for_SLOs.pdf.aspx • Ohio Department of Education. (2013, September 13). Local Measures. Retrieved from Ohio Department of Education: http://education.ohio.gov/getattachment/Topics/Academic-Content-Standards/New-Learning-Standards/Student- Learning-Objective-Examples/112912-SLO-Requirements-and-Recommendations.pdf.aspx • Ohio Department of Education. (2013, September 25). Student Growth Measures for Teachers. Retrieved from Ohio Department of Education: http://education.ohio.gov/getattachment/Topics/Teaching/Educator-Evaluation-System/Ohio- s-Teacher-Evaluation-System/Student-Growth-Measures/091913_Business-rules-for-SGM-FINAL-040913-3.pdf.aspx • Ohio Department of Education. (2013, September 25). Student Growth Measures for Teachers. Retrieved from Ohio Department of Education: http://education.ohio.gov/getattachment/Topics/Teaching/Educator-Evaluation-System/Ohio- s-Teacher-Evaluation-System/Student-Growth-Measures/091913_Combining-the-SGM-scores-one-pager.pdf.aspx
  • 45. Acknowledgements A special thank you to the following individuals for editing, proofreading, and helping shape my knowledge of the information contained in this presentation: • Dr. Kathy Harper, Coordinator/Liaison for State- School Initiatives, Greene County Educational Service Center • Carolyn Everidge-Frey, Assistant Director, Office of Educator Equity and Talent, Ohio Department of Education
  • 46. Questions Steps for Designing a Local SGM Plan Mindy Schulz Director of Curriculum, Allen County ESC mindy.schulz@allencountyesc.org ODE Contact Information: Student Growth Measures SGM@education.ohio.gov (P) 614-644-7446

Editor's Notes

  1. Ask by a show of hands how many people this graphic looks familiar to---- explain on the surface the categories seem fairly simple. However, as we progress through the session, you will learn how to delve deeper into each category (more complex thinking required).A teacher can only be in one category. Always start at the top & wherever the teacher meets the criteria, that is their category. Provide broad overview of each category. A1 = exclusively teaches value added; A2 = some value added, some not; B= no value added, but has ODE-approved vendor assessment(s); C= no value added or vendor assessment data
  2. Acknowledge & thank Kathy for assistance with definition, editing & providing feedback on PPT.
  3. Reference this info is on ODE’s website. This session will elaborate on each step.
  4. 1.) Share about experiences in Allen Cty. last year. All LEA’s collaborated over the course of 3 meetings. Mention participants are given a workbook-----step by step companion guide for this session to be able to implement these steps in their own LEA.2.) Share about Allen Cty. LEA’s ----- Most sent the supt., principal(s), BOE member, union reps, members of transformation team, etc. ----- Ultimately, it us up to each LEA to determine who will be on their SGM design team.
  5. Ask how many have seen the ODE-approved vendor assessment list? Last updated 2/21/12. Ask how many are familiar with the checklist for selecting assessments.
  6. Explain a teacher has to have the value added report “in hand”, in order to be a category A teacher. Value added is determined by the schedule the teacher instructed the prior year.
  7. Refer to “Business Rules for SGMs” – 9/19/13 edition
  8. Share about Bluffton using STAR Early Literacy for 13 years (began in 2000 with STAR Reading & upgraded as STAR morphed). Compare them with Spencerville----just purchased STAR Early Literacy last year. When determining how much weight to place on vendor assessment, it is important to note the length of time the assessment has been in place & the amount of trend data available……establishes greater level of confidence for determining % to apply to vendor assessments. Bluffton placed a higher % on their vendor assessments because they have 13 yrs. of trend data to analyze.
  9. AIR Module 6 ----- Share calibration is threaded throughout entire OTES & OPES systems……credentialing training requires calibration on the performance rating rubrics; emphasize need to calibrate on any type of rubrics-----rubrics used in SLOs for assessments, etc.
  10. Explain downloadable, electronic templates of all of the inventory steps are available here.