Building Data Literacy Among Middle School Administrators and Teachers
Data literacy is an essential trait for middle school administrators and teachers to possess. In this session, the Research and Accountability Team from Durham Public Schools will discuss how it has expanded its focus on Data-to-Action to building data literacy amongst its middle school administrators and teachers during 2013-14.
J. Brent Cooper, Terri Mozingo & Karin Beckett Durham Public Schools - Durham, NC
Building Data Literacy Among Middle School Administrators and Teachers
1. Data-to-Action: Building Middle School Administrators’
and Teachers’ Data Literacy Capacity
in Durham Public Schools
4/10/2014 1
2014 North Carolina Association of
Middle Level Educators Conference (NCAMLE)
March 17, 2014
Dr. J. Brent Cooper Dr. Terri Mozingo
2. Desired Outcomes
• Describe actions, strategies, and trainings within DPS that support the
development of a Data-to-Action, data literate culture amongst central
office/school administrators and teachers.
• Share ways school administrators’ data literacy skills have been developed
through work with the Deputy Superintendent of Academic Services, Dr.
Stacey Wilson-Norman, and Research and Accountability during 2013-14.
• Present the Area Network Model as applied to the work of the Middle School
Area Network through the leadership of Dr. Julie Spencer, Middle School Area
Superintendent.
• Summarize lessons learned.
• Project the course of Data Literacy trainings for the rest of 2013-14 and for the
2014-15 school year.
2
4. Significance of Data Literacy Training
• Increase emphasis on
developing high-
quality, data literate
school administrators to
lead schools.
• Empower principals
and central office
leaders to develop data
literate assistant
principals and teachers.
4
5. What Does it Mean to Be Data Literate?
A data literate person possesses the
knowledge to gather, analyze, synthesize,
and convey information graphically and in
writing to support decision making.
4/10/2014 5
6. Skills of a Data Literate Person
Data
Location
Data
Comprehension
Data
Interpretation
Instructional
Decision Making
Question Posing
4/10/2014 6U.S. Department of Education report Teachers' Ability to Use Data to Inform Instruction: Challenges and Supports (2011)
Find Data Evaluate Data
Use Data
7. Data Location
• The ability to find relevant data
• The ability to manipulate data from a complex
table or graph to support reasoning
4/10/2014 7In the U.S. Department of Education report Teachers' Ability to Use Data to Inform Instruction: Challenges and Supports (2011).
8. Data Interpretation
• Examining score distributions
• Understanding the effects of outliers
• Appreciating limits of generalizability
• Understanding measurement error
4/10/2014 8U.S. Department of Education report Teachers' Ability to Use Data to Inform Instruction: Challenges and Supports (2011).
9. Data Use for Instructional Decision Making
• Understanding the value of scale scores and item
level data
• Using student data to plan differentiated
instruction based on student needs
• Synthesizing multiple data sources to inform
instructional practices
4/10/2014 9
U.S. Department of Education report Teachers' Ability to Use Data to Inform Instruction: Challenges and Supports (2011),.
10. Question Posing
• Aligning questions with purpose and data
• Forming queries that lead to actionable data
• Appreciating the value of multiple measures
4/10/2014 10
U.S. Department of Education report Teachers' Ability to Use Data to Inform Instruction: Challenges and Supports (2011), the
research team identified five skills
11. Four Types of Data
4/10/2014 11
Type Definition Example
1) Achievement or
Assessment Data
Data used to determine the level of
student achievement in a particular
content area
Performance-based assessments, written
exams, or quizzes
2) Demographic Data Data that provides descriptive
information about the school
community and the students served in
the community
Enrollment, gender, ethnicity, economic
status, student attendance, grade levels,
school suspensions, limited English
proficiency status, and special education
status
3) Program Data Data that defines the programs,
instructional strategies, and classroom
practices of the teachers
Assessment/Achievement Data, Attendance,
Perception Data, and Implementation
Records
4) Perception Data Data that tells us what students,
parents, staff, and others think about
the learning environment
Questionnaires, interviews, surveys, and
observations
Source: Guide to Using Data in School Improvement Efforts: A Compilation of Knowledge From Data Retreats and Data Use at
Learning Points Associates (December 2004)
12. Assessment and Achievement Data Tiers
4/10/2014 12
Assessment
Purpose
Rate of
Feedback
Type of
Feedback
Example Primary Target
of Feedback
Tier IV
(Federal)
Annual large scale
testing for Federal
Accountability
Infrequent General and
broad
Condition of
Education
Federal and State
Policymakers
Tax payers
Administrators
Tier III
(State)
Annual large scale
testing for
accountability
Infrequent General and
broad
Summative
End-of-Grade Exams
End of Course
Exams
Policymakers
Board Members
Community
Administrators
Tier II
(District)
Periodic grade level
and subject area
Infrequent/
Frequent
Specific and
Formative
Local Assessments Administrators
Teachers
Tier I
(Classroom)
On-going
classroom
Frequent Specific and
narrow
Formative
Unit exams
Classroom projects
Homework
Class work
Teachers
Students
13. Questions to Guide Achievement
Data Collections
• What evidence can we collect about our students’ learning?
• What evidence do we have that shows the knowledge, skills,
and understandings our students have achieved?
• Which data indicate the degree to which our students show the
conceptual understandings and generalizations in our
standards?
• What evidence shows which students are meeting or exceeding
our achievement expectations and which are not?
• What do we know about how each individual student learns?
4/10/2014 13
14. Locating Achievement or Assessment Data
•PAPA Datamining Reports
•Achievement Series
• State Testing Portal
•mClass
Local
(Tier II)
•NCDPI – Data and Statistics Website
•University of North Carolina General Administration
•Education Value-Added Assessment System (EVAAS)
State
(Tier III)
• National Assessment of Educational Progress
• Federal Data Warehouse
National
(Tier IV)
4/10/2014 14
15. Types of Demographic Data
Student Demographics
• Gender
• Ethnicity
• Limited English Proficiency
• Exceptional Children
• Parent Characteristics
School Demographics
• Mobility Patterns
• School Transportation Needs
• LEP and EC Participation Rates
• Suspension/ Expulsion Rates
• Promotions / Retention Rates
• Graduation/ Dropout Rates
• Free/Reduced Lunch Percentages
• Neighborhood Characteristics
• Parent Involvement
• Behavior and Social Problems of
Students4/10/2014 15
16. What is a Program Evaluation (PE)?
What is a Program
Evaluation?
• A program evaluation is a
purposeful, systematic, and
careful collection and
analysis of
programs, establishing
accountability, and
identifying areas needing
change and improvement.
What are the goals of a
Program Evaluation?
16
• The generic goal of most
evaluations is to provide “useful
feedback” to a variety of
audiences including sponsors,
donors, client-groups,
administrators, staff, and other
relevant constituencies.
17. Why Evaluate Programs?
Evaluations are conducted to:
• Gain insight about a program and its operations – to see where we
are going and where we are coming from, and to find out what
works and what doesn’t.
• Improve practice – to modify or adapt practice to enhance the
success of activities.
• Assess program effects to determine how well we are meeting the
goals and objectives, how the program benefits the community, and
to provide evidence of effectiveness.
• Help build capacity by increasing funding, enhancing skills, and
strengthening accountability.
17
18. What are the Types of Evaluations?
Formative Evaluations
Evaluability assessment
Implementation
evaluation
Needs assessment
Process evaluation
Structured
conceptualization
Summative Evaluations
Cost-effectiveness and
cost-benefit analysis
Impact evaluation
Meta-analysis
Outcome evaluations
Secondary analysis
18
19. Examples of the Types of PEs
Formative Evaluations
Evaluability assessment
Implementation evaluation
Needs assessment
Process evaluation
Structured
conceptualization
Examples of FPEs
Afterschool Programs
Common Core Implementation
Dropout Programs
Helpdesk Process
Literacy Program
19
20. Examples of the Types of PEs
Summative Evaluations
Cost-effectiveness and cost-benefit
analysis
Impact evaluation
Meta-analysis
Outcome evaluations
Secondary analysis
Examples of SPEs
One-to-One Laptop Initiatives
Online Instruction
Effects of Early Education on
Kindergarten Readiness
Science, Math, Engineering and
Technology (STEM) Program
School Management Practices
on Student Performance
20
21. Constituents of School Level Perception
Data
4/10/2014 21
Internal Constituents External Constituents
Students and Parents Community Citizens
Teachers and Staff Community Businesses
Administrators Regional Colleges and Universities
School Board
22. Guiding Questions for Collecting Perception
Data
• How do the members of our school community feel about our
school and district?
• How satisfied are school community members about our
educational programs?
• What do the members of our school community perceive to be
the strengths and needs of our school?
• What do members of our business community think about the
skills of our graduates?
4/10/2014 22
23. Educate…Previous vs. Now
Previous
• ABCs
• Grade-level Proficiency
focus
• Different state and
federal models
Now
• NC READY
• Career and College-
Ready focus
• Closing Achievement
Gaps
• State and Federal share
AMOs and targets
• Teacher Effectiveness
23
24. Objectives of Data Literacy Trainings
• Discuss components of NC READY state model
• Review the federal and state Annual
Measurable Objectives (AMOs)
• Review the 2012-13 Data Reports
• Discuss implications of data and next steps
• Provide additional support to schools
24
25. Educate… 2012-13 NC READY and AMO Guide
• NC READY Indicators: Definitions
• State and Federal Annual Measurable Objectives
(AMOs) Indicators (Federal, State, and AMO
Definition)
• AMOs Terminology/Definitions
4/10/2014 25
26. Ma
End of Grade/
End of Course
(reading, mathematics,
science)
ACT
Graduation Rates
Math Course Rigor
WorkKeys
Graduation Project
Δ End of Grade/
End of Course (reading,
mathematics, science)
Δ ACT
Δ Graduation Rates
Δ Math Course Rigor
Status
Indicators
“this year”
Progress
Indicators
“trend over time
for groups of
students”
Growth
Indicators
“combined
individual
student’s growth”
26
EVAAS
Growth
School-wide
Δ WorkKeys
27. Communicate…
Our School’s 2012-2013 Data Announcement
• Performance Composite
• School-wide Growth Status
• Strengths
• Areas of Opportunity
27
28. System of Supports
Communication
Do you need help
framing your story?
Understanding Data
Do you need help
understanding your
data?
Next Steps
Do you need help
with next steps?
28
29. Systems of Support
COMMUNICATION
• Public Information
and Community
Engagement
(PICE)
DATA SUPPORT
• Research and
Accountability
NEXT STEPS
• Area Support
(Elementary, Midd
le, High School)
29
30. Next Steps
School Improvement
Plan Review and
Revisions
Student Intervention-
School Level Reports
in coming weeks
EVAAS Value-Added
Reporting - More
information after
Thanksgiving
Communicating with
the Community – No
Surprise, Baseline
Year, Continue Focus
on Growth
30
31. Research and Accountability
December 4, 2013
“It’s not about giving the
assessments; it’s about
doing something about
the results”
Dr. Doug Reeves (2005)
EVAAS
Growth
AMO
Targets
Proficiencies
Academic Services Data Summit
Research
Articles
32. Data Release
• After the initial release of the data, we will do
the following, as a middle school:
– Review the data with our School Improvement
Team
– Review student data to determine if we need to
provide intervention
– Provide teachers more information about the
EVASS value-added data that is included in
Standard 6 of the teacher evaluation instrument
– Host a Parent Meeting to go over our school data 32
34. Essential Questions: AMO Targets
34
Question to Guide Student AMO Targets Discussion.
What are our biggest AMO target challenges?
35. Essential Questions: Proficiencies
35
Questions to Guide Student Proficiencies Discussion.
1) Where are our biggest opportunities to move students to proficiency?
2) How have the new cut scores impacted AIG Proficiencies?
3) How will the new AMO’s goal impact AIG Identification?
36. Essential Questions: EVAAS Growth
36
Questions to Guide Education Value Added Assessment System (EVAAS)
Discussion.
1)
2)
In what grade levels are students experiencing academic growth?
In what grade levels are students not experiencing academic growth?
3)
4)
In what schools are students are experiencing academic growth?
In what schools are students not experiencing academic growth?
5)
6)
In what subjects are students experiencing academic growth?
In what subjects are students not experiencing academic growth?
7)
8)
9)
In what grade are teachers not demonstrating teaching effectiveness?
In what schools are teachers not demonstrating teaching effectiveness?
In what subjects are teachers not demonstrating teaching effectiveness?
10)
11)
12)
13)
In what schools are minority students experiencing the greatest achievement gaps?
In what subjects are minority students experiencing the greatest achievement gaps?
In what schools are exceptional students experiencing the greatest achievement gaps?
In what subjects are exceptional students experiencing the greatest achievement gaps?
37. Elementary Area School-Level
EOG Test Proficiency Questions
37
Essential Questions to Guide the Elementary Area School-Level EOG Test Proficiency Discussion
1)
Are there schools with large opportunities for EOG test proficiency improvements (Focus Schools) for
multiple grades levels and/or student groups?
2)
Are there schools with declining EOG test proficiency when analyzing a school’s data across grades for all
student groups, which could lead to a negative effect on feeder middle schools 6th grade students’
proficiency?
3)
Are there student groups with opportunities for EOG test proficiency improvements in the Elementary
Area?
38. Impact of Data Literacy on Teaching
and Learning
• Data literacy capacity building will lead to continuous analysis of:
NC READY, AMO, and EVAAS data, and in turn local assessment
data.
• Student achievement data will be analyzed for the school and
district by: grade level, test, gender, student group, and by
teacher.
• Program evaluations will be completed internally and externally
for many current DPS programs (process was developed and
approved, Summer 2013).
39. Discussion
Brainstorm…
• How are you building data literacy capacity in
your school/district?
• Which groups are you focusing on building
data literacy capacity during this school year?
39
40. Discussion
Brainstorm…
• Through what types of professional
development activities are you building your
school’s/district’s data literacy capacity?
• What are your school’s/district’s goals in
building data literacy capacity?
40
42. Deputy Superintendent’s Support for
Building Data Literacy Amongst All DPS
Stakeholders
Teaming for Excellence
• Documents were created and Data Release Meetings were
planned in a collaborative effort by:
1- the Superintendent
2- Deputy Superintendent
3- Academic Services
4- Research and Accountability and
5- Public Information and Community Engagement (PICE)
42
43. Deputy Superintendent’s Support for
Building Data Literacy Amongst All DPS
Stakeholders
Teaming for Excellence
• The goals of these efforts were:
• Ensure an understanding of the NC READY, AMO, and EVAAS
Data and that principals were well prepared when they
communicated the accountability results to all DPS Stakeholders
• Create a Data Release Toolkit to support the principals
• Ensure that terms were understood and specific points were
clarified
• Develop Data Release Guides
43
44. Deputy Superintendent’s Support for
Building Data Literacy Amongst All DPS
Stakeholders
• Develop and Offer 2013 NC READY and AMO Data Release
Training Sessions for principals by school level (area), (October-
November 2013)
• Data Release Guides developed for school administrators by
School Level
• Data Release Toolkit developed to educate all DPS Stakeholders
about the release of NC READY, AMO, and EVAAS data
• Data Release parent and faculty meetings planned with
corresponding ConnectEd messages created and sent to
educate all DPS Stakeholders about the release of NC
Ready, AMO, and EVAAS data
44
45. Deputy Superintendent’s Support for
Building Data Literacy Capacity in DPS
• Develop and Offer 2013 NC READY and AMO Data
Literacy Training Sessions for Principals, October-
November 2013
• Develop and Offer 2013 Central Office Data Literacy
Summit for Central Office Instructional Support,
December 2013
45
46. Deputy Superintendent, Area Superintendents, and Research and Accountability
2013-14
Central Office/School
Administrator Data
Literacy Capacity
Identification of Area
Focus Schools
2012-13 NC READY
Data Dives
Establishment of Area
Networks
Steps to Building Data Literacy Capacity in DPS
2012-13 NC
READY Data
Summits
47. Deputy Superintendent’s Support for
Building Data Literacy Capacity in DPS
Area Networks
• Implement Area Networks for the Elementary,
Middle, and High School Areas to focus data analysis
and data literacy capacity building, December 2013.
• Conduct quarterly Data Dives for School
Administrators and Central Office Instructional
Support in our continuous data analysis,
modification, and re-teaching of curriculum
objectives, Spring 2014-Fall 2015.
47
49. Area Networks
Purpose
Central Services departments provide rapid response
services to each Area of schools through coordinated
collaboration and communication around key issues.
The purpose of these meetings are to remove barriers,
address issues and respond to school needs in a timely
and effective manner. Each meeting should be results-
based with clear action items and timelines to address
barriers for schools.
4/10/2014 49
51. Area Networks
Structure
Each Area Superintendent will lead the assigned
network. The Executive Leadership supervisor will
assign individuals to the appropriate network to
provide leadership and service for their respective
departments. Every Network will have representation
for content areas and departments. All networks will
meet for two hours the 3rd Wednesday of the month.
A standing agenda has been designed to ensure
efficient, consistent and effective monthly meetings.
4/10/2014 51
52. Area Networks
Members
4/10/2014 52
Network Members Network Members
Curriculum, Instruction, and
Assessment
Research and Accountability
Exception Children’s Program Area Facilitators
Human Resources Student, Family, and Community
Services
Leadership Development Title I
State Transformation Team
53. Middle School Area Network
Team Meeting 1
1. Establish a Middle School Area Network Working
Plan
2. Discussion of MSAN Data and the purpose of
making data-driven decisions
3. Introduction to the MSAN Purpose
4. Creation and administration of Middle School
Network Priority Survey- identify the most
important problems which need immediate
attention from the MSAN
53
54. Middle School Area Network
Team Meeting 2
1. Updates from the current Middle School Network Working
Plan (planning days, data dives, Focus Middle School
support, mentors, etc.)
2. Comprehensive Planning…what is the Middle School Area
Network’s #1 problem to be addressed by the MSAN
3. Data Driven Problem Solving- problem solving in a particular
area based on a data point with resulting action items
4. Establish MSAN Work Groups around major problems which
need immediate attention from the MSAN.
54
55. Middle School Area Network
Team Meeting 3
1. Reporting out from Work Group collaboration
55
58. Deputy Superintendent’s Support for
Building Data Literacy Capacity in DPS
Focus Schools
• Implementation of Area Focus School Models in an
effort to provide extra resources to schools with
greater academic needs.
• Focus Schools identified by an analysis of the
previous year’s NC READY, AMO, and EVAAS data, Fall
2014-15.
• Six Focus Middle Schools were identified to provide
additional central services support for the remainder
of the 2013-14 school year.
58
59. Middle School Area Superintendent’s
Support for Building Data Literacy
Capacity in Focus Middle Schools
• Three collaborative Focus School Team Meetings
between Focus Middle School School
Improvement Teams and Central Services Support
Staff held at the Focus School.
• The Focus Middle School team includes
representatives from the multiple Central
Services departments.
59
60. Focus School Data Analysis
60
Data Sets
(CS
Depts)
NC READY
(AS/R&A/
CIA)
AMO
(AS/R&A/
CIA)
EVAAS
(AS/R&A/
CIA)AIG/EC/
ESL
(CIA/EC/
ESL)
Discipline:
PBIS/OSS
(SFCS)
Staffing
(HR)
62. Focus School Team (School)
62
Focus
School
Team
ESL/LEP
Facilitator
School
Principal
School
Improvement
Team
RtI
Facilitator
AIG
Facilitator
EC
Facilitator
63. Focus School Team (District)
63
Focus
School
Team
Middle
School
Area
Supt. MS Area
Facilitators
(CIA, EC,
etc)
Executive
Director of
ESL
Coordinator
of MS
Staffing
District AIG
Coordinator
District PBIS
Coordinator
64. Focus Middle School
Team Goals
1. Pre-Meeting Analysis of the Data
2. Discussion of the Data
3. Introduction to the Focus School Purpose
4. Identification of Action Items for the Next
Meeting
64
65. Focus Middle School
Team Meeting 1
1. Pre-Meeting Analysis of the Data
2. Discussion of the Data
3. Introduction to the Focus School Purpose
4. Identification of Action Items for the Next
Meeting
65
66. Leaders and Learners Field Trips
Leaders
&
Learners
MS
Principals
ES
Principals
Central
Services
Staff
HS
Principals
66
67. Leaders and Learners Field Trips
L & L
Field
Trips
Piedmont
Open IB MS
(Charlotte-
Meck)
Stories
Creek ES
(Union)
Rock
Rest ES
(Person)
Dudley HS
(Guilford)
67
68. 4/10/2014 68
Data Literacy Capacity Building
Review
Principals
Central Office Data
Summit
Area Data
Dives
Data Literacy Capacity Building 2013-14
69. 4/10/2014 69
Data Literacy Capacity Building
Next Steps
Leaders and Learners Field Trip (3/13/14)
(Charlotte-Meck./Guilford/Person/Union)
Char
Area Networks
Symposium (Monthly)
Summer Leadership
Retreat
June 2014
Spring 2013-14 and Beyond
70. Lessons Learned
• Time must be protected for principals’/central office data literacy skills to
be developed (Monthly Principals’ Meetings, Monthly Area Principals’
Academies, Area Networks, School/Area/Curriculum and Instruction
Data Dives).
• Principals must go back to their schools and develop the data literacy
capacity of the assistant principals and teachers (Faculty Meetings,
Grade and Subject-Level PLCs, School Improvement Teams).
• Data literacy skills must be applied to a comprehensive school and
district-level analysis of NC READY and EVAAS data.
• Data literacy capacity building is a continuous process.
• Data literacy capacity building will support Data-to-Action, data-driven
decisions. 70
71. “If you want to lift ten pounds, you can do it by
yourself. If you want to lift two-hundred pounds
you’d better have a team.” ~old adage
71
T - Take time to build the foundation
E - Engage in data-driven dialogue
A - Act together to improve instruction
M - Monitor implementation and results
72. Discussion
Brainstorm…
• Are there questions or comments concerning
data literacy capacity building in DPS, your
middle school(s), or your school district?
72
73. Dr. Stacey Wilson-Norman
Deputy Superintendent of Academic Services
stacey.wilson-norman@dpsnc.net
Dr. Terri Mozingo
Assistant Superintendent of Research and Accountability
terri.mozingo@dpsnc.net
Dr. Julie Spencer
Middle School Area Superintendent
julie.spencer@dpsnc.net
Dr. J. Brent Cooper
Data Analyst, Program Evaluator, Coordinator of Surveys and Research
jason.cooper@dpsnc.net
73
Notes de l'éditeur
There are four types of data. Achievement/Assessment Data, Demographic Data, Program Data and Perception Data.
Notes: Tier III assessments are designed primarily for accountability purposes. While the primary target of the feedback are policy makers, board members, community members and administrators, tier III data can also be used by curriculum teams to evaluate the effectiveness of curriculum. Teachers, students and parents can also use Tier III data although it has limited use for these groups because it is only administered once a year.Tier II assessments provides immediate results of student performance in key standards-based skills in a content area and grade level. The assessments are administered periodically throughout the year. They are used to assist teachers and administrators in tracking student progress, strengths and weaknesses in particular content areas. They should be used to create fluid instructional groups. Data should be used in Professional Learning Communities.Tier I occurs at the classroom level. Assessing students should be a natural part of every teachers repertoire. This repertoire should include multiple measures that will assess the depth of each students conceptual understanding as well as their knowledge and skills.
All data should be collected in 5 year increments so that trends can be viewed and predictions made. Data should be organized in a way that will facilitate comparisons from year to year. I recommend an EXCEL spreadsheet.
The entire testing landscape has changed. The ABC model has been replaced with the new NC READY accountability model. While the ABCs focused on grade-level proficiency, NC READY focuses on proficiency as well as college and career readiness. The AYP performance targets have been replaced with AMO targets.Additionally, this model changes how we evaluate teachers but we will discuss this further at the Nov. 14 meeting.
Today we hope that you will leave with a clear understanding of the components of the NC READY Accountability Model as well as the how the federal and state AMOs are tracked and calculated. Later on today, we will also provide you with your school-level data reports. I have asked PICE, your area superintendent and R&A to provide on-site assistance with the goal of helping you understand your data, communicate your results and address any another needs you may have as a school leader.
The NC READY Accountability Model measures school performance in 3 ways.Status Indicators answer the question: How well are students doing this year? The indicators make-up the performance composite score and gives the community an overall impression of the school achievement. The performance composite score focuses on proficiency.Growth Indicators answer the question: Given where students start, how much was the school able to help them grow? The school growth indicator utilizes EVAAS to determine if a school has: not met, met or exceeded school-wide growth levels. Please note we will not discuss teacher effectiveness today.Progress Indicators answer the question: How much progress are cohorts making from one year to the next on the performance indicators? This data allows us to look at data over time to determine trends.
You can announce your school’s performance composite, growth status as a school.You can share overall general thoughts regarding your school’s specific strengths (i.e., Reading/ELA EOG, English II EOC, attendance). *I would focus on 2-3 big rocks.You can share overall general thoughts regarding your school’s specific areas of opportunity (i.e., science, math in all grades, participation).*I would not go into details with subgroup, or grade/content specific details at this time. The focus on today’s meeting is to ensure that everyone is prepared to hear the release. A data meeting should be scheduled after November 7th.
We are here to offer you support in interpreting the dataWe will provide you with toolkit for both your staff and your parents
From this point on we will:-revisit our SIP-identify specific students who need intervention and attention-we will talk more about our EVAAS data in the coming weeks-clearly communicate with our community that we know that this is not a surprise, that this is our baseline year and we look forward to moving the needle, and we will continue our focus on every child growing.