1. DAI
Instructional Data Analysis
&
Data Teams
Karen Brooks
Coordinator Instructional Data Analysis, Technology & Assessment
May 6th
, 2014
kbrooks@ulsterboces.org
http://www.karenbrooks.wikispaces.com
http://www.slideshare.net/kbrooks
2. Focus for Today
• Instructional Data Analysis &
Data Teams:
– What are they?
– How do they function?
– How do they relate to me?
3. Break Down for Day
• 9 AM Begin
• 10:45 -11:00 Break
• 12:00 – 12:30 Lunch
• 1:45 – 2:00 Break
• 3:00PM Conclude
8. Data Driven Teams
Data Driven Schools
Three Pillars of NYS Education
• NYS Common Core Learning Standards
• http://www.engageny.org/resource/new-york-state-p-12-common-core-learning-standard
• Data Driven Instruction
• http://www.engageny.org/data-driven-instruction
• APPR
• http://www.capregboces.org/APPR/FAQs.c
fm
10. So What is a Data Team?
• A data team can generally be defined
as a group of educators
collaboratively using data to identify
and understand opportunities for
improvement, then working together
to make changes that get
measureable results. Using protocols
for collaborative inquiry, the group
follows a process in which members
prepare, implement, and reflect on
data-informed actionable goals.
• This simple definition can be applied
broadly at many levels within a
district.
• At the classroom level, teachers use
data to identify student learning
problems and work together to plan
instructional changes that will yield
improvements in learning.
• At the school level, principals and
school improvement teams use data
to identify goals to drive
improvements in the ways teachers
collaborate and learn, thereby
improving results for all students.
• Within a district office, many
departments and leaders use data to
make decisions regarding the
management and efficiency of their
particular responsibilities.
Cheat Sheet:
http://www.sde.ct.gov/sde/lib/sde/pdf/curriculum/cali/dddm_desktop_reference_guide.pdf
11. Instructional Data Team Standards
Rubric
• http://www.sde.ct.gov/sde/lib/sde/pdf/curriculum/cali/standards_for_instructional_data_teams.pdf
Sections
• Membership
• Structure
• Process
• Gap Analysis
• Instructional
Action Planning
Always looking at
process not people.
17. Taking Multiple Measures Further
What data would we need in
each of these areas?
• Demographics
• Perceptions
• Student Learning
• School Improvement
Next Steps
• What is the source for this
Data?
• Who has the access?
18. District Data Team Samples
• http://www.middletownschoo
ls.org/page.cfm?p=7538 -
Website
• http://www.sde.ct.gov/sde/lib
/sde/pdf/curriculum/cali/3bdt
datateammeetingsteps.pdf
Collecting, Charting and
Discussing
• https://docs.google.com/a/uls
terboces.org/viewer?
url=http://www.sde.ct.gov/sd
e/lib/sde/word_docs/cali/data
_teams/guidelines_for_data_
walls.doc Data Walls
GuideLines
20. Things are not always what they first
appear
Need to keep focused
21. Data Collected Needs to be Consistent
in order to have Validity
• School Data a Comedy – Dirty Data – Example
PBIS
• https://www.youtube.com/watch?
v=XBv95uMFudE
22. Data Teams
5 Essential Questions They Ask
• Where are we now?
• Where do we want to be?
• How did we get to where we are now?
• How are we going to get to where we want to
be?
• Is what we are doing making a difference?
25. Some Start with a
Data Team Survey.
This focuses the
team.
It also helps to
focus the essential
or guiding questions
that keep the team
on track.
Simply put, what is
the problem of
focus?
26. Definition of an Essential Question
"A guiding question is the fundamental query
that directs the search for understanding."
Guiding questions help provide focus.
27. What do Guiding Questions Look Like?
CHARACTERISTICS:
• Good guiding questions are open-ended yet
focus inquiry on a specific topic.
• Guiding questions are non-judgmental, but
answering them requires high-level cognitive
work.
• Good guiding questions contain emotive
force and intellectual bite.
• Guiding questions are succinct. They contain
few words but demand a lot.
ADVICE FOR DEVELOPING GOOD
GUIDING QUESTIONS:
• Determine the theme or concept you want to
explore.
• Brainstorm a list of questions you believe might
cause your to think about the topic but that
don't dictate conclusions or limit possible
directions of investigation. Wait to evaluate and
refine the list until you have several possibilities.
• The question must allow for multiple avenues
and perspectives.
• Consider the six queries that newspapers
answer: who, what, when, where, how, and why.
28. Sample Guiding Questions
SAMPLE GUIDING QUESTIONS for DATA STUDY:
• How do student outcomes differ by demographics,
programs, and schools?
• To what extent have specific programs,
interventions, and services improved outcomes?
• What is the longitudinal progress of a specific
cohort of students?
• What are the characteristics of students who
achieve proficiency and of those who do not?
• Where are we making the most progress in closing
achievement gaps?
• How do absence and mobility affect assessment
results?
• How do student grades correlate with state
assessment results and other measures?
RESOURCES:
Traver, R. (March, 1998). What is a
good guiding question?
Educational Leadership, p. 70-73.
Ronka, D., Lachat, M, et. al.
(December 2008/January 2009).
Data: Now What? Educational
Leadership, p. 18-24.
29.
30. The Data Coordinator is always a Member of the Data Team
What Impact does this Team Have?
Build Awareness
• Build a vision for data use that is
grounded in positive student
outcomes
• Articulate the vision for district-
wide systemic data use clearly
and repeatedly with all
stakeholders to paint an evident
image of how the future will be
better if all engage in this work
• Develop and communicate a
sense of positive urgency
• Share the structure and function
of the District Data Team with
school-level teams
Understand Concerns
• Talk openly with staff at all levels
in the district about stress they
may experience as change is
implemented
• Actively listen: solicit and act
upon the concerns of staff
members to facilitate the change
process
• Acknowledge losses that people
may feel as they shift established
habits and approach their work in
new ways
31. Continued
Model the Process
• Lead by example, not by
edict
• Publicly demonstrate how
the District Data Team is
moving toward the vision
• Present the district-level
data overview with school-
level participants and other
district stakeholders
• Design district-level action
plans using the Data-Driven
Inquiry and Action Cycle
Manage the Process
• Conduct and maintain a
data inventory that includes
– school-level data
– Coordinate the upload of
local data to the Data
Warehouse
– Maintain an up-to-date data
dissemination schedule
– Disseminate relevant data
sets and displays for school-
based action
35. Let’s Look at Some Data:
» Frequency Distribution by Subgroup
» Frequency Distribution
» CC Strands Analysis
» P-Value Comparison
» Sample Grade 6
Glossary of Assessment Terms
40. P-Value
• This common view shows
the p-value for each
question. For multiple
choice items, p-value is
the proportion of
students responding
correctly. For constructed
response items, p-value is
the mean raw score
divided by the maximum
number of score points
for an item.
• District performance on
individual questions can
be compared to regional
levels to determine how
similar students
performed on a particular
question.
• The larger the sample size
the more accurate the
results.
41. P-Value Basics
• Also referred to as the p-value.
• The range is from 0% to 100%, or more
typically written as a proportion of 0.0
to 1.00.
• The higher the value, the easier the
item.
• Calculation: Divide the number of
students who got an item correct by the
total number of students who
answered it.
• Ideal value: Slightly higher than midway
between chance (1.00 divided by the
number of choices) and a perfect score
(1.00) for the item. For example, on a
four-alternative, multiple-choice item,
the random guessing level is 1.00/4 =
0.25; therefore, the optimal difficulty
level is .25 + (1.00 - .25) / 2 = 0.62. On a
true-false question, the guessing level is
(1.00/2 = .50) and, therefore, the
optimal difficulty level is .
50+(1.00-.50)/2 = .75
• P-values above 0.90 are very easy items
and should be carefully reviewed based
on the instructor’s purpose. For example,
if the instructor is using easy “warm-up”
questions or aiming for student mastery,
than some items with p values above .90
may be warranted. In contrast, if an
instructor is mainly interested in
differences among students, these items
may not be worth testing.
• P-values below 0.20 are very difficult
items and should be reviewed for possible
confusing language, removed from
subsequent exams, and/or identified as an
area for re-instruction. If almost all of the
students get the item wrong, there is
either a problem with the item or
students were not able to learn the
concept. However, if an instructor is trying
to determine the top percentage of
students that learned a certain concept,
this highly difficult item may be necessary.
https://www.utexas.edu/academic/ctl/assessment/iar/students/report/itemanalysis.php
49. Balanced Assessments
• Same exams give
• Common Rubrics
• Assessments given during same time period
• Results are looked at through Data
conversations with building leaders.
• Conversations may also be grade level or
subject level wide. They may also be building
wide and make its way to the District Data
Team.