Presented at the 2013 NPEA conference by: Brigham Nahas Research Associates, The Steppingstone Foundation, Kingsbury Center at NWEA
http://www.educational-access.org/npea_conference_speakers2013.php
The Education Lifecycle of African American and Latino/a Students: From Middl...
College Success Academy: Launching a New Program with Research and Evaluation Partners
1. Launching a new program with
research and evaluation partners
NPEA Conference
April 11, 2013
2. Agenda
• Welcome and introductions
• How we got here
• Program design: the scorecard
• Planning evaluation activities
• What we learned
• The next chapter: measuring non-cognitive
3. Who we are
• Robert Theaker, Senior Research Associate
The Kingsbury Center, Northwest Evaluation
Association
• Roblyn Brigham PhD, Managing Partner
Brigham Nahas Research Associates
• Yully Cha, Chief Program Officer
The Steppingstone Foundation
4. What we believe about our work
• Evaluate early and often
• Culture matters
• Broad view of what we mean by “data”
• Findings inspire action
• Students front and center
5. How we got here
• Environmental context
– Summer learning loss and the achievement gap
– College access to persistence/graduation
• The Steppingstone Foundation’s 2009
strategic plan
– The public school venture
6. Program design
1. Academic achievement Center for Higher Education Studies, UC
Berkeley; Cliff Adelman, U.S. Dept of Ed; National Center for Educational Accountability
2. Socio-emotional competency Malecki et al (Measuring Perceived
Social Support: Development of the Child and Adolescent Social Support Scale)
• Adult relationships Learning First Alliance (Every Child Learning: Safe and
Supportive Schools)
• Self-efficacy Robbins et al, (Do Psychosocial and Study Skill Factors Predict
College Outcomes? A Meta-Analysis)
3. Positive behavior Balfanz et al (Preventing Student Disengagement and
Keeping Students on the Graduation Path in Urban Middle Grades Schools)
• Attendance
• School disciplinary action
4. College awareness The Bridgespan Group (Reclaiming the American Dream);
Southern Regional Education Board (Middle Grades to HS: Mending a Weak Link;
Choy, U.S. Dept of Ed (Students Whose Parents Did Not Go to College: Postsecondary
Access, Enrollment and Persistence)
7.
8. Academic achievement
• The search for the right tool
– Summative
– Formative
– Measure summer learning
– And we want to compare against national norms
Measures of Academic Progress (MAP)
9. National Map
• Partners in 50 states
• Over 5000 Partner Districts
• Over 6 million students assessed
• Partners in 100 foreign countries
10.
11. NWEA Uses a RIT Scale
(Rasch Unit)
Equal interval
Linked to curriculum RIT SCALE
Achievement scale Adult Reading
Grace
Cross graded 232 X
x
Shows growth x
x x
Greater score 207 x X
x xx x Devon
x
precision x
x
Like an academic ruler 184 X
Daniel
Beginning Reading
13. Planning with external evaluator
Logic model: Implement
observations, focus
Articulate “the model” groups, and interviews Continue evaluation
Define outcomes and Internal team debrief activities
sequence sessions Three-year analysis and
Build consensus around Build SPSS database report
data
Mid-year observations and
report
14. Data collection plan
• Qualitative Data
– Observations, interviews, focus groups
– Program staff, teachers, parents, tutors, and
students
• Quantitative Data
– Scorecard and MAP
– Non-cognitive measures
– Surveys to capture perspective of parents
• Process of sharing what we learn
15. What we learned: qualitative
Strengths Challenges Our response
• Program • Demanding job • Change schedule
administration is for teachers/staff and temperature
strong
• Culture needs to • Admission info
• The program deepen to and interview
“culture” is taking transform sessions
root
• Defining who to • IEP info collected
• Scholars’ serve/who can in admission;
enthusiasm for best benefit from improve faculty
the academics the program orientation
16. What we learned: quantitative
Class 1 Class 2
Class 1 Class 2
Summer 2011 93% n/a
Summer 2011 42/46 n/a
Academic Year 87% n/a
2011-2012
Academic Year 35/42 n/a
Summer 2012 92% 94% 2011-2012
Summer 2012 29/35 = 63% 26/46 = 78%
English Language Arts Math
Average Average days
Score SGP* A/P** Score SGP* A/P**
GPA missed from school
Grade 4 230 n/a 13 230 n/a 11
Grade 4 (2011) 2.7 8
(2011)
Grade 5 (2012) 2.8 6
Grade 5 232 55 18 238 65 19
(2012)
17. What we learned:
measuring academic impact
• Year one
– Correcting mistakes in how test is administered
– Learning how to read and report the results
• Year two
– Summer learning impact and school year effects
– English Language Learners
– Boys
19. How we create a Virtual Comparison Group
(VCG)
We identify included
studentsall matching students from GRD
Identify
Grade, Subject, Starting Achievement,
Randomly select comparison group
School Income, Urban vs. Rural Classification, etc.
19
20. Virtual Comparison
Groups
(VCGs)
Compare
your
student’s
growth to
similar
students in
similar
schools
20
21. Next chapter: measuring non-cognitive
• Virtual Comparison Group (VCG) analysis
• External evaluator year-three report
• Non-cognitive assessments
– Holistic Student Assessment
– Survey of After-School Youth Outcomes (SAYO)
Editor's Notes
Why TSF chose MAP over others: adaptive and continuous spectrum (i.e. more questions at student’s level; greater precision in reflecting improvement of knowledge vs “single-form test”)“Single-form test:” for ex, terra nova, Iowa, ACT – fixed group of questions targeted towards a grade vs specific students (50 questions vs thousands of questions to draw from in timed vs untimed gauge engagement)Reliability/validity: MAP less standard deviation of error