2. Today’s Objectives
• Explain how CQI supports DHS youth
outcomes
• Use CQI as a way to tell your story
• Explain how to use the PSAYDN self-
assessment for CQI
• Develop an implementation strategy for
the PSYADN self-assessment
3. Today’s Agenda
• Why? – CQI Rationale
• With What? – Using the PSAYDN Tool
• Who + How? Strategy for Implementing
the Tool
• When + How? Submitting the Assessment
• Q & A about CQI
12. DHS Mission Statement
The Philadelphia Department of Human
Services mission is to provide and promote
safety, permanency, and well-being for
children and youth at risk of abuse, neglect
and delinquency.
13. DHS OST Project FY 2015 Vision
Create a more intentional connection between the needs
of all children and youth, particularly those served by DHS
and the DHS OST Project as an effective prevention
strategy and critical community-based resource.
FY 15 Youth
Impact
Youth
Focus
Program
Quality
Focus
14. Evaluation Framework (Desired Youth Impact)
Youth who participate in DHS-funded OST
services are less likely to be active within DHS;
and
Youth who were formerly active within DHS are
less likely to re-enter the DHS system of services.
15. GOAL AREA OUTCOMES INDICATORS
Academic
success
1. Increased Engagement in
Learning
2. Increased Engagement School
3. Aspires to Educational
Excellence
1. Improved Quality of Homework Time
2. Increased Opportunities for Problem
Solving & Critical Thinking
3. Awareness of High School Choices
Develop
positive life
skills
1. Improved Life Skills
2. Improved Relationships
1. Goal Setting
2. Personal Accountability
3. Work with Others to Accomplish Goals
4. Improved Interactions with Peers
5. Improved Interactions with Adults
Ready for
college
and/or career
1. Prepared for Higher Education
and Employment
1. Awareness of and interest in Careers
DHS OST Goals, Outcomes & Indicators
16. GOAL AREAS OUTCOMES INDICATORS
Academic
success
1. Increased Engagement in
Learning
2. Increased Engagement
School
3. Aspires to Educational
Excellence
1. Improved Quality of Homework Time
2. Increased Opportunities for Problem
Solving & Critical Thinking
3. Awareness of High School Choices
Develop positive
life skills
1. Improved Life Skills
2. Improved Relationships
1. Goal Setting
2. Personal Accountability
3. Work with Others to Accomplish Goals
4. Improved Interactions with Peers
5. Improved Interactions with Adults
Ready for
college and/or
career
1. Prepared for Higher
Education and Employment
1. Awareness of and interest in Careers
DHS OST Goals, Outcomes & Indicators
Elementary School Priorities
Red = Primary Blue = Secondary
17. GOAL AREA OUTCOMES INDICATORS
Academic
success
1. Increased Engagement in
Learning
2. Increased Engagement
School
3. Aspires to Educational
Excellence
1. Improved Quality of Homework Time
2. Increased Opportunities for Problem
Solving & Critical Thinking
3. Awareness of High School Choices
Develop positive
life skills
1. Improved Life Skills
2. Improved Relationships
1. Goal Setting
2. Personal Accountability
3. Work with Others to Accomplish Goals
4. Improved Interactions with Peers
5. Improved Interactions with Adults
Ready for college
and/or career
1. Prepared for Higher Education
and Employment
1. Awareness of and interest in Careers
DHS OST Goals, Outcomes & Indicators
Middle School Priorities
Red = Primary Blue = Secondary
18. GOAL AREA OUTCOMES INDICATORS
Academic
success
1. Increased Engagement in
Learning
2. Increased Engagement School
3. Aspires to Educational
Excellence
1. Improved Quality of Homework Time
2. Increased Opportunities for Problem Solving
& Critical Thinking
3. Awareness of High School Choices
Develop positive
life skills
1. Improved Life Skills
2. Improved Relationships
1. Goal Setting
2. Personal Accountability
3. Work with Others to Accomplish Goals
4. Improved Interactions with Peers
5. Improved Interactions with Adults
Ready for college
and/or career
1. Prepared for Higher Education
and Employment
1. Awareness of and interest in Careers
DHS OST Goals, Outcomes & Indicators
High School Priorities
Red = Primary Blue = Secondary
19. CQI and You
• Program quality
• Empowering site-level staff
to lead improvements
• WILL NOT determine FY
2016 funding
20. CQI and You
Data gathered will help:
• PHMC and your program work
together to improve quality
• Determine whether youth
experiences in your program
result in desired DHS youth
outcomes
• DHS make decisions about
system-wide supports
21. The CQI Process - 2014Collect
data,
Reassess
Provider self-
assessment
Data
collection
and
analysisGoal
setting w/
Provider,
Program
Specialist
Program
Improvements
Site
Visit
Program
Improvements Training on
CQI and Tools
Adapted from the National Institute on Out-of-School-Time
22. The CQI Process - 2015
Collect
data,
Reassess
Program Specialist
Reviews
Goal setting
w/
Provider,
Program
Specialist
Program
Supports and
Improvement
Site Visit
Program Supports
and Improvement Provider Goal
Setting In
Reapplication
Adapted from the National Institute on Out-of-School-Time
26. Section A – Structure & Management
• Fundamental to the implementation of a good program
• Will likely require management support to complete
• Executive Director
• Human Resources
• Finance
• Useful for larger strategic
planning discussion within
the program and the agency
• Program Specialist will
provide minimal coaching in
this area
27. Section B – Positive Connections
• Helps us examine the relationships between staff, youth,
and caregivers within our programs
• Research and experience tell us that relationships are
key to successful youth development work
• Also assessment connections to community partners
28. Section C – Safety & Health
• Just like sound
management practices,
safety is critical to success
in other areas
• Focuses on emergency
policies, facilities, and
activities that promote
healthy lifestyles
29. Section D - Activities
• Assesses the extent to which activities are:
• Thoughtfully-designed
• Age-appropriate
• Youth-driven and responsive to youth interests
• Foster learning and positive youth development
• Varied
• Also examines the
planning process for
activities
30. Addendum for Elementary Programs
• The quality of homework time is a
priority indicator for DHS
elementary programs.
• PSAYDN Self-Assessment does not
include a homework time.
• The addendum is a chance to
assess how well your program is
providing homework support.
• Can be identified as an area for
improvement.
31. Additional Questions for ALL Programs
• #46 - Rank the PSAYDN categories in order of one to five based on
the areas your program could use the most additional support.
• #47 - What additional supports would your program like from
PHMC/DHS to assist in the area from Q. #46 ranked as a number
one (most important).
• #48 - How did program staff feel about the CQI process in FY2015?
• #49 - Why does your program believe the CQI process worked?
Select all that apply.
• (If you select ‘No’ to #49) #50 - Why does your program believe the
CQI process did not work well? Select all that apply.
32. Additional Questions for ALL Programs
• #51 - What communication process did your program use to share
CQI information with your Executive Director and other
stakeholders?
• #52 - What was the first goal you set for FY2015?
• #53 - Did your program achieve the first goal set for FY2015?
• #54 - Provide an explanation for why your program did or did not
achieve the first goal.
• #55 - What was the second goal you set for FY2015?
• #56 - Did your program achieve the second goal set for FY2015?
• #57 - Provide an explanation for why your program did or did not
achieve the second goal.
33. Additional Questions for ALL Programs
• #58 - What was the third goal you set for FY2015? If you only set
two goals, write N/A below.
• #59 - Did your program achieve the third goal set for FY2015?
• #60 - Provide an explanation for why your program did or did not
achieve the third goal.
• For each section, we will ask about your program’s rationale for
the score selected for that section’s questions. Be prepared with
typed text that you can insert into the online survey.
35. A.1.3 – Guiding Questions
• Do staff have access to on-going professional
development opportunities such trainings and
conferences?
• What are the yearly expectations for professional
development, in terms of amount and/or content, by
staff’s level of responsibility in your program?
• What types of on-site support do staff have to continue
building skills?
36. Scenario #1: A.1.3
Agency ABC: Staff members feel well supported.
They are invited to attend paid workshops offered
by professional development providers, as well as
occasional on-site in-service trainings. All staff
participate in PD in some form, but agency ABC
does not monitor requirements for annual PD.
Meetings are held once per month and focus on
program planning and policy updates. Staff receive
annual performance reviews in writing with some
supervision is provided as needed.
37. Scenario #1: A.1.3
1 Only some staff attend professional development
Professional development topics not informed by any assessment of needs
3 Staff participate in on-going professional development to address the unique
characteristics of youth and families and the content focus of program
Staff receive only informal supervision or feedback related to their performance
Staff meetings held at least monthly to address program issues
5 Staff receive written, performance-based, assessments of their work, identify areas for
needed skill-building, and plan for professional development based on identified
priorities
Professional development opportunities are available in multiple formats (on- and off-
site) including workshops, observation, coaching, and peer-to-peer learning
Staff receive regularly-scheduled one-on-one supervision and feedback
Staff meetings held more than once a month to address program issues and support
staff skill-building
38. B.1.1 – Guiding Questions
• In what ways do staff support positive
communication and conflict resolution with
youth and co-workers?
39. Scenario #2: B.1.1
Agency DEF: The type of communication happening varies a
great deal. Six of eight members of the team get along very
well and have positive communication with the youth.
Among these staff, conflicts with co-workers are resolved
quickly. These staff also use the organization’s
reward/consequence structure to guide youth behavior.
There are two staff members that don’t gel with the rest of
the team. They are quick to snap at the youth and at the
other staff members. The kids have started to call them
Oscar #1 and Oscar #2 (the grouch). There are noticeably
more behavior problems (put downs, teasing, squabbling,
etc.) among the youth these two staff members supervise.
40. Scenario #1: B.1.1
1 Staff either ignore youth conflicts, or impose negative consequences without
taking the time to understand the cause
3 Staff relate to all participants in positive ways
Program establishes, maintains and communicates code of conduct to participants,
staff, and their families
Program applies rewards and consequences for participant behavior appropriately and
consistently
Staff communicate with respect and are role models of positive relationships
5 Staff relate to all participants in positive ways and intentionally build individual
relationships
Staff approach negative behavior calmly, acknowledging youth’s feelings while
stopping hurtful behavior
Staff facilitate constructive conflict resolution when needed
41. C.1.2 – Guiding Questions
• Does your program have written policies
to ensure equal access and safe
participation for all youth?
• How do staff promote and maintain a safe
physical and emotional environment for all
participating youth?
42. Scenario #3: C.1.2
Agency GHI: Prides itself on creating a safe and inclusive
space for youth. Last year, their 70 year-old building
added a wheelchair ramp and is now ADA compliant.
Program activities focus on celebrating differences and
there is virtually no bullying as a result. Several youth
have special needs and the Site Director met with each
caregiver to understand how to best accommodate their
youth’s needs. Six months ago, agency leadership and
father of one youth decided the program wasn’t suited
to meet his son’s needs, who had a severe autism
diagnosis. The staff and family worked together to find
another program that would work better.
43. Scenario #3: C.1.2
1 Program space or offerings are not accessible to all youth
3 Program space and operations meet all local, state and federal regulations
regarding accessibility (e.g. ADA)
Program develops and manages effective arrival and dismissal procedures and
plans for safe travel home
Participant files include emergency contact information, medical release forms,
allergy and health information
No one in the program feels threatened or unsafe because of who they are
5 • Program materials and policies emphasize commitment to serving all youth in the
community
• Staff make reasonable accommodations to serve youth with learning differences or
special needs
• Staff consistently intervene when youth are adults make hurtful or negative
comments
• The emotional climate of the program is positive, characterized by mutually
respectful, supportive interactions among staff and youth
44. D.1.1 – Guiding Questions
• Describe your planning process - how do
you determine overall goals and daily
activities (e.g., using your logic model,
documented program design)?
45. Scenario #4: D.1.1
Agency KLM: Youth are always busy. The staff plan
their activities the evening before, using a couple of
activity idea books the Site Director bought online.
Examples of activities include friendship bracelets, a
soda bottle terrarium, and multiplication jeopardy.
When asked how activities are chosen, the staff say,
“Whatever sounds fun.” When the staff don’t have
the materials they need for a particular activity, they
have some board games, jump ropes, and a kickball.
Youth of all ages (K-5) participate in activities.
46. Scenario #4: D.1.1
1 Activities are not intentionally planned according to program mission,
developmental or learning objectives
3 Activities reflect the mission of the program and promote the physical,
mental, and/or social-emotional development of all participants
All activities planned according to the age and skill level of the
participants and identify targeted skills that participants will develop
during the program cycle
5 Comprehensive written program design and activity plans define
learning objectives that build on each other throughout the program
cycle and show how activities connect to objectives
Program emphasizes the connections between learning and real life
48. Self-Assessment Comfort Test
High comfort level with self-assessments
• Complete with team and submit online.
Some experience with self-assessments
• Program Specialist support for questions as needed.
• Work with team to complete and submit online.
Completely new to self-assessments
• Share online training with key individuals.
• Regular email/conversation with Program Specialist.
• Work with team to complete and submit online.
50. IMPORTANT
When entering the survey online
• Complete it in one sitting – It does not save progress.
Have your answers ahead of time
• Have your scores for each item that applies
• Have the rationale for each section and additional items
answered in advance.
Due c. May 15th
• Do not wait until the last minute – plan it on a calendar.
SCORE YOUR PROGRAM ACCURATELY. DHS IS NOT
USING THIS TO JUDGE YOUR PROGRAM. THEY USE IT TO
DETERMINE SUPPORTS.
51. Minimal Investment
- CQI Training
- Manager/SD completes form
independently
- Enter data
Moderate
Investment
- CQI Training
- Management team
completes form with some
staff input
- Enter data
Maximum Investment
- CQI Training
- Exec. Staff input for Part A
- Line staff input for Parts B-D
- Time to discuss
- Align with site visit assessment to
determine evidence of quality
- Enter data
How much time will it take?
52. Tips for Getting Started: Program Directors
• Make an appointment with administrative
(managers, directors) staff to ask questions.
• Let your team know that this is coming.
• Consider involving other stakeholders (families,
funders, activities partnerships)
• Try to collect “evidence” (sample parent/family
survey, newsletters, student artwork, rubrics, etc.)
that support your program’s efforts.
53. Tips for Getting Started: Working With
Administration (EDs, Managers)
• Make yourself available on their schedule.
• Set up a brief phone meeting before starting.
• Review the self-assessment before meeting.
• Make suggestions for potential improvement
project and program changes.
55. Timeframe
Goal Setting
Reapplication
May - June
PHMC
Analysis of
Data
May-June
Complete
Tool and
Submit Data
May 15
Review Tool
& Determine
Support
Needed
Apr - May
Intro to CQI
Process
Early April
57. Quiz Question #1
What does the abbreviation CQI stand for?
A. Continuous Quality Improvement
B. Content Quality Improvement
C. City Quality Instrument
58. Quiz Question #2
How many sections are on the
PSAYDN self-assessment?
A. Three
B. Four
C. Five if you include the HW addendum
D. Six
E. Seven
59. Quiz Question #3
Who should ideally be involved in the self-
assessment process?
A. Executive Staff, Finance & Human Resources
B. Program Directors/Managers
C. Site Directors
D. Direct Line Staff
E. All of the Above
60. Quiz Question #4
Where can I find self-assessment submission
instructions?
A. The Scope of Work
B. PCAPS
C. PHMC Website
D. LearnUpon
Notes de l'éditeur
Hello and welcome to the second round of continuous quality improvement self-assessments for the OST Initiative. Many of you may remember or have heard about the Continuous Quality Improvement (CQI) process in FY 2014, but you may have a lot of questions, and some programs may have been doing this for years – how does this come together? This presentation is designed to answer your questions, give you some timeframes and tell you what you need to know to implement this practice.
Our goal is to provide you with an overview of CQI, its connection to the DHS youth outcomes, and prepare you to use the PSYADN Self-Assessment tool.
The objectives of the workshop/presentation:
Participants will be able to explain CQI as the framework to support the efforts of DHS funded OST programs to deliver activities with intentionality to make progress toward the DHS youth outcomes.
Participants will understand how to use CQI to tell a story of quality experiences and resulting youth impact.
Participants will be able to identify at least two tools they currently use as part of their agency’s CQI
Participants will be able to explain why assessing for quality is important in relation to achieving desired youth impact.
The primary goal of the Philadelphia Department of Human Services is to prevent the neglect and abuse of children and youth.
The OST project is designed to help support DHS’ mission.
The primary goal of the Philadelphia Department of Human Services is to prevent the neglect and abuse of children and youth.
The OST project is designed to help support DHS’ mission.
The primary goal of the Philadelphia Department of Human Services is to prevent the neglect and abuse of children and youth.
The OST project is designed to help support DHS’ mission.
The primary goal of the Philadelphia Department of Human Services is to prevent the neglect and abuse of children and youth.
The OST project is designed to help support DHS’ mission.
Now you have the DHS OST youth outcomes frame to the CQI. Now, let’s discuss what the CQI process is and how you will operate within it.
This slide provides an overview of what CQI is. (REVIEW EACH ITEM). Most of you have integrated some type of quality assessments into your program; that may have been parent satisfaction surveys or maybe your site used a debriefing form to get feedback from participants about what worked when the PBL approach was used and made adjustments for improvement for the next time. Additionally, all previous Providers participated in Site Visits with Program Specialist and got feedback to support program quality improvement efforts.
The “story” is the tale of your success in the process. The “tools” are the sources of data to inform the process. The “supports” are the activities you engage in and the resources you draw on to make the improvements. The “process” is just that – a process. It takes time. It requires collaboration internally (within your organization) and externally (between your program and PHMC). It is designed for you to realize improvements throughout the process.
Here is one example of CQI in action …Last year a Program Specialist worked with a new Site Director to understand best practices around managing the work flow, schedule, and team process for the project based learning approach to programming. In the Fall, the Site Director was the only staff member using the PBL strategy because she was the only staff member experienced in the approach (sources of data collection or tools). Following the initial meeting with the Program Specialist, they agreed that other members of the team should receive PBL training and that the Site Director’s role should be to support her team, answer questions, etc., not to facilitate the activities by herself (supports). The Program Specialist arranged for the entire staff to receive training and during their following Spring and Summer site visits, the staff were much more confident in their ability to utilize the PBL approach, even excited about the youth projects (the story).
Now you have the DHS OST youth outcomes frame to the CQI. Now, let’s discuss what the CQI process is and how you will operate within it.
This slide provides an overview of what CQI is. (REVIEW EACH ITEM). Most of you have integrated some type of quality assessments into your program; that may have been parent satisfaction surveys or maybe your site used a debriefing form to get feedback from participants about what worked when the PBL approach was used and made adjustments for improvement for the next time. Additionally, all previous Providers participated in Site Visits with Program Specialist and got feedback to support program quality improvement efforts.
The “story” is the tale of your success in the process. The “tools” are the sources of data to inform the process. The “supports” are the activities you engage in and the resources you draw on to make the improvements. The “process” is just that – a process. It takes time. It requires collaboration internally (within your organization) and externally (between your program and PHMC). It is designed for you to realize improvements throughout the process.
Here is one example of CQI in action …Last year a Program Specialist worked with a new Site Director to understand best practices around managing the work flow, schedule, and team process for the project based learning approach to programming. In the Fall, the Site Director was the only staff member using the PBL strategy because she was the only staff member experienced in the approach (sources of data collection or tools). Following the initial meeting with the Program Specialist, they agreed that other members of the team should receive PBL training and that the Site Director’s role should be to support her team, answer questions, etc., not to facilitate the activities by herself (supports). The Program Specialist arranged for the entire staff to receive training and during their following Spring and Summer site visits, the staff were much more confident in their ability to utilize the PBL approach, even excited about the youth projects (the story).
Now you have the DHS OST youth outcomes frame to the CQI. Now, let’s discuss what the CQI process is and how you will operate within it.
This slide provides an overview of what CQI is. (REVIEW EACH ITEM). Most of you have integrated some type of quality assessments into your program; that may have been parent satisfaction surveys or maybe your site used a debriefing form to get feedback from participants about what worked when the PBL approach was used and made adjustments for improvement for the next time. Additionally, all previous Providers participated in Site Visits with Program Specialist and got feedback to support program quality improvement efforts.
The “story” is the tale of your success in the process. The “tools” are the sources of data to inform the process. The “supports” are the activities you engage in and the resources you draw on to make the improvements. The “process” is just that – a process. It takes time. It requires collaboration internally (within your organization) and externally (between your program and PHMC). It is designed for you to realize improvements throughout the process.
Here is one example of CQI in action …Last year a Program Specialist worked with a new Site Director to understand best practices around managing the work flow, schedule, and team process for the project based learning approach to programming. In the Fall, the Site Director was the only staff member using the PBL strategy because she was the only staff member experienced in the approach (sources of data collection or tools). Following the initial meeting with the Program Specialist, they agreed that other members of the team should receive PBL training and that the Site Director’s role should be to support her team, answer questions, etc., not to facilitate the activities by herself (supports). The Program Specialist arranged for the entire staff to receive training and during their following Spring and Summer site visits, the staff were much more confident in their ability to utilize the PBL approach, even excited about the youth projects (the story).
Now you have the DHS OST youth outcomes frame to the CQI. Now, let’s discuss what the CQI process is and how you will operate within it.
This slide provides an overview of what CQI is. (REVIEW EACH ITEM). Most of you have integrated some type of quality assessments into your program; that may have been parent satisfaction surveys or maybe your site used a debriefing form to get feedback from participants about what worked when the PBL approach was used and made adjustments for improvement for the next time. Additionally, all previous Providers participated in Site Visits with Program Specialist and got feedback to support program quality improvement efforts.
The “story” is the tale of your success in the process. The “tools” are the sources of data to inform the process. The “supports” are the activities you engage in and the resources you draw on to make the improvements. The “process” is just that – a process. It takes time. It requires collaboration internally (within your organization) and externally (between your program and PHMC). It is designed for you to realize improvements throughout the process.
Here is one example of CQI in action …Last year a Program Specialist worked with a new Site Director to understand best practices around managing the work flow, schedule, and team process for the project based learning approach to programming. In the Fall, the Site Director was the only staff member using the PBL strategy because she was the only staff member experienced in the approach (sources of data collection or tools). Following the initial meeting with the Program Specialist, they agreed that other members of the team should receive PBL training and that the Site Director’s role should be to support her team, answer questions, etc., not to facilitate the activities by herself (supports). The Program Specialist arranged for the entire staff to receive training and during their following Spring and Summer site visits, the staff were much more confident in their ability to utilize the PBL approach, even excited about the youth projects (the story).
There are three pillars to help you get a complete picture of your program’s quality.
In previous years, the only type of assessment we did as a system was an external assessment. Program Specialists came to a site, observed for an afternoon, and completed their report. There is real value in external assessments…but they also provide only part of the picture or one chapter in the story.
A self-assessment (internal assessment) provides a different perspective. In this case, the self-assessment is designed to help you understand the building blocks that you have in place to produce the evidence of quality. AGAIN, MOST PROGRAMS WILL USE THE PSAYDN SELF-ASSESSMENT, THOUGH PROVIDERS IN THE APAS PILOT WILL DO THEIR SELF-ASSESSMENT WITH THE APAS TOOLS. PROGRAMS IN THE APAS PILOT WILL NOT BE EXPECTED TO DO BOTH.
A third important perspective is the caregiver or youth perspective, which is not a part of the formal CQI process this year, though sites are encouraged to include it if they can.
These three types of assessment, when used together, give us the best picture of a program’s quality.
The primary goal of the Philadelphia Department of Human Services is to prevent the neglect and abuse of children and youth.
The OST project is designed to help support DHS’ mission.
The primary goal of the Philadelphia Department of Human Services is to prevent the neglect and abuse of children and youth.
The OST project is designed to help support DHS’ mission.
The DHS OST Project vision for FY 2014 is to Create a more intentional connection between the needs of all children and youth, particularly those served by DHS and the DHS OST Project as an effective prevention strategy and critical community-based resource.
Current research indicates a strong correlation between the healthy social and emotional development of youth and reduced incidence of child maltreatment and delinquent behavior. Based on this research, the vision for the Project reflects the belief that high quality OST programming supports the work of DHS to prevent child abuse and neglect when there is intentionality in providing activities that promote the overall well being of children and youth.
The thinking is that when there are intentional strategies put into place, and programs are operating at a high level of quality with trained and caring staff, we will realize the outcomes that have been defined for the system (positive supports and positive changes in the lives of the youth and families served).
… a reduction in the number of youth involved in the DHS system.
All of this is possible when we continuously reflect on the quality of the programming, staff strengths and needs, and the experiences the young people are having, and make necessary, timely adjustments utilizing a range of tools and supports (like accessing professional development with the support of the Program Specialists).
So far, we have presented the big picture and shared how DHS’ youth outcomes relate to OST programming. Now, let’s review the goals, outcomes, and indicators specific to OST. These are also listed in the scope of work and reporting requirements. As you can see there are three goal areas, six outcomes, and several indicators connected with the outcomes.
Providers are expected to demonstrate that they are working toward these outcomes through a wide-range of activities. When program activities are delivered in an intentional way, we anticipate seeing positive changes in youth as defined by the DHS youth outcomes
So for example, Goal Area 1 – Participating youth will have academic success – has 3 associated outcomes (Increased Engagement in Learning, Increased Engagement in School, and Aspires to Educational Excellence). There are also 3 indicators that will help us to determine if we making progress on these outcomes.
Note that ALL of these goals and outcomes are important for ALL youth. However, this year we’ve selected some priority outcomes for each age group to make this work more manageable, targeted, and aligned with the DHS goals. The goals for each age group are as follows:
For elementary youth, we will be focusing on Increased Engagement in Learning. Specifically we want to ensure that all elementary programs are providing a high-quality homework time where youth can receive support as needed.
Additionally, elementary providers may also work on the secondary priorities related to increased engagement in school, improved life skills, and improved relationships.
Middle school providers will focus on improved relationships and improved life skills this year. The indicators of goal setting and personal accountability will relate directly to these outcomes.
Secondary priorities are highlighted in blue.
High school programs will also work on improved life and relationship skills this year. However, the secondary outcome is “prepared for higher education and employment” for this age group.
Again, all 6 youth outcomes and their associated indicators are important and relevant for all age groups. The prioritization is there to help you take on a realistic and manageable piece for this first year of the CQI process.
READ SLIDE.
READ SLIDE.
So here is a picture of what this WHOLE process will look like. You can see that there are various steps along the way. In addition, the steps do not have to flow in sequence.
We start by getting to know your programs and orienting everyone to the CQI process.
Next comes the self-assessment process. This year, we will work with a few tools – the PSADYN Program Quality Self Assessment Tool is the one we will use for the self-assessment phase WITH THE MAJORITY OF PROGRAMS. HOWEVER, PROVIDERS INVOLVED IN THE APAS PILOT WILL USE THE APAS TOOLS FOR THEIR SELF-ASSESSMENT ACTIVITIES. THE PROCESS OF CONTINUOUS QUALITY IMPROVEMENT ISN’T DEPENDENT ON ONE SPECIFIC TOOL OR ANOTHER, AS LONG AS SELF-ASSESSMENT AND CONTINUOUS QUALITY IMPROVEMENT IS HAPPENING. Other tools will include the site visit assessment and the 21st Century Skills Rubric (formerly the PBL Rubric). After a period of goal setting and a first round of program improvements, the Program Specialist will conduct the site visit. This will lead to additional program improvements, some additional analysis at the program and system level, and the process will start again.
We recognize that each program is unique in its goals and mission. The self-assessment will help each program stay ‘true’ to its mission and goals, while helping everyone across the system to meet DHS youth outcomes.
For the purposes of your work with PHMC, we will define some goals based on the DHS goals and youth outcomes, as well as the results from your self assessment. We will talk about the timeframe and some potential scenarios.
So here is a picture of what this WHOLE process will look like. You can see that there are various steps along the way. In addition, the steps do not have to flow in sequence.
We start by getting to know your programs and orienting everyone to the CQI process.
Next comes the self-assessment process. This year, we will work with a few tools – the PSADYN Program Quality Self Assessment Tool is the one we will use for the self-assessment phase WITH THE MAJORITY OF PROGRAMS. HOWEVER, PROVIDERS INVOLVED IN THE APAS PILOT WILL USE THE APAS TOOLS FOR THEIR SELF-ASSESSMENT ACTIVITIES. THE PROCESS OF CONTINUOUS QUALITY IMPROVEMENT ISN’T DEPENDENT ON ONE SPECIFIC TOOL OR ANOTHER, AS LONG AS SELF-ASSESSMENT AND CONTINUOUS QUALITY IMPROVEMENT IS HAPPENING. Other tools will include the site visit assessment and the 21st Century Skills Rubric (formerly the PBL Rubric). After a period of goal setting and a first round of program improvements, the Program Specialist will conduct the site visit. This will lead to additional program improvements, some additional analysis at the program and system level, and the process will start again.
We recognize that each program is unique in its goals and mission. The self-assessment will help each program stay ‘true’ to its mission and goals, while helping everyone across the system to meet DHS youth outcomes.
For the purposes of your work with PHMC, we will define some goals based on the DHS goals and youth outcomes, as well as the results from your self assessment. We will talk about the timeframe and some potential scenarios.
There are three pillars to help you get a complete picture of your program’s quality.
In previous years, the only type of assessment we did as a system was an external assessment. Program Specialists came to a site, observed for an afternoon, and completed their report. There is real value in external assessments…but they also provide only part of the picture or one chapter in the story.
A self-assessment (internal assessment) provides a different perspective. In this case, the self-assessment is designed to help you understand the building blocks that you have in place to produce the evidence of quality. AGAIN, MOST PROGRAMS WILL USE THE PSAYDN SELF-ASSESSMENT, THOUGH PROVIDERS IN THE APAS PILOT WILL DO THEIR SELF-ASSESSMENT WITH THE APAS TOOLS. PROGRAMS IN THE APAS PILOT WILL NOT BE EXPECTED TO DO BOTH.
A third important perspective is the caregiver or youth perspective, which is not a part of the formal CQI process this year, though sites are encouraged to include it if they can.
These three types of assessment, when used together, give us the best picture of a program’s quality.
Now that we’ve established the value of a self-assessment process, let’s take some time to look at the PSAYDN Self-Assessment Tool.
First, you’ll see there are 4 sections or building block areas
Section A: Structure and Management
Section B: Positive Connections
Section C: Safety and Health
Section D: Activities
DHS and PHMC have added an Addendum, think of it as a Section E, for elementary providers. This addendum addresses the quality of homework assistance time in OST programs. This section does not need to be completed by middle or high school programs.
A separate self-assessment is needed for each program model.
As the names suggest, each section has a specific focus. These areas of focus have been determined as the minimum things that high quality programs are able to demonstrate.
We have cross-walked this tool with the DHS youth outcomes as well as the Site Visit Assessment Tool. As you complete the self-assessment, you can understand how the building block areas affect site visit assessment indicator areas and ultimately how these bring you closer to meeting DHS youth outcomes. For example, a strong staffing structure (well-trained and consistent staff) is expected to yield staff who engage youth through meaningful activities. Engaged youth are expected to have strong relationships with staff and their peers – a DHS outcome.
You can see that this tool uses a rating scale. Descriptions are provided for a “1” a “3” and a “5” with the understanding that programs that exceed the standards of a 1 but fail to meet the criteria for a 3, for example, could be rated a 2.
We know that some programs are going to rate higher in some areas than others. That’s okay and that’s to be expected. It’s extremely RARE that a program will score ALL 5s throughout the whole assessment. That’s NOT what DHS or PHMC is looking for. We want sites to take an HONEST look at their program in these 4 (or 5) specific areas and set some goals about how to IMPROVE, not be perfect. REMEMBER that these scores are not tied to FY 2015 funding. It is more important to DHS and PHMC that you rate your program honestly so that we can work together toward improvement rather than have you inflate your rating and stay stagnant. If you fail to change, you fail to grow.
Something you’ll want to think about as you complete this assessment is how do we know? What’s the EVIDENCE that this is accurate? So for example A1, the second question is: What methods does your program use to recruit, screen, hire, and retain qualified staff? If you have a sample job description, be prepared to have it on hand to demonstrate a 2-3 rating, etc. You will have the chance to talk with your Program Specialist about what will make for reliable examples.
Section A focuses on a program’s structure and management. Most of us can probably agree that it is very difficult to build a quality OST program, when an agency is facing structural or management challenges.
Areas covered on this portion of the PSAYDN tool include:
Hiring and training practices
Staff retention
Professional development
Policies and procedures
Fiscal management
Strategic planning
Every agency is different, but for most providers, this section of the self-assessment would benefit from input and discussion with executive-level staff, members of the finance and human resource departments, and anyone involved in the development of policies or long-term planning for the program.
Section B addresses the relationships that exist among staff, youth, caregivers, and community partners.
Some of the items include:
Staff communication and conflict resolution skills
Youth social interactions
Communication with parents and caregivers
Relationships with school staff and administrators.
When completing this section, direct line staff often have a great deal of insight to contribute. They are observing youth interactions with one another and often interact with parents. Consider discussing this section of the tool at a staff meeting or allow staff to provider their written comments.
Section C focuses on safety and health. Like section A, section C is essential. It is impossible to work on developing other areas, if basic safety remains an issue.
Some of the items include:
Emergency plans and procedures
Inclusion practices that help all youth to feel welcome and safe
Well-maintained facilities
Activities or services that promote good health
This section can be completed by a program administrator, site director, or by a team of individuals. Keep in mind that a policy or procedure that exists only on paper – staff don’t know it or follow it – doesn’t really count.
Section D addresses the activities – the “what” of your program day.
Some of the items include:
Are activities intentionally designed?
Are activities age-appropriate?
Do staff incorporate a variety of activities?
Are the activities youth-centered?
Group leaders also have an important perspective to contribute in this section, as they are the ones facilitating the activities. Staff meetings, where staff members have had time to reflect on the items in advance, are a great way to collect line-staff input. If a staff meeting is not possible, try including a sample of the staff that includes different perspectives and levels of experience.
While the original PSAYDN self-assessment tool did not include a section on homework time, it is an important element of the program day – especially for elementary programs. Since the quality of homework time was identified as an indicator by DHS and the provider community, PHMC has developed an addendum to the tool.
This addendum reflects the same three homework-related indicators that Program Specialists score as part of the Site Visit Assessment.
This section only needs to be completed by elementary programs.
While the original PSAYDN self-assessment tool did not include a section on homework time, it is an important element of the program day – especially for elementary programs. Since the quality of homework time was identified as an indicator by DHS and the provider community, PHMC has developed an addendum to the tool.
This addendum reflects the same three homework-related indicators that Program Specialists score as part of the Site Visit Assessment.
This section only needs to be completed by elementary programs.
While the original PSAYDN self-assessment tool did not include a section on homework time, it is an important element of the program day – especially for elementary programs. Since the quality of homework time was identified as an indicator by DHS and the provider community, PHMC has developed an addendum to the tool.
This addendum reflects the same three homework-related indicators that Program Specialists score as part of the Site Visit Assessment.
This section only needs to be completed by elementary programs.
While the original PSAYDN self-assessment tool did not include a section on homework time, it is an important element of the program day – especially for elementary programs. Since the quality of homework time was identified as an indicator by DHS and the provider community, PHMC has developed an addendum to the tool.
This addendum reflects the same three homework-related indicators that Program Specialists score as part of the Site Visit Assessment.
This section only needs to be completed by elementary programs.
READ SLIDE.
Key messages:
- Hard to describe a program in a paragraph. You will know your programs much better and will better able to assess all criteria.
- This program demonstrates all of the level three criteria and at least some from level 5 (but not all). Scores of 3 and 4 would be acceptable here. I would probably score a 5.
READ SLIDE.
Key messages:
- Hard to describe a program in a paragraph. You will know your programs much better and will better able to assess all criteria.
- This program demonstrates all of the level three criteria and at least some from level 5 (but not all). Scores of 3 and 4 would be acceptable here. I would probably score a 5.
READ SLIDE.
Key messages:
- Hard to describe a program in a paragraph. You will know your programs much better and will better able to assess all criteria.
- This program demonstrates all of the level three criteria and at least some from level 5 (but not all). Scores of 3 and 4 would be acceptable here. I would probably score a 5.
READ SLIDE.
Key messages:
- Hard to describe a program in a paragraph. You will know your programs much better and will better able to assess all criteria.
- This program demonstrates all of the level three criteria and at least some from level 5 (but not all). Scores of 3 and 4 would be acceptable here. I would probably score a 5.
READ SLIDE.
Key messages:
- Hard to describe a program in a paragraph. You will know your programs much better and will better able to assess all criteria.
- This program demonstrates all of the level three criteria and at least some from level 5 (but not all). Scores of 3 and 4 would be acceptable here. I would probably score a 5.
READ SLIDE.
Key Messages:
Not all staff are operating at the same level. What is the impact on the program as a whole.
On one hand “we are only as strong as our weakest link”
On the other, a score of a 1 doesn’t accurately reflect some of the strengths of the program
Most folks settle on a two here, though a one could be justify. A score of a 3 glosses over a serious issue with these two staff.
READ SLIDE.
Key messages:
- Hard to describe a program in a paragraph. You will know your programs much better and will better able to assess all criteria.
- This program demonstrates all of the level three criteria and at least some from level 5 (but not all). Scores of 3 and 4 would be acceptable here. I would probably score a 5.
READ SLIDE.
Key messages:
- Hard to describe a program in a paragraph. You will know your programs much better and will better able to assess all criteria.
- This program demonstrates all of the level three criteria and at least some from level 5 (but not all). Scores of 3 and 4 would be acceptable here. I would probably score a 5.
READ SLIDE.
Key messages:
Reasonable accommodation doesn’t mean a program has to work for all children
The consultation with caregivers about individual youth was key here
Program staff also connected youth to other resources, which is excellent
READ SLIDE.
Key messages:
- Hard to describe a program in a paragraph. You will know your programs much better and will better able to assess all criteria.
- This program demonstrates all of the level three criteria and at least some from level 5 (but not all). Scores of 3 and 4 would be acceptable here. I would probably score a 5.
READ SLIDE.
Key messages:
- Hard to describe a program in a paragraph. You will know your programs much better and will better able to assess all criteria.
- This program demonstrates all of the level three criteria and at least some from level 5 (but not all). Scores of 3 and 4 would be acceptable here. I would probably score a 5.
READ SLIDE.
Key message:
Fun or accidentally educational isn’t the same as intentional
Activities for a K-5 group, are probably geared toward the midpoint and are probably not age appropriate for the older and younger kids.
Some planning, at least the night before
Good flexibility
If staff don’t have the materials they need, that is an agency/leadership issue (remember this is a program, not staff, self-assessment)
Probably would score this a 2, but cases could be made for a 1 or (MAYBE a 3) depending on how the activities are facilitated.
READ SLIDE.
Key messages:
- Hard to describe a program in a paragraph. You will know your programs much better and will better able to assess all criteria.
- This program demonstrates all of the level three criteria and at least some from level 5 (but not all). Scores of 3 and 4 would be acceptable here. I would probably score a 5.
Some Providers may have high levels of experience with self-assessments, and so may all or most of their sites and staff. Other sites may have very little experience with a CQI process.
As a result, DHS and PHMC would like to provide some potential options for how the self-assessment might be implemented. Sites can select the level of support they would like based on their level of comfort with these types of tools. Please keep in mind that whatever implementation scenario you select, your Program Specialist will be with you along the way to support and guide you.
Sites that are very comfortable with self-assessment can complete the self-assessment tool on their own and return to their Program Specialist with minimal assistance. The Program Specialist will still meet with these sites to finalize goals and action steps.
Sites that have some experience, but may have questions should feel free to contact their Program Specialists as needed. A quick phone call or e-mail about 1 or 2 items may be all the support you need. Or, maybe the Program Specialist can provide more support when you get to the goal setting stage.
Finally, we understand that sites that have no prior experience with self-assessments may need more support. This training is a great resource for other individuals in your organization. Additionally, your Program Specialist is available to walk you through the process.
One note for agency leadership, in the event that you or your staff is already pretty comfortable with using a self-assessment, this could be a way for those who are more experienced to take a leadership role and help some less experienced or new staff/Site Directors.
Some Providers may have high levels of experience with self-assessments, and so may all or most of their sites and staff. Other sites may have very little experience with a CQI process.
As a result, DHS and PHMC would like to provide some potential options for how the self-assessment might be implemented. Sites can select the level of support they would like based on their level of comfort with these types of tools. Please keep in mind that whatever implementation scenario you select, your Program Specialist will be with you along the way to support and guide you.
Sites that are very comfortable with self-assessment can complete the self-assessment tool on their own and return to their Program Specialist with minimal assistance. The Program Specialist will still meet with these sites to finalize goals and action steps.
Sites that have some experience, but may have questions should feel free to contact their Program Specialists as needed. A quick phone call or e-mail about 1 or 2 items may be all the support you need. Or, maybe the Program Specialist can provide more support when you get to the goal setting stage.
Finally, we understand that sites that have no prior experience with self-assessments may need more support. This training is a great resource for other individuals in your organization. Additionally, your Program Specialist is available to walk you through the process.
One note for agency leadership, in the event that you or your staff is already pretty comfortable with using a self-assessment, this could be a way for those who are more experienced to take a leadership role and help some less experienced or new staff/Site Directors.
In addition to the self-assessment tool, PSAYDN also provides two other documents which may be useful during the self-assessment process.
The first document is a 1-page statement of program quality developed by PSAYDN. If you feel that your team might have different ideas about what “program quality” is, this document is one way to get everyone on the same page. Consider reviewing the document with your team prior to asking them to provide input.
The second is the discussion guide. This document removes the ratings and instead poses a number of reflection or discussion questions. This guide can be used to collect input from a variety of individuals when a smaller group will ultimately be responsible for the scoring. Discussion guides will not be collected by PHMC staff, so participants should feel safe to be as honest and as open as they would like.
All three documents: the self-assessment, the discussion guide, and the 1-page statement of quality can be found at phillyost.org.
Some Providers may have high levels of experience with self-assessments, and so may all or most of their sites and staff. Other sites may have very little experience with a CQI process.
As a result, DHS and PHMC would like to provide some potential options for how the self-assessment might be implemented. Sites can select the level of support they would like based on their level of comfort with these types of tools. Please keep in mind that whatever implementation scenario you select, your Program Specialist will be with you along the way to support and guide you.
Sites that are very comfortable with self-assessment can complete the self-assessment tool on their own and return to their Program Specialist with minimal assistance. The Program Specialist will still meet with these sites to finalize goals and action steps.
Sites that have some experience, but may have questions should feel free to contact their Program Specialists as needed. A quick phone call or e-mail about 1 or 2 items may be all the support you need. Or, maybe the Program Specialist can provide more support when you get to the goal setting stage.
Finally, we understand that sites that have no prior experience with self-assessments may need more support. This training is a great resource for other individuals in your organization. Additionally, your Program Specialist is available to walk you through the process.
One note for agency leadership, in the event that you or your staff is already pretty comfortable with using a self-assessment, this could be a way for those who are more experienced to take a leadership role and help some less experienced or new staff/Site Directors.
One common question many providers have is “How long will this take?” and the short answer is, “it depends.”
Programs that plan to take the self-assessment process seriously should expect it to take several hours. Discussing these areas and collecting input from a variety of perspectives takes time. You’ll want to make sure everyone understands what CQI is and how it will help your program make an impact on the youth you serve. You may want to gather different points of view on different sections. Finally, you’ll need to enter your scores into the SurveyMonkey link provided later in this presentation.
At minimum, we expect one person from each agency to view participate in this training (in person or online). Although we recommend working in a team, he or she may complete the self-evaluation alone and submit it via Survey-Monkey.
Here are a few tips for Site Directors and Program Directors.
Make an appointment with your supervisor and/or the ED to review any suggestions, concerns, etc. There are parts of this assessment that they may have a greater ability to complete (especially section A).
Let your team know that this is coming (don’t spring it on them at a staff meeting without a head ups); give them a copy of the assessment (or relevant sections) so that they can become familiar with the questions. Put it in your calendar, set a clear deadline and stick to it! The 1 pager is helpful for this.
Consider involving other stakeholders (parents, partners, other staff from within the agency);
Remember to try to collect ‘evidence’ (sample parent/family survey, newsletters, student artwork, rubrics, etc.) of the story you are trying to tell. PHMC will not require that you submit your evidence, but it can be helpful to review during the self-assessment and goal setting conversations.
All of the documents you will need are posted at phillyost.org
Executive Directors and Senior Administrators should also keep in mind a few tips that will help staff successfully and confidently utilize the PSAYDN self-assessment. These tips were shared at the September ED meeting.
Make yourself available for questions that come up. There is some agency information that is needed to successfully complete the self-assessment that some staff may not know.
Set up a brief meeting before starting the self-assessment to talk with the key staff person(s) who are going to lead the self-assessment; share tips or suggestions from your previous experience(s).
Review the self-assessment before your staff meets with their Program Specialist to help set goals and identify resources necessary for change/improvements.
Support the staff in making necessary changes/improvements and take note of and acknowledge accomplishments, improvements, etc.
When you have completed the self-assessment tool, you will get the link above to submit your self-assessment. PHMC is requesting that all self-assessments be completed by October 31st.
In order to submit your self-assessment, you will need your program’s PPSID. This a number unique to your program (agency, site, and model). PPSID numbers are visible in PCAPS and can be emailed to you upon your request.
If you have any trouble submitting your self-assessment, you may contact your Program Specialist for support.
Here is another way to think about the steps we’ve just discussed.
Step 1: Participant in an Introduction to CQI workshop or online training: Congrats, you’ve almost completed step 1!
Step 2: Review the tool and decide level of support that your staff or you need or want by mid-October. Don’t wait until the end of the month to inform your Specialist that you need assistance.
Step 3: Complete the tool with or without support from PS by October 31st 2013. Submit it via SurveyMonkey when it is complete.
Step 4: PHMC will analyze the data we receive to get a sense of where we are as a system and to inform things like professional development offerings and other quality improvement efforts we can do on a large scale.
Step 5: Meet with your Program Specialist in November or December to identify your goals, including next steps and necessary resources that can help your program to reach those goals.
The amount of information on the web related to program methodologies and project ideas is astonishing. I encourage you to develop your go to list that fits your mission and programming and share it back with your Program Specialist. We are always looking for a comprehensive list of sites that providers actually use.
Here are some additional resources for how self-assessment tools can help improve program quality. There are LOTs of resources on the internet about the goal of self-assessment, the research on how quality can improve, etc.