6. Planning Hierarchy
OperationalStrategic
• 3-5 years
• HIV/LGBTI
health split
• High level
• Simple
language
ACON Strategic
Plan (SP)
Health Outcome
Strategies (HOS)
• 6 nominated
health areas
(TBC)
• Comprehensive
Framework
• Population
Engagement
Strategy
• Strategy Working
Groups
Health Program
Records (HPR)
• Program
Specific
• Consolidated
Framework
• Cross-
Divisional
Health
Project Plan
(HPP)
• Project &
Activity
Specific
• Aligned to HPR
or HOS
• Consolidated
Framework
B
U
S
I
N
E
S
S
P
L
A
N
7. Primary Activity Schedule
Case
Coordination
Capacity
Building
Marketing &
Communication
Enablers
Peer Support &
Education
Client Support Counselling
Home-Based
Care
Therapeutic
Groups
Priority Support
& Treatments
Print & Web
Based
Communication
Social Media
Safe Sex
Resources
Community
Events
Peer Education
Workshops
Peer Education
Outreach
Social Inclusion &
Connectedness
NSP
Support &
Training
Research
Partnerships
External
Representation
Venues
Liaison
Volunteers
Grants &
New Services
Policy &
Advocacy
Operations Fundraising
Learning &
Development
8. Evaluation Pyramid
6 Month Follow Up
Pre & Post
Assessment
Feedback
Surveys
Data
Collection
Output
Indicators
Quality
Indicators
Short-term
Impact
Indicators
Longer-term
Impact
Indicators
9. Evaluation Indicator Bands
THEME PRIMARY ACTIVITIES
INDICATOR BAND
Output Quality Impact Impact
SVR ∆ KSC ∆ Behaviour
PEER SUPPORT
Peer Education Workshops X X X X
Peer Education Outreach X X
Social Inclusion and
Connectedness
X
Needle and Syringe Program
(NSP)
X X X
Priority Support and Treatments
X X
X X
11. Key Benefits and Desired Outcomes
• Strong and adaptable foundation
• Ability to harvest accurate data to inform organisational
activities
• Build ACON’s capacity to develop knowledge in order to
become a learning and sharing organisation
• Ability to monitor and reflect on health trends, and provide
targeted responses to community needs
• Increased quality in service(s) delivery
• Greater operational efficiencies
• Increased community, client and employee satisfaction.
12. Thank You
A special thanks to:
Ryan Jackson
Planning Coordinator
Helen Conway
Monitoring & Evaluation Officer
Alan Brotherton
Director, Policy, Strategy & Research
Notes de l'éditeur
ACON is NSW’s largest community-based HIV/AIDS and GLBT health organisationOrganisational review, recommended improved Planning & Evaluation.Roles of P&E created in 2012, Ryan & I started in July.Larger organisation, have capacity to have specific roles. May work in smaller orgs, not have dedicated roles. Give practical advice or tips take back to own place. Try not to confuse with ACON jargon.
Current Processes & Data Collection are:Non-standardisedHighly segmentedIndependentInconsistent Reporting & Evaluation which are:IsolatedFragmentedAd hocDecreased potential to:Have accurate and meaningful dataHave proactive, rather than reactive, reportingEvaluate & gauge activitiesMaximise outcomesShare knowledge & learningsReport & communicate feedback, internally & externallyConsultation process with all teams within ACON, found out what they’re currently doing (if any) and any challengesResulted in PEKM framework
As a parallel process to this, we ran an anonymous staff evaluation survey in September to diagnose the culture and practice of evaluation across the organisation.We are lucky that staff response showed genuine interest in evaluation and wanting to do it. They understand the value, so we are in good place to implement this framework. Overall, there was strong consensus that evaluation plays a number of important roles at ACON, mainly in improving service and they indicated that a majority of staff have incorporated evaluation results into program planning and delivery. Biggest weakness:There was a lack of shared learning and communication of evaluation processes and outcomes. This has become a core objective of the PEKM framework, the need for making the ‘sharing’ of evaluation a key aspect in order to improve organisational and individual learning. It’s important that we make a ‘safe’ place where it’s OK to talk about strategies and outcomes, even if they don’t work as intended. The main thing is that evaluation is planned in such a way that the right information can be captured to inform that process and that everyone learns from it and doesn’t repeat the same mistakes.Staff generally want the time to be able to do it properly, ensure that they’re capturing the right information and that the information is going somewhere useful. This requires the development of templates and clear guidelines.
Foundation of PEKM Framework is based on a hybrid of the well known health promotion and project management cycle with the added component of ‘SHARE’, as this aspect provides ACON the opportunity to leverage the greatest benefit from. Purpose to build capacity of all health promotion practitioners to do own planning and evaluation. Lots of health promotion officers very good at ‘doing’ but evaluation of the ‘doing’ is lacking. Our team here to support the P&E components and integrate so this become standard part of everyone’s role.
The PAS is structured into a thematic model, grouping similar-natured activities and operations into categories. The categories are tiered from a base of fundamental operational activities - the ‘Enablers’, through to partnership- and human resource-enhancing activities – ‘Capacity Building’, and finally into wider community engagement activities – ‘Marketing and Communication’, ‘Peer Support and Education’, and ‘Client Support’.This system allows for a list of activity types to be standardised for planning and evaluation mechanisms, and reporting requirements, whilst retaining flexibility for program delivery.
So what information do they need to capture?Output gets done the most, Quality fairly regularly. These are your ‘Process’ evaluation – how much did we do & how well did we do it? If your indicator is to provide 1,000 counselling sessions, that’s fairly easy to count, but have you made any difference? That’s where impact indicators come in. People get stuck on setting indicators.
From organisation perspective:Important to be able to select clear indicators that tell us whether we’ve met our goals or not. Eg. one objective may be to reduce the transmission of HIV. What do you think may be an indicator that tells us whether we’ve reduced it or not? Perhaps # of HIV notifications. External data source, credible data source, collected anyway, no extra workload for ACON or health practitioners.BUT what if one of the strategies use to deliver on this objective were to increase testing rates? What if the corresponding rise in the number of people getting tested meant we picked up more diagnoses? Does that mean we’ve failed if the notification numbers increase? Important to ensure that indicators are not viewed in isolation. Down at the operational end, evaluation much more concerned with Impact & Process level evaluation.