This 16 slide presentation Monitoring and Evaluation is Module 7 of a nine (9) module online course for adult education policy makers and practitioners to complement an innovative toolkit to guide adult education policy and practice.
Participation in adult education varies significantly across states and regions of Europe! Why? Evidence and literature suggests a wide disparity in policy making, programming and implementation skills in the adult education sector across Europe. It is imperative that policy makers and programme managers address this disparity to foster life-long learning for a smart-sustainable Europe (see EU2020 https://ec.europa.eu/info/business-economy-euro/economic-and-fiscal-policy-coordination/eu-economic-governance-monitoring-prevention-correction/european-semester/framework/europe-2020-strategy_en) and to achieve a European target of 15% of the adult population engaged in learning.
In response to this challenge, the ERASMUS+ DIMA project (See https://dima-project.eu/index.php/en/, 2015 to 2017) developed a practical 9 module online course to complement an innovative toolkit to guide adult education policy and practice. The DIMA toolkit (See https://dima-project.eu/index.php/en/toolkit) introduces tools for developing, implementing, and monitoring adult education policies, strategies, and practices.
Author: Michael Kenny and DIMA Project partners (https://dima-project.eu/index.php/en/partners)
Web & Social Media Analytics Previous Year Question Paper.pdf
Module 7: Monitoring and Evaluation Dima course content
1. www.dima-project.eu
This project has been funded with support from the European Commission. This publication [communication]
and all its content reflect the views only of the author, and the Commission cannot be held responsible for any
use which may be made of the information contained therein. Project Number: 2015-1-CY01-KA204-011850
Module 7:
Monitoring and Evaluation
The Content of the following presentation was prepared
jointly by the ERASMUS+ DIMA Project consortium partners
working collaboratively – Voice overs added for the purposes
of the Irish partner by Michael Kenny, Department of Adult &
Community Education, Maynooth University.
(See partner list following …)
2. DIMA Consortium
The consortium consists of 6 partners, from 5 European
countries which cover a wide range of expertise related
to the aims of the project
Cyprus: Ministry of Education and Culture
Cyprus: Centre for Advancement of Research and Development in
Education Technology Ltd – Cardet
Ireland: Department of Adult & Community Education, Maynooth
University
Slovenia: Slovenian Institute for Adult Education
Slovakia: National Institute of Lifelong Learning
Belgium: European Association for the Education
of Adults
3. At the end of this module learners will be
able to:
• Describe the difference between a
benchmark and a quantitative or
qualitative indicator.
• Explain the importance of monitoring and
evaluation for the policy making process.
Learning Outcomes
4. There are various definitions of monitoring:
• A systematic and continuous checking of operation
• The collection of information for evaluation
• Usually utilizes measurable / quantitative
indicators
• Usually ICT (information Communication
Technology) facilitates Monitoring
• Monitoring used interchangeably with terms
‘process’ or ‘formative’ evaluation
• Is an integral part of the evaluation process
What is Monitoring?
5. • Evaluation is used to determine if objectives have been
achieved.
• Systematic use of research methods for judgment of different
elements/point of views of the object under study.
• The gathering, analyzing and interpreting information on
different elements/point of views of object‘s efficiency,
effectiveness and others views concerning achieved results.
• A process for determining the worth or value of something.
Definitions of Evaluation
6. • Evaluation is “… a process of a systematic and critical analysis
leading to judgements and/or recommendations for
improvement“
(Adapted from Vlãsceanu et al., 2004, p. 37)
• Evaluation is “… the systematic assessment of the worth or
merit of an object“
(Stufflebeam & , Shinkfield 2007, p. 5)
Definitions of Evaluation
7. 1. Inclusion
2. Continuity
3. Proportional
breadth and depth
4. Transparency
5. Inbuilt feedback
procedure
6. Ethicality
7. Confidentiality
Principles of Monitoring and Evaluation
8. Evaluation is based on the values of:
• Integrity & morality
• Feasibility
• Safety
• Significance
• Equity
• Merit and Worth
(adapted from Stufflebeam & Shinkfield 2007, p. 13-15)
Evaluation Values
9. What is Merit and Worth?
Merit is:
• intended to assess intrinsic
value of the object
• based on the quality
standards of the object being
evaluated
• focuses on the operation of
the object in comparison with
its purpose
Worth is:
• intended to assess extrinsic
value of the object
• focuses on the wider
significance of the object
taking account of needs
• assesses the object‘s
importance within a given
context
(adapted from Stufflebeam & Shinkfield 2007, p. 9-10)
10. The object in an Evaluation may be:
• A policy, a programme, etc.
• An organisation
• A product, a service, etc.
• A tool, equipment, etc.
• An individual
• A theory
… within different areas, disciplines and sub-disciplines
What is the Object in Evaluation?
11. Formative & Summative Evaluation
Formative Evaluation Summative Evaluation
Carried out During the period of
observation
At end of period of observation
Main purpose Improvement Accountability
Data collection Is continous Is limited
Role Forms base for summative
evaluation
Brings together data gathered in
formative evaluation
Cost Usually less costly Usually more expensive
12. Monitoring and Evaluation Stages
Defining the Monitoring
and Evaluation Agenda
Designing the
Monitoring and
Evaluation Plan
Reporting and Feedback
to Confirm, Reframe or
Revise Policy
Implementation of the
Monitoring and
Evaluation Process
13. Types of Data Used in Monitoring and Evaluation
Data
by Characteristics
by mode of
acquisition
Qualitative Quantitative Primary Secondary
Adapted from Bregar, L., Ograjenšek, I. , Bavdaž M. 2005.
14. • Quantitative data – expressed in numerical form, gathering
relatively cheap, BUT conclusions based solely on these data
could be misleading.
• Qualitative data - expressed in narrative form (descriptions,
interpretations), can provide a deeper insight to quantitative
data, BUT gathering could be more expensive.
• Primary data - gathered from informants for the needs of the
evaluation.
• Secondary data – data already available from non-primary
sources perhaps assembled for other/different purposes.
Types of Data Used in Evaluation
15. • Use more than one method for data collection.
• Combine methods according to their appropriateness, strenghts and weaknesses.
• Refer to the DIMA toolkit Module 3 ‘Stakeholders and Consultation’ for a list of methods.
Examples of common data collecting methods are:
• Questionnaires (in person, by telephone, post, on-line),
• Interviews (structured, partly structured, unstructured),
• Focus group, deliberate or purposeful sample,
• Observation, public engagement, artistic method
• Documentary sources (official reports, minutes of the meeting, personal diares, on-
line/off-line, etc.) Etc.
Methods of Collecting Data
16. www.dima-project.eu
This project has been funded with support from the European Commission. This publication [communication]
and all its content reflect the views only of the author, and the Commission cannot be held responsible for any
use which may be made of the information contained therein. Project Number: 2015-1-CY01-KA204-011850
End of Module 7 of 9
Thank you
i
(Download toolkit from E-Prints
http://eprints.maynoothuniversity.ie/10121/
Or https://dima-project.eu/index.php/en/toolkit)