Alayne da Costa Duarte, Marcos Roberto da Silva Borges, José Orlando Gomes, and Paulo Victor R. de Carvalho on "ASC Model: A Process Model for the Evaluation of Simulated Field Exercises in the Emergency Domain" at ISCRAM 2013 in Baden-Baden.
10th International Conference on Information Systems for Crisis Response and Management
12-15 May 2013, Baden-Baden, Germany
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
ASC Model: A Process Model for the Evaluation of Simulated Field Exercises in the Emergency Domain
1. ASC Model: a process model
for the evaluation of simulated
field exercises in the
emergency domain
Alayne da Costa Duarte
Marcos Roberto da Silva Borges
José Orlando Gomes
Paulo Victor R. de Carvalho
2013
6. Introdução | Modelo ASC | Avaliação do Modelo ASC | Considerações Finais
Current Process
6
Activities and flow between
them are undefined
Data
gathering
Evaluation
meeting
Report
7. Introdução | Modelo ASC | Avaliação do Modelo ASC | Considerações Finais
Problem
7
8. A structured model
Enable the
development of
an evaluation
process
Activities
defined
and
connected
Develop a
process
model
8
9. Evaluation Process
9
Evaluation based on the objectives
Goals
Methodolog
y
Indicators Evaluation
instrument
Observer
s
Data
gatherin
g
Analysis
Data gathering
Evaluation
meeting
Report
11. ASC Model Goals
11
• State objectives of the evaluation
• Avoid a misperception about what should be evaluated
• Drive the other activities of the process
• Conditions for the flow continuity
• Coordinator defines goals according to the emergency plan
• Output - a list with the objectives
12. MethodologyASC Model
12
• Defines and details how the next steps will be executed
• With the goals defined, it is time to choose the most appropriate
methods for each of the sub processes or activities that follow:
the method/criteria observers selection
the method for survey indicators
Etc.
• Output - a list of methods
14. Evaluation InstrumentASC Model
14
• To be used by the exercise observers. The evaluation instrument
include questions:
- about the objectives
- the methodology
- generic and operational indicators list.
• The evaluator specifies the type of questions to be used, open
(discursive), closed (objective) or mixed (discursive and objective).
• Applied with the support of a computational too:
- chose and edit questions
- Store and display results
• Output: evaluation instrument
15. Observer´s selection
ASC Model
15
• Observers are the responders of the evaluation instrument.
• Select suitable and qualified people to collect data about the simulated
field exercise during runtime.
• When choosing observers, the evaluator should consider the skills of the
candidate linking them to the objectives of the exercise and the selection
criteria defined in the Methodology Details List.
• This sub process includes interviews with candidates or volunteers for the
observer roles, as well as the identification of individual skills that are then
compared to the desired skills, based on Simulation Goals List.
• Output: the Observers list.
16. Data GatheringASC Model
• This is a sub process performed by the observers
• The variables that comprise the indicators are represented in
questions to be answered in an evaluation instrument
• The issues are judged by observers as they accompany the
implementation of the script execution of the simulated
exercise.
• Values are recorded alongside their respective issues, as well as
the observers’ individual perceptions about the exercise.
• Output: the assessment tool filled out
17. Data analysisASC Model
17
• Data analysis is started after the execution of the simulated field
exercise
• The evaluator reads the data contained in the evaluation
instrument.
• The evaluation team compare the obtained results with the
expected results
• The evaluation team and observers gather to share
considerations about the results and personal perceptions
• Output: a report with evaluation results
18. • Table exercise: how activities are
performed
• Evaluation criteria: generality, flexibility,
usability, completness
• Participants:
– 2 PPGI/UFRJ MSc students
– 4 drill coordinators
Model evaluation
18
20. Evaluation results
• Generality
• Exercises in different areas
• Good results
• Usability
• Easy understanding of the process
• Easy of use
• Time consuming
20
21. Evaluation results
• Flexibility
• Used without changes in strucuture
• Able to deal with diverse exercise
specifications
• Completeness
• All activities defined and structured in a
satisfactory way
21
22. “I thought the model is comprehensive and simple!”
"... it appears to be quite broad ... Able to cope with the
various features of an exercise ... ”
"The model is great, it has great potential to become a tool
for the evaluation of simulated exercises and contribute
effectively to the organizations involved in the process
....”
Some comments
22
23. • Detail ASC Model, providing an easier of
process development
• Survey of the main challenges and difficulties
related to the development of the evaluation
process
• Develop computerized tools for drill
evaluation
• Evaluate the model in real exercises
• Further Indicators development
Future work
23