Programmaleider Rampen en Milieudreigingen; bijzonder hoogleraar 'Crises, veiligheid en gezondheid', Rijksuniversiteit Groningen
Publicatie
Publicatie datum
Disaster exercises to prepare hospitals for Mass Casualty Incidents. Does it contribute to preparedness or is it ritualism?
Verheul, M.L.M.I., Dückers, M.L.A., Visser, B.B., Beerens, R.J.J., Bierens, J.J.L.M. Disaster exercises to prepare hospitals for Mass Casualty Incidents. Does it contribute to preparedness or is it ritualism? Prehospital and Disaster Medicine: 2018, 33(4), p. 287-393.
Download de PDF
Introduction
The central question this study sought to answer was whether the team members of Strategic Crisis Teams (SCTs) participating in mass-casualty incident (MCI) exercises in the Netherlands learn from their participation.
Methods
Evaluation reports of exercises that took place at two different times were collected and analyzed against a theoretical model with several dimensions, looking at both the quality of the evaluation methodology (three criteria: objectives described, link between objective and items for improvement, and data-collection method) and the learning effect of the exercise (one criterion: the change in number of items for improvement).
Results
Of all 32 evaluation reports, 81% described exercise objectives; 30% of the items for improvement in the reports were linked to these objectives, and 22% of the 32 evaluation reports used a structured template to describe the items for improvement. In six evaluation categories, the number of items for improvement increased between the first (T1) and the last (T2) evaluation report submitted by hospitals. The number of items remained equal for two evaluation categories and decreased in six evaluation categories.
Conclusion
The evaluation reports do not support the ideal-typical disaster exercise process. The authors could not establish that team members participating in MCI exercises in the Netherlands learn from their participation. More time and effort must be spent on the development of a validated evaluation system for these simulations, and more research into the role of the evaluator is needed. (aut. ref.)
The central question this study sought to answer was whether the team members of Strategic Crisis Teams (SCTs) participating in mass-casualty incident (MCI) exercises in the Netherlands learn from their participation.
Methods
Evaluation reports of exercises that took place at two different times were collected and analyzed against a theoretical model with several dimensions, looking at both the quality of the evaluation methodology (three criteria: objectives described, link between objective and items for improvement, and data-collection method) and the learning effect of the exercise (one criterion: the change in number of items for improvement).
Results
Of all 32 evaluation reports, 81% described exercise objectives; 30% of the items for improvement in the reports were linked to these objectives, and 22% of the 32 evaluation reports used a structured template to describe the items for improvement. In six evaluation categories, the number of items for improvement increased between the first (T1) and the last (T2) evaluation report submitted by hospitals. The number of items remained equal for two evaluation categories and decreased in six evaluation categories.
Conclusion
The evaluation reports do not support the ideal-typical disaster exercise process. The authors could not establish that team members participating in MCI exercises in the Netherlands learn from their participation. More time and effort must be spent on the development of a validated evaluation system for these simulations, and more research into the role of the evaluator is needed. (aut. ref.)
Introduction
The central question this study sought to answer was whether the team members of Strategic Crisis Teams (SCTs) participating in mass-casualty incident (MCI) exercises in the Netherlands learn from their participation.
Methods
Evaluation reports of exercises that took place at two different times were collected and analyzed against a theoretical model with several dimensions, looking at both the quality of the evaluation methodology (three criteria: objectives described, link between objective and items for improvement, and data-collection method) and the learning effect of the exercise (one criterion: the change in number of items for improvement).
Results
Of all 32 evaluation reports, 81% described exercise objectives; 30% of the items for improvement in the reports were linked to these objectives, and 22% of the 32 evaluation reports used a structured template to describe the items for improvement. In six evaluation categories, the number of items for improvement increased between the first (T1) and the last (T2) evaluation report submitted by hospitals. The number of items remained equal for two evaluation categories and decreased in six evaluation categories.
Conclusion
The evaluation reports do not support the ideal-typical disaster exercise process. The authors could not establish that team members participating in MCI exercises in the Netherlands learn from their participation. More time and effort must be spent on the development of a validated evaluation system for these simulations, and more research into the role of the evaluator is needed. (aut. ref.)
The central question this study sought to answer was whether the team members of Strategic Crisis Teams (SCTs) participating in mass-casualty incident (MCI) exercises in the Netherlands learn from their participation.
Methods
Evaluation reports of exercises that took place at two different times were collected and analyzed against a theoretical model with several dimensions, looking at both the quality of the evaluation methodology (three criteria: objectives described, link between objective and items for improvement, and data-collection method) and the learning effect of the exercise (one criterion: the change in number of items for improvement).
Results
Of all 32 evaluation reports, 81% described exercise objectives; 30% of the items for improvement in the reports were linked to these objectives, and 22% of the 32 evaluation reports used a structured template to describe the items for improvement. In six evaluation categories, the number of items for improvement increased between the first (T1) and the last (T2) evaluation report submitted by hospitals. The number of items remained equal for two evaluation categories and decreased in six evaluation categories.
Conclusion
The evaluation reports do not support the ideal-typical disaster exercise process. The authors could not establish that team members participating in MCI exercises in the Netherlands learn from their participation. More time and effort must be spent on the development of a validated evaluation system for these simulations, and more research into the role of the evaluator is needed. (aut. ref.)