Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

Improving disaster response evaluations : Supporting advances in disaster risk management through the enhancement of response evaluation usefulness

Beerens, Ralf Josef Johanna LU orcid (2021)
Abstract
Future disasters or crises are difficult to predict and therefore hard to prepare for. However, while a specific event might not have happened, it can be simulated in an exercise. The evaluation of performance during such an exercise can provide important information regarding the current state of preparedness, and used to improve the response to future events. For this to happen, evaluation products must be perceived as useful by the end user. Unfortunately, it appears that this is not the case. Both evaluations and their products are rarely used to their full extent or, in extreme cases, are regarded as paper-pushing exercises.

The first part of this research characterises current evaluation practice, both in the scientific... (More)
Future disasters or crises are difficult to predict and therefore hard to prepare for. However, while a specific event might not have happened, it can be simulated in an exercise. The evaluation of performance during such an exercise can provide important information regarding the current state of preparedness, and used to improve the response to future events. For this to happen, evaluation products must be perceived as useful by the end user. Unfortunately, it appears that this is not the case. Both evaluations and their products are rarely used to their full extent or, in extreme cases, are regarded as paper-pushing exercises.

The first part of this research characterises current evaluation practice, both in the scientific literature and in Dutch practice, based on a scoping study, document and content analyses, and expert judgements. The findings highlight that despite a recent increase in research attention, few studies focus on disaster management exercise evaluation. It is unclear whether current evaluations achieve their purpose, or how they contribute to disaster preparedness. Both theory and practice tend to view, and present evaluations in isolation. This limited focus creates a fragmented field that lacks coherence and depth. Furthermore, most evaluation documentation fails to justify or discuss the rational underlying the selected methods, and their link to the overall purpose or context of the exercise. The process of collecting and analysing contextual, evidence-based data, and using it to reach conclusions and make recommendations lacks methodological transparency and rigour. Consequently, professionals lack reliable guidance when designing evaluations.

Therefore, the second part of this research aimed to gain an insights into what make evaluations useful, and suggest improvements. In particular, it highlights the values associated with the methodology used to record and present evaluation outcomes to end users. The notion of an ‘evaluation description’ is introduced to support the identification of four components that are assumed to influence the usefulness of an evaluation: its purpose, object description, analysis and conclusion. Survey experiments identified that how these elements – notably, the analysis and/ or conclusions – are documented significantly influences the usefulness of the product. Furthermore, different components are more useful depending on the purpose of the report (for learning or accountability). Crisis management professionals expect the analysis to go beyond the object of the evaluation, and focus on the broader context. They expect a rigorous evaluation to provide them with evidence-based judgements that deliver actionable conclusions and support future learning.

Overall, this research shows that the design and execution of evaluations should provide systematic, rigorous, evidence-based and actionable outcomes. It suggests some ways to manage both the process and the products of an evaluation to improve its usefulness. Finally, it underlines that it is not the evaluation itself that leads to improvement, but its use. Evaluation should, therefore, be seen as a means to an end. (Less)
Please use this url to cite or link to this publication:
author
supervisor
opponent
  • Ass. Prof. Kruke, Björn Ivar, University of Stavanger, Norway.
organization
publishing date
type
Thesis
publication status
published
subject
keywords
crisis, disaster, emergency, disaster risk management (DRM), preparedness, exercise, simulation, response, performance, evaluation, usefullness, design, The Netherlands
pages
166 pages
publisher
Division of Risk Management and Societal Safety, Faculty of Engineering, Lund University
defense location
Lecture hall V:B, building V, John Ericssons väg 1, Faculty of Engineering LTH, Lund University, Lund. Zoom: https://lu-se.zoom.us/j/65999762322?pwd=dnJ0Q1pOdlVWdVk2MndEZjg1akpyUT09
defense date
2021-09-03 10:15:00
ISBN
978-91-7895-923-5
978-91-7895-922-8
language
English
LU publication?
yes
id
f7cb9502-de1f-4f7f-bbe6-ad555fed9de7
date added to LUP
2021-06-10 11:45:03
date last changed
2022-04-07 08:46:04
@phdthesis{f7cb9502-de1f-4f7f-bbe6-ad555fed9de7,
  abstract     = {{Future disasters or crises are difficult to predict and therefore hard to prepare for. However, while a specific event might not have happened, it can be simulated in an exercise. The evaluation of performance during such an exercise can provide important information regarding the current state of preparedness, and used to improve the response to future events. For this to happen, evaluation products must be perceived as useful by the end user. Unfortunately, it appears that this is not the case. Both evaluations and their products are rarely used to their full extent or, in extreme cases, are regarded as paper-pushing exercises.<br/><br/>The first part of this research characterises current evaluation practice, both in the scientific literature and in Dutch practice, based on a scoping study, document and content analyses, and expert judgements. The findings highlight that despite a recent increase in research attention, few studies focus on disaster management exercise evaluation. It is unclear whether current evaluations achieve their purpose, or how they contribute to disaster preparedness. Both theory and practice tend to view, and present evaluations in isolation. This limited focus creates a fragmented field that lacks coherence and depth. Furthermore, most evaluation documentation fails to justify or discuss the rational underlying the selected methods, and their link to the overall purpose or context of the exercise. The process of collecting and analysing contextual, evidence-based data, and using it to reach conclusions and make recommendations lacks methodological transparency and rigour. Consequently, professionals lack reliable guidance when designing evaluations.<br/><br/>Therefore, the second part of this research aimed to gain an insights into what make evaluations useful, and suggest improvements. In particular, it highlights the values associated with the methodology used to record and present evaluation outcomes to end users. The notion of an ‘evaluation description’ is introduced to support the identification of four components that are assumed to influence the usefulness of an evaluation: its purpose, object description, analysis and conclusion. Survey experiments identified that how these elements – notably, the analysis and/ or conclusions – are documented significantly influences the usefulness of the product. Furthermore, different components are more useful depending on the purpose of the report (for learning or accountability). Crisis management professionals expect the analysis to go beyond the object of the evaluation, and focus on the broader context. They expect a rigorous evaluation to provide them with evidence-based judgements that deliver actionable conclusions and support future learning.<br/><br/>Overall, this research shows that the design and execution of evaluations should provide systematic, rigorous, evidence-based and actionable outcomes. It suggests some ways to manage both the process and the products of an evaluation to improve its usefulness. Finally, it underlines that it is not the evaluation itself that leads to improvement, but its use. Evaluation should, therefore, be seen as a means to an end.}},
  author       = {{Beerens, Ralf Josef Johanna}},
  isbn         = {{978-91-7895-923-5}},
  keywords     = {{crisis; disaster; emergency; disaster risk management (DRM); preparedness; exercise; simulation; response; performance; evaluation; usefullness; design; The Netherlands}},
  language     = {{eng}},
  publisher    = {{Division of Risk Management and Societal Safety, Faculty of Engineering, Lund University}},
  school       = {{Lund University}},
  title        = {{Improving disaster response evaluations : Supporting advances in disaster risk management through the enhancement of response evaluation usefulness}},
  url          = {{https://lup.lub.lu.se/search/files/99042044/BEERENS_RJJ_2021_Improving_disaster_response_evaluations_E_spikning_THESIS_excl._papers_.pdf}},
  year         = {{2021}},
}