Design and Analysis of Experiments
Gary Glass, Jr.
Director, Monitoring and Evaluation
Blumont Inc.
Vancouver, Washington, United States
Christy Lazicky, MPA/ID
Director of Impact Evaluation
Causal Design, United States
Gary Glass, Jr.
Director, Monitoring and Evaluation
Blumont Inc.
Vancouver, Washington, United States
Location: Room 105
Abstract Information: Understanding what works to improve food security and wellbeing in humanitarian contexts is critical for implementers to understand. However, the evidence base on which interventions are most effective in humanitarian settings remains scarce. One reason for this is that Randomized Control Trials (RCT) are considered the gold standard when designing impact evaluations as they can provide definitive evidence of program impact. However, emergency humanitarian assistance interventions, take place in operating contexts and under reduced award lengths that often preclude the use of a traditional RCT. As such, it is necessary that evaluators rely on alternative – often multiple – methods to gather evidence and convincingly tell the story of what is or is not working and how to improve wellbeing in these contexts. We present completed impact evaluation work in Northeast Syria using realistic approaches and alternative methods to understand the effectiveness of a series humanitarian assistance interventions. Specifically, we combine both quasi-experimental and non-experimental approaches including a qualitative lens to tell the story of evaluation that if often missed in a traditional RCT and is needed to capture the nuance of humanitarian interventions and the stories of those receiving emergency humanitarian assistance. We discuss the implications of RCT limitations in the humanitarian context and how we designed an alternative methodology to tell our story. We also share how strong implementor and evaluator collaboration can more effectively generate these stories by grounding data in contextual realities. The approaches we include in this presentation can considered by donors, implementors and evaluators who seek to complete impact measures for humanitarian interventions where a traditional RCT cannot or should not be applied.
Relevance Statement: This presentation will advance the discussion around impact evaluation for humanitarian assistance programming. We will explain how the implementing partner (Blumont) and evaluator (Causal Design) collaborate to design a realistic approach to measure humanitarian assistance interventions in Northeast Syria. This presentation will add to the theoretical discussion around impact evaluation application for humanitarian assistance programming by providing real-world examples from our current collaboration. Evaluators will learn how we chose to use these approaches and why these will be useful in the coming years as evaluations gain importance to USAID’s Bureau for Humanitarian Assistance and similar donors.