Systems in Evaluation
Akugizibwe Byenkya, MA
Monitoring, Evaluation and Learning Advisor
USAID
Nairobi, Kenya
Akugizibwe Byenkya, MA
Monitoring, Evaluation and Learning Advisor
USAID
Nairobi, Kenya
Chris Thompson, MPA
Technical Director, Strategy, Performance, & Learning
Social Impact, United States
Chris Thompson, MPA
Technical Director, Strategy, Performance, & Learning
Social Impact, United States
Joseph Mungai
Monitoring, Evaluation and Learning Advisor
Social Impact, United States
Joseph Mungai
Monitoring, Evaluation and Learning Advisor
Social Impact, United States
Marzia Faraz, n/a
Monitoring, Evaluation and Learning Specialist
USAID, United States
Marzia Faraz, n/a
Monitoring, Evaluation and Learning Specialist
USAID, United States
Stephen Okoth, Mr (he/him/his)
MEL Specialist
USAID Somalia
Location: White River Ballroom D
Abstract Information: When USAID/Somalia began developing its country strategy for 2020-2025, the stories being told centered on the challenges of evaluating programs contributing to preventing or countering violent extremism (P/CVE). Stories highlighted analytical challenges, such as a lack of data, untested theories of change (TOCs), and establishing causality, and practical challenges, like collecting reliable and disaggregated data, the safety of beneficiaries and evaluators, ensuring a context-specific lens, and built-in flexibility and agility. While P/CVE can be viewed as difficult to measure, this panel will show how USAID/Somalia is working to flip this script by developing a portfolio monitoring system that increases the use of evidence for adaptive management and assesses the effectiveness of programming. To make this system effective, the Mission and its partners considered questions around, how can we: gather data more rapidly than traditional evaluations, regularly test TOCs to better interpret progress, supplement findings with contextual research, and better utilize the considerable learning and program implementation experience of USAID development partners in Somalia?
Relevance Statement: Stories can shape how organizations choose to learn and manage their progress for better or worse, and when USAID/Somalia commenced designing its country strategy for 2022-2025, the dominant narrative among donors and practitioners focused on the challenges of evaluating programs contributing to preventing or countering violent extremism (P/CVE). There is no defined set of practices, methods, or systems used to evaluate the impact of programs that contribute to P/CVE. Analytical challenges include establishing causality and addressing contextual variations, and practical challenges include collecting timely, precise and high quality data. Furthermore, establishing causality in P/CVE programming faces the obstacle of proving that violent activity or radicalization would have otherwise occurred had there not been an intervention and accounting for the large number of variables and contextual factors that may have contributed to or affected outcomes in fragile or conflict-prone environments. These challenges contribute to a lack of evidence-based analysis to support program implementation and adaptation. This had the potential to limit USAID/Somalia’s ability to test strategic assumptions, address questions of right-fit interventions, and adapt investments to appropriately use development tools to P/CVE in Somalia. Overcoming a dominant narrative around the challenges of evaluation requires intentionality, reflection, and collaboration. To better understand what causes violent extremism (VE) and develop effective interventions to prevent and counter it, USAID/Somalia is using a portfolio monitoring system that helps it learn from what has and has not worked previously. This approach systematically tests and refines the Mission’s P/CVE results framework and domains of change, collects data on project contributions toward these intended results at regular intervals using a cluster-based evaluation methodology, and strengthens the use of contextual data from development partners and stakeholders to interpret progress within a dynamic environment. In doing so, the system routinely provides information that helps to tell an evidence-based story of whether USAID has the right mix of activities to move the needle on P/CVE, what types of activities affect these intended outcomes in what ways and why, and how these answers change as the operating environment and context changes. This proposal is relevant and important to the field of evaluation as it showcases an example of how evaluation can be adapted to spur learning in such a complex and sensitive phenomenon as VE. It broadens the purview of evaluation to provide information that tests theories of change (TOC), and incorporate contextual analyses, and the meaningful utilization of development partner’s learning. The proposal described in the abstract adds to knowledge in the evaluation field around how evaluations can be used as opportunities to start telling a new story of how to generate evidence to address data gaps and inform implementing and adapting P/CVE programming.
Presenter: Marzia Faraz, n/a – USAID
Presenter: Joseph N. Mungai – Social Impact
Presenter: Chris Thompson, MPA – Social Impact
Presenter: Akugizibwe Byenkya, MA – USAID