Nonprofit and Foundations
Kecia Bertermann, n/a (she/her/hers)
Director, Learning and Impact
Luminate, United States
Julia Coffman, n/a
Co-executive Director
Center for Evaluation Innovation, United States
Dorcas Mutheu (she/her/hers)
Manager, Learning & Impact
Luminate, Nairobi Area, Kenya
Alex Ash, MPP (he/him/his)
Senior Research Manager
M&C Saatchi World Services, United States
Location: White River Ballroom I
Abstract Information: Funders that work on systems change face a number of complicated challenges when designing their approaches to learning and evaluation. Paths to change are emergent and non-linear, so a systems-thinking mindset is critical. But traditional approaches to measurement and learning in philanthropy tend to be more linear and metrics-oriented. This session will share the approach that Luminate, a funder that works on democracy and governance issues in complex political systems, has developed and is testing. We revisited what measurement meant to Luminate and developed a new learning framework that is centered on hypotheses and storytelling rather than a traditional logframe approach. We will share the components of Luminate’s learning framework but also share details about “how” we did it–from development to setting up the complementary support structures and integrating the new learning practices across the foundation. We will cover: a) theory/background of the learning framework approach, b) how we rolled out the learning framework to our teams and encouraged buy-in, c) how we set up the knowledge management system to house confirming and disconfirming evidence, d) how we are structuring our learning conversations and analysis, and e) how we report to the Board in the absence of a logframe. In the discussion we will invite comments and insights on how feasible this approach is for other organizations, thoughts on collecting disconfirming evidence, and how others have centered diverse voices and stories in their learning approaches to build a holistic picture of change.
Relevance Statement: Measuring systems change is challenging, especially when contexts change quickly. Luminate is a democracy and governance funder that works to ensure that everyone – especially those who are underrepresented – has the information, rights, and power to influence the decisions that affect their lives. When addressing these complex challenges with emergent and non-linear pathways, it is difficult to pinpoint causes or solutions or see the broad story of change. A systems-thinking mindset that includes multiple viewpoints and stories of how change happens is therefore critical to effective measurement, and we wanted to reflect this in our learning framework. We identified four challenges: Be grounded in and informed by grantee perspectives and stories Develop a holistic understanding of causality by including diverse voices to triangulate findings Allow us to respond to new information rather than remaining attached to predetermined impact pathways Make the learning framework manageable for the entire team (not just MEL staff); enabling each team to tell their own story of impact With these considerations in mind, and recognising that our changing contexts did not easily allow for fixed objectives and KPIs, we re-visited what we meant by “measurement” and developed a new learning framework, centered on hypotheses, that would tell us the story of change. The framework itself includes: Hypotheses: If-then statements linking planned activities to outcomes Assumptions: Statements related to cause-and-effect, context, and implementation that underlie hypotheses; we prioritize those which would have serious consequences to our strategy if they did not hold true Learning questions: Action-oriented questions which define our boundaries of inquiry, allowing us to test and interrogate our assumptions Confirming and disconfirming evidence: An indication that  something may be true/not true, helping us understand whether our hypotheses and assumptions hold up or need to be revised or rejected. These framework components comprise the “what” of our learning approach. Underpinning this is a culture of reflection, conversation, and learning in which we investigate causal pathways and unpick the story that emerges from the tension of the confirming and disconfirming evidence. In this session, we will share the learning framework and also details about the “how”: from development of the framework to setting up the complementary support structures and integrating new organizational learning practices. We will cover: Brief theory/background of the learning framework approach How we introduced the learning framework to our teams and encouraged buy-in How we set up our knowledge management to house confirming and disconfirming evidence How we are structuring our learning conversations and analysis How we report our story of change to the Board in the absence of a logframe As this is a new approach, we are eager to hear insights from session participants. We will encourage conversation related to: How feasible or practical is this approach for other organizations? How have others tested the practice of collecting disconfirming evidence? What insights do participants have about weaving learning into routine conversations? How have others centered diverse voices and stories in their learning approaches to build a holistic picture of change?