116 - Navigating complexity: Crafting relevant stories through pragmatism and adapting to the unexpected
Wednesday, October 11, 2023
5:30 PM – 7:00 PM ET
Location: Griffin Hall
Abstract Information: The field of program evaluation demands a nimble, adaptive approach when working within complex partnerships and underserved contexts. This paper aims to share a narrative on how the process of evaluation is influenced through the stories gathered from one such partnership. The partnership between Purdue University and Indianapolis Public school district sought to improve the academic outcomes of students within the largest urban school district in Indiana by preparing culturally competent, highly qualified career teachers to elevate student achievement in middle and high school STEM subjects. As with many programs operating in complex multi-partner settings, unexpected events led to changes in program implementation that necessitated adaptations in the evaluation approach. To meet dynamic program needs, the evaluation employed a pragmatic approach that leveraged collaboration with a wide range of stakeholders (community members, program administrators, and other partners) to identify research questions, modify evaluation plans, collect and analyze data, and disseminate findings in ways that were relevant and useful to the program and the various stakeholder groups. This paper highlights the critical importance of adaptability and collaboration in program evaluation within complex partnerships and underserved contexts. The pragmatic approach provides a framework for achieving this goal, which can facilitate a comprehensive understanding of program outcomes and foster collaboration between diverse partners, ultimately leading to a more effective response to community needs. This paper holds relevance to evaluators who work in unpredictable and complex contexts, especially those in underserved communities.
Relevance Statement: Evaluators are often tasked with assessing complex programs, policies, or initiatives and presenting riveting narratives about their effectiveness or impact. However, navigating complexity and uncertainty can be challenging. To account for these challenges, evaluators need to develop a pragmatic approach that allows them to adapt to unexpected program changes or circumstances. This paper contributes to the evaluation field by exploring the practical and theoretical considerations needed to successfully navigate complexity and craft relevant stories. Specifically, the paper describes the need for evaluators to focus on developing a pragmatic approach that allows them to make informed decisions based on multiple sources of available evidence, while also being flexible and adaptable to unexpected changes. This approach can help evaluators tell compelling stories that accurately reflect the complexity of the programs or initiatives they are examining. Using multiple sources of evidence aligns with the principles of triangulation, which involves using different methods and data sources for findings validation, as noted by Bazeley (2021). Engaging stakeholders is also critical for evaluation, as it ensures that the evaluation is responsive to the needs and concerns of those affected by the program, as opined by Rossi et al. (2021). Additionally, the paper draws on relevant standards of quality in evaluation theory, methods, and practice, which stress the importance of validity, reliability, utility, and feasibility in evaluation practice, as outlined by the Joint Committee on Standards for Educational Evaluation (2020). This paper emphasizes pragmatism and adaptability, which align with the standards of feasibility and utility, recognizing that evaluations must be tailored to the unique context and circumstances of each program. Evaluation is a practical activity that requires evaluators to make informed judgments based on the available evidence, as noted by Scriven (2021). This paper's value to the audience lies in its practical applications for evaluation practice. Its emphasis on pragmatism and adaptability provides evaluators with a framework for making these judgments in complex and unpredictable situations. References Bazeley, P. (2021). Integrating analyses in mixed methods research: Challenges and opportunities. Journal of Mixed Methods Research, 15(1), 3-16. https://doi.org/10.1177/1558689819829132 Joint Committee on Standards for Educational Evaluation. (2020). The program evaluation standards: A guide for evaluators and evaluation users (4th ed.). Sage. Rossi, P. H., Lipsey, M. W., & Freeman, H. E. (2021). Evaluation: A systematic approach (8th ed.). Sage. Scriven, M. (2021). The nature of evaluation. In D. L. Stufflebeam & C. F. Weisner (Eds.), Evaluation models: Viewpoints on educational and human services evaluation (2nd ed.) (pp. 13-40). Kluwer-Nijhoff.