Teaching of Evaluation
Susan Tucker, PhD (she/her/hers)
President
Evaluation & Development Associates LLC, United States
Susan Tucker, PhD (she/her/hers)
President
Evaluation & Development Associates LLC, United States
Jennifer Villalobos, MA
Doctoral Candidate
Claremont Graduate University, United States
Minji Cho, MSc
Researcher/Evaluator
Claremont Graduate University
CLAREMONT, California, United States
Location: Room 204
Abstract Information: This session provides a timely discussion of the challenges of aligning AEA evaluator competencies with evaluator education and professional development design. Using a multi-method case study, competencies were assessed using a cognitive complexity framework within the context of university evaluator education. Through a process of independent coding, followed by collaborative discussions between three evaluators with diverse practical backgrounds, we used Clinton and Hattie’s (2021) framework of evaluator knowledge dimensions to discuss how competencies might be best targeted during evaluator development. The process to understand each competency in relation to its practical function elicited spirited discussions full of practical stories from personal evaluation practice. Findings highlight the varying degrees of “knowing” required for applying individual competencies, and the importance of creating teaching tools informed by evaluation practice. Using a course syllabus example, this session will demonstrate one method of using a cognitive complexity framework as a pedagogical tool to bolster the integration of AEA evaluator competencies with evaluator education and development. Using a learning framework can facilitate designing, improving, researching, and evaluating evaluator preparation programs.
Relevance Statement: The theme of this conference, The Power of Story in Evaluation, is relevant to this proposal about integrating the use of competencies within evaluator professionalization as well as aligning with AEA’s Guiding Principles (Tucker et al. 2022). Following the AEA board’s acceptance of AEA’s Evaluator Competencies in 2018, research has slowly emerged regarding their utility for practice and professionalization. There has also been limited research on competency alignment with evaluator education programs. Developing practice-based research on the evaluator competencies is relevant and necessary not only for evidence-based evaluator preparation programs but also for designing and delivering a growing continuum of educational pathways from young and emerging evaluators to highly experienced evaluators. Moreover, engaging in discussions about using competencies in evaluator professionalization is the next step of evaluator professionalization (Tucker & King, 2020; 2022). AEA competencies can be used to design and review evaluator education programs as well as strengthen program credibility and build the capacity of program managers and commissioners of evaluation. As the field demands more evidence-based professionalization practices, this session will focus on a case study conducted by three evaluators with diverse practice and academic backgrounds of how AEA Evaluator Competencies can be integrated into university preparation programs using a cognitive complexity framework. The session will demonstrate how using a learning framework alongside practice-based discussions can be used to develop more applied competency-aligned evaluator training programs. Exploring this framework from multiple perspectives (e.g., research, program management, design, and range of evaluator expertise) promises to benefit evaluation educators and training program designers to improve their evaluation training content and teaching practices. As the session is developed from a multi-method case study, we expect to demonstrate our step-by-step process of 1) incorporating the evaluator experience in understanding the implementation of the AEA competencies; 2) using a cognitive complexity knowledge framework to explore the teaching and learning of the AEA Evaluator Competencies; and 3) using examples of course syllabi and delivery to explore the alignments and gaps between the AEA Competencies and current evaluator training courses. Finally, session facilitators aim to create a space for participants to reflect on their evaluation practice and teaching and share their thoughts and ideas on how they can improve their teaching practice to better support evaluators in developing the competencies. Clinton, J. M., & Hattie, J. (2021). Cognitive complexity of evaluator competencies. Evaluation and Program Planning, 89. https://doi.org/10.1016/j.evalprogplan.2021.102006 Tucker, S. A., & King, J. A. (2020). Next steps for AEA and the newly created AEA Professionalization and Competencies Working Group. New Directions for Evaluation, 2020(168), 149–162. https://doi.org/10.1002/ev.20436 Tucker, S., Stevahn, L., & King, J. A. (2022). Professionalizing Evaluation: A Time-Bound Comparison of the American Evaluation Association’s Foundational Documents. American Journal of Evaluation, 0(0). https://doi.org/10.1177/10982140221136486