Program Theory and Theory-Driven Evaluation
Huey Chen, PhD
Professor
Department of Public Health, Mercer University
Johns Creek, Georgia, United States
Huey Chen, PhD
Professor
Department of Public Health, Mercer University
Johns Creek, Georgia, United States
Stewart Donaldson, Ph.D.
Distinguished University Professor
Claremont Graduate University
Claremont, California, United States
Stewart Donaldson, Ph.D.
Distinguished University Professor
Claremont Graduate University
Claremont, California, United States
Location: White River Ballroom C
Abstract Information: The Campbellian validity typology has made significant contributions to providing guidance and tools for conducting outcome evaluation. One of its influential tenets is its claim that randomized controlled trials (RCTs) are the best method for documenting credible relationships between interventions and their outcomes, across sciences, including evaluation. Because RCTs follow strict protocols, they are recommended as the best design. However, this one-size-fits-all approach does not match the reality of intervention programs. For example, the RCT application in community settings is difficult, very expensive, raises ethical questions, and the results generated are mixed. Theory-driven evaluation literature indicates major shortcomings of method-driven evaluation. This presentation will provide concrete examples to illustrate the major weaknesses of using the Campbellian validity typology and RCTs for evaluating real-world programs. These will be examined from the angle of the reductionist, systems thinking, and pragmatic synthesis paradigms of approaching solving real-world problems. Because of the ability to isolate the pure effects of interventions, RCTs may be ideal for evaluating program components that are possible to examine from a reductionist viewpoint. However, real-world programs are highly interactive, and context-dependent, making a reductionist approach very difficult. As such, using RCTs could be waste of money and counterproductive. Different evaluation perspectives are necessary, to allow the examination of real-world relationships that are better described by systems thinking or pragmatic synthesis. The presentation will focus on one alternative evaluation perspective – the pragmatic synthesis because the majority of intervention programs require a problem-solving philosophy that is practical, informative, and useful. Applications of this perspective across different program phases (i.e., planning, implementation, outcome, and dissemination or scale-up) will be explained.
Relevance Statement: The evaluation literature often proposes the use of a universalist perspective in selecting the best evaluation approach for evaluating an intervention program. For example, randomized controlled trials (RCTs) are often promoted as the best approach for evaluating the effectiveness of intervention programs. While RCTs are useful for documenting the pure-independent effects of interventions, they do so in controlled environments. Since stakeholders do not work in controlled environments, they indicate the information on pure independent effects obtained from RTCs is not useful to them. This paper argues that the selection of an evaluation approach for a program must be contingent on which theories of the program are used to establish the program. There are three theories of the program: Reductionism, systems thinking, and pragmatic synthesis. The contingency perspective argues that different theories of the program are used to require the use of different evaluation approaches. For example, RCTs are excellent to evaluate programs based on reductionism, but not useful to evaluate programs based on systems thinking or pragmatic synthesis. This paper argues that the selection of an evaluation theory and approach must be contingent on the theory of program used in an intervention program, to avoid the trap of one-size-fits-all. The paper discusses evaluation approaches suitable for evaluating programs based on system thinking or programmatic synthesis. References 2015 Huey T. Chen, Practical Program Evaluation: Theory-Driven Evaluation and the Integrated Evaluation Perspective. (2nd ed.) Thousand Oaks, CA: Sage. 2016 Huey T. Chen “Interfacing Theories of Program with Theories of Evaluation for Advancing Evaluation Practice: Reductionism, Systems Thinking, and Pragmatic Synthesis” Evaluation and Program Planning 10.1016/j.evalprogplan.2016.05.012. 2011 Huey T. Chen, Stewart Donaldson, and Melvin Mark (eds). Advancing Validity in Outcome Evaluation: Theory and Practice. New Directions for Evaluation Volume 130. San Francisco: Jossey- Base.