Disabilities and Underrepresented Populations
Crystal Steltenpohl, PhD (she/her/hers)
Research and Evaluation Associate
Dartmouth Center for Program Design and Evaluation
Evansville, Indiana, United States
Catherine Denial, MS
Research and Evaluation Associate
Dartmouth Center for Program Design and Evaluation, United States
Erin Knight-Zhang, PhD
Senior Research Associate
Dartmouth Center for Program Design and Evaluation, United States
Location: Room 203
Abstract Information: Federal service grants often require a standardized set of metrics to be met by each grantee. While these metrics serve a purpose and critically support continued grant funding, they are largely used to justify congressional budget expenditures and may or may not be useful to the communities served by the grants. Because federal service grant metrics are created with the intent that everyone reports on the same outcomes, these metrics also cannot fully encompass the experiences of participants or communities affected by grant activities. It falls upon implementation and evaluation teams, then, to ensure that what is collected and reported beyond the minimum grant-assigned requirements is accurate, nuanced, and respectful of participants’ humanity, and that the knowledge gained from these projects are obtained through ethical, thoughtful methods. The papers presented in this session outline different evaluation strategies used by overlapping implementation and evaluation teams on three projects that serve the common goal of improving outcomes for vulnerable, trauma-affected youth and their families to center participants’ voice and experiences through mixed methods and reflexive practices. The first paper highlights low-burden strategies to increase and incorporate input from parents of young children being screened for trauma in pediatric clinics, and how the resulting data have been reported to program teams to improve screening implementation practices that directly affect parents and their families. The second paper demonstrates opportunities for using additional data, such as interviews and intervention documentation, to contextualize reporting of grant-required metrics for families affected by substance use disorders (SUDs). The third paper describes our consultative work with our project team, whose goal was to increase trauma screening in the context of autism evaluations. Rather than simply reporting the number of children screened, the team aimed to validate the screener for use with autistic youth through a process modified for this population, and this presentation will highlight evaluative and consultative strategies used to meet that goal. While the strategies vary and speak to different components of the evaluation process, from design to data-based decision-making, they have the same goal of contextualizing grant-required metrics through robust mixed methods evaluation. Furthermore, these three papers come from projects that have comparable project goals, employ overlapping teams, engage similar professional communities, and address the needs of vulnerable, trauma-affected populations in the same region. They contribute to the field of evaluation by highlighting concrete strategies for ensuring our work is used to center the stories of vulnerable populations and leads to change in the community and ideally in what is reported to grant funders. We will discuss and describe the strategies used, provide an overview of challenges we have faced in implementing them, and will speak more broadly on contextualizing and humanizing the data we collect to center the stories of the communities with which we work.
Relevance Statement: This session will highlight three projects that aimed to go beyond the “letter” of grant reporting requirements to meet the “spirit” of those guidelines, therefore centering the stories of the vulnerable, trauma-affected populations who are served. To meet the “spirit” of our grants, which are often focused on making life better for those affected by the grant’s activities, evaluators must take active steps to obtain data that speaks to community needs. Additionally, it is important to understand the factors that should be considered while working with these populations and to tell their stories to project teams in rigorous, compelling, and usable ways. This proposal is focused on practical strategies that evaluators in the audience can add to their toolbox and use to spark discussions with project teams, particularly those working with vulnerable populations, such as trauma-affected youth. This proposal also aims to explore the relationship between grant requirements and participant stories and could be used to highlight context for grantors or even push for more nuanced, complex understandings of the impacts of these funding mechanisms. This session will provide concrete examples of – and steps to achieving – high-quality evaluation that considers the needs and goals of communities, in line with the principles of systematic inquiry, respect for people, and common good and equity, outlined by the American Evaluation Association (2018). The teams in this session have gone beyond grant reporting requirements, explored their own values in line with the goals of the grants, strategized how to accurately and fairly obtain data that could inform decisions that impact the affected communities, and attempted to address and mitigate bias where possible. American Evaluation Association. (2018). Guiding principles for evaluators. American Evaluation Association. https://www.eval.org/About/Guiding-Principles
Presenter: Catherine M. Denial, MS – Dartmouth Center for Program Design and Evaluation
Presenter: Erin M. Knight-Zhang, PhD – Dartmouth Center for Program Design and Evaluation
Presenter: Kady Sternberg, BA – Dartmouth Trauma Interventions Research Center
Presenter: Rebecca Parton, MSW, LICSW – Dartmouth Trauma Interventions Research Center
Presenter: Holly Gaspar, MEd, MPH – Population Health at Dartmouth Health
Presenter: Erin Barnett, PhD – Geisel School of Medicine at Dartmouth
Presenter: Mary K. Jankowski, PhD – Dartmouth Trauma Interventions Research Center
Presenter: Erin M. Knight-Zhang, PhD – Dartmouth Center for Program Design and Evaluation
Presenter: Crystal N. Steltenpohl, PhD (she/her/hers) – Dartmouth Center for Program Design and Evaluation
Presenter: Crystal N. Steltenpohl, PhD (she/her/hers) – Dartmouth Center for Program Design and Evaluation