Senior Manager of Analytic Services BCT Partners Aurora, Connecticut, United States
Racial and gender bias in data science can have severe consequences, including reinforcing harmful stereotypes, perpetuating systemic inequalities, and leading to exclusionary policies and practices. This presentation will describe the consequences of gender and racial bias for evaluation and discuss the strategies we have used to integrate storytelling into our evaluation practices, including the use of participatory analytic methods to engage communities and increase data transparency. We will discuss how our approach addresses subjectivity and bias in analytic outcomes and algorithms by centering the perspectives of those whose stories are being told. We will describe how we identified and addressed these biases, including the use of culturally responsive evaluation practices, co-created research agendas, and the use of community-led research methods. Ultimately, our presentation will demonstrate how thoughtful, participatory analytics can promote equity in data science and help break down barriers to achieving greater justice and fairness in our society. By reducing bias and increasing the diversity of perspectives and experiences included in data analysis, we can develop more accurate and inclusive representations of the world around us.