Albatross County Child Abuse Screening Tool

TR Number

Date

2025-06

Journal Title

Journal ISSN

Volume Title

Publisher

Virginia Tech

Abstract

This case study investigates the use of predictive analytics in child welfare services, focusing on the ethical and systemic dilemmas that arise when artificial intelligence is used to assess child abuse and neglect risk. Drawing on the example of the Albatross County Office of Child Welfare (OCW), the case explores how data-driven tools are deployed to assist call screeners in evaluating hotline reports, and whether these tools truly reduce human bias or simply encode it in new forms. By analyzing the integration of the ALCAST risk assessment model, the case raises critical questions about the conflation of poverty with neglect, the limits of data transparency, and the shifting roles of professional judgment. It further explores how such technologies may reinforce racial disparities by relying on incomplete and biased public data. The case prompts reflection on human discretion, institutional accountability, and the societal consequences of automating social services. Through this lens, it encourages readers to consider what ethical, inclusive, and community-informed AI design might look like in public sector decision-making.

Description

Keywords

Child welfare systems, Predictive analytics, Algorithmic bias

Citation