SHARE

Over the past seven years, Allegheny County Department of Human Services workers have frequently employed an AI predictive risk modeling program to aid in assessing children’s risk factors for being placed into the greater Pittsburgh area’s foster care system. In recent months, however, the underlying algorithms behind the Allegheny Family Screening Tool (AFST) have received increased scrutiny over their opaque design, taking into account predictive AI tools’ longstanding racial, class, and gender-based biases.

Previous delving into the Allegheny Family Screening Tool’s algorithm by the Associate Press revealed certain data points could be interpreted as stand-in descriptions for racial groups. But  now it appears the AFST could also be affecting families within the disabled community as well as families  with a history of mental health conditions. And the Justice Department is taking notice.

[Related: The White House’s new ‘AI Bill of Rights’ plans to tackle racist and biased algorithms.]

According to a new report published today from the Associated Press, multiple formal complaints regarding the AFST have been filed via the Justice Dept.’s Civil Rights Division, citing the AP’s prior investigations into its potential problems. Anonymous sources within the Justice Dept. say officials are concerned that the AFST’s overreliance on potentially skewed historical data risks “automating past inequalities,” particularly long standing biases against people with disabilities and mental health problems.

The AP explains the Allegheny Family Screening Tool utilizes a “pioneering” AI program designed to supposedly help overworked social workers in the greater Pittsburgh area determine which families require further investigation regarding child welfare claims. More specifically, the tool was crafted to aid in predicting the potential risk of a child being placed into foster care within two years of following an investigation into their family environment.

The AFST’s black box design reportedly takes into account numerous case factors, including “personal data and birth, Medicaid, substance abuse, mental health, jail and probation records, among other government data sets,” to determine further investigations for neglect. Although human social service workers ultimately decide whether or not to follow up on cases following the AFST algorithm results, critics argue the program’s potentially faulty judgments could influence the employees’ decisions.

[Related: The racist history behind using biology in criminology.]

A spokesman for the Allegheny County Department of Human Services told the AP they were not aware of any Justice Department complaints, nor were they willing to discuss the larger criticisms regarding the screening tool.

Child protective services systems have long faced extensive criticisms regarding both their overall effectiveness, as well as the disproportional consequences faced by Black, disabled, poor, and otherwise marginalized families. The AFST’s official website heavily features third-party studies, reports, and articles attesting to the program’s supposed reliability and utility.