The Bakersfield Californian

Child welfare algorithm faces scrutiny

BY SALLY HO AND GARANCE BURKE

PITTSBURGH — The Justice Department has been scrutinizing a controversial artificial intelligence tool used by a Pittsburgh-area child protective services agency following concerns that the tool could lead to discrimination against families with disabilities, The Associated Press has learned.

The interest from federal civil rights attorneys comes after an AP investigation revealed potential bias and transparency issues surrounding the increasing use of algorithms within the troubled child welfare system in the U.S. While some see such opaque tools as a promising way to help overwhelmed social workers predict which children may face harm, others say their reliance on historical data risks automating past inequalities.

Several civil rights complaints were filed in the fall about the Allegheny Family Screening Tool, which is used to help social workers decide which families to investigate, AP has learned. The pioneering AI program is designed to assess a family’s risk level when they are reported for child welfare concerns in Allegheny County.

Two sources said that attorneys in the Justice Department’s Civil Rights Division cited the AP investigation when urging them to submit formal complaints detailing their concerns about how the algorithm could harden bias against people with disabilities, including families with mental health issues.

A third person told AP that the same group of federal civil rights attorneys also spoke with them in November as part of a broad conversation about how algorithmic tools could potentially exacerbate disparities, including for people with disabilities. That conversation explored the design and construction of Allegheny’s influential algorithm, though the full scope of the Justice Department’s interest is unknown.

All three sources spoke to AP on the condition of anonymity, saying the Justice Department asked them not to discuss the confidential conversations. Two said they also feared professional retaliation.

Wyn Hornbuckle, a Justice Department spokesman, declined to comment.

Algorithms use pools of information to turn data points into predictions, whether that’s for online shopping, identifying crime hotspots or hiring workers. Many agencies in the U.S. are considering adopting such tools as part of their work with children and families.

Though there’s been widespread debate over the moral consequences of using artificial intelligence in child protective services, the Justice

Department’s interest in the Allegheny algorithm marks a significant turn toward possible legal implications.

Robin Frank, a veteran family law attorney in Pittsburgh and vocal critic of the Allegheny algorithm, said she also filed a complaint with the Justice Department in October on behalf of a client with an intellectual disability who is fighting to get his daughter back from foster care. The AP obtained a copy of the complaint, which raised concerns about how the Allegheny Family Screening Tool assesses a family’s risk.

“I think it’s important for people to be aware of what their rights are and to the extent that we don’t have a lot of information when there seemingly are valid questions about the algorithm, it’s important to have some oversight,” Frank said.

Mark Bertolet, spokesman for the Allegheny County Department of Human Services, said by email that the agency had not heard from the Justice Department and declined interview requests.

“We are not aware of any concerns about the inclusion of these variables from research groups’ past evaluation or community feedback on the (Allegheny Family Screening Tool),” the county said, describing previous studies and outreach regarding the tool.

Child protective services workers can face critiques from all sides. They are assigned blame for both over-surveillance and not supporting enough the families who land in their view. The system has long been criticized for disproportionately separating Black, poor, disabled and marginalized families and for insufficiently addressing — let alone eradicating — child abuse and deaths.

Supporters see algorithms as a datadriven solution to make the system both more thorough and efficient, saying child welfare officials should use all tools at their disposal to make sure children aren’t maltreated.

Critics worry that delegating some of that critical work to AI tools powered by data collected largely from people who are poor can bake in discrimination against families based on race, income, disabilities or other external characteristics.

The AP’s previous story highlighted data points used by the algorithm that can be interpreted as proxies for race. Now, federal civil rights attorneys have been considering the tool’s potential impacts on people with disabilities.

The Allegheny Family Screening Tool was specifically designed to predict the risk that a child will be placed in foster care in the two years after the family is investigated.

The county said its algorithm has used data points tied to disabilities in children, parents and other members of local households because they can help predict the risk that a child will be removed from their home after a maltreatment report. The county added that it has updated its algorithm several times and has sometimes removed disabilities-related data points.

NATION & WORLD

en-us

2023-02-01T08:00:00.0000000Z

2023-02-01T08:00:00.0000000Z

https://bakersfield.pressreader.com/article/281689733956355

Alberta Newspaper Group