Skip Navigation

Computer program aimed at predicting child abuse debated

  • By Kim Strong/York Daily Record

 AP

(York)  —   When a 4-year-old girl drowned in her swimming pool near Pittsburgh on an August day in 2019, the family was already known to child welfare.

Reports of “lack of supervision” in the home had been called into Allegheny County’s Department of Human Services six times between December 2017 and July 2019, and five of those reports were labeled “unfounded” or “invalid,” according to a state document. The sixth call was investigated, and child welfare was involved with the family when Ca-Niyah Mitchell slipped into the pool without a life vest while her father was inside their house.

Charles Mitchell pleaded guilty to child endangerment last year and received probation.

This is the horror story of child welfare, where the red flags were raised to see a child was in danger, but they were missed, screened out by the person taking the call and put into the “unfounded” pile.

When serious neglect and abuse cases happen in Allegheny County, “we review them with a fine-tooth comb with tons and tons of stakeholders, and just so often you see these cases where there’d been a call for abuse and neglect, and we screened it out,” said Erin Dalton, director of the county’s Department of Human Services.

Dalton and her predecessor, Marc Cherna, set about changing that six years ago, and working with engineers in New Zealand, they built an algorithm, or predictive analysis tool, to help decision-making for the person who is screening those critical calls.

It has its critics, but Dalton remains the algorithm’s biggest cheerleader.

In Allegheny County, an algorithm is being used to help make a critical decision in a child’s welfare.

The critical call

There’s a critical moment in child welfare that happens when a phone screener receives a call from someone reporting child neglect or abuse.

The decision has to be made whether to proceed into an investigation or set the case aside. Screeners looking at the facts of a case, even given family background to inform the decision, were often leaning in the wrong direction in Allegheny County.

An analysis of the county’s DHS five years ago found that 27% of the highest risk child welfare cases were being screened out and 48% of the lowest risk cases were being screened in, according to DHS.

“First and foremost, our job is to keep children safe,” Dalton said.

The county decided to turn to data it had been collecting on the county’s families since 1998.

DHS screeners had been using that information to inform their decisions in the critical moment of moving an abuse or neglect case into an investigation or screening it out. It was just a lot of information, though, and screeners had limited time to dig through it all, Dalton said.

DHS decided that information could be automated. A partnership with the Auckland University of Technology led to the Allegheny Family Screening Tool, which assigns a number to a child based on data fed into the system. The data comes from publicly available records.

The algorithm assigns a number to the child, predicting the likelihood the family will be referred to DHS again. The higher the number, the more likely that will happen.

“It’s just that humans just aren’t good at this. They have their own biases. And so having a tool like this that can help to provide that kind of information to really talented staff really does just change everything,” Dalton said.

Fixing what’s broken

“I was hired 25 years ago to fix child welfare. We were known as a national disgrace,” Marc Cherna wrote to his colleagues last year when he announced his retirement as head of Allegheny County’s Department of Human Services.

When he took over the department, children were being removed from their homes and placed in foster care at alarmingly high rates in Allegheny County, and Cherna sought to change that.

“He didn’t go far enough (to fix the system), and then he thinks we’re still not finding all of the horror story cases, so we’ll use the algorithm,” said Richard Wexler, executive director of the National Coalition for Child Protection Reform.

Wexler and other critics of the algorithm believe it carries its own biases and pushes more children out of their homes because they’re poor, not because they’re in danger.

“If this is so great, why is that, by and large, the people who are most enthused about it are people whose statements or track record shows a strong propensity for taking away kids (from their homes)? If this is really a way to preserve families, why isn’t the family preservation movement leading the charge for this? We’re not. Why isn’t the racial justice movement leading the charge for it instead of saying: Hey we know what happened in criminal justice, why do we think it’s gonna be any different in child welfare?” he said.

“You are destroying families. You are emotionally traumatizing children, and you are exposing children to high rates of abuse in foster care, and at the same time, you’re making it harder to find those few children who really need to be saved. … Wrongful removal drives all of the problems,” Wexler said.

What’s wrong with algorithms?

One of the issues with algorithms, generally, is the data that’s used to feed them, according to Nicol Turner-Lee, director of the Center for Technology Innovation and senior fellow of governance studies with the Brookings Institution,

“First and foremost, computers do not discriminate; people do. The people behind these models may come with explicit and implicit biases that are baked into the model,” she said.

The data that’s used for algorithms is primarily public data, so the poor family using government services for food, housing, drug and alcohol counseling and mental health treatment will have much more data in the public sphere than a wealthier family using private insurance for counseling and treatment.

“Computer technology that gets deployed takes on the face of those communities, so unfortunately, you can look at any algorithm, like a criminal justice algorithm, and it picks up the historical legacy that are vetted in unfair systems,” Turner-Lee said.

“I’ve had a keen interest in not just addressing the output portion of the problems, which is the ultimate prediction, which can have a disproportionate impact on vulnerable populations, but I’ve also taken an interest in the design and the evaluation of these products. Who’s at the table? How do they get formed? What questions are they trying to solve? And whether or not they’re being done from a diverse perspective,” she said.

Political scientist and author Virginia Eubanks wrote a book about predictive algorithms. Called “Automating Inequality: How high tech tools profile, police and punish the poor,” it chronicles the issues of these predictive tools in three places, one of them Allegheny County and its Family Screening Tool.

“Faith that big data, algorithmic decision-making, and predictive analytics can solve our thorniest social problems—poverty, homelessness, and violence—resonates deeply with our beliefs as a culture. But that faith is misplaced,” Eubanks wrote in an article in Wired magazine. “These grand hopes rely on the premise that digital decision-making is inherently more transparent, accountable, and fair than human decision-making. But, as data scientist Cathy O’Neil has written, ‘Models are opinions embedded in mathematics.‘’’

She goes on to say: “Allegheny County has an extraordinary amount of information about the use of public programs. But the county has no access to data about people who do not use public services. Parents accessing private drug treatment, mental health counseling, or financial support are not represented in DHS data. Because variables describing their behavior have not been defined or included in the regression, crucial pieces of the child maltreatment puzzle are omitted from the AFST.”

Reducing racial disparities

The University of Pittsburgh hosts a task force that looks at algorithms being used by government agencies, including the Allegheny Family Screening Tool, as the use of algorithms is becoming more common.

“As I think about that system and others, I’ve had this kind of framing in my mind, of what is the thing that it’s replacing? What was the legacy, human decision-making process? Is this thing offering benefit? And for the screening tool, the county has shown some data that it has reduced racial disparities. We want that in a system like this,” said Chris Deluzio, who works on that task force and is the policy director at Pitt’s Institute for Cyber Law, Policy, and Security.

The task force is currently working on a report about the child welfare algorithm, to be published this year.

In an independent ethical analysis of the tool, two professors “concluded the tool is ethically appropriate, particularly because its accuracy exceeded the alternatives at the time and there would be ethical issues in not using the most accurate measure,” according to Allegheny County’s DHS.

The state backs the tool as well: “The Department of Human Services supports Allegheny County’s efforts to protect children and strengthen families. DHS has taken some initial steps to research predictive risk modeling, but there are no immediate plans to develop a statewide model,” said spokesperson Erin James.

Support for WITF is provided by:

Become a WITF sponsor today »

Support for WITF is provided by:

Become a WITF sponsor today »

Up Next
Regional & State News

End of term in sight, Wolf sets sights on school funding