April 29, 2022 GMT https://apnews.com/article/child-welfare-algorithm-investigation-9497ee937e0053ad4144a86c68241ef1 Inside a stone fortress cave in central Pittsburgh, lawyer Robin Frank defends his parents at one of their lowest points – when they risk losing their children. The job is never easy, but in the past he knew what he was facing when dealing with child protection services in family court. Now, he worries that he is struggling with something he cannot see: an opaque algorithm whose statistical calculations help social workers decide which families should be investigated first. “A lot of people don’t even know it’s used,” Frank said. “Families should have the right to have all the information in their file.” From Los Angeles to Colorado and across Oregon as childcare services use or test tools similar to those in Allegheny County, Pennsylvania, an Associated Press review has identified a number of technology concerns, including questions about its credibility and its potential to exacerbate racial inequalities in the child welfare system. Relevant issues have already been torpedoed by the plans of some jurisdictions for the use of forecasting models, such as the tool that was significantly released by the state of Illinois. According to new research from a Carnegie Mellon University team that received the AP exclusively, Allegheny’s algorithm in its early years showed a pattern of marking a disproportionate number of black children for a “mandatory” neglect survey, compared to white children. The independent researchers, who received data from the county, also found that social workers disagreed with the risk scores generated by the algorithm in about a third of the time. County officials said social workers could always bypass the tool and called the investigation “hypothetical.” Childcare workers in Allegheny County, the cradle of Mister Rogers’s television neighborhood, and the child-centric innovations of the icon say the cutting-edge tool – which is gaining traction across the country – uses data to support agents in agencies. try to protect children from neglect. This distinctive term can include everything from inadequate housing to poor hygiene, but it is a different category from physical or sexual abuse, which is investigated separately in Pennsylvania and is not subject to the algorithm. “Employees, whoever they are, should not be called upon to make, in a given year, 14, 15, 16,000 such decisions with incredibly incomplete information,” said Erin Dalton, director of the county Human Services Department and pioneer. in the application of the child welfare prediction algorithm.
This story, sponsored by the Pulitzer Center for Crisis Reporting, is part of an ongoing Associated Press series, “Tracked,” that explores the power and implications of algorithmic decisions in people’s daily lives.
Critics say it gives a program powered by data collected mostly for poor people a big role in deciding the fate of families and warns against the growing reliance of local officials on artificial intelligence tools. If the tool had acted on its own to handle a comparable call rate, it would have recommended investigating two-thirds of black children, compared to about half of all other children reported, according to another study published last year. month and was written by a researcher who tested the county algorithm. Advocates worry that if similar tools are used in other childcare systems with little or no human intervention – such as the way algorithms were used to make decisions in the criminal justice system – they could reinforce existing racial inequalities in the childcare system. “It does not reduce the impact on black families,” said Logan Stapleton, a researcher at Carnegie Mellon University. “In terms of accuracy and inequality, (the county) makes strong statements that I think are misleading.” Because family court hearings are closed to the public and records are sealed, the AP was unable to identify first-hand any families whose algorithm recommended mandatory child neglect, nor cases that resulted in a child being sent to foster care . Families and their lawyers can never be sure of the role of the algorithm in their lives, nor because they are not allowed to know the scores. SAFER, FASTER Incidents of possible neglect are reported in the Allegheny County Child Protection Hotline. Reports go through a testing process where the algorithm calculates the potential risk to the child and assigns a score. Social workers then use their discretion to decide whether to investigate. The Allegheny Family Screening Tool is specifically designed to predict the risk that a child will be placed in foster care two years after the investigation. Using a collection of detailed personal data collected from birth, Medicaid, substance abuse, mental health, prisons and surveillance records, among other government data sets, the algorithm calculates a risk score from 1 to 20: The higher the number, the higher the danger . Given the high stakes – ignoring a report of neglect could result in the death of a child, but controlling the life of a family could lead to separation – the county and developers suggested that their tool could help “correct to make the work of the service more thorough and effective by eliminating unreliable reports, so that social workers can focus on children who really need protection. The developers described the use of such tools as a moral imperative, saying that caregivers should use whatever they have at their disposal to make sure children are not neglected. “There are children in our communities who need protection,” said Emily Putnam-Hornstein, a professor at the University of North Carolina at Chapel Hill’s School of Social Work, who helped develop the Allegheny tool, speaking at a virtual panel hosted by the University. of New York. In November. Dalton said algorithms and other forecasting technologies also provide scientific insight into the personal biases of call center employees because they see risk ratings when deciding if a case is worth investigating. If the case escalates, Dalton said the full investigation is being conducted by a different social worker who is personally investigating, deciding whether the allegations are true and helping to determine whether the children should be placed in foster care. CMU researchers found that from August 2016 to May 2018, the tool calculated scores that suggest that 32.5% of black children reported as neglected should be subject to a “mandatory” survey, compared to with 20.8% of white children. In addition, the county confirmed to the AP that for more than two years, a technical failure in the tool sometimes featured social workers with incorrect ratings, either underestimating or overestimating a child’s risk. County officials said the problem had since been resolved. The county did not dispute the CMU’s findings, but Dalton said the research work represented a “hypothetical scenario that goes so far as to use this tool to support our workforce.” The CMU survey found no difference in the percentage of black families surveyed after the algorithm was adopted. The study found that employees were able to reduce this inequality generated by the algorithm. The county says social workers are always up to date and ultimately responsible for deciding which families to investigate because they can bypass the algorithm, even if it points to a mandatory investigation case. Dalton said the tool would never be used alone in Allegheny and doubted that any county would allow fully automated decision-making about family life. “Of course they could do that,” he said. “I think they are less likely to do it, because it makes no sense to do it.” Despite what the county describes as safeguards, a childcare specialist who worked for an Allegheny County contractor says there is still cause for concern. “When you have man-made technology, bias will show up in algorithms,” said Nico’Lee Biddle, who has worked for nearly a decade in child welfare, including family therapist and foster care specialist in Allegheny. County. “If they designed a perfect tool, it does not matter, because it was designed from very imperfect data systems.” Biddle is a former foster child who became a therapist, social worker and advocate. In 2020, she resigned, mainly due to her growing frustration with the child welfare system. She also said officials had dismissed her concerns when asked why the families were initially referred for investigation. “We could see the report and this decision, but we could never see the real tool,” he said. “They would meet me… ‘What has this got to do with it now?’” In recent years, moves to reform – or abolish – child protection services have grown, as generations of terrible sponsorship results have been shown to have their roots in racism. In a note last year, the US Department of Health and Human Services cited racial inequalities “at almost every important decision-making point” in the child welfare system, an issue raised by Aysha Schomburg, deputy commissioner for the US Office of Child Welfare. leads more than half of all black children nationwide will be investigated by social workers. “Excessive surveillance leads to massive family separation,” Schomburg wrote in a recent blog post. With racial and gender equality debates looming large in childcare circles, Putnam-Hornstein joined a roundtable of experts convened by the conservative American Enterprise Institute last fall and wrote a paper that struck supporters who believe child welfare systems are …