Won’t somebody think of the children

Machine learning algorithms are also used by child welfare and family services agencies to predict which children may be at greatest risk of abuse. In 2014, Los Angeles County started testing a system called Approach to Understanding Risk Assessment, or AURA, for such a purpose, but dropped it three years later officials said it was returning thousands of false positives. But according to the American Civil Liberties Union, child-welfare agencies in at least 11 states last year were still using predictive analytics to help children suffering abuse. One is Allegheny County, Pennsylvania’s, screening tool, which since 2016 has generated “family screening scores” that help case workers make predictions on long-term likelihood that children will need to be removed from their homes.

Colin Wood

Written by Colin Wood

Colin Wood is the editor in chief of StateScoop and EdScoop. He's reported on government information technology policy for more than a decade, on topics including cybersecurity, IT governance and public safety.

Latest Podcasts