How Child Advocacy Centers Can Lead the Way for AI in Child Welfare

Unlocking the power of artificial intelligence for good.

Child advocacy centers are natural connectors. They bridge the space between many different partner agencies such as law enforcement, child protective services, victim advocates, medical and mental health professionals, and legal services to deliver lifesaving services to children whose voices of trauma need to be heard. Breaking down silos and weaving this network of collaborating partners is critical to pursuing justice and well-being for the nearly 700,000 children who are victims of child sexual assault, abuse, and neglect in a given year.

Today, with advances in technological horsepower, child advocacy centers are transitioning their historic interagency coalition-building expertise into the digital era. In doing so, they are leading the way to a future where ethical artificial intelligence will unearth new possibilities for the children and families served by the child welfare system.

The state of AI in child welfare:

Jacqueline Schafer’s recently published article, “Harnessing AI Innovation for Struggling Families”, raises insightful points about the opportunity of pursuing and costs of not pursuing leading-edge technology to advance the child welfare system.

The outcomes of child welfare cases may affect generations of Americans, resulting in an enormous human cost of inherited trauma, cycles of abuse, and lost potential for those who come into contact with the child welfare system.

As Shafer eloquently points out, moving the needle on artificial intelligence in child welfare to drive enhanced outcomes is both possible and complex – investing in new systems, updating data-sharing laws, creating connected datasets, reducing friction between state autonomy and federal support, balancing appropriate ethics – but the promise it holds for vulnerable children and families and the generations that follow is consequential.

Child advocacy centers adopting state-of-the-art technology:

At Guardify, we see the hunger in child advocacy centers to lean in and digitally transform their practices, knowing it will help them elevate their ability to serve children. And while systems change is no simple process, these centers are both historically experienced and uniquely positioned to drive systems change across multiple agencies serving these children.

The appetite to adopt advancing technology is evidenced by more than 120 centers across the county working with Guardify, along with more than 2,500 collaborating agencies. And, the number of centers and multidisciplinary partners who are embracing collaboration through Guardify is growing daily. This coordination of efforts around centralized information has created a new opportunity to gain insight into practices to advance forensic interviewing, enhance quality assurance, reduce secondary stress and staff burn-out, and more fully inform service provisioning to better support pathways toward justice and healing.

The voices of child abuse survivors can shed new light:

Child forensic interviews are based on practices focused on the integrity of documenting children’s experiences and testimonies without using leading strategies. As a result, these recordings are some of the best sources of direct information before further interpretations come into play. With more than 27,000 child forensic interviews already in Guardify, the potential to use leading technology to remove any personally identifiable information and use remaining de-identified information to analyze patterns has the power to unlock a wealth of new insights for the field of child welfare. And the more child advocacy centers see the opportunity that technology like Guardify offers – to not only to strengthen protecting these children’s stories in a secure collaborative environment, but also to elevate the field – the more valuable the AI insights will continue to become.

Relying on the value of survivor-centricity to drive AI ethics:

Child advocacy centers are built on survivor centricity – interview the child one time and coordinate the system of delivery around that instance. Survivor-centricity rejects the alternative, fragmented state where each multidisciplinary partner conducts varying degrees of interviews with the child, further exacerbating trauma. This core value of survivor centricity is critical to driving requisite privacy, ethical, and accountability considerations for the responsible use of AI in child welfare. Furthermore, child advocacy centers often drive conversations about trauma-informed practices that have the potential to illuminate best practices in ethical AI.

At Guardify, we believe in the power of child advocacy centers, coordinating with their multidisciplinary partners, to continue to be innovators in child protection. We agree with Schafer’s assertion that the opportunity to consider the next generation of advances in child welfare is upon us. We look forward to working with child advocacy centers, thought leaders like Schafer, and other partners in leading the future of this work. The opportunity for greater well-being for generations depends on it.

Related Content