Tech tools to reduce harm
New software could reduce the potential harm of identifying child sexual abuse material.
Automated biometrics software could link up child sexual abuse material (CSAM) without requiring investigators to view the disturbing content themselves.
A paper published by the Australian Institute of Criminology (AIC), titled Advancing child sexual abuse investigations using biometrics and social network analysis, describes how seven experts developed a system called the Biometric Analyser and Network Extractor, which matches faces and voices in CSAM videos. In its early demonstration, the system identified 222 links between videos using 445 videos as a sample.
“This modelling makes it possible to draw links between people (victims or offenders) across videos by constructing more complex networks,” the paper says.
“This network visualisation moves beyond simply identifying the same victim or the same offender appearing in multiple videos and can also establish co-offending and co-victimisation relationships across a large holding of videos.”
The manual processing of videos can lead to burnout, significant psychological harms and unmanageable workloads for investigators, as pointed out in the paper. Previous studies have shown vicarious trauma can impact public service staff who have to witness disturbing images as part of their jobs.
To prevent these issues from occurring, researchers from institutions including the University of Adelaide, San Jose State University, Michigan State University, South Australia Police, Deakin University and the Defence Science and Technology Group, were able to establish relationships between people in the CSAM material without having to view the content themselves.
The software used biometric analysis and social network analysis to match faces and voices in the videos, but at no point were the researchers able to access the CSAM itself.
The software has yet to be evaluated for accuracy, and the researchers suggest future studies will need to incorporate labelled data with an established ground truth to evaluate accuracy.
“Given the graphic nature of the content, and the legal implications of possessing CSAM, such activities will need to be completed in partnership with law enforcement,” the paper says.