New Features

Thorn’s CSAM Image Classifier Now Integrated With Magnet AXIOM

With the release of Magnet AXIOM 7.0, we’re excited to introduce the integration of Thorn’s advanced AI model: the CSAM Image Classifier.

This new feature will help investigators spend less time reviewing and categorizing files, and instead use this AI technology to quickly arm themselves with relevant information to focus their investigations and identify victims of child sexual abuse faster.

Addressing the CSAM Data Volume Problem

One of the most significant challenges in child sexual abuse material (CSAM) investigations is the sheer volume of data that investigators must analyze.

Terabytes of data need to be examined thoroughly to identify any potential evidence, and the time required to review, categorize, and report on CSAM can be lengthy—contributing to backlogs in the lab.

Plus, there is also the mental and physical toll it can take on officers.

Since Magnet Forensics’ inception, we’ve been committed to developing and evolving solutions to help investigators bring justice to those who victimize children. For example, with Magnet AXIOM 2.0, we introduced Magnet.AI—a feature that uses machine learning to comb through evidence and automatically categorize chat and pictures for things like drugs, guns, nudity, abuse, luring, or sexual conversations.

And now we’ve added another tool to help in that mission: Thorn’s CSAM Image Classifier.

How Thorn Fights Against CSAM

Thorn was founded in 2012 to revolutionize the fight against child sexual abuse by using technology.

Thorn has spent the last decade building technology to defend children from sexual abuse. This work has led to innovations in platform safety through scalable detection and reporting of CSAM; in parent and youth education to prevent online sexual abuse; and, perhaps most profoundly, in victim identification and recovery.

Thorn’s data science team developed the CSAM Image Classifier—a machine learning classification model that predicts the likelihood that a file contains imagery of child sexual abuse.

The CSAM Image Classifier can identify CSAM at scale, allowing trusted partners across law enforcement, NGOs, and the tech industry to identify victims of child sexual abuse faster and stop revictimization caused by the further spread of abuse content, while also protecting the mental health of the people responsible for reviewing it.  

As a result, investigators spend less time reviewing and categorizing files and instead can focus on identifying victims and advancing their investigations.

The CSAM Image Classifier has not only been instrumental in helping tech platforms identify unknown CSAM, but it is changing how  law enforcement agents investigate and intervene in child sexual abuse cases around the globe.

If you haven’t already, update AXIOM to the latest version, either in-product or within the Customer Portal, to leverage Thorn’s technology to help find illicit content faster.

Subscribe today to hear directly from Magnet Forensics on the latest product updates, industry trends, and company news.

Start modernizing your digital investigations today.

Top