top of page

Why nudity & age classifiers are not a substitute for a CSAM classifier

Writer's picture: Maria AllgaierMaria Allgaier

Online platforms will use a variety of AI classifiers to assist with moderation. For example, you will have classifiers to help detect violence & weapons. The way that classifiers are trained varies depending on the company as well as the classifier in question. However, when it comes to AI classifiers some are ´easier´ to train than others. Indeed, a CSAM classifier is one of the most challenging classifiers to train, only a select few companies are able to train these classifiers due to partnerships with law enforcement. This is because it is not legal for a company or individual to have or distribute CSAM, therefore, companies do not have the necessary data to train the classifier.  As this classifier is challenging to build, many moderation companies will offer nudity or age classifiers instead. This is often marketed or assumed to help with CSAM prevention and take down.  However, this is not the case and there are stark differences between a CSAM, nudity and age classifier. This article explains the difference between the types of classifiers:


Age classifier:


  • This is a type of AI classifier trained to estimate the age of individuals depicted in images or videos. It typically analyses facial features, body proportions, and other visual cues to predict the age range of a person, such as child, adolescent, adult, or elderly.

  • Age estimations can be accurate for certain age brackets but flawed with other brackets.

  • This can lead to many false negatives and false positives.

  • Age classifiers are set out to estimate age they are not set out to identify sexually explicit content.

  • It should also be noted that combining a nudity and an age classifier does not help you to increase accuracy over finding CSAM.

  • This will not reduce the exposure risk of CSAM to moderators as they will still have to shift through contents to verify and ensure.


Nudity classifier:


  • This is a classifier that is trained to detect nudity. It works by identifying specific patterns, shapes and skin tones associated with nude or partially nude bodies.

  • This is quite a standard classifier that most moderation companies will have.

  • This classifier will not be able to specifically detect CSAM, just nudity in general. Therefore, it can lead to high moderation queues, and not help you to quickly identify CSAM.

  • This still puts your moderators at risk of being exposed to CSAM as they will have to filter through nude contents (that are not against terms of service).

CSAM classifier:


  • CSAM classifiers are specifically designed to identify and categorise images or videos that depict child sexual abuse or exploitation.

  • These classifiers used advanced algorithms to detect visual indicators of abuse, such as explicit sexual acts involving children.

  • A CSAM classifier is only allowed to be developed by companies who have a strict agreement between themselves and law enforcement.

Overall, there are many large differences between an age, nudity, CSAM classifiers. They serve distinct purposes and are trained on different datasets tailored to their respective tasks. Similarly, a CSAM classifier often requires more specialised training and stringent ethical considerations due to the sensitive nature of the content being analysed. In sum, nudity & age classifiers are not a substitute for a CSAM classifier. To learn more about how CSAM classifiers can help protect your platforms and moderators reach out to Orthus Ai today.



18 views0 comments

Comments


bottom of page