NudeNet: An ensemble of Neural Nets for Nudity Detection and Censoring

Praneeth Bedapudi
3 min readMar 30, 2019

--

Note: This post can also be read from here

Please note that, after writing this post, NudeDetector changed a lot. Major changes are

  1. 10x data, more parts detection (Checkout https://github.com/notAI-tech/NudeNet/)
  2. Auto downloading the checkpoint files and windows support.
  3. 2x faster default model and 6x faster “fast” detection mode.
  4. Support for video detection using smart frame selection.

Part 2: Exposed part detection and censoring.

The Why: A major draw back of image classification is not having fine grained control. If we want to blur the exposed parts/ have fine -grained control of which type of images we want to allow, we need Object Detection. This is a first of it’s kind open-source project, which I hope will be helpful to the community.

If some one wants to allow images with exposed chest or buttocks but not images with exposed genitalia, or some other combination, doing this solely with image classification is not possible. Since, there are no creative ways of obtaining the data-set for this task, I make use of the data collected by Jae Jin and team. For more information on this or to contribute to the development of the dataset, please contact Jae Jin or join their team’s discord server (Discord Tags: 0131#2628 or Jae Jin#0256).

With a total of 5789 images, the distribution of the number of labels is as follows

Total data available for training

With this data, I use the image augmentation library albumentations for adding some random blur, flips etc to the dataset.

Since, there is significant class imbalance in the data, I chose to use RetinaNet by FAIR for object detection. RetinaNet uses a variation of cross entropy loss called Focal Loss, which is designed to increase the performance of one-stage object detection.

Using, resnet-101 as the backend, the model achieves the following scores on the test data.

Evaluating the model:

Since, this model can be used for both nudity detection and censoring, I test the model with the same data I used to test the classifier. If in an image, any of the labels “BUTTOCKS_EXPOSED”, “*_GENETALIA_EXPOSED”, “F_BREAST_EXPOSED” are found, we label the image as “nude” and if none of these are found, we label the image as “safe”. This labels mapping is chosen based on test data. For eg: In the test dataset, images with exposed Male Breast or exposed Belly, are present in the “sfw” category.

Precision and Recall of NudeNet’s Detector

NudeNet’s Detector performs better than Yahoo’s Open NSFW, GantMan’s nsfw_model and NudeNet’s classifier in identifying porn.

I also, include a censor function in this class, which censor’s the private parts in a nsfw image.

For example, for the image (NSFW) https://i.imgur.com/rga6845.jpg, the following is the output of NudeNet’s censor function.

Censored image, using NudeDetector.

The project can be found at https://github.com/bedapudi6788/NudeNet

The pre-trained models at https://github.com/bedapudi6788/NudeNet-models/

To install and use NudeNet, take a look at the following snippet.

# installing the project
pip install git+https://github.com/bedapudi6788/NudeNet
# Using the classifier
from NudeNet import NudeClassifier
classifier = NudeClassifier('classifier_checkpoint_path')
classifier.classify('path_to_nude_image')

--

--

Praneeth Bedapudi
Praneeth Bedapudi

Written by Praneeth Bedapudi

Senior NLP Engineer - DeepAffects

Responses (5)