Content Moderation Pretrained Classifiers

With Nyckel's content moderation pretrained classifiers, you can moderate your text and images in just minutes.
mugshot Oscar Beijbom
Jan 2024
Nyckel classification API

Last year, Nyckel announced the launch of our pretrained classifiers. These let anyone launch their own machine learning model in just minutes.

Today we’re proud to announce our suite of content moderation pretrained classifiers.

With these out-of-the-box tools, you can moderate your images and text in real-time. The classifiers available include:

  1. NSFW (Image) - Identify if content is NSFW or not (across 5 categories).
  2. Pornography (Image) - Identify if content is NSFW or not (yes/no).
  3. Weapons (Image) - Identify if the image contains a gun or knife.
  4. Offensive Language (Text) - Identify if a given text is profane, offensive, or hate speech.
  5. Smoking Use (Image) - Identify if someone is smoking in the image.
  6. Alcohol Use (Image) - Identify if someone is drinking in the image.
  7. Image Quality (Image) - Check image quality (blurry, normal, too dark, too bright).
  8. Blurry Images (Image) - Check whether image is blurry or not.

These classifiers exist as public-facing pages that anyone can interact with. We also offer API access to the models for free.

Not-Safe-For-Work (NSFW) Images

Our NFSW identifier will classify whether an image is NSFW or not. It breaks it down by:

  • NSFW (Porn) - Images containing porn or partial nudity.
  • NSFW (Pornographic Drawings) - Illustrations / drawings of pornographic nature.
  • SFW (Normal Image) - Images with nothing to worry about.
  • SFW (Drawings) - Drawings or illustrations that are SFW.
  • SFW (Mildly Suggestive) - Images that may be considered risque for some sites, such as kid-friendly ones, but which don't qualify as nudity or pornography.
nsfw classifier

This is a good classifier should you want a nuanced view of the NSFW/SFW spectrum. For example, kid-friendly sites may want to block images that are otherwise SFW but perhaps not suitable for children.

Pornography (Images)

Our porn identifier tells you if an image is pornographic or not. It’s a binary yes/no.

porn classifier

This is a good classifier if you just want a binary yes/no whether an image is pornographic.

Weapons Identifier (Images)

Our weapons identifier identifies whether there is a gun or knife in the image.

weapon identifier

This is a useful tool to auto-block images that contain weapons in them (specifically, knives or guns).

Smoking Use Identifier (Image)

Our smoking use identifier categorizes whether an image contains someone smoking.

smoking identifier

If images containing cigarettes or smoking use are not allowed on your platform, this tool will make it easy to flag such photos.

Alcohol Use Identifier (Image)

Our alcohol use classifier categorizes whether an image contains someone drinking.

alcohol identifier

If images of alcohol use or alcohol are not allowed on your platform, this tool will flag such offending photos.

Offensive Language (Text)

Our offensive language classifier will ingest text and tell you if it’s offensive or not. It specifically looks for text that is profane, offensive, or hate speech.

offensive language identifier

This is a good text classifier for easily flagging any content that could be considered offensive. It flags not just curse words or erotica, but hate speech as well.

Image Quality

Our image quality classifier tells you whether an image is blurry, normal, too dark, or too bright.

image quality identifier

Maintaining high-quality user-generated images on your site can be tough. This tool will tell you if a given image is blurry or abnormally dark / bright, which you can use to auto-flag images as needed.

Blurry Images

Our blurry images classifier tells you whether an image is blurry or not (yes/no).

blurry image identifier

If you want an easy way to identify whether an image is blurry or not, this tool is for you.

Wrap Up

If you’re interested in API access to these pretrained classifiers, please sign-up for a free account and reach out to us at feedback@nyckel.com.

Want to build your own classifier in just minutes?

You get 1,000 classifications a month for free. Pay only when you scale. No credit card required.

Start for free Get a demo