Filter(remove) the NSFW(not safe for work) images in a certain directory recursively, backup every images before it
-
Updated
Aug 22, 2023 - Python
Filter(remove) the NSFW(not safe for work) images in a certain directory recursively, backup every images before it
Remove adult content in discord channels better with Artificial Intelligence.
A comprehensive NSFW image detection npm package, equipped with advanced algorithms to ensure the safety and integrity of online platforms by swiftly identifying and filtering explicit content
NSFW.js implementation for image, gif and video. NSFW detection on the client-side via TensorFlow.js
A machine learning-based NSFW filter specifically designed to detect underage content in Tavern V1/V2 cards.
Application that allows you read dump file from PornHub for extracting files & tags for creating AI models
Containerized self-hosted REST API for vision classification, utilizing Hugging Face transformers.
There are many common vision tasks that were resolve by use On-device machine learning.
Express.js interface for infinitered/nsfwjs
Http isteği ile çalışan nsfw sunucusu - Nsfw server working with http request
A browser interface for NudeNet classifier.
A tool for detecting viruses and NSFW material in WARC files
[READ-ONLY] CLI tool that uses machine learning to detect nudity in images.
In this repo we explore a neural net trained on NSFW and SFW images as a binary classifier and then we explore if neural net classifies illusions as NSFW or SFW.
A keyword-based anti-NSFW classifier for Twitter, with a suspicious twist 🧐
NSFW画像検出モデル(open_nsfw_android)をColaboratory上で動かすサンプル
A Neural Net for Nudity Detection. Classifier only.
Python package to apply the Safety Checker from Stable Diffusion.
Add a description, image, and links to the nsfw-classifier topic page so that developers can more easily learn about it.
To associate your repository with the nsfw-classifier topic, visit your repo's landing page and select "manage topics."