Trained models & code to predict toxic comments on all 3 Jigsaw Toxic Comment Challenges. Built using ⚡ Pytorch Lightning and 🤗 Transformers. For access to our API, please email us at contact@unitary.ai.
-
Updated
Sep 19, 2024 - Python
Trained models & code to predict toxic comments on all 3 Jigsaw Toxic Comment Challenges. Built using ⚡ Pytorch Lightning and 🤗 Transformers. For access to our API, please email us at contact@unitary.ai.
The world's largest social media toxicity dataset.
SFU Opinion and Comments Corpus
Detect text toxicity in a simple way, using React. Based in a Keras model, loaded with Tensorflow.js.
A static code analysis tool for JavaScript and TypeScript.
A Python package to compute HONEST, a score to measure hurtful sentence completions in language models. Published at NAACL 2021.
The toxEval R-package includes a set of functions to analyze, visualize, and organize measured concentration data as it relates to chosen biological effects benchmarks. See https://doi-usgs.github.io/toxEval/ for more details
Highlight refactors and efforts to keep code base lean
Smash molecule and obtain significant fragments
Data from "Crowdsourcing of Parallel Corpora: the Case of Style Transfer for Detoxification" paper
The official code to reproduce results from the NACCL2019 paper: White-to-Black: Efficient Distillation of Black-Box Adversarial Attacks
Datasets used in the tox21 challenge
A collection of transformer-based models and developmental scripts presented in the publication "Transformers enable accurate prediction of acute and chronic chemical toxicity in aquatic organisms".
In silico platform to analize MD trajectories using metrics, clustering and machine&deep learning techniques
Module for predicting toxicity messages in Russian and English
Prediction and mechanistic analysis of Drug-Induced Liver Injury (DILI) based on chemical structure
Telegram bot that detects toxic comments based on Perspective API
Add a description, image, and links to the toxicity topic page so that developers can more easily learn about it.
To associate your repository with the toxicity topic, visit your repo's landing page and select "manage topics."