Skip to content

This is a Toxic Comment Classifier model, which classifies text according to whether it exhibits offensive attributes (i.e. Insult, Obscene, Severe Toxicity).

Notifications You must be signed in to change notification settings

Yashmenaria1/Toxic-Comment-Classifier

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

Toxic-Comment-Classifier

This is a Toxic Comment Classifier model, which classifies text according to whether it exhibits offensive attributes (i.e. Insult, Obscene, Severe Toxicity).

Classifying all Comments.

image

After Classifying.

Result:

image

About

This is a Toxic Comment Classifier model, which classifies text according to whether it exhibits offensive attributes (i.e. Insult, Obscene, Severe Toxicity).

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published