Skip to content

olaalnaameh/Toxic-Comment-Classification

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Toxic-Comment-Classification

Toxic Comment Classification Challenge is a Kaggel challenge to build a model that is capable of detecting different types of of toxicity like threats, obscenity, insults, and identity-based using a dataset of comments from Wikipedia’s talk page edits.

Link to the challenge: https://www.kaggle.com/c/jigsaw-toxic-comment-classification-challenge

The code include all the tested approaches.

Link to the datasets used for train and test: https://www.kaggle.com/c/jigsaw-toxic-comment-classification-challenge/data

Our team was called "Lets have Fun" and the final score was 0.9807.

Link to challenge leader board: https://www.kaggle.com/c/jigsaw-toxic-comment-classification-challenge/leaderboard

About

Toxic Comment Classification Challenge

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published