Implemented pre-trained Transformer-based distilBERT and BERT multilingual model to classify sentiments in positive or negative class and ranked them on scale of 1 to 5. Pipeline is used to call BERT and distilBERT models.
-
Notifications
You must be signed in to change notification settings - Fork 0
ankit1281998/Sentiment_Analysis
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
Implemented pre-trained Transformer-based distilBERT and BERT multilingual model to classify sentiments in positive or negative class and ranked them on scale of 1 to 5
Topics
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published