Skip to content

Deep Learning Translator using Transfer Learning approach. Hugging Face mT5 Base model fine-tuned on downstream tasks.

License

Notifications You must be signed in to change notification settings

dashnak90/mT5_translator

Repository files navigation

mT5 Translator

This project was completed in Week 12 of the Data Science bootcamp at SPICED Academy.

In this project I built Deep Learning Translator using Transfer Learning approach. I used Hugging Face mT5 Base model (multilingual pretrained text-to-text transformer model) and fine-tuned it on 4 downstream tasks (English-German, German-English, Russian-English, English-Russian) using data from The Tatoeba Translation Challenge.
The model was trained in Google Colab using PyTorch.
Using the fine-tuned model, I created a simple Flask webapp and hosted it on Google Colab to visualize the final results.

Tools used:

  • PyTorch
  • Hugging Face Transformers
  • Colab
  • Flask
  • Bootstrap

BLEU score (compated to TATOEBA score)

About

Deep Learning Translator using Transfer Learning approach. Hugging Face mT5 Base model fine-tuned on downstream tasks.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published