In this project I built Deep Learning Translator using Transfer Learning approach. I used Hugging Face mT5 Base model (multilingual pretrained text-to-text transformer model) and fine-tuned it on 4 downstream tasks (English-German, German-English, Russian-English, English-Russian) using data from
The Tatoeba Translation Challenge.
The model was trained in Google Colab using PyTorch.
Using the fine-tuned model, I created a simple Flask webapp and hosted it on Google Colab to visualize the final results.
Tools used:
- PyTorch
- Hugging Face Transformers
- Colab
- Flask
- Bootstrap