Skip to content

Latest commit

 

History

History
43 lines (30 loc) · 1.22 KB

README.md

File metadata and controls

43 lines (30 loc) · 1.22 KB

mini seq2seq

Minimal Seq2Seq model with attention for neural machine translation in PyTorch.

This implementation focuses on the following features:

  • Modular structure to be used in other projects
  • Minimal code for readability
  • Full utilization of batches and GPU.

This implementation relies on torchtext to minimize dataset management and preprocessing parts.

Model description

Requirements

  • GPU & CUDA
  • Python3
  • PyTorch
  • torchtext
  • Spacy
  • numpy
  • Visdom (optional)

download tokenizers by doing so:

sudo python3 -m spacy download de
sudo python3 -m spacy download en

References

Based on the following implementations