Minimalist NMT for educational purposes
-
Updated
Jan 29, 2024 - Python
Minimalist NMT for educational purposes
Keras, PyTorch, and NumPy Implementations of Deep Learning Architectures for NLP
A PyTorch implementation of the hierarchical encoder-decoder architecture (HRED) introduced in Sordoni et al (2015). It is a hierarchical encoder-decoder architecture for modeling conversation triples in the MovieTriples dataset. This version of the model is built for the MovieTriples dataset.
Neural Machine Translation using LSTMs and Attention mechanism. Two approaches were implemented, models, one without out attention using repeat vector, and the other using encoder decoder architecture and attention mechanism.
Paper Implementation about Attention Mechanism in Neural Network
Repository containing the code to my bachelor thesis about Neural Machine Translation
ICLR_2018_Reproducibility_Challenge : Sketch-RNN
The sequence-to sequence model implemented by pytorch
REST API for training and prediction of seq2seq model
French to English neural machine translation trained on multi30k dataset.
Interpretation for english autoencoder (seq2seq model).
π¬ Sequence to Sequence from Scratch Using Pytorch
Implementation of Selected Published Papers from AI, RL, NLP Conferences and reputed Journals
νκΈμ μμ΄λ‘ λ²μνλ μμ°μ΄μ²λ¦¬ λͺ¨λΈ μ€ν°λμ λλ€.
Add a description, image, and links to the seq2seq-pytorch topic page so that developers can more easily learn about it.
To associate your repository with the seq2seq-pytorch topic, visit your repo's landing page and select "manage topics."