Skip to content

Implementation of RNNs from "scratch" in PyTorch. LSTM and GRU available.

Notifications You must be signed in to change notification settings

onucharles/lstm-gru-pytorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Implementations of LSTM and GRU in Pytorch

Implementation of recurrent neural networks (RNNs) from "scratch" in PyTorch. The only PyTorch module used is nn.Linear.

I had to write this for a research project. I needed to make internal changes to RNNs for my experiments but observed that PyTorch's RNNs were imported as C libraries. Hopefully this will save you a few hours or days in your own work :-)

RNNS implemented are:

  • Long short-term memory, LSTM
  • Gated recurrent Unit, GRU

Dependency

Torch

About

Implementation of RNNs from "scratch" in PyTorch. LSTM and GRU available.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages