Skip to content

A PyTorch implementation of Legendre Memory Units (LMUs) and its FFT variant

Notifications You must be signed in to change notification settings

hrshtv/pytorch-lmu

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PyTorch LMU

This repository contains PyTorch implementations of the following papers:

Performance on the psMNIST dataset is demonstrated in examples/.

Usage

torch, numpy, and scipy are the only requirements.
src/lmu.py contains the implementations of LMUCell, LMU and LMUFFT.

Examples:

  • LMU

    import torch
    from lmu import LMU
    
    model = LMU(
        input_size = 1,
        hidden_size = 212,
        memory_size = 256,
        theta = 784
    )
    
    x = torch.rand(100, 784, 1) # [batch_size, seq_len, input_size]
    output, (h_n, m_n) = model(x)
  • LMUFFT

    import torch
    from lmu import LMUFFT
    
    model = LMUFFT(
        input_size = 1,
        hidden_size = 346,
        memory_size = 468, 
        seq_len = 784, 
        theta = 784
    )
    
    x = torch.rand(100, 784, 1) # [batch_size, seq_len, input_size]
    output, h_n = model(x)

Running on psMNIST

  • Clone this repository and open:

    Running in Google Colab is preferred

  • examples/permutation.pt contains the permutation tensor used while creating the psMNIST data; it's included for reproducibility. Alternatively, torch.randperm(784) can be used to test with a new permutation.

References

About

A PyTorch implementation of Legendre Memory Units (LMUs) and its FFT variant

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages