Skip to content

This is the code for "Neural Arithmetic Logic Units" in Keras by Atri Saxena

License

Notifications You must be signed in to change notification settings

AtriSaxena/Neural_Arithmetic_Logic_Units-Keras

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Neural Arithmetic Logic Units

[WIP]

This is a Keras implementation of Neural Arithmetic Logic Units by Andrew Trask, Felix Hill, Scott Reed, Jack Rae, Chris Dyer and Phil Blunsom.

Drawing

Usage

Simply add them as normal layers after importing nalu.py or nac.py.

NALU has several additional parameters, the most important of which is whether to apply the gating mechanism or not

  • use_gating is True by default, enabling the behaviour from the paper.
  • Resetting use_gating allows the layer to model more complex expressions.
from nalu import NALU


ip = Input(...)
x = NALU(10, use_gating=True)(ip)
...

Requirements

  • Tensorflow (Tested) | Theano
  • Keras 2+

About

This is the code for "Neural Arithmetic Logic Units" in Keras by Atri Saxena

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages