Skip to content

Neural Arithmetic Logic Unit (NALU) implementation in Tensorflow and some of its applications

Notifications You must be signed in to change notification settings

reza-sohrabi/NALU-Tensorflow-Applications

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NALU-Tensorflow-Applications

Neural Arithmetic Logic Unit (NALU) implementation in Tensorflow and some of its applications based on this paper. The architecture of the their proposed model is depicted in the figure below.

In different notebooks, I will explore the applications of this model in different tasks.


The first task was to see if the network can extrapolate the addition, subtraction, multiplication, and division of numbers in the range it has not been trained on, which results in 0 error.


The second task is to translate a series of calculations written in natural language into the mathematical results. Unfortuantely the result is not satisfactory, so I will not add it here, but the data generation part might be useful.


This is a work in progress, and I will explore other tasks soon.

About

Neural Arithmetic Logic Unit (NALU) implementation in Tensorflow and some of its applications

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages