Skip to content

A type of potential-based recurrent neural networks implemented with PyTorch

License

Notifications You must be signed in to change notification settings

famura/neuralfields

Repository files navigation

Neural Fields – Old Idea, New Glory

license docs docs pre-commit bandit isort black ci tests coverage

About

In 1977, Shun-ichi Amari introduced neural fields, a class of potential-based recurrent neural networks [1]. This architecture was developed as a simplistic model of the activity of neurons in a (human) brain. It's main characteristic is the lateral in-/exhibition of neurons though their accumulated potential. Due to its simplicity and expressiveness, Amari’s work was highly influential and led to several follow-up papers such as [2-6] to only name a few.

Support

If you use code or ideas from this repository for your projects or research, please cite it.

@misc{Muratore_neuralfields,
  author = {Fabio Muratore},
  title = {neuralfields - A type of potential-based recurrent neural networks implemented with PyTorch},
  year = {2023},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/famura/neuralfields}}
}

Features

  • There are two variants of the neural fields implemented in this repository: one called NeuralField that matches the model of Amari closely using 1D convolutions, as well as another one called SimpleNeuralField that replaces the convolutions and introduces custom potential dynamics function.
  • Both implementations have by modern standards very few, i.e., typically less than 1000, parameters. I suggest that you start with the NeuralField class since it is more expressive. However, the SimpleNeuralField has the benefit of operating with typically less than 20 parameters, which allows you to use optimizers that otherwise might not scale.
  • Both, NeuralField and SimpleNeuralField, model classes are subclasses of torch.nn.Module, hence able to process batched data and run on GPUs.
  • The examples contain a script for time series learning. However, it is also possible to use neural fields as generative models.
  • This repository is a spin-off from SimuRLacra where the neural fields have been used as the backbone for control policies. In SimuRLacra, the focus is on reinforcement learning for sim-to-real transfer. However, the goal of this repository is to make the implementation as general as possible, such that it could for example be used as generative model.

Time series learning example

Time series generation example

Getting Started

To install this package, simply run

pip install neuralfields

For further information, please have a look at the getting started guide. In the documentation, you can also find the complete reference of the source code.


References

[1] S-I. Amari. Dynamics of pattern formation in lateral-inhibition type neural fields. Biological Cybernetics. 1977.
[2] K. Kishimoto and S-I. Amari. Existence and stability of local excitations in homogeneous neural fields. Journal of Mathematical Biology, 1979.
[3] W. Erlhagen and G. Schöner. Dynamic field theory of movement preparation. Psychological Review, 2002.
[4] S-I. Amari, H. Park, and T. Ozeki. Singularities affect dynamics of learning in neuromanifolds. Neural Computation, 2006.
[5] T. Luksch, M. Gineger, M. Mühlig, T. Yoshiike, Adaptive Movement Sequences and Predictive Decisions based on Hierarchical Dynamical Systems. International Conference on Intelligent Robots and Systems, 2012.
[6] C. Kuehn and J. M. Tölle. A gradient flow formulation for the stochastic Amari neural field model. Journal of Mathematical Biology, 2019.