Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reset of snn.Leaky(beta=0.8) is S(t)U_thr and not beta*S(t)*U_thr #341

Open
alexbababu opened this issue Jul 30, 2024 · 1 comment
Open

Comments

@alexbababu
Copy link

alexbababu commented Jul 30, 2024

  • snntorch version: 0.9.1
  • Python version: tested on python 3.9, 3.10, and 3.11
  • torch: 2.3.1, 2.3.1+cu121 and 2.4.0
  • Operating System: Windows 10, Ubuntu 22.04, Rocky 8.9

Description

I was doing the snnTorch Tutorial Lesson 3 about LIF Neurons in Feed Forward Networks.
In Chapter 2 a snn.Leaky-Neuron is created and its behaviour with constant current input is tested.
In a plot the membran potential is plotted.
It is explained, that a soft-reset mechanism is implemented.
In contrast to the implementation of the Neuron in Chapter 1.4 the reset subtracts \beta*U_threshold.
Therefore the membran potential should fall to ~0.2 and not to ~0.0
This does not happen for me.
I tested on Windows, Linux and Google Colab.
My colleague has a Mac and for him it works as explained and shown here:
https://raw.githubusercontent.com/jeshraghian/snntorch/master/docs/_static/img/examples/tutorial3/_static/snn.leaky_step.png

What I Did

Ran all cells in snntorch_tutorial_3.ipynb

I get this
snntorch_leaky_reset

I expect this:
snn leaky_step

mem_rec
tensor([0.0000, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000,
        0.0000, 0.2100, 0.3780, 0.5124, 0.6199, 0.7059, 0.7747, 0.8298, 0.8738,
        0.9091, 0.9373, 0.9598, 0.9778, 0.9923, 1.0038, 0.0131, 0.2204, 0.3864,
        0.5191, 0.6253, 0.7102, 0.7782, 0.8325, 0.8760, 0.9108, 0.9387, 0.9609,
        0.9787, 0.9930, 1.0044, 0.0135, 0.2208, 0.3867, 0.5193, 0.6255, 0.7104,
        0.7783, 0.8326, 0.8761, 0.9109, 0.9387, 0.9610, 0.9788, 0.9930, 1.0044,
        0.0135, 0.2208, 0.3867, 0.5193, 0.6255, 0.7104, 0.7783, 0.8326, 0.8761,
        0.9109, 0.9387, 0.9610, 0.9788, 0.9930, 1.0044, 0.0135, 0.2208, 0.3867,
        0.5193, 0.6255, 0.7104, 0.7783, 0.8326, 0.8761, 0.9109, 0.9387, 0.9610,
        0.9788, 0.9930, 1.0044, 0.0135, 0.2208, 0.3867, 0.5193, 0.6255, 0.7104,
        0.7783, 0.8326, 0.8761, 0.9109, 0.9387, 0.9610, 0.9788, 0.9930, 1.0044,
        0.0135, 0.2208, 0.3867, 0.5193, 0.6255, 0.7104, 0.7783, 0.8326, 0.8761,
        0.9109, 0.9387, 0.9610, 0.9788, 0.9930, 1.0044, 0.0135, 0.2208, 0.3867,
        0.5193, 0.6255, 0.7104, 0.7783, 0.8326, 0.8761, 0.9109, 0.9387, 0.9610,
        0.9788, 0.9930, 1.0044, 0.0135, 0.2208, 0.3867, 0.5193, 0.6255, 0.7104,
        0.7783, 0.8326, 0.8761, 0.9109, 0.9387, 0.9610, 0.9788, 0.9930, 1.0044,
        0.0135, 0.2208, 0.3867, 0.5193, 0.6255, 0.7104, 0.7783, 0.8326, 0.8761,
        0.9109, 0.9387, 0.9610, 0.9788, 0.9930, 1.0044, 0.0135, 0.2208, 0.3867,
        0.5193, 0.6255, 0.7104, 0.7783, 0.8326, 0.8761, 0.9109, 0.9387, 0.9610,
        0.9788, 0.9930, 1.0044, 0.0135, 0.2208, 0.3867, 0.5193, 0.6255, 0.7104,
        0.7783, 0.8326, 0.8761, 0.9109, 0.9387, 0.9610, 0.9788, 0.9930, 1.0044,
        0.0135, 0.2208, 0.3867, 0.5193, 0.6255, 0.7104, 0.7783, 0.8326, 0.8761,
        0.9109, 0.9387])
@alexbababu
Copy link
Author

My colleague on Mac was using snntorch=0.7.0 and torch 2.2.1.
I installed a venv with same version and now I get the expected behaviour.

torch>=2.3.1 or snntorch=0.9.1 broke the softreset

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant