Dear Benedict,
When it comes to ResetNetwork(), there was never really a definite definition of what the
function should do. What resetting the network truly means. Does it mean resetting all
model parameters, all synapse parameters, both? Do we reset to default values, or to set
initial values? The function was never complete, and the use of it could lead to strange
results, because some parameters were reset while others were not.
To try to illustrate the complexity and what we had problems with when defining what reset
network means, image you initialize your neurons with random membrane potential. Then you
simulate for a while before resetting your network. Are we supposed to go back to the
initial random potentials, or should you then get the default membrane potential for the
model? Maybe it makes most sense that we get the initial random values. Then we need to
store them for all neurons in the network, in case someone does a ResetNetwork(), so that
we can apply them again. This can be quite complex, because you might set a lot of
variables, and you might have a very large network.
It would be very helpful if you create an issue where you write exactly what you need to
have reset for your application. What you envision a ResetNetwork to do. We might be able
to restore some of the functionalities to make it easier for you.
Best regards,
Stine Vennemo
________________________________
Fra: Alexey Serenko <serenko(a)phystech.edu>
Sendt: 12. mai 2020 16:12
Til: NEST User Mailing List; benedikt.s.vogler(a)tum.de
Emne: [NEST Users] Re: Reset Kernel + STDP Performance
Dear Benedict,
Regarding your second question, from a quick glance at that paper it seems the authors do
not state explicitly that their computation is faster than NEST. They probably mean that
computing the weight change only when a presynaptic spike arrives is faster than
evaluating their update code at each time step (which is not what NEST does, but what a
straightforward simulation of the authors' hardware would do: see the line "in
contrast to the implementation on BSS2"). As for why the authors chose to implement
plasticity outside of NEST rather than pick some existing model of reward-modulated STDP,
this is not explained in the paper, but one might suppose that the reason may have been to
ensure that the plasticity is identical to their hardware setup.
Anyway, synaptic weight change computation in NEST is also triggered by a presynaptic
spike (just because we only need to know the up-to-date weight value when a postsynaptic
spike has to be transmitted). So, there does not seem to be anything that needs
improvement in this matter. Please correct me if I get something wrong.
Sincerely yours,
Alex Serenko
Graduate student at Kurchatov Institute, Moscow, Russia
вт, 12 мая 2020 г. в 16:07, Benedikt S. Vogler
<benedikt.s.vogler@tum.de<mailto:benedikt.s.vogler@tum.de>>:
Hello everyone!
I am writing to you regarding two matters:
Reset Network/Kernel in nest2->nest 3
In the last developer conference, Daphne Cornelisse talked about that she used
ResetNetwork() to solve her problem.
ResetNetwork() is marked as deprecated. No one said anything, so I got confused why this
is apparently the recommended way or at least approved. She showed me, that it works (in
her case). It is deprecated, so there is probably some good reasoning behind it. The
documentation says: "ResetNetwork is deprecated and will be removed in NEST 3,
because this function is not fully able to reset network and simulator state. What are the
edge cases where the use causes problems?
In this ticket, it is stated that the feature is just removed with any replacement.
https://github.com/nest/nest-simulator/issues/525
Thus, in nest3 there is only ResetKernel().
This means that you have to rebuild the network for any application where you do multiple
simulations with different input or parameter changes. I am using nest3 for reinforcement
learning and in each training episode, I have to extract all the weights and save them,
reset the kernel, reconstruct the net, then load all the weights. This adds a lot of
overhead in performance and bloats my code. I basically have another front-end storing the
net and talking to the nest back-end.
Therefore, the update to nest3 is a downgrade for many applications. I don’t have a
solution for this issue, but I want to spark some discussion as I learned that I am not
the only nest user to stumble into this issue.
STDP Performance boost by manual computation in python
In the paper "Demonstrating Advantages of Neuromorphic Computation: A Pilot Study“ by
Wunderlich et al. (
https://www.frontiersin.org/articles/10.3389/fnins.2019.00260/full)
some performance improvement on STDP was reported.
"The synaptic weight updates in each iteration were restricted to those synapses
which transmitted spikes, i.e., the synapses from the active input unit to all output
units (32 out of the 1,024 synapses), as the correlation a+ of all other synapses is zero
in a perfect simulation without fixed-pattern noise. This has the effect of reducing the
overall time required to simulate one iteration[…]“
The provided source code
(
https://github.com/electronicvisions/model-sw-pong/blob/976e0778ca05cfd96c4…)
indeed contains a manual computation of STDP. When using the nest library I don’t expect
to do some computation in python to be faster. It appears to me that the nest
implementation is computing STDP every time, even without spikes? Maybe someone can
comment on this whether this can be improved in nest?
Kind regards,
Benedikt S. Vogler
--
Benedikt S. Vogler
benedikt.s.vogler@tum.de<mailto:benedikt.s.vogler@tum.de>
Student M.Sc. Robotics, Cognition, Intelligence
_______________________________________________
NEST Users mailing list --
users@nest-simulator.org<mailto:users@nest-simulator.org>
To unsubscribe send an email to
users-leave@nest-simulator.org<mailto:users-leave@nest-simulator.org>