Hello everyone!
I am writing to you regarding two matters:
Reset Network/Kernel in nest2->nest 3
In the last developer conference, Daphne Cornelisse talked about that she used
ResetNetwork() to solve her problem.
ResetNetwork() is marked as deprecated. No one said anything, so I got confused why this
is apparently the recommended way or at least approved. She showed me, that it works (in
her case). It is deprecated, so there is probably some good reasoning behind it. The
documentation says: "ResetNetwork is deprecated and will be removed in NEST 3,
because this function is not fully able to reset network and simulator state. What are the
edge cases where the use causes problems?
In this ticket, it is stated that the feature is just removed with any replacement.
https://github.com/nest/nest-simulator/issues/525
Thus, in nest3 there is only ResetKernel().
This means that you have to rebuild the network for any application where you do multiple
simulations with different input or parameter changes. I am using nest3 for reinforcement
learning and in each training episode, I have to extract all the weights and save them,
reset the kernel, reconstruct the net, then load all the weights. This adds a lot of
overhead in performance and bloats my code. I basically have another front-end storing the
net and talking to the nest back-end.
Therefore, the update to nest3 is a downgrade for many applications. I don’t have a
solution for this issue, but I want to spark some discussion as I learned that I am not
the only nest user to stumble into this issue.
STDP Performance boost by manual computation in python
In the paper "Demonstrating Advantages of Neuromorphic Computation: A Pilot Study“ by
Wunderlich et al. (
https://www.frontiersin.org/articles/10.3389/fnins.2019.00260/full)
some performance improvement on STDP was reported.
"The synaptic weight updates in each iteration were restricted to those synapses
which transmitted spikes, i.e., the synapses from the active input unit to all output
units (32 out of the 1,024 synapses), as the correlation a+ of all other synapses is zero
in a perfect simulation without fixed-pattern noise. This has the effect of reducing the
overall time required to simulate one iteration[…]“
The provided source code
(
https://github.com/electronicvisions/model-sw-pong/blob/976e0778ca05cfd96c4…
<https://github.com/electronicvisions/model-sw-pong/blob/976e0778ca05cfd96c4c1f5abf3b7a352b87cb72/pang.py#L231>)
indeed contains a manual computation of STDP. When using the nest library I don’t expect
to do some computation in python to be faster. It appears to me that the nest
implementation is computing STDP every time, even without spikes? Maybe someone can
comment on this whether this can be improved in nest?
Kind regards,
Benedikt S. Vogler
--
Benedikt S. Vogler
benedikt.s.vogler(a)tum.de
Student M.Sc. Robotics, Cognition, Intelligence