Dear Colleagues,
I would like to remind you that the deadline for abstract submission to the virtual NEST Conference 2022 is approaching fast: You have until next Wednesday, 11 May 2022, to submit your contribution (talk, "poster", or workshop) via the conference website https://nest-simulator.org/conference.
The NEST Conference provides an opportunity for the NEST Community to meet, exchange success stories, swap advice, learn about current developments in and around NEST spiking network simulation and its application. We explicitly encourage young scientists to participate in the conference!
This year's conference will again take place as a virtual conference on Thursday/Friday 23/24 June 2022.
Important dates
11 May 2022 - Deadline for submission of contributions
03 June 2022 - Notification of acceptance
17 June 2022 - Registration deadline
23 June 2022 - NEST Conference 2022 starts
We are looking forward to seeing you all in June!
Hans Ekkehard Plesser and the conference organizing committee
--
Prof. Dr. Hans Ekkehard Plesser
Head, Department of Data Science
Faculty of Science and Technology
Norwegian University of Life Sciences
PO Box 5003, 1432 Aas, Norway
Phone +47 6723 1560
Email hans.ekkehard.plesser(a)nmbu.no<mailto:hans.ekkehard.plesser@nmbu.no>
Home http://arken.nmbu.no/~plesser
Dear NEST developers,
In our group, we're working on a model of the primary visual cortex and use step_current_source generators to simulate the input current of the LGN neurons. We noticed that the simulation time of our model was very sensitive to the number of step_current_sources. When trying to narrow down the cause, we found out that this might be due to an issue with the parallelization of the step_current_source_generators. The resulting simple system in which the problem can be observed is attached below, simple_example.py. It essentially creates NSstep_current_generators and injects them into NLneurons with fixed indegree. The iaf_cond_exp neuron model is used here. The increment in the number of step_current sources does not benefit from a multithreading performance boost as one would expect. This is compared to the performance boost for the number of neurons; see the technical details below. Our estimated guess is that the difference between 1 and 32 threads is 10 to 20 times slower than the parallelization suggests.
Technical details:
The relative slowdown due to the parallelization of step_current_sources was measured using linear regression over
simulation time = a NL + b NS.
See slowdown_example.png.
The ratio b/a was then calculated. This ratio was then measured in dependence on the number of threads. A bigger difference between the ratio for 1 thread and 32 threads means a greater problem in parallelization in step_current_generators.
Some additional results:
- interval_dependence.png - the slowdown does not depend on amplitude_times in the step_current_source function
- indegree_dependence.png - the slowdown depends on the indegree of nest.Connect(source, neurons). Specifically, the slowdown is worse for low indegree values. This shows the slowdown depends on the number of step_current_sources created, not on the injections themselves.
Are you aware of some lack of parallelization of the step_current_source or current the injection itself? If so, are there any plans for improving it?
best regards,
Jan Střeleček
Dear all,
in the past weeks there was some on- and off-list discussion around the Tsodyks models and examples in NEST and I want to put some files out here that might be interesting for more people.
Around 2019 we were trying to port some old scripts to more recent versions of NEST and did some basic plots. There are descriptions and PyNEST examples available in the docs Model Directory [1, 2, 3], and the example Notebook in the docs runs with NEST 3.x.
Additionally the short-term burst behavior can be reproduced with the attached scripts (taken from the pre-3.x repository and updated to 3.x syntax, still a bit untidy), but to my knowledge this was not yet transferred into a PyNEST version.
There has also been some discussion on the mailing list some time ago [4], but I couldn't find any results that have surfaced on the list here, though I know of some work going on in this direction in different groups.
It would be interesting to hear about your perspective and see where we can combine experience!
best,
Dennis
[1]: https://nest-simulator.readthedocs.io/en/v3.3/models/index.html#short-term-…
[2]: https://nest-simulator.readthedocs.io/en/v3.3/models/tsodyks_synapse.html
[3]: https://nest-simulator.readthedocs.io/en/v3.3/models/tsodyks2_synapse.html
[4]: https://nest-simulator.org/mailinglist/hyperkitty/list/users@nest-simulator…
--
Dipl.-Phys. Dennis Terhorst
Coordinator Software Development
Institute of Neuroscience and Medicine (INM-6)
Computational and Systems Neuroscience &
Theoretical Neuroscience,
Institute for Advanced Simulation (IAS-6)
Jülich Research Centre, Member of the Helmholz Association and JARA
52425 Jülich, Germany
Building 15.22 Room 4004
Phone +49 2461 61-85062
Fax +49 2461 61- 9460
d.terhorst(a)fz-juelich.de
Dear NEST Users,
As discussed a long time ago (see towards the end of the discussion in https://github.com/nest/nest-simulator/issues/329), the hh_cond_exp_traub model included in NEST is far from the model by Traub and Miles (1991) which inspired it. The model in NEST was implemented as part of the work on the Brette et al (2007) review paper.
I am considering to create a pull request that will
1. rename the models to hh_brette_2007* in keeping with NEST naming guidelines for paper-specific models
2. change the threshold parameter V_T to be the actual threshold for spike emission (currently the criterium is V_m >= V_T + 30
and falling V_m)
I was therefore wondering if anyone of you
1. uses the hh_cond_* models at all
2. would object to those changes?
Depending on whether the models are being used or not, I'd either implement the renaming via deprecation or directly.
Best,
Hans Ekkehard
--
Prof. Dr. Hans Ekkehard Plesser
Head, Department of Data Science
Faculty of Science and Technology
Norwegian University of Life Sciences
PO Box 5003, 1432 Aas, Norway
Phone +47 6723 1560
Email hans.ekkehard.plesser(a)nmbu.no<mailto:hans.ekkehard.plesser@nmbu.no>
Home http://arken.nmbu.no/~plesser
Dear NEST Users & Developers!
I would like to invite you to our next fortnightly Open NEST Developer
Video Conference, today
Monday April 25, 11.30-12.30 CEST (UTC+2).
Special feature today will be a discussion of the available, but maybe not so widely known astrocyte model implementation in NEST.
Feel free to join the meeting also just to bring your own questions for
direct discussion in the in-depth section.
As usual, in the Project team round, a contact person of each team will
give a short statement summarizing ongoing work in the team and
cross-cutting points that need discussion among the teams. The remainder
of the meeting we would go into a more in-depth discussion of topics
that came up on the mailing list or that are suggested by the teams.
Agenda
* Welcome
* Review of NEST User Mailing List
* Project team round
* In-depth discussion
* Astrocyte models in NEST
The agenda for this meeting is also available online, see
https://github.com/nest/nest-simulator/wiki/2022-04-25-Open-NEST-Developer-…
Looking forward to seeing you soon!
Cheers,
Dennis Terhorst
------------------
Log-in information
------------------
We use a virtual conference room provided by DFN (Deutsches Forschungsnetz).
You can use the web client to connect. We however encourage everyone to
use a headset for better audio quality or even a proper video
conferencing system (see below) or software when available.
Web client
* Visit https://conf.dfn.de/webapp/conference/97938800
* Enter your name and allow your browser to use camera and microphone
* The conference does not need a PIN to join, just click join and you're in.
In case you see a dfnconf logo and the phrase "Auf den
Meetingveranstalter warten", just be patient, the meeting host needs to
join first (a voice will tell you).
VC system/software
How to log in with a video conferencing system, depends on you VC system
or software.
- Using the H.323 protocol (eg Polycom): vc.dfn.net##97938800 or
194.95.240.2##97938800
- Using the SIP protocol:97938800@vc.dfn.de
- By telephone: +49-30-200-97938800
For those who do not have a video conference system or suitable
software, Polycom provides a pretty good free app for iOS and Android,
so you can join from your tablet (Polycom RealPresence Mobile, available
from AppStore/PlayStore). Note that firewalls may interfere with
videoconferencing in various and sometimes confusing ways.
For more technical information on logging in from various VC systems,
please see http://vcc.zih.tu-dresden.de/index.php?linkid=1.1.3.4
Dear NEST Developers!
If you are just using NEST, you can ignore this message.
If you are coding models at the C++ level, I just want to share this information to reduce the risk of confusion:
The good old calibrate() method found in all neuron models has now been renamed to pre_run_hook(), see https://github.com/nest/nest-simulator/pull/2336. Thanks to Pooja for implementing this.
I'd suggest that you merge the master branch into any open development branches to bring them up to date on this change.
Best,
Hans Ekkehard
--
Prof. Dr. Hans Ekkehard Plesser
Head, Department of Data Science
Faculty of Science and Technology
Norwegian University of Life Sciences
PO Box 5003, 1432 Aas, Norway
Phone +47 6723 1560
Email hans.ekkehard.plesser(a)nmbu.no<mailto:hans.ekkehard.plesser@nmbu.no>
Home http://arken.nmbu.no/~plesser
Hello Rachael,
Could you post a minimal script or Jupyter notebook reproducing the problem? Does the problem only occur on the Neurorobotics platform or can you reproduce it locally on your computer as well?
Best regards,
Hans Ekkehard
--
Prof. Dr. Hans Ekkehard Plesser
Head, Department of Data Science
Faculty of Science and Technology
Norwegian University of Life Sciences
PO Box 5003, 1432 Aas, Norway
Phone +47 6723 1560
Email hans.ekkehard.plesser(a)nmbu.no<mailto:hans.ekkehard.plesser@nmbu.no>
Home http://arken.nmbu.no/~plesser
On 13/04/2022, 13:20, "Rachael Stentiford" <rachael.stentiford(a)brl.ac.uk<mailto:rachael.stentiford@brl.ac.uk>> wrote:
Some people who received this message don't often get email from rachael.stentiford(a)brl.ac.uk. Learn why this is important<http://aka.ms/LearnAboutSenderIdentification>
Apologies for so many emails. Looks like i jumped the gun a little and i am still having this issue. So if anyone does have any ideas, I would welcome the help.
Cheers,
Rachael
________________________________
From: Rachael Stentiford <rachael.stentiford(a)brl.ac.uk>
Sent: Wednesday, April 13, 2022 11:46 AM
To: users(a)nest-simulator.org <users(a)nest-simulator.org>
Subject: Re: Kernel death when recording from 300+ cells for 2mins+
Looks like i have solved this issue now.
For those interested in the solution, i have switched from using the efficient method for splitting a simulation into multiple intervals described in the docs, which has the clean up functions outside of the loop, to just using nest.Simulate() in the loop so the clean up functions are called each time.
Cheers,
Rachael
________________________________
From: Rachael Stentiford <rachael.stentiford(a)brl.ac.uk>
Sent: Tuesday, April 12, 2022 6:32 PM
To: users(a)nest-simulator.org <users(a)nest-simulator.org>
Subject: [NEST Users] Kernel death when recording from 300+ cells for 2mins+
Dear all,
I hope you are doing well. I am having an issue using pyNEST which i haven't come across before and have been unable to find answer to elsewhere. I am working with Python 3.8 and Nest 2.18 (which is the version currently used by the Neurorobotics platform).
I have a network of 6 x 360 neurons, and i am trying to record spikes from 360 of these. However, I am finding that if i try to simulate for more than ~2 mins i run into problems. In jupyter notebooks I get a kernel death error, but even just running a standard python script NEST dies. I can simulate for longer if i only record from a single neuron.
Previously I have used networks of 4 x 180 neurons and recorded from 180 for 10+ minutes without an issue.
I have tried splitting up the simulation into multiple intervals, as explained in the docs, and saving spikes in between. This would mean there are fewer spikes to save each time, but this doesn't appear to help.
Ideally i would like to record from 360 cells for around 30 mins. Have any of you come across a similar issue in the past, or have any suggestions for how I could fix this issue?
Thanks for your help!
Rachael
-------------------------
Dr Rachael Stentiford (she/her)
Research Associate in Neurobotics
Bristol Robotics Laboratory
University of the West of England
Bristol
BS16 1QY
Dear all,
I hope you are doing well. I am having an issue using pyNEST which i haven't come across before and have been unable to find answer to elsewhere. I am working with Python 3.8 and Nest 2.18 (which is the version currently used by the Neurorobotics platform).
I have a network of 6 x 360 neurons, and i am trying to record spikes from 360 of these. However, I am finding that if i try to simulate for more than ~2 mins i run into problems. In jupyter notebooks I get a kernel death error, but even just running a standard python script NEST dies. I can simulate for longer if i only record from a single neuron.
Previously I have used networks of 4 x 180 neurons and recorded from 180 for 10+ minutes without an issue.
I have tried splitting up the simulation into multiple intervals, as explained in the docs, and saving spikes in between. This would mean there are fewer spikes to save each time, but this doesn't appear to help.
Ideally i would like to record from 360 cells for around 30 mins. Have any of you come across a similar issue in the past, or have any suggestions for how I could fix this issue?
Thanks for your help!
Rachael
-------------------------
Dr Rachael Stentiford (she/her)
Research Associate in Neurobotics
Bristol Robotics Laboratory
University of the West of England
Bristol
BS16 1QY
Dear NEST users,
After trying multiple ideas, I am still facing the issue where my
installation is successful with an error message:
-- Configuring done
CMake Warning at cmake/UseCython.cmake:100 (add_library):
Cannot generate a safe runtime search path for target pynestkernel because
files in some directories may conflict with libraries in implicit
directories:
runtime library [libgomp.so.1] in /usr/lib/gcc/x86_64-linux-gnu/9 may
be hidden by files in:
/home/<user>/miniconda3/envs/nestConda/lib
Some of these libraries may not be found correctly.
Call Stack (most recent call first):
cmake/UseCython.cmake:283 (python_add_module)
pynest/CMakeLists.txt:28 (cython_add_module)
------------------------------------------------------------------------------------------------
This leads to the failure of multiple tests. The summary of the tests and
logs for the failed tests is attached.
I must mention that I tested the installation by creating a neuron and it
works fine.
It would be nice to get rid of the error, please see error_msg file to see
where it fails.
--
Thanks and Regards
*Maryada*
Dear NEST Users & Developers!
I would like to invite you to our next fortnightly Open NEST Developer
Video Conference, today
Monday April 11, 11.30-12.30 CEST (UTC+2).
Feel free to join the meeting also just to bring your own questions for
direct discussion in the in-depth section.
As usual, in the Project team round, a contact person of each team will
give a short statement summarizing ongoing work in the team and
cross-cutting points that need discussion among the teams. The remainder
of the meeting we would go into a more in-depth discussion of topics
that came up on the mailing list or that are suggested by the teams.
Agenda
* Welcome
* Review of NEST User Mailing List
* Project team round
* In-depth discussion
The agenda for this meeting is also available online, see
https://github.com/nest/nest-simulator/wiki/2022-04-11-Open-NEST-Developer-…
Looking forward to seeing you soon!
Cheers,
Jessica Mitchell
Dennis Terhorst
------------------
Log-in information
------------------
We use a virtual conference room provided by DFN (Deutsches Forschungsnetz).
You can use the web client to connect. We however encourage everyone to
use a headset for better audio quality or even a proper video
conferencing system (see below) or software when available.
Web client
* Visit https://conf.dfn.de/webapp/conference/97938800
* Enter your name and allow your browser to use camera and microphone
* The conference does not need a PIN to join, just click join and you're in.
In case you see a dfnconf logo and the phrase "Auf den
Meetingveranstalter warten", just be patient, the meeting host needs to
join first (a voice will tell you).
VC system/software
How to log in with a video conferencing system, depends on you VC system
or software.
- Using the H.323 protocol (eg Polycom): vc.dfn.net##97938800 or
194.95.240.2##97938800
- Using the SIP protocol:97938800@vc.dfn.de
- By telephone: +49-30-200-97938800
For those who do not have a video conference system or suitable
software, Polycom provides a pretty good free app for iOS and Android,
so you can join from your tablet (Polycom RealPresence Mobile, available
from AppStore/PlayStore). Note that firewalls may interfere with
videoconferencing in various and sometimes confusing ways.
For more technical information on logging in from various VC systems,
please see http://vcc.zih.tu-dresden.de/index.php?linkid=1.1.3.4
------------------------------------------------------------------------------------------------
------------------------------------------------------------------------------------------------
Forschungszentrum Juelich GmbH
52425 Juelich
Sitz der Gesellschaft: Juelich
Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
Vorsitzender des Aufsichtsrats: MinDir Volker Rieke
Geschaeftsfuehrung: Prof. Dr.-Ing. Wolfgang Marquardt (Vorsitzender),
Karsten Beneke (stellv. Vorsitzender), Prof. Dr. Astrid Lambrecht,
Prof. Dr. Frauke Melchior
------------------------------------------------------------------------------------------------
------------------------------------------------------------------------------------------------
Neugierige sind herzlich willkommen am Sonntag, den 21. August 2022, von 10:00 bis 17:00 Uhr. Mehr unter: https://www.tagderneugier.de