Dear NEST Users & Developers!
I would like to invite you to our next fortnightly Open NEST Developer Video Conference, today
Monday December 19, 11.30-12.30 CET (UTC+1).
Feel free to join the meeting also if it's just to bring your own questions for direct discussion in the in-depth section.
As usual, in the Project team round, a contact person of each team will give a short statement summarizing ongoing work in the team and cross-cutting points that need discussion among the teams. The remainder of the meeting we would go into a more in-depth discussion of topics that came up on the mailing list or that are suggested by the teams.
Agenda
* Welcome
* Review of NEST User Mailing List
* Project team round
* In-depth discussion
The agenda for this meeting is also available online, see
https://github.com/nest/nest-simulator/wiki/2022-12-19-Open-NEST-Developer-…
As I will not be joining today, I delegated chairing the meeting.
Looking forward to seeing you again next year!
Merry Christmas and a happy new year!
Cheers,
Dennis Terhorst :*)
------------------
Log-in information
------------------
We use a virtual conference room provided by DFN (Deutsches Forschungsnetz).
You can use the web client to connect. We however encourage everyone to
use a headset for better audio quality or even a proper video
conferencing system (see below) or software when available.
Web client
* Visit https://conf.dfn.de/webapp/conference/97938800
* Enter your name and allow your browser to use camera and microphone
* The conference does not need a PIN to join, just click join and you're in.
In case you see a dfnconf logo and the phrase "Auf den
Meetingveranstalter warten", just be patient, the meeting host needs to
join first (a voice will tell you).
VC system/software
How to log in with a video conferencing system, depends on you VC system
or software.
- Using the H.323 protocol (eg Polycom): vc.dfn.net##97938800 or
194.95.240.2##97938800
- Using the SIP protocol:97938800@vc.dfn.de
- By telephone: +49-30-200-97938800
For those who do not have a video conference system or suitable
software, Polycom provides a pretty good free app for iOS and Android,
so you can join from your tablet (Polycom RealPresence Mobile, available
from AppStore/PlayStore). Note that firewalls may interfere with
videoconferencing in various and sometimes confusing ways.
For more technical information on logging in from various VC systems,
please see http://vcc.zih.tu-dresden.de/index.php?linkid=1.1.3.4
Dear nest community,
I wonder if any of you know how to customize Ion channels and receptor types of "cm_default" neuron model through nestml.
I noticed some information on Extending NESTML- Running NESTML with custom templates, but I still found myself confused due to my poor understanding.
Is there any way to acquire the .nestml file of the cm_default neuron model?
Thank you.
Best,
Zirui
Dear NEST Users & Developers!
I would like to invite you to our next fortnightly Open NEST Developer Video Conference, today
Monday December 5, 11.30-12.30 CET (UTC+1).
Feel free to join the meeting also if it's just to bring your own questions for direct discussion in the in-depth section.
Special topic for the in-depth discussion for today are
/ Hackathon Results
/
As usual, in the Project team round, a contact person of each team will give a short statement summarizing ongoing work in the team and cross-cutting points that need discussion among the teams. The remainder of the meeting we would go into a more in-depth discussion of topics that came up on the mailing list or that are suggested by the teams.
Agenda
* Welcome
* Review of NEST User Mailing List
* Project team round
* In-depth discussion
The agenda for this meeting is also available online, see
https://github.com/nest/nest-simulator/wiki/2022-12-05-Open-NEST-Developer-…
Looking forward to seeing you again soon!
Cheers,
Dennis Terhorst
------------------
Log-in information
------------------
We use a virtual conference room provided by DFN (Deutsches Forschungsnetz).
You can use the web client to connect. We however encourage everyone to
use a headset for better audio quality or even a proper video
conferencing system (see below) or software when available.
Web client
* Visit https://conf.dfn.de/webapp/conference/97938800
* Enter your name and allow your browser to use camera and microphone
* The conference does not need a PIN to join, just click join and you're in.
In case you see a dfnconf logo and the phrase "Auf den
Meetingveranstalter warten", just be patient, the meeting host needs to
join first (a voice will tell you).
VC system/software
How to log in with a video conferencing system, depends on you VC system
or software.
- Using the H.323 protocol (eg Polycom): vc.dfn.net##97938800 or
194.95.240.2##97938800
- Using the SIP protocol:97938800@vc.dfn.de
- By telephone: +49-30-200-97938800
For those who do not have a video conference system or suitable
software, Polycom provides a pretty good free app for iOS and Android,
so you can join from your tablet (Polycom RealPresence Mobile, available
from AppStore/PlayStore). Note that firewalls may interfere with
videoconferencing in various and sometimes confusing ways.
For more technical information on logging in from various VC systems,
please see http://vcc.zih.tu-dresden.de/index.php?linkid=1.1.3.4
I use Docker to create a nest environment. The version is v2.16.0
But when I execute the code as follows,vt_dummy = nest.Create("global_volume_transmitter", 1, {'deliver_interval': 10}),
, I encounter an error, the error is
pynestkernel.NESTError: UnknownModelName in GetDefaults_l: /global_volume_transmitter is not a known model name. Please check the modeldict for a list of available models .
but after I use the nest2.16 source code to install, the same code executes successfully. I don't know why
Dear NEST community:
We all know that distributed computing is carried out in Nest, and neurons are distributed to each neuron through polling. However, considering the load balancing and sparsity of communication, this is a NP hard problem. Should we adopt some other methods, such as directed graph segmentation technology. Whether this will be considered in Nest, or whether graph segmentation is helpful in SNN simulation.
If you can spare time to reply, thank you very much
Dear NEST Users & Developers!
I would like to invite you to our next fortnightly Open NEST Developer Video Conference, today
Monday November 21, 11.30-12.30 CET (UTC+1).
Feel free to join the meeting also if it's just to bring your own questions for direct discussion in the in-depth section.
Special topic for the in-depth discussion for today are mechanisms around
/Tsodyks-style synapses/
As usual, in the Project team round, a contact person of each team will give a short statement summarizing ongoing work in the team and cross-cutting points that need discussion among the teams. The remainder
of the meeting we would go into a more in-depth discussion of topics that came up on the mailing list or that are suggested by the teams.
Agenda
* Welcome
* Review of NEST User Mailing List
* Project team round
* In-depth discussion
The agenda for this meeting is also available online, see
https://github.com/nest/nest-simulator/wiki/2022-11-21-Open-NEST-Developer-…
Looking forward to seeing you again soon!
Cheers,
Dennis Terhorst
------------------
Log-in information
------------------
We use a virtual conference room provided by DFN (Deutsches Forschungsnetz).
You can use the web client to connect. We however encourage everyone to
use a headset for better audio quality or even a proper video
conferencing system (see below) or software when available.
Web client
* Visit https://conf.dfn.de/webapp/conference/97938800
* Enter your name and allow your browser to use camera and microphone
* The conference does not need a PIN to join, just click join and you're in.
In case you see a dfnconf logo and the phrase "Auf den
Meetingveranstalter warten", just be patient, the meeting host needs to
join first (a voice will tell you).
VC system/software
How to log in with a video conferencing system, depends on you VC system
or software.
- Using the H.323 protocol (eg Polycom): vc.dfn.net##97938800 or
194.95.240.2##97938800
- Using the SIP protocol:97938800@vc.dfn.de
- By telephone: +49-30-200-97938800
For those who do not have a video conference system or suitable
software, Polycom provides a pretty good free app for iOS and Android,
so you can join from your tablet (Polycom RealPresence Mobile, available
from AppStore/PlayStore). Note that firewalls may interfere with
videoconferencing in various and sometimes confusing ways.
For more technical information on logging in from various VC systems,
please see http://vcc.zih.tu-dresden.de/index.php?linkid=1.1.3.4
Dear NEST community,
Thank you so much for your previous suggestions on writing my own neuron model with NESTML.
I constructed a multiple synapses version of simple integrate-and-fire ("iaf_cond_beta_multisynapse") neuron based on the existing "iaf_cond_beta" neuron model. And I did this by simply adding two synaptic channels (AMPA and NMDA) without changing the structure of the neuron model. But when I called this model on NEST to set the initial membrane potentials from a uniform distribution (from 280 to 255 mV), for example:"d1_pop.set({"V_m":nest.random.uniform(-80.0, -55.0)})", an error reporting NESTErrors.TypeMismatch: "TypeMismatch in SLI function SetStatus_id: Expected datatype: doubletype Provided datatype: parametertype" appeared.
So I wonder is there any coding errors in my own neuron model, since it worked just fine when I set the initial membrane potentials to a number like -80, and how can I fix this error.
Best,
Zirui
Hello Alice,
Setting repeated recording windows is currently not possible (but maybe create a feature request on Github). What you can do is the following
n1 = nest.Create('iaf_psc_delta', params={'I_e': 1000.})
n2 = nest.Create('iaf_psc_delta')
wr = nest.Create('weight_recorder', params={'origin': 0., 'start': 0, 'stop': 1000})
nest.CopyModel('stdp_synapse', 'stdp_synapse_rec', {'weight_recorder': wr})
nest.Connect(n1, n2, syn_spec={'synapse_model': 'stdp_synapse_rec', 'weight': 20})
nest.Connect(n2, sr)
t_block = 5000
for k in range(10):
wr.origin = k * t_block
nest.Simulate(t_block)
This will simulate in 5s blocks and record weights only during the first 1s of each block. The actual recording windows are given by
(origin+start, origin+stop]. You need to do this with Simulate() calls since you need to change the origin parameter; therefore RunManager + run() does not work.
You could also record to file if that is an option for your workflow, by setting wr.record_to = 'ascii'. If you want to combine that with a loop, you need to set wr.label to a different value for each loop iteration, otherwise data will be overwritten for each round through the loop.
Best,
Hans Ekkehard
--
Prof. Dr. Hans Ekkehard Plesser
Head, Department of Data Science
Faculty of Science and Technology
Norwegian University of Life Sciences
PO Box 5003, 1432 Aas, Norway
Phone +47 6723 1560
Email hans.ekkehard.plesser(a)nmbu.no<mailto:hans.ekkehard.plesser@nmbu.no>
Home http://arken.nmbu.no/~plesser
From: Alice Geminiani <alice.geminiani(a)polimi.it>
Reply to: NEST User Mailing List <users(a)nest-simulator.org>
Date: Friday, 11 November 2022 at 18:47
To: "users(a)nest-simulator.org" <users(a)nest-simulator.org>
Subject: [NEST Users] weight_recorder with large number of synapses
Some people who received this message don't often get email from alice.geminiani(a)polimi.it. Learn why this is important<https://aka.ms/LearnAboutSenderIdentification>
Dear NEST users,
I am trying to record weights from a network with about 1.5 million synapses, for simulations of a 30-minute protocol, and using the weight_recorder causes memory errors. Is there a way to record only in specific time instants of the simulation (e.g. every 5 minutes)?
Thanks a lot!
Cheers,
Alice
----------------------------
Alice Geminiani, PhD
PostDoctoral Researcher
NeuroEngineering And medical Robotics Laboratory - NEARLab
Department of Electronics, Information and Bioengineering - DEIB
Politecnico di Milano
Neurocomputational laboratory
Department of Brain and Behavioral Sciences
University of Pavia
Dear NEST users,
I am trying to record weights from a network with about 1.5 million synapses, for simulations of a 30-minute protocol, and using the weight_recorder causes memory errors. Is there a way to record only in specific time instants of the simulation (e.g. every 5 minutes)?
Thanks a lot!
Cheers,
Alice
----------------------------
Alice Geminiani, PhD
PostDoctoral Researcher
NeuroEngineering And medical Robotics Laboratory - NEARLab
Department of Electronics, Information and Bioengineering - DEIB
Politecnico di Milano
Neurocomputational laboratory
Department of Brain and Behavioral Sciences
University of Pavia
Hi,
I planned to use simple integrate-and-fire model (iaf_cond_beta) to construct the cortico-striatal network, but then it appeared some issues. Since striatum neurons have multiple synapses (AMPA and NMDA for example), ‘iaf_cond_beta’ seemed not to support multiple synapse input. I searched the whole Model directory and found multiple synapse neuron models like “aeif_cond_beta_multisynapse”, “aeif_cond_alpha_multisynapse” and “aeif_psc_alpha_multisynaps”.
So I wondered if there’s a multiple synapse version of simple integrate-and-fire model (iaf_cond_beta), or can I construct a multiple synapse simple integrate-and-fire model through nestml.
Thank you
Best,
Zirui