Hi!
After using some PyNEST functions and modifying the C++ implementation of DumpLayerConnections(), I was thinking on a possible further improvement of this function (or creating a new function). But since I do not know the design and architecture of C++ nestkernel code, I prefer to ask.
In my code I normalize the presynaptic connections of every neuron. To do this, I originally do something like
for i in range(len(target_layer))
conn[i] = nest.GetConnections(source_layer, target_layer, synapse_model)
normalize(conn)
Since I am using MPI and the loop iterates over all the neurons, I modified the previous code to
local_nodes = nest.GetLocalNodeCollection(target_layer)
local_nodes = nest.NodeCollection(local_nodes)
for i in range(len(local_nodes))
conn[i] = nest.GetConnections(source_layer, local_nodes, synapse_model)
normalize(conn)
This last code is faster that the previous one (I guess that local_nodes variable is local to every MPI process and, as a consequence, GetConnections() is more efficient because it only works with local nodes instead of with all of them).
Using the idea of the GetLocalNodeCollection(), I was thinking it could be used into the C++ implementation of DumpLayerConnections(). Presently, this function obtain all the connections between source and target layers. Could it be possible to call the equivalent C++ implementation of GetLocalCollection()? I understand that the problem is that, since DumpLayerConnections() needs the spatial information of the layers, the node collections obtained with GetLocalNodeCollection() (I guess similarly its C++ implementation) does not have this spatial information.
As a second possibility, I was thinking on adding a new method (GetSpatialInformation()) and enhancing DumpLayerConnections() (or add a new function). The GetSpatialInformation() function could return the spatial information of a collection of nodes (the ones obtained by GetLocalNodeCollection() ). The DumpLayerConnections() (or a new function) could be enhanced by changing the target layer parameter to a pair of parameters that contain the local nodes collection and their spatial information. Something like DumpLaterConnections(source_layer, local_nodes_collection, spatial_information, synapse_model). This way DumpLayerConnections() will only used local nodes and would be much faster.
Sorry for the technical email.
Xavier
I'm trying to install and use NEST on google Colab, but it's not importing:
!pip install -Uqq nest
!pip install -Uqq nestml
!pip install -Uqq pysilsub
import nest
import nestml
import numpy as np
import matplotlib.pyplot as plt
from pysilsub import observers
from nestml.network import Network
from nestml.models import ConeNakaRushton
Am I able to use NEST on Google Colab?
--
Alvin J. Spivey, Ph.D.
(c) 843.267.8055
Dear NEST Users & Developers!
I would like to invite you to our next fortnightly Open NEST Developer Video Conference today
*Monday April 22, at 11:30 CEST (UTC+0200).*
As usual, in the Project team round, a contact person of each team will give a short statement summarizing ongoing and planned work in the team and highlight cross-cutting points that need discussion among the teams. The remainder of the meeting we would go into a more in-depth discussion of topics that came up on the mailing list or that are suggested by the teams.
Feel free to join the meeting also if it’s just to bring your own quick questions for direct discussion in the in-depth section.
Agenda
* Welcome
* Review of NEST User Mailing List
* Project team round
* In-depth discussion
The agenda for this meeting is also available online, see https://github.com/nest/nest-simulator/wiki/2024-04-22-Open-NEST-Developer-… <https://github.com/nest/nest-simulator/wiki/2024-04-22-Open-NEST-Developer-…>
Looking forward to seeing you soon!
Cheers,
Dennis Terhorst
Log-in information
We use a virtual conference room provided by DFN <https://www.dfn.de/en/> (Deutsches Forschungsnetz).
You can use the web client to connect. We however encourage everyone to use a headset for better audio quality or even a proper video conferencing system (see below) or software when available.
Web client
* Visit https://conf.dfn.de/webapp/conference/97938800 <https://conf.dfn.de/webapp/conference/97938800>
* Enter your name and allow your browser to use camera and microphone
* The conference does not need a PIN to join, just click join and you’re in.
In case you see a dfnconf logo and the phrase “Auf den Meetingveranstalter warten”, just be patient, the meeting host needs to join first (a voice will tell you).
VC system/software
How to log in with a video conferencing system, depends on you VC system or software.
* Using the H.323 protocol (eg Polycom): |vc.dfn.net##97938800| or |194.95.240.2##97938800|
* Using the SIP protocol:97938800@vc.dfn.de <mailto:97938800@vc.dfn.de>
* By telephone: |+49-30-200-97938800|
For those who do not have a video conference system or suitable software, Polycom provides a pretty good free app for iOS and Android, so you can join from your tablet (Polycom RealPresence Mobile, available from AppStore/PlayStore). Note that firewalls may interfere with videoconferencing in various and sometimes confusing ways.
For more technical information on logging in from various VC systems, please see
http://vcc.zih.tu-dresden.de/index.php?linkid=1.1.3.4 <http://vcc.zih.tu-dresden.de/index.php?linkid=1.1.3.4>
--
Dipl.-Phys. Dennis Terhorst
Coordinator Software Development
Institute for Advanced Simulation (IAS-6), Computational and Systems Neuroscience &
JARA-Institute Brain Structure-Function Relationships (INM-10)
Institute of Neuroscience and Medicine
Jülich Research Center, Member of the Helmholz Association
52425 Jülich, Germany
http://www.csn.fz-juelich.de/
Phone +49 2461 61-85062
----------------------------------------------------------------------
Forschungszentrum Juelich GmbH
52425 Juelich
Sitz der Gesellschaft: Juelich
Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
Vorsitzender des Aufsichtsrats: MinDir Stefan Müller
Geschaeftsfuehrung: Prof. Dr. Astrid Lambrecht (Vorsitzende),
Karsten Beneke (stellv. Vorsitzender), Dr. Ir. Pieter Jansens
----------------------------------------------------------------------
Dear all,
The abstract submission deadline for the NEST Conference 2024 has been extended to 30 April. We are looking forward to your contributions!
[Image]
The NEST Conference provides an opportunity for the NEST Community to meet, exchange success stories, swap advice, learn about current developments in and around NEST spiking network simulation and its application.
This year's conference will again take place as a virtual conference on Monday/Tuesday 17/18 June 2024.
We are inviting contributions to the conference, including talks, "posters" and workshops on specific topics.
For more information on how to submit your contribution, register and participate, please visit the conference website
https://nest-simulator.org/conference
Important dates
26 April 2024 - Deadline for NEST Initiative membership applications eligible for fee reduction
30 April 2024 - Extended deadline for submission of contributions
08 May 2024 - Notification of acceptance
07 June 2024 - Registration deadline
17 June 2024 - NEST Conference 2024 starts
We are looking forward to seeing you all in June!
Hans Ekkehard Plesser and the conference organizing committee
--
Prof. Dr. Hans Ekkehard Plesser
Department of Data Science
Faculty of Science and Technology
Norwegian University of Life Sciences
PO Box 5003, 1432 Aas, Norway
Phone +47 6723 1560
Email hans.ekkehard.plesser(a)nmbu.no<mailto:hans.ekkehard.plesser@nmbu.no>
Home http://arken.nmbu.no/~plesser
Dear all,
The abstract submission deadline for the virtual NEST Conference 2024 is approaching (15 April 2024). We are looking forward to your contributions!
The NEST Conference provides an opportunity for the NEST Community to meet, exchange success stories, swap advice, learn about current developments in and around NEST spiking network simulation and its application.
This year's conference will again take place as a virtual conference on Monday/Tuesday 17/18 June 2024.
We are inviting contributions to the conference, including talks, "posters" and workshops on specific topics.
Important dates
15 April 2024 - Deadline for submission of contributions
26 April 2024 - Deadline for NEST Initiative membership applications eligible for fee reduction
08 May 2024 - Notification of acceptance
07 June 2024 - Registration deadline
17 June 2024 - NEST Conference 2024 starts
For more information on how to submit your contribution, register and participate, please visit the conference website
https://nest-simulator.org/conference
We are looking forward to seeing you all in June!
Hans Ekkehard Plesser and the conference organizing committee
--
Prof. Dr. Hans Ekkehard Plesser
Department of Data Science
Faculty of Science and Technology
Norwegian University of Life Sciences
PO Box 5003, 1432 Aas, Norway
Phone +47 6723 1560
Email hans.ekkehard.plesser(a)nmbu.no<mailto:hans.ekkehard.plesser@nmbu.no>
Home http://arken.nmbu.no/~plesser
Dear NEST Users & Developers!
I would like to invite you to our next fortnightly Open NEST Developer Video Conference today
Monday April 8, at 11:30 CEST (UTC+0200).
As usual, in the Project team round, a contact person of each team will give a short statement summarizing ongoing and planned work in the team and highlight cross-cutting points that need discussion among the teams. The remainder of the meeting we would go into a more in-depth discussion of topics that came up on the mailing list or that are suggested by the teams.
Special topic today is the status of the upcoming NEST 3.7 release.
Feel free to join the meeting also if it’s just to bring your own quick questions for direct discussion in the in-depth section.
Agenda
* Welcome
* Review of NEST User Mailing List
* Project team round
* In-depth discussion
* NEST Conference 2024<http://www.nest-initiative.org/conference>: Submission Deadline 15 April 2024
* NEST 3.7 RC1 Status
* Reduce number of warnings (see, e.g., https://github.com/nest/nest-simulator/actions/runs/8566897766)
* Porting tests from SLI to Py
The agenda for this meeting is also available online, see https://github.com/nest/nest-simulator/wiki/2024-03-25-Open-NEST-Developer-…
Looking forward to seeing you soon!
Cheers,
Hans Ekkehard Plesser
Log-in information
We use a virtual conference room provided by DFN<https://www.dfn.de/en/> (Deutsches Forschungsnetz).
You can use the web client to connect. We however encourage everyone to use a headset for better audio quality or even a proper video conferencing system (see below) or software when available.
Web client
* Visit https://conf.dfn.de/webapp/conference/97938800
* Enter your name and allow your browser to use camera and microphone
* The conference does not need a PIN to join, just click join and you’re in.
In case you see a dfnconf logo and the phrase “Auf den Meetingveranstalter warten”, just be patient, the meeting host needs to join first (a voice will tell you).
VC system/software
How to log in with a video conferencing system, depends on you VC system or software.
* Using the H.323 protocol (eg Polycom): vc.dfn.net##97938800 or 194.95.240.2##97938800
* Using the SIP protocol:97938800@vc.dfn.de<mailto:97938800@vc.dfn.de>
* By telephone: +49-30-200-97938800<tel:+493020097938800>
For those who do not have a video conference system or suitable software, Polycom provides a pretty good free app for iOS and Android, so you can join from your tablet (Polycom RealPresence Mobile, available from AppStore/PlayStore). Note that firewalls may interfere with videoconferencing in various and sometimes confusing ways.
For more technical information on logging in from various VC systems, please see
http://vcc.zih.tu-dresden.de/index.php?linkid=1.1.3.4
--
Prof. Dr. Hans Ekkehard Plesser
Department of Data Science
Faculty of Science and Technology
Norwegian University of Life Sciences
PO Box 5003, 1432 Aas, Norway
Phone +47 6723 1560
Email hans.ekkehard.plesser(a)nmbu.no<mailto:hans.ekkehard.plesser@nmbu.no>
Home http://arken.nmbu.no/~plesser
Hello,
I have some questions regarding the iaf_psc_exp model. When it receives current from a current generator (with static synapse), does that current go to receptor_type 0 and not be filtered through an exponential kernel?
Secondly, I would like to confirm if the exponential PSCs follow the exponential function:: i_syn(t) = (q / tau_syn) * exp(-t / tau_syn) * Θ(t) , where Θ(t) is the Heaviside step function defined as Θ(t) = 1 fot t >0 and Θ(t) = 0 otherwise. Additionally, is q chosen to be equal to tau_syn?
Thank you very much!
Best regards,
Anh Phan