Dear all,
I am a new NEST user. I have a question concerning the range of
neuron/synapses model possibilities of NEST.
I would like to implement my own neuron/synapse model with NESTML, but I
am unsure that it would be possible.
Indeed, in my model, synaptic currents are not only relying on
pre-synaptic spikes. To compute synaptic currents, the opening
probability of pre-synaptic channel receptors are required.
Those pre-synaptic channel receptors opening probabilities are evolving
according to differential equations involving second order dynamics,
with specific decays and taking into account the pre-synaptic spikes
arrivals times at this specific synapse.
Those differential equations for the opening probabilities are relying
on different parameters, according to the neurotransmitter type (GABA
A,GABA B, NMDA, AMPA ).
Furthermore, additionally to the input spikes and the pre-synaptic
channel receptors opening probabilities, the current membrane potential
of the post-synaptic neuron is also required to compute the synaptic
currents.
Do you know if one of the NEST models implement similar dynamics? Is it
possible to compute such synaptic dynamics with NESTML by creating a
synapse or (and) a neuron model? Or is it not, due to specific
limitations?
Thank you,
Best regards,
JB
Dear NEST users,
I am trying to debug my network and I found that my spike_recorder with
*ascii* option isn't working as expected.
I am simulating a network for a few milliseconds (say 1000. ms) and then do
some structure changes and then re-run for another phase of a few ms (say
4000. ms)
In the compact version, I am trying to see how this sample code would work.
import nest
nest.ResetKernel()
nest.SetKernelStatus({
"overwrite_files": True,
"data_path": './Debug-Log',
"data_prefix": 'DEBUG-SPIKE-REC',
})
noise_device = nest.Create("poisson_generator",params={'start': 0.,'rate' :
100.})
sd_params = {
"record_to": 'ascii',
"label": "Input_spike_recorder"
}
Input_SD = nest.Create("spike_recorder",1, params=sd_params)
nest.Connect(noise_device,Input_SD)
nest.Simulate(100.)
# do something here
nest.Simulate(100.)
*Issue:* I only get the data recorded from 100th millisecond but for
memory, I get all the data from 0-200 ms.
This behavior is consistent for both overwrite_files as True and False.
In the previous version (v2.20) I also used the same approach and had no
issues. Is there a newer way to do such a simulation paradigm?
I didn't find anything on this on the comparison page of NESTv2.x and
NESTv3.x
Am I missing something? It's really important for my work, I hope I could
find a solution.
--
Thanks and Regards
*Maryada*
Hi, I visited the page 'https://nest-simulator.readthedocs.io/en/latest/models/' looking for the FitzHugh-Nagumo model keyword. I want to use it to create this kind of neuron in Nest. I can not find it. Can anyone help me?
Thanks,
Salvo
Dear NEST users,
As I understood from the documentation unless you set the seed using
nest.rng_seed, nest.random.normal (for instance) should return the same
value
nest.ResetKernel()
nest.rng_seed = 21#69696
v_m = nest.random.normal(mean=-51., std=10.)
v_m.GetValue()
In this code, I always receive the same v_m value for both cases, if the
seed is set as 21 or 69696. The only time it changes is if I remove
ResetKernel() call, which then is expected to return different values
irrespective of rng_seed.
With this code below, I also got the same results irrespective of rng_seed
value
nest.ResetKernel()
nest.rng_seed = 3333#69696
for _ in range(10):
v_m = nest.random.normal(mean=-51., std=10.)
print(v_m.GetValue())
So, maybe rng_seed doesn't reflect on nest.random.normal distribution.
However, then how can I make sure it draws a different set of values?
--
Thanks and Regards
*Maryada*
Dear all,
Just a reminder about Luiz Gadelhas trial lecture tomorrow at 9 in Alfa. Same procedure as before.
Oliver and Habib: Will you be available for chats from ca 11.15 on? Then campus tour and lunch from ca 12. Luiz will have to take the 14.30 bus from Korsegården to Gardermoen.
Best,
Hans Ekkehard
--
Prof. Dr. Hans Ekkehard Plesser
Head, Department of Data Science
Faculty of Science and Technology
Norwegian University of Life Sciences
PO Box 5003, 1432 Aas, Norway
Phone +47 6723 1560
Email hans.ekkehard.plesser(a)nmbu.no<mailto:hans.ekkehard.plesser@nmbu.no>
Home http://arken.nmbu.no/~plesser
Dear Nest Users,
Has anyone used any NEST Synapse Models to create Gamma Rhythms during a
simulation with the NEST Izhikevich Neuron model?
My simulation, which uses the Stdp-Synapse model (additive type) does not
appear to be converging into gamma rhythms (w/bi-model wt distribution)
using super-threshold poisson point process (ref: Polychronization
Computation With Spikes (Izhikevich, 2006),
- DOI: 10.1162/089976606775093882
<https://doi.org/10.1162/089976606775093882>)
-
-
-
-
Thanks for any suggestions,
Best Regards,
--Allen
Dear NEST Users & Developers!
I would like to invite you to our next fortnightly Open NEST Developer Video Conference, today
Monday 22 November, 11.30-12.30 CET (UTC+1).
Feel free to join the meeting also just to bring your own questions for direct discussion in the in-depth section.
As usual, in the Project team round, a contact person of each team will give a short statement summarizing ongoing work in the team and cross-cutting points that need discussion among the teams. The remainder of the meeting we would go into a more in-depth discussion of topics that came up on the mailing list or that are suggested by the teams.
Agenda
* Welcome
* Review of NEST User Mailing List
* Project team round
* In-depth discussion
The agenda for this meeting is also available online, see https://github.com/nest/nest-simulator/wiki/2021-11-22-Open-NEST-Developer-…
Looking forward to seeing you soon!
best,
Dennis Terhorst
------------------
Log-in information
------------------
We use a virtual conference room provided by DFN (Deutsches Forschungsnetz).
You can use the web client to connect. We however encourage everyone to use a headset for better audio quality or even a proper video conferencing system (see below) or software when available.
Web client
* Visit https://conf.dfn.de/webapp/conference/97938800
* Enter your name and allow your browser to use camera and microphone
* The conference does not need a PIN to join, just click join and you're in.
In case you see a dfnconf logo and the phrase "Auf den Meetingveranstalter warten", just be patient, the meeting host needs to join first (a voice will tell you).
VC system/software
How to log in with a video conferencing system, depends on you VC system or software.
- Using the H.323 protocol (eg Polycom): vc.dfn.net##97938800 or 194.95.240.2##97938800
- Using the SIP protocol:97938800@vc.dfn.de
- By telephone: +49-30-200-97938800
For those who do not have a video conference system or suitable
software, Polycom provides a pretty good free app for iOS and Android,
so you can join from your tablet (Polycom RealPresence Mobile, available
from AppStore/PlayStore). Note that firewalls may interfere with
videoconferencing in various and sometimes confusing ways.
For more technical information on logging in from various VC systems,
please see
http://vcc.zih.tu-dresden.de/index.php?linkid=1.1.3.4
Dear Madam, Dear Sir
As a starting doctoral student in the field of computational neurosciences,
I am particularly interested in using NEST in my research works. However, I
am a bit confused about the procedure to install NEST on a M1 macbook pro,
therefore, I prefer to ask you before doing anything. From what I
understood from your website here are the steps that, I suppose, I should
do to install NEST properly:
- as I use a M1MAX Mac (MacOS 12.0) , I can't just install nest, I have to
build it. To do so:
- Download Miniforge for arm64 (from which I will be able to use Conda
right?)
- Install Python 3.9.4 for arm Mac macOS 64-bit universal2 installer
<https://www.python.org/ftp/python/3.9.8/python-3.9.8-macos11.pkg>
- Download the nest source code directly from github (
https://github.com/nest/nest-simulator)
- Create a conda environment using the following command conda env create
-f extras/conda-nest-simulator-dev.yml
- Activate this environment ( conda activate...)
- Create a build directory outside the NEST source and go into it
- Install cmake package using conda ( conda install -c anaconda cmake)
- Configure NEST by running: cmake
-DCMAKE_INSTALL_PREFIX:PATH=<nest_install_dir>
<nest_source_dir>
- Once done I can run make. make install. make installcheck
Here are my questions:
- Is this installation procedure correct?
- Once installed, how can I launch NEST from a python interpreter such as
Spyder? Or is it only possible to launch it within a terminal window?
- I am quite new to managing packages, so correct me if I am wrong, but
have I to install everything using exclusively Conda ?
- What is the purpose of creating a new environment to build NEST
-Is there anything I missed or misunderstood?
I realize how many questions I ask you, and I am very grateful for every
advice you could give me.
If it is more convenient for you, we can have a phone call when you want.
--
*Julien BALLBÉ Y SABATÉ*
*Étudiant en thèse - PhD student*
*Neurophysiology of Visual Computation*
Centre Giovanni Borelli - CNRS UMR 9010
Université de Paris
45 rue des Saint-Pères, 75006 Paris, France
Tel: +33 6 79 03 40 35