Hi all,
I am currently using NEST to build a Liquid State Machine (LSM) as a preprocessing step for my work. I am setting the amplitude_times and amplitude_values of multiple step_current_generators as input, which is read from .npy files as batches of size 32.
However, the issue is that the first dimension of the input data, which is 32 in this example, is not directly handled as the batch size in NEST. To work around this issue, I have implemented the following approach:
I first get the shape of the input and determine the size of the first dimension, which in this case is 32.
I then set up a for-loop that repeats 32 times.
Within each repetition of the for-loop, I set the parameters of the step current generators as input, and then simulate for some time using the "Simulate" function (e.g. using "Simulate(3000)").
While this approach works, it is not ideal as it requires me to simulate each data sample individually. For example, if I have 100 batches, each with 32 data samples, I would need to simulate 100 x 32 x 3000ms to finish processing all the data samples.
Therefore, I am wondering if there is a way to handle the input data directly in batches, as is typically done with tools like PyTorch.
Thank you very much for your assistance!
Best,
Yin