Speaker
Description
Biological neurons exhibit a large degree of heterogeneity. In a recent work, we have shown that heterogeneity in the timescales of rate-based neurons can be exploited for a better input representation in networks, leading to better performance on various tasks comprising nonlinear transformations of time-shifted input. More specifically, we have used a recurrently balanced network driven by multidimensional chaotic input as a dynamic reservoir, and determined the respectively best linear readout approximation for the aforementioned tasks.
Here, we employ the widely used Brian 2 library to extend our previous studies to networks of spiking neurons. Compared to conventional networks of rate neurons, spiking neural networks have the benefit of being more biologically realistic as well as more energy efficient due their sparse signal transmission. To efficiently train our spiking network, we use so-called surrogate gradient descent methods, which entail, for example, the convolution of the spikes with an exponentially decaying kernel. The biological realism and the larger number of parameters of the spiking model enable us to investigate in more detail the benefits of heterogeneous timescales for predicting, memorizing, and processing chaotic time series in biological and artificial intelligence systems. Next, we exploit the compatibility of our spiking model with neuromorphic hardware systems. Using our recently developed Brian2Lava package, which connects Brian 2 to the neuromorphic computing framework Lava, we implement the spiking model on Intel's neuromorphic chip Loihi 2. This enables us to test the performance of the model on a system that promises to be extremely energy efficient and scalable. Furthermore, we explore the implementation of our model on other neuromorphic systems such as SpiNNaker 2 and memristive devices.