Speaker
Description
Heterogeneity is an inherent feature of biological systems that is often treated as a source of noise or disregarded in analyses of network function. In a series of computational studies, we systematically examine task-independent, intrinsic heterogeneity within neuronal systems and evaluate their role in solving a wide range of tasks with varying levels of complexity. Our findings demonstrate that biologically consistent variability in synaptic and neuronal properties substantially enhances the performance and robustness of both rate- and spike-based networks, as well as conventional machine learning architectures. Moreover, by implementing our spike-based networks on cutting-edge neuromorphic hardware platforms and quantifying their energy consumption, we provide evidence that heterogeneity could be a fundamental principle for the design of next generation energy-efficient computing hardware. Taken together, these results illustrate the potential of insights from cellular neuroscience to deepen our understanding of network function and inform the development of novel design principles for advancing artificial intelligence and its hardware.