Form Versus Function: Theory and Models for Neuronal Substrates
This thesis addresses one of the most fundamental challenges for modern science: how can the brain as a network of neurons process information, how can it create and store internal models of our world, and how can it infer conclusions from ambiguous data? The author addresses these questions with the rigorous language of mathematics and theoretical physics, an approach that requires a high degree of abstraction to transfer results of wet lab biology to formal models. The thesis starts with an in-depth description of the state-of-the-art in theoretical neuroscience, which it subsequently uses as a basis to develop several new and original ideas. Throughout the text, the author connects the form and function of neuronal networks. This is done in order to achieve functional performance of biological brains by transferring their form to synthetic electronics substrates, an approach referred to as neuromorphic computing. The obvious aspect that this transfer can never be perfect but necessarily leads to performance differences is substantiated and explored in detail. The author also introduces a novel interpretation of the firing activity of neurons. He proposes a probabilistic interpretation of this activity and shows by means of formal derivations that stochastic neurons can sample from internally stored probability distributions. This is corroborated by the author’s recent findings, which confirm that biological features like the high conductance state of networks enable this mechanism. The author goes on to show that neural sampling can be implemented on synthetic neuromorphic circuits, paving the way for future applications in machine learning and cognitive computing, for example as energy-efficient implementations of deep learning networks. The thesis offers an essential resource for newcomers to the field and an inspiration for scientists working in theoretical neuroscience and the future of computing.
What people are saying - Write a review
We haven't found any reviews in the usual places.
Simulation and Emulation of Neural Networks
4 Dynamics and Statistics of PoissonDriven LIF Neurons
5 Cortical Models on Neuromorphic Hardware
6 Probabilistic Inference in Neural Networks
Other editions - View all
activation function active attractors algorithm attentional blink attractor average axonal behavior binary RVs biological Boltzmann distribution Boltzmann machines channels COBA computational configuration connections correlation cortical CUBA defined discussed dynamics effect emulation encode equation excitatory exponential factor factor graphs Figure taken firing rate free membrane potential Gibbs sampling graph gsyne HICANN Iext implementation increasing inhibitory interaction joint distribution LIF networks LIF neurons LIF sampling liquid Markov blanket membrane potential distribution network models neural networks neural sampling neuromorphic neuromorphic hardware neuron and synapse neuron model Neurosci nodes parameters particular Poisson process population postsynaptic probability propagation pulse PYR cells readout refractory period represents reversal potential scaling Schemmel Sect simulation spike trains Spikey chip stimulus stochastic synapse loss synaptic weight noise synfire chain taken from Petrovici target distribution Teff threshold total number variables voltage wsyn