Tuesday 05 May 2020: Dynamics Zoominar: How neural network size affects learning performance
Timothy O'Leary - University of Cambridge
Neuronal networks have many tunable parameters such as synaptic strengths that are shaped during learning of a task. The number of degrees of freedom for representing a task can vastly exceed the minimum required for good performance. I will describe recent work that explores the consequences of such additional ‘redundant’ degrees of freedom for learning and for task representation. We find that additional redundancy in network parameters can make a fixed task easier to learn and compensate for deficiencies in learning rules. However, we also find that in a biologically relevant setting where synapses are subject to unavoidable noise there is an upper limit to the level of useful redundancy in a network, suggesting an optimal network size for a given task.