Study of complex dynamical neural networks and its application to brain development and emergent synchronization phenomena

  1. Millán Vidal, Ana Paula
Dirigida por:
  1. Joaquín Javier Torres Agudo Director

Universidad de defensa: Universidad de Granada

Fecha de defensa: 16 de septiembre de 2019

Tribunal:
  1. Rosa María Benito Zafrilla Presidente/a
  2. Pedro Luis Garrido Galera Secretario
  3. Lucilla De Arcangelis Vocal
  4. Javier Martín Buldú Vocal
  5. María Teresa Bajo Molina Vocal
Departamento:
  1. ELECTROMAGNETISMO Y FÍSICA DE LA MATERIA

Tipo: Tesis

Resumen

The brain is a paradigmatic example of a complex system with a wide and rich dynamical repertoire arising from the non-linear dynamics of its billions of constituents, connected in a non-trivial manner. Approaches to understanding brain dynamics were first made in a behavioral manner to then become increasingly precise in a microscopic scale. It has been made clear, however, that an integrated view of brain dynamics will be necessary if one seeks to explain how memory, speech, or consciousness may emerge from those little cells and the connections among them. In this thesis we tackled in the problem of the interplay between brain structure and function, and how this may affect its emergent cognitive abilities. We thus consider the framework of biologically inspired neural networks, that have long provided a means of relating cognitive processes, such as memory or brain rhythms, with biophysical dynamics at the cellular level. In particular, we study two fundamental problems in this thesis. Firstly, how the complex structure of brain networks might develop from simple rules based on the microscopical activity, and how this developing structure in turn affects neuronal activity. Secondly, the establishment of a link between the inherent geometrical structure of cortical networks and brain – and in particular synchronization – dynamics. For the frist point, we study brain development and in particular the process of synaptic pruning. In fact, a fundamental question in neuroscience is why brain development proceeds via a severe synaptic pruning – that is, with an initial overgrowth of synapses, followed by the subsequent atrophy of approximately half of them throughout infancy. It is clear that fewer synapses require less metabolic energy, but why not start with the optimal synaptic density? In this thesis we present an adaptive neural network model that shows that the memory performance of the system does indeed depend on whether it passed through a transient period of relatively high synaptic density. Furthermore, the presented model also provides a simple demonstration of how network structure can be optimized by pruning with a rule that only depends on local information at each synapse – the intensity of electrical current – that is consistent with empirical results on synaptic growth and death. In this view, a neural network would begin life as a more or less random structure with a sufficiently high synaptic density that is capable of memory performance. Throughout infancy, certain memories are learnt, and pruning gradually eliminates synapses experiencing less electrical activity. Eventually, a network architecture emerges which has lower mean synaptic density but is still capable, thanks to a more optimal structure, of retrieving memories. Moreover, the network structure will be optimized for the specific patterns it stored. This seems consistent with the fact that young children can acquire memory patterns (such as languages or artistic skills) which remain with them indefinitely, yet as adults they struggle to learn new ones. Interestingly, the reported feed-back loop between form and function might be relevant not only to neural networks but also to other biological and engineering systems, such as protein interaction networks, that also evolve in time and activity dependent manner. Subsequently, we have studied how the highly complex structure of biological neural networks may affect its synchronization properties. In particular, in this thesis we show that non-trivial synchronization states can emerge even in small-world networks, with an infinite Hausdorff dimension, provided that the spectral dimension is finite. These results reveal deep connections between geometry and synchronization dynamics and might provide an useful approach to further investigate, for instance, the relation between structural and functional brain networks. Overall, in this thesis we have combined mathematical and computational simulations to investigate the collective behavior that may emerge in neural networks due to the symbiotic relationship between their complex heterogeneous structure and the non-linear physiological dynamics of their components.