phd thesis
Physical Constraints and Functional Demands shape Modular Neuromorphic Intelligence
My doctoral thesis, defended at Imperial College London (Neural Reckoning Group, supervised by Prof. Dan Goodman), explores modularity and self-organisation in neural networks.
The first half of the thesis investigates the structure–function relationship in neural networks: how structural modularity, resource constraints, and input statistics jointly shape functional specialisation, and how compositional learning can be grounded in physically embedded, energy-constrained substrates such as memristive neuromorphic hardware.
The second half steps a level deeper, into the foundations of self-organisation itself. It explores how continuous Neural Cellular Automata can be sculpted into a universal computational medium via gradient descent, and how a closely related local-message-passing policy can grow and self-repair digital Boolean circuits — bridging biological resilience and reconfigurable hardware.
Together, these chapters frame modularity less as a fixed architectural property and more as an emergent phenomenon arising from the tension between constraints and goals, between local rules and global behaviour.