The overarching aim of research in my team is to understand how thousands of neurons in the brain work together to implement computations that underlie behavior. To this end, we develop mathematical models based on recurrent neural networks and perform population analyses of neural activity recorded in behaving animals.
Recurrent interactions between neurons generate dynamic patterns of activity that serve as the physical substrate for performing behaviorally relevant computations. Understanding how collective dynamics and computations emerge from recurrent interactions is a key endeavor of research in my team. We investigate the interplay between collective dynamics and computations in recurrent neural networks by combining methods from machine learning with mathematical analyses inspired by statistical physics.
In a key development, my lab has developed a novel class of network models, low-rank recurrent networks, that allow us to directly understand how connectivity determines low-dimensional dynamics of neural activity that implement computations (Mastrogiuseppe and Ostojic 2018). This class of models provides a rich and tractable theoretical framework for unraveling how dynamics enable specific computations, and for interpreting neural recordings in behaving animals.