Rainer Engelken
Bridging dynamical systems and neuroscience to reveal principles of learning in neural circuits.
Welcome! We are a computational neuroscience lab working at the interface of dynamical systems, neuroscience, and machine learning. Our mission is to uncover the theoretical principles that govern how neural circuits learn, process information, and compute.
We bridge biophysics to collective dynamics, developing tools from stats physics and dynamical systems to probe learning in biological and artificial networks.
Our research bridges from the biophysics of single neurons to the collective behavior of large-scale circuits. Using and developing tools from statistical physics and dynamical systems theory, we investigate how neural circuits learn, process information, and maintain stability.
Research Focus Areas
- The Dynamics of Learning: How do neural representations and synaptic weights evolve during learning, and how does network stability enable credit assignment over long time scales?
-
The Role of Chaos and Stability in Computation: Characterizing the rich dynamics of recurrent networks (RNNs), from sparse chaos in spiking circuits to the full Lyapunov spectra of rate networks. We seek to understand when chaos is beneficial for computation and when it must be tamed for stable learning.
- Building Brain-Inspired, Efficient Algorithms: Developing highly scalable, event-based simulation and training algorithms (like SparseProp) to enable the study of large, brain-inspired systems.
Quick links
Quick Links
- 🔬 Research
- 📚 Publications
- 👨🏫 Teaching
- 🚀 Open Positions
selected publications
-
Gradient Flossing: Improving Gradient Descent through Dynamic Control of JacobiansIn Advances in Neural Information Processing Systems, 2023NeurIPS 2023, Poster - SparseProp: Efficient Event-Based Simulation and Training of Sparse Recurrent Spiking Neural NetworksIn Advances in Neural Information Processing Systems, 2023NeurIPS 2023, Poster