Rainer Engelken
Bridging dynamical systems and neuroscience to reveal principles of learning in neural circuits.
Welcome! We are a computational neuroscience lab working at the interface of dynamical systems, neuroscience, and machine learning. Our mission is to uncover the theoretical principles that govern how neural circuits learn, process information, and compute.
PhD Recruitment: Fall 2026
I am recruiting 1โ3 fully funded PhD students to join my lab at UIUC.
Application paths and deadlines:
- ECE (Electrical & Computer Engineering):
Priority deadline December 15, 2025 ยท Final deadline January 15, 2026 - CS (Siebel School of Computing and Data Science):
Deadline December 15, 2025
I particularly encourage applications from students with strong backgrounds in physics, mathematics, computer science, or engineering, who enjoy thinking deeply with equations and code.
Details on how to apply and what Iโm looking for โ Join the Lab
We bridge biophysics to collective dynamics, developing tools from statistical physics and dynamical systems to probe learning in biological and artificial networks.
Our research bridges from the biophysics of single neurons to the collective behavior of large-scale circuits. Using and developing tools from statistical physics and dynamical systems theory, we investigate how neural circuits learn, process information, and maintain stability.
Research Focus Areas
- The Dynamics of Learning: How do neural representations and synaptic weights evolve during learning, and how does network stability enable credit assignment over long time scales?
-
The Role of Chaos and Stability in Computation: Characterizing the rich dynamics of recurrent networks (RNNs), from sparse chaos in spiking circuits to the full Lyapunov spectra of rate networks. We seek to understand when chaos is beneficial for computation and when it must be tamed for stable learning.
- Building Brain-Inspired, Efficient Algorithms: Developing highly scalable, event-based simulation and training algorithms (like SparseProp) to enable the study of large, brain-inspired systems.
Quick Links
- ๐ฌ Research
- ๐ Publications
- ๐จโ๐ซ Teaching
- ๐ Open Positions
- ๐ CV
selected publications
-
Gradient Flossing: Improving Gradient Descent through Dynamic Control of JacobiansIn Advances in Neural Information Processing Systems, 2023NeurIPS 2023, Poster - SparseProp: Efficient Event-Based Simulation and Training of Sparse Recurrent Spiking Neural NetworksIn Advances in Neural Information Processing Systems, 2023NeurIPS 2023, Poster