Rainer Engelken

Bridging dynamical systems and neuroscience to reveal principles of learning in neural circuits.

RainerEngelken.jpg

Welcome! We are a computational neuroscience lab working at the interface of dynamical systems, neuroscience, and machine learning. Our mission is to uncover the theoretical principles that govern how neural circuits learn, process information, and compute.

We bridge biophysics to collective dynamics, developing tools from stats physics and dynamical systems to probe learning in biological and artificial networks.

Our research bridges from the biophysics of single neurons to the collective behavior of large-scale circuits. Using and developing tools from statistical physics and dynamical systems theory, we investigate how neural circuits learn, process information, and maintain stability.

Research Focus Areas

  • The Dynamics of Learning: How do neural representations and synaptic weights evolve during learning, and how does network stability enable credit assignment over long time scales?
  • The Role of Chaos and Stability in Computation: Characterizing the rich dynamics of recurrent networks (RNNs), from sparse chaos in spiking circuits to the full Lyapunov spectra of rate networks. We seek to understand when chaos is beneficial for computation and when it must be tamed for stable learning.

  • Building Brain-Inspired, Efficient Algorithms: Developing highly scalable, event-based simulation and training algorithms (like SparseProp) to enable the study of large, brain-inspired systems.

Quick links

selected publications

  1. sparse-chaos.png
    Sparse chaos in cortical circuits
    Rainer Engelken, Michael Monteforte, and Fred Wolf
    2024
  2. gradient-flossing.png
    Gradient Flossing: Improving Gradient Descent through Dynamic Control of Jacobians
    Rainer Engelken
    In Advances in Neural Information Processing Systems, 2023
    NeurIPS 2023, Poster
  3. sparse-prop3.svg
    SparseProp: Efficient Event-Based Simulation and Training of Sparse Recurrent Spiking Neural Networks
    Rainer Engelken
    In Advances in Neural Information Processing Systems, 2023
    NeurIPS 2023, Poster
  4. rate-chaos2.png
    Lyapunov spectra of chaotic recurrent neural networks
    Rainer Engelken, Fred Wolf, and L. F. Abbott
    Physical Review Research, 2023