top of page
Bubbles

Event details

June 26, 2024

Colin Scheibner

Princeton University
colin.jpg
Spiking at the Edge: Excitability at interfaces in reaction-diffusion systems

Abstract:  Spiking is a general phenomenon, crucial in the firing of neurons, beating of hearts, and spread of diseases. In homogeneous media, spiking arises from a local competition between amplifying and suppressing forces. But most real-world systems are far from homogeneous. In this talk, I will discuss how inhomogeneities such as interfaces and boundaries (that spatially segregate these two forces) can promote spiking, even if the system does not spike when these forces are evenly mixed. The underlying mathematics reveal a counterintuitive spiking phase diagram in which increasing the system size or decreasing the diffusive coupling can give rise to spiking. These insights apply to chemical reactions, predator–prey dynamics, and recent electrophysiology experiments, in which localized action potentials were observed at the interface of distinct, nonspiking bioelectric tissues.

Speaker Bio: 

 Colin received a BA in physics and mathematics from St. Olaf College in 2017 and completed his PhD in physics at the University of Chicago in 2023. His research interests typically involve using statistical physics (e.g. hydrodynamics and coarse graining) and math (e.g. geometry and topology) to understand emergent behavior in soft matter and biology. Colin is currently a postdoc at Princeton University in the Center for the Physics of Biological Function and the Princeton Center for Theoretical Science.

Date: June 26, 2024

Time: 12pm (Eastern time zone)

via zoom: Join our mailing list to get the link

​

June 26, 2024

Samuel Dillavou

University of Pennsylvania
Sam-J-Dillavou.jpg
Emergent Machine Learning in a Nonlinear Electronic Metamaterial

Abstract:  SMachine learning methods typically use gradient descent – a centralized, top-down algorithm – to optimally modify every parameter. In this talk I'll discuss our recently realized electronic learning metamaterials that perform machine learning differently, in an entirely bottom-up manner (without help from a processor). Each element of our system follows local update rules, and global learning emerges from these dynamics. This is a feature shared with the brain, albeit with different rules and dynamics. I'll discuss the construction and operation of these systems, the breadth of tasks they can accomplish even within a single architecture choice, and similarities to biological systems including the brain. Further, I'll show that the system learns complex, nonlinear tasks in a predictable sequence, by lowering polynomial modes of the error. This ordering persists regardless of the relative sizes of these modes. Our system trains in seconds and performs learned tasks in microseconds, dissipating picojoules of power across each element. Future versions have enormous potential to be faster and more efficient than state-of-the-art machine learning solutions, while providing additional benefits like robustness to manufacturing defects, similar to how biological systems can endure damage but retain functionality.

Speaker Bio: 

 Sam Dillavou is a postdoctoral fellow at the University of Pennsylvania in the Department of Physics and Astronomy. He completed his PhD in Physics at Harvard, where he studied memory effects in frictional interfaces. He is now interested in the overlap between experimental physics and (machine) learning, and how the two fields can inform and support each other. This includes building physical systems that can perform machine learning tasks (learn) without a processor, studying complex systems like granular flows that have resisted understanding using standard statistical methods, and using machine learning to make experimental science easier and more accessible.

Chembo
bottom of page