MIT PhD student Miranda Schwacke is developing neuromorphic computing devices that mimic the brain’s energy-efficient processing to tackle artificial intelligence’s massive power consumption problem. Her research on electrochemical ionic synapses could help create AI systems that process and store information in the same location, dramatically reducing the energy required for machine learning compared to traditional computing architectures.
Why this matters: Training large AI models consumes enormous amounts of energy, while the human brain operates far more efficiently when learning new information by processing and storing data in the same neural locations.
How it works: Schwacke’s devices replicate brain synapses using materials that can be “tuned” to adjust conductivity, similar to how neurons strengthen or weaken connections.
The technical challenge: Schwacke is bridging two distinct scientific fields—electrochemistry and semiconductor physics—to create these brain-inspired devices.
What her advisor says: “This is electrochemistry for brain-inspired computing,” explains Bilge Yildiz, Schwacke’s advisor and the Breene M. Kerr Professor at MIT.
Recognition and impact: Schwacke received MathWorks Fellowships from MIT’s School of Engineering in both 2023 and 2024 for her work using MATLAB in critical data analysis and visualization.
Beyond the lab: Schwacke actively engages in science communication through Kitchen Matters, a graduate student group that explains scientific concepts using food and cooking.