Key research themes
1. How can neuromorphic computing hardware architectures balance brain-inspired efficiency with flexibility for diverse neural network implementations?
This research theme explores the design principles behind neuromorphic hardware that aim to emulate biological neural efficiency, notably integrating memory and computation and leveraging spike-based event-driven processing, while ensuring sufficient programmability and adaptability to support various neuron models, learning algorithms, and connectivity schemes. Addressing the tension between architectural specialization for energy efficiency and computational flexibility is critical for enabling neuromorphic processors to deploy complex, multi-model practical neural applications effectively.
2. What strategies enable robust and efficient learning on analog and mixed-signal neuromorphic devices with inherent device variability and noise?
Analog and mixed-signal neuromorphic systems promise ultra-low power and real-time spike-based processing by emulating neurobiological mechanisms at the circuit level. However, inherent device mismatch, stochasticity, and manufacturing variations introduce noise and variability, posing challenges for reliable and precise computation. This theme focuses on understanding, modeling, and mitigating the effects of such non-idealities through brain-inspired population coding, balanced network architectures, surrogate gradient training, and self-calibrating learning algorithms. It is crucial for advancing hardware implementations capable of real-world online learning and generalization.
3. How can neuromorphic computing devices emulate biological synaptic functions to advance brain-inspired learning and memory capabilities?
This theme investigates the physical implementation of biological synapses using memristors, resistive switching devices, and analog VLSI circuits that emulate synaptic plasticity such as spike-timing dependent plasticity (STDP) and synaptic weight modulation. It examines device-level innovations, analog circuit designs exploiting device mismatch constructively, and integration of synaptic dynamics into scalable hardware architectures. Advancements in this area are critical for enabling real-time adaptive learning on neuromorphic platforms and bridging the gap between artificial neural models and biological neural computations.