Sparse Spatiotemporal Codes
Parallel Computing in 3D
Retinomorphic Vision Sensors
Scaling & Programming Neuromorphic Systems
“Today's deep learning solutions scale poorly. Emitting 10,000-fold more carbon to train a deep net only doubles its performance on benchmark tasks. The next doubling in performance will emit as much CO2 as New York City does in a month. In contrast, the brain learns extremely efficiently. A child masters language by the age of 6, having heard at most 65 million words. That’s 15,000-fold less than the trillion words used to train GPT-3. Equivalently, a child that learns language at the same rate as GPT-3 would be 90,000 years old before it could converse fluently. By reverse-engineering how the brain uses so little data to learn, we hope to invent solutions that enable a sustainable technological future.”
Y Shi, N A Steinmetz, T Moore, K Boahen, T A Engel, Cortical state dynamics and selective attention define the spatial pattern of correlated variability in neocortex , Nature Communications , vol 13, no 44, 2022. PDF
B Benjamin, N A Steinmetz, N N Oza, J J Aguayo and K Boahen, Neurogrid simulates cortical cell-types, active dendrites, and top-down attention , Neuromorphic Computing and Engineering , 2021. PDF