Boahen lab

# people
5
Area of Focus 1

Sparse Spatiotemporal Codes

Area of Focus 2

Parallel Computing in 3D

Area of Focus 3

Retinomorphic Vision Sensors

Area of Focus 4

Scaling & Programming Neuromorphic Systems

Commercial Readiness?
needs work
Description

“Today's deep learning solutions scale poorly. Emitting 10,000-fold more carbon to train a deep net only doubles its performance on benchmark tasks. The next doubling in performance will emit as much CO2  as New York City does in a month. In contrast, the brain learns extremely efficiently. A child masters language by the age of 6, having heard at most 65 million words. That’s 15,000-fold less than the trillion words used to train GPT-3. Equivalently, a child that learns language at the same rate as GPT-3 would be 90,000 years old before it could converse fluently. By reverse-engineering how the brain uses so little data to learn, we hope to invent solutions that enable a sustainable technological future.”

Institution
Stanford University
New/Existing Tech
Recent Publication 1

Y Shi, N A Steinmetz, T Moore, K Boahen, T A Engel, Cortical state dynamics and selective attention define the spatial pattern of correlated variability in neocortexNature Communications  , vol 13, no 44, 2022. PDF

Recent Publication 2

B Benjamin, N A Steinmetz, N N Oza, J J Aguayo and K Boahen, Neurogrid simulates cortical cell-types, active dendrites, and top-down attentionNeuromorphic Computing and Engineering , 2021. PDF

Techs used
Techs-used
neuromorphic software
Treatments
AI