Please login or sign up to post and edit reviews.
100x Improvements in Deep Learning Performance with Sparsity, w/ Subutai Ahmad - #562 - Publication Date |
- Mar 07, 2022
- Episode Duration |
- 00:50:57
Today we’re joined by Subutai Ahmad, VP of research at Numenta. While we’ve had numerous conversations about the biological inspirations of deep learning models with folks working at the intersection of deep learning and neuroscience, we dig into uncharted territory with Subutai. We set the stage by digging into some of fundamental ideas behind Numenta’s research and the present landscape of neuroscience, before exploring our first big topic of the podcast: the cortical column. Cortical columns are a group of neurons in the cortex of the brain which have nearly identical receptive fields; we discuss the behavior of these columns, why they’re a structure worth mimicing computationally, how far along we are in understanding the cortical column, and how these columns relate to neurons.
We also discuss what it means for a model to have inherent 3d understanding and for computational models to be inherently sensory motor, and where we are with these lines of research. Finally, we dig into our other big idea, sparsity. We explore the fundamental ideals of sparsity and the differences between sparse and dense networks, and applying sparsity and optimization to drive greater efficiency in current deep learning networks, including transformers and other large language models.
The complete show notes for this episode can be found at
twimlai.com/go/562This episode could use a review!
This episode could use a review! Have anything to say about it? Share your thoughts using the button below.
Submit Review