Please login or sign up to post and edit reviews.
BI 139 Marc Howard: Compressed Time and Memory
Podcast |
Brain Inspired
Publisher |
Paul Middlebrooks
Media Type |
audio
Categories Via RSS |
Education
Natural Sciences
Science
Technology
Publication Date |
Jun 20, 2022
Episode Duration |
01:20:11

Check out my free video series about what's missing in AI and Neuroscience

Support the show to get full episodes and join the Discord community.

Marc Howard runs his Theoretical Cognitive Neuroscience Lab at Boston University, where he develops mathematical models of cognition, constrained by psychological and neural data. In this episode, we discuss the idea that a Laplace transform and its inverse may serve as a unified framework for memory. In short, our memories are compressed on a continuous log-scale: as memories get older, their representations "spread out" in time. It turns out this kind of representation seems ubiquitous in the brain and across cognitive functions, suggesting it is likely a canonical computation our brains use to represent a wide variety of cognitive functions. We also discuss some of the ways Marc is incorporating this mathematical operation in deep learning nets to improve their ability to handle information at different time scales.

0:00 - Intro 4:57 - Main idea: Laplace transforms 12:00 - Time cells 20:08 - Laplace, compression, and time cells 25:34 - Everywhere in the brain 29:28 - Episodic memory 35:11 - Randy Gallistel's memory idea 40:37 - Adding Laplace to deep nets 48:04 - Reinforcement learning 1:00:52 - Brad Wyble Q: What gets filtered out? 1:05:38 - Replay and complementary learning systems 1:11:52 - Howard Goldowsky Q: Gyorgy Buzsaki 1:15:10 - Obstacles

This episode currently has no reviews.

Submit Review
This episode could use a review!

This episode could use a review! Have anything to say about it? Share your thoughts using the button below.

Submit Review