Please login or sign up to post and edit reviews.
BI 065 Thomas Serre: How Recurrence Helps Vision
Podcast |
Brain Inspired
Publisher |
Paul Middlebrooks
Media Type |
audio
Categories Via RSS |
Education
Natural Sciences
Science
Technology
Publication Date |
Apr 05, 2020
Episode Duration |
01:40:13

Thomas and I discuss the role of recurrence in visual cognition: how brains somehow excel with so few “layers” compared to deep nets, how feedback recurrence can underlie visual reasoning, how LSTM gate-like processing could explain the function of canonical cortical microcircuits, the current limitations of deep learning networks like adversarial examples, and a bit of history in modeling our hierarchical visual system, including his work with the HMAX model and interacting with the deep learning folks as convolutional neural networks were being developed.

Show Notes:

Thomas and I discuss the role of recurrence in visual cognition: how brains somehow excel with so few “layers” compared to deep nets, how feedback recurrence can underlie visual reasoning, how LSTM gate-like processing could explain the function of canonical cortical microcircuits, the current limitations of deep learning networks like adversarial examples, and a bit of history in modeling our hierarchical visual system, including his work with the HMAX model and interacting with the deep learning folks as convolutional neural networks were being developed. Show Notes: Visit the Serre Lab website. Follow Thomas on twitter: @tserre.Good reviews that references all the work we discussed, including the HMAX model: Beyond the feedforward sweep: feedback computations in the visual cortex. Deep learning: the good, the bad and the ugly. Papers about the topics we discuss: Complementary Surrounds Explain Diverse Contextual Phenomena Across Visual Modalities. Recurrent neural circuits for contour detection.Learning long-range spatial dependencies with horizontal gated-recurrent units.

Thomas and I discuss the role of recurrence in visual cognition: how brains somehow excel with so few “layers” compared to deep nets, how feedback recurrence can underlie visual reasoning, how LSTM gate-like processing could explain the function of canonical cortical microcircuits, the current limitations of deep learning networks like adversarial examples, and a bit of history in modeling our hierarchical visual system, including his work with the HMAX model and interacting with the deep learning folks as convolutional neural networks were being developed.

Show Notes:

This episode currently has no reviews.

Submit Review
This episode could use a review!

This episode could use a review! Have anything to say about it? Share your thoughts using the button below.

Submit Review