026 – Gordon Wilson
Publisher |
Loup Ventures
Media Type |
audio
Categories Via RSS |
Tech News
Technology
Publication Date |
Nov 03, 2019
Episode Duration |
00:35:08
Gordon Wilson is the CEO of Rain Neuromorphics, a company developing neuromorphic computer chips to enable brain-like artificial intelligence. Gordon holds a B.S. in Statistics and Mathematics from the University of Florida. Top 3 Takeaways Training deep learning algorithms is expensive. To understand...
Gordon Wilson is the CEO of neuromorphics.com/">Rain Neuromorphics, a company developing neuromorphic computer chips to enable brain-like artificial intelligence. Gordon holds a B.S. in Statistics and Mathematics from the University of Florida. Top 3 Takeaways * Training deep learning algorithms is expensive. * To understand the brain, you need to build one. * Modern computing hardware doesn’t have the parallelism and energy efficiency of the brain. Show Notes * [1:12] Building a processor for brain math. * [2:40] The cost of artificial neural networks. * [3:36] What is “brain-inspired hardware”? * [4:50] Nanowires and memristors. * [6:25] Cross-disciplinary chip design. * [7:30] Size of the brain vs. size of artificial neural networks. * [9:05] Research vs. development. * [12:00] Bridging brain science and AI. * [13:54] Neuromorphics vs. GPUs. * [18:00] Chips on the market. * [20:40] Go-to-market: matrix multiplication. * [22:22] Cost and energy of Rain’s hardware. * [23:33] Does chip design impact software development? * [24:08] Fusing training and inference. * [26:26] Wide learning vs. deep learning. * [29:30] Sparse learning. * [32:10] Gordon’s book recommendations. Selected Links * A talk by Gordon * An article about Rain’s technology * OpenAI’s blog * On Intelligence, by Jeff Hawkins * WaitButWhy, a blog by Tim Urban Related Podcasts * 002 – Jeff Hawkins * 020 – Mary Beth Henderson * 024 – Brian Pepin Disclaimer

Gordon Wilson is the CEO of neuromorphics.com/">Rain Neuromorphics, a company developing neuromorphic computer chips to enable brain-like artificial intelligence. Gordon holds a B.S. in Statistics and Mathematics from the University of Florida.

Top 3 Takeaways

  1. Training deep learning algorithms is expensive.
  2. To understand the brain, you need to build one.
  3. Modern computing hardware doesn’t have the parallelism and energy efficiency of the brain.

Show Notes

  • [1:12] Building a processor for brain math.
  • [2:40] The cost of artificial neural networks.
  • [3:36] What is “brain-inspired hardware”?
  • [4:50] Nanowires and memristors.
  • [6:25] Cross-disciplinary chip design.
  • [7:30] Size of the brain vs. size of artificial neural networks.
  • [9:05] Research vs. development.
  • [12:00] Bridging brain science and AI.
  • [13:54] Neuromorphics vs. GPUs.
  • [18:00] Chips on the market.
  • [20:40] Go-to-market: matrix multiplication.
  • [22:22] Cost and energy of Rain’s hardware.
  • [23:33] Does chip design impact software development?
  • [24:08] Fusing training and inference.
  • [26:26] Wide learning vs. deep learning.
  • [29:30] Sparse learning.
  • [32:10] Gordon’s book recommendations.

Selected Links

Related Podcasts

Disclaimer

This episode currently has no reviews.

Submit Review
This episode could use a review!

This episode could use a review! Have anything to say about it? Share your thoughts using the button below.

Submit Review