Please login or sign up to post and edit reviews.
KL Divergence
Media Type |
audio
Categories Via RSS |
Technology
Publication Date |
Aug 07, 2017
Episode Duration |
00:25:38
Kullback Leibler divergence, or KL divergence, is a measure of information loss when you try to approximate one distribution with another distribution.  It comes to us originally from information theory, but today underpins other, more machine-learning-focused algorithms like t-SNE.  And boy oh boy can it be tough to explain.  But we're trying our hardest in this episode!

This episode currently has no reviews.

Submit Review
This episode could use a review!

This episode could use a review! Have anything to say about it? Share your thoughts using the button below.

Submit Review