16 min listen
[MINI] The Vanishing Gradient
FromData Skeptic
ratings:
Length:
15 minutes
Released:
Jun 30, 2017
Format:
Podcast episode
Description
This episode discusses the vanishing gradient - a problem that arises when training deep neural networks in which nearly all the gradients are very close to zero by the time back-propagation has reached the first hidden layer. This makes learning virtually impossible without some clever trick or improved methodology to help earlier layers begin to learn.
Released:
Jun 30, 2017
Format:
Podcast episode
Titles in the series (100)
[MINI] Experimental Design: This episode loosely explores the topic of Experimental Design including hypothesis testing, the importance of statistical tests, and an everyday and business example. by Data Skeptic