16 min listen
[MINI] Backpropagation
FromData Skeptic
ratings:
Length:
15 minutes
Released:
Apr 7, 2017
Format:
Podcast episode
Description
Backpropagation is a common algorithm for training a neural network. It works by computing the gradient of each weight with respect to the overall error, and using stochastic gradient descent to iteratively fine tune the weights of the network. In this episode, we compare this concept to finding a location on a map, marble maze games, and golf.
Released:
Apr 7, 2017
Format:
Podcast episode
Titles in the series (100)
[MINI] Experimental Design: This episode loosely explores the topic of Experimental Design including hypothesis testing, the importance of statistical tests, and an everyday and business example. by Data Skeptic