PLAY PODCASTS
Make Stochastic Gradient Descent Fast Again (Ep. 113)
Episode 110

Make Stochastic Gradient Descent Fast Again (Ep. 113)

Data Science at Home

July 22, 202020m 35s

Audio is streamed directly from the publisher (mcdn.podbean.com) as published in their RSS feed. Play Podcasts does not host this file. Rights-holders can request removal through the copyright & takedown page.

Show Notes

There is definitely room for improvement in the family of algorithms of stochastic gradient descent. In this episode I explain a relatively simple method that has shown to improve on the Adam optimizer. But, watch out! This approach does not generalize well.

Join our Discord channel and chat with us.

 

References