Home
Categories
EXPLORE
True Crime
Comedy
Business
Society & Culture
Sports
Technology
History
About Us
Contact Us
Copyright
© 2024 PodJoint
Podjoint Logo
US
00:00 / 00:00
Sign in

or

Don't have an account?
Sign up
Forgot password
https://is1-ssl.mzstatic.com/image/thumb/Podcasts211/v4/aa/6e/c5/aa6ec535-4bf6-6000-c0c7-d14efada8206/mza_9225735319382486240.jpg/600x600bb.jpg
The AI Concepts Podcast
Sheetal ’Shay’ Dhar
38 episodes
3 months ago
Show more...
Technology
Education,
Courses,
Science
RSS
All content for The AI Concepts Podcast is the property of Sheetal ’Shay’ Dhar and is served directly from their servers with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
Show more...
Technology
Education,
Courses,
Science
https://is1-ssl.mzstatic.com/image/thumb/Podcasts211/v4/aa/6e/c5/aa6ec535-4bf6-6000-c0c7-d14efada8206/mza_9225735319382486240.jpg/600x600bb.jpg
Deep Learning Series: Advanced Optimizers - SGD and SGDM
The AI Concepts Podcast
4 minutes 40 seconds
6 months ago
Deep Learning Series: Advanced Optimizers - SGD and SGDM
Welcome to the AI Concepts Podcast, where host Shay unravels the intricate world of AI through relatable examples and easy-to-understand analogies. In this episode, we continue our dive into deep learning by addressing the challenges and solutions of gradient descent. Learn how traditional gradient descent, which is pivotal in neural network training, sometimes falls short due to its slow speed and susceptibility to getting stuck. Explore enhancements like Stochastic Gradient Descent, which speeds up the process by using random data subsets, and discover the power of momentum in overcoming noisy gradients. Dive into Adagrad, the adaptive learning rate optimizer that adjusts itself based on parameter updates, ensuring efficient learning even with sparse data. However, watch out for Adagrad's tendency to become overly cautious over time. Get ready for an insightful discussion as we lay the groundwork for future episodes focusing on advanced optimizers like RMSprop and Adam, along with the crucial art of hyperparameter tuning.
The AI Concepts Podcast