Simulated Annealing x SGD x Mini-batch | Machine Learning w TensorFlow & scikit-learn #9

📚About This lecture is dedicated for variations of gradient descent algorithms. We talk about Stochastic & mini-batch Gradient Descent along with Simulated Annealing. Python implementations are done on Jupyter. ⏲Outline⏲ 00:00 Introduction 00:26 Simulated Annealing 02:42 Stochastic Gradient Descent with variable step size 10:12 Mini-batch Gradient Descent 🔴 Subscribe for more videos on Machine Learning and Python. 👍 Smash that like button, in case you find this tutorial useful. 👁‍🗨 Speak up and comment,
Back to Top