W8_L6: Gradient descent & taylor series
Автор: IIT Madras - B.S. Degree Programme
Загружено: 2025-01-30
Просмотров: 14117
Описание:
Welcome to Week 8 Lecture 6 of the course "Machine Learning Foundations" by Profs. Harish Guruprasad Ramaswamy, Arun Rajkumar, and Prashanth LA.
Full Course: https://study.iitm.ac.in/ds/course_pa...
Video Overview
This lecture explains the intuition behind using the negative gradient direction in the gradient descent optimization algorithm. Using Taylor series expansion, we see why moving in the direction of the negative derivative decreases the function value, provided the step size is small.We ask and answer an important question: what is so special about minus f dash of x t? What makes the negative gradient direction unique in ensuring descent of the objective function? This session bridges calculus intuition with algorithmic understanding.
About IIT Madras' online Bachelor of Science programme
IIT Madras offers four-year BS programmes that aim to provide quality education to all, irrespective of age, educational background, or location. The BS programme has multiple levels, which provide flexibility to students to exit at any of these levels. Depending on the courses completed and credits earned, the learner can receive a Foundation Certificate from IITM CODE (Centre for Outreach and Digital Education), Diploma(s) from IIT Madras, or BSc/BS Degrees from IIT Madras.
For more details, Visit: https://www.iitm.ac.in/academics/stud...
#optimization #gradientdescent #taylorseries #algorithm #machinelearning #deeplearning #calculus #derivatives #unconstrainedoptimization #stepsize #learningrate #functionminimization #descentdirection #negativegradient #intuition #mathematics #mlfoundations
Повторяем попытку...
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: