Lecture 4 Part 2: Nonlinear Root Finding, Optimization, and Adjoint Gradient Methods
Автор: MIT OpenCourseWare
Загружено: 2023-10-23
Просмотров: 7541
Описание:
MIT 18.S096 Matrix Calculus For Machine Learning And Beyond, IAP 2023
Instructors: Alan Edelman, Steven G. Johnson
View the complete course: https://ocw.mit.edu/courses/18-s096-m...
YouTube Playlist: • MIT 18.S096 Matrix Calculus For Machine Le...
Description: Nonlinear root finding by Newton’s method and optimization by gradient descent. “Adjoint” methods (reverse-mode/backpropagation) lets us find gradients efficiently for large-scale engineering optimization.
License: Creative Commons BY-NC-SA
More information at https://ocw.mit.edu/terms
More courses at https://ocw.mit.edu
Support OCW at http://ow.ly/a1If50zVRlQ
We encourage constructive comments and discussion on OCW’s YouTube and other social media channels. Personal attacks, hate speech, trolling, and inappropriate comments are not allowed and may be removed. More details at https://ocw.mit.edu/comments.
Повторяем попытку...
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: