9. Lagrangian Duality and Convex Optimization
Автор: Inside Bloomberg
Загружено: 2018-07-11
Просмотров: 72218
Описание:
We introduce the basics of convex optimization and Lagrangian duality. We discuss weak and strong duality, Slater's constraint qualifications, and we derive the complementary slackness conditions. As far as this course is concerned, there are really only two reasons for discussing Lagrangian duality: 1) The complementary slackness conditions will imply that SVM solutions are "sparse in the data" (next lecture), which has important practical implications for the kernelized SVMs (see the kernel methods lecture). 2) Strong duality is a sufficient condition for the equivalence between the penalty and constraint forms of regularization (see Hwk 4 Problem 8).
This mathematically intense lecture may be safely skipped.
Access the full course at https://bloom.bg/2ui2T4q
Повторяем попытку...

Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: