Karolina Dziugaite on Nonvacuous Generalization Bounds for Deep Neural Networks via PAC-Bayes
Автор: Borealis AI
Загружено: 2018-07-24
Просмотров: 1330
Описание:
Abstract: Karolina presents her recent work constructing generalization bounds in order to understand existing learning algorithms and propose new ones. Generalization bounds relate empirical performance to future expected performance. The tightness of these bounds vary widely, and depends on the complexity of the learning task and the amount of data available, but also on how much information the bounds take into consideration. Her work is particularly concerned with data and algorithm-dependent bounds that are quantitatively nonvacuous. She presents bounds built from solutions obtained by stochastic gradient descent (SGD) on MNIST. By formalizing the notion of flat minima using PAC-Bayes generalization bounds, we obtain nonvacuous generalization bounds for stochastic classifiers built by randomly perturbing SGD solutions.
Joint work with Daniel M. Roy based on https://arxiv.org/abs/1703.11008, https://arxiv.org/abs/1712.09376, and https://arxiv.org/abs/1802.09583
Lecture given in on June 2018 in Toronto, ON at the Borealis AI lab.
Повторяем попытку...
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: