Pierre-Alexandre Mattei - Ensembles in machine learning: (simple) theory and (simple) practice
Автор: One world theoretical machine learning
Загружено: 2026-03-14
Просмотров: 13
Описание: Abstract: Ensemble methods combine predictions from various statistical learning models. Their most famous representatives are random forests or neural network ensembles. This talk will center around the question: "How many models should I aggregate?" We will see that the answer depends on the chosen performance metric. Specifically, in the case of convex losses (such as cross-entropy in classification or mean squared error in regression), the error is a decreasing function of the number of models. In the case of non-convex losses (such as classification error in classification or the Fréchet Inception distance in generative modelling), things are more nuanced, and the error can sometimes be non-monotonic. These results will be illustrated with examples of neural network ensembles. This work is notably based on the paper JMLR paper "Are Ensembles Getting Better All the Time?" (http://jmlr.org/papers/v26/24-0408.html), joint work with Damien Garreau (Julius-Maximilians-Universität Würzburg). It will also feature recent work with Raphaël Razafindralambo, Rémy Sun, and Frédéric Precioso (Inria, Université Côte d'Azur).
Повторяем попытку...
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: