EUSIPCO 2020 Tutorial 1-1: Nonconvex Optimization for High-Dimensional Signal Estimation
Автор: EURASIP
Загружено: 2021-03-04
Просмотров: 137
Описание:
T1 - Title: Nonconvex Optimization for High-Dimensional Signal Estimation: Spectral and Iterative Methods
Presenters: Yuejie Chi (Carnegie Mellon University), Yuxin Chen (Princeton University), Cong Ma (Princeton University)
Abstract: Recent years have seen tremendous progress in the development of provable nonconvex information processing, with the aid of proper statistical models of data. Examples include recommendation systems, structured matrix recovery, phase retrieval, community detection, ranking, amongst others. The focal point of this tutorial is two approaches that have received much attention recently. (1) Spectral methods: which refer to a collection of algorithms built upon eigenanalysis of some properly arranged data matrices. While the studies of spectral methods can be traced back to classical matrix perturbation theory, the past decade has witnessed a tremendous theoretical advance with the aid of statistical modeling, particularly in the development of entrywise perturbation bounds that are much less known to the signal processing community. (2) Nonconvex iterative methods, which attempt to solve the nonconvex problems of interest via iterative optimization algorithms like gradient descent. Despite a high degree of nonconvexity, it has been shown that the loss surface of many high-dimensional signal processing problems may possess benign geometric structures either locally or globally, which in turn allow one to find optimal solutions efficiently. As a result, for a broad range of low-rank matrix estimation problems, nonconvex iterative methods are capable of achieving optimal statistical accuracy and computational efficiency simultaneously.
This tutorial will focus on both spectral methods and iterative optimization algorithms for estimating low-complexity models in high dimension, highlighting the following aspects:
The role of statistical models: with the help of statistical models, one is allowed to leverage the population risk / loss functions that might be easier to characterize in terms of optimization geometry. The main challenge lies in how the optimization geometry translates back to the empirical loss. The tutorial will discuss common techniques and tools for developing such an analysis framework in both global (where a random initialization suffices) and local (where a careful initialization close to the desired solution is needed) approaches for optimization.
The entrywise performance bounds: the tutorial will emphasize an entrywise eigenvector (or singular vector) perturbation analysis, a fundamentally important topic that has only recently become tractable thanks to the advances of statistical tools. Such performance bounds, which are previously rarely available, are of great interest in applications such as ranking, community detection, and localization.
Case studies: the tutorial will feature several key applications and analysis techniques, and highlight their broad applicability.
Повторяем попытку...

Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: