Gradient Flows, Stochastic Control and Robust pricing and hedging via neural SDEs
Автор: Fields Institute
Загружено: 2020-11-26
Просмотров: 1512
Описание:
Speaker: Lukasz Szpruch, University of Edinburgh and Alan Turing Institute
Quantitative Finance Seminar
http://www.fields.utoronto.ca/activit...
Abstract: There is overwhelming empirical evidence that deep neural networks trained with stochastic gradient descent perform (extremely) well in the high dimensional settings. Nonetheless, a complete mathematical theory that would provide theoretical guarantees why and when these methods work so well has been elusive. In this talk, I will demonstrate how one may leverage control theory and the theory of statistical sampling to study the convergence of stochastic gradient algorithms used in deep learning. Conversely, I will show that machine learning perspective leads to new algorithms for (stochastic) control problems and offers a fresh perspective on classical quantitative finance problems. Indeed, modern data science techniques are opening the door to more robust and data-driven model selection mechanisms. Indeed, deep generative modelling is opening the door to more robust and data-driven model selection mechanisms. By combining neural networks with risk models based on classical stochastic differential equations (SDEs), we find robust bounds for prices of derivatives and the corresponding hedging strategies while incorporating relevant market data. Neural SDEs allow consistent calibration under both the risk-neutral and the real-world measures. Thus the model can be used to simulate market scenarios needed for assessing risk profiles and hedging strategies. We develop and analyse novel algorithms needed for efficient use of neural SDEs. We validate our approach with numerical experiments using both local and stochastic volatility models. We will also show that neural SDEs can be used to calibrate to SPX/VIX options.
Повторяем попытку...
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: