false

Enforcing asymptotic behavior with DNNs for approximation and regression in finance

Enforcing asymptotic behavior with DNNs for approximation and regression in finance ArXiv ID: 2411.05257 “View on arXiv” Authors: Unknown Abstract We propose a simple methodology to approximate functions with given asymptotic behavior by specifically constructed terms and an unconstrained deep neural network (DNN). The methodology we describe extends to various asymptotic behaviors and multiple dimensions and is easy to implement. In this work we demonstrate it for linear asymptotic behavior in one-dimensional examples. We apply it to function approximation and regression problems where we measure approximation of only function values (Vanilla Machine Learning''-VML) or also approximation of function and derivative values (Differential Machine Learning’’-DML) on several examples. We see that enforcing given asymptotic behavior leads to better approximation and faster convergence. ...

November 8, 2024 · 2 min · Research Team

Large (and Deep) Factor Models

Large (and Deep) Factor Models ArXiv ID: 2402.06635 “View on arXiv” Authors: Unknown Abstract We open up the black box behind Deep Learning for portfolio optimization and prove that a sufficiently wide and arbitrarily deep neural network (DNN) trained to maximize the Sharpe ratio of the Stochastic Discount Factor (SDF) is equivalent to a large factor model (LFM): A linear factor pricing model that uses many non-linear characteristics. The nature of these characteristics depends on the architecture of the DNN in an explicit, tractable fashion. This makes it possible to derive end-to-end trained DNN-based SDFs in closed form for the first time. We evaluate LFMs empirically and show how various architectural choices impact SDF performance. We document the virtue of depth complexity: With enough data, the out-of-sample performance of DNN-SDF is increasing in the NN depth, saturating at huge depths of around 100 hidden layers. ...

January 20, 2024 · 2 min · Research Team