false

Overparametrized models with posterior drift

Overparametrized models with posterior drift ArXiv ID: 2506.23619 “View on arXiv” Authors: Guillaume Coqueret, Martial Laguerre Abstract This paper investigates the impact of posterior drift on out-of-sample forecasting accuracy in overparametrized machine learning models. We document the loss in performance when the loadings of the data generating process change between the training and testing samples. This matters crucially in settings in which regime changes are likely to occur, for instance, in financial markets. Applied to equity premium forecasting, our results underline the sensitivity of a market timing strategy to sub-periods and to the bandwidth parameters that control the complexity of the model. For the average investor, we find that focusing on holding periods of 15 years can generate very heterogeneous returns, especially for small bandwidths. Large bandwidths yield much more consistent outcomes, but are far less appealing from a risk-adjusted return standpoint. All in all, our findings tend to recommend cautiousness when resorting to large linear models for stock market predictions. ...

June 30, 2025 · 2 min · Research Team

Double Descent in Portfolio Optimization: Dance between Theoretical Sharpe Ratio and Estimation Accuracy

Double Descent in Portfolio Optimization: Dance between Theoretical Sharpe Ratio and Estimation Accuracy ArXiv ID: 2411.18830 “View on arXiv” Authors: Unknown Abstract We study the relationship between model complexity and out-of-sample performance in the context of mean-variance portfolio optimization. Representing model complexity by the number of assets, we find that the performance of low-dimensional models initially improves with complexity but then declines due to overfitting. As model complexity becomes sufficiently high, the performance improves with complexity again, resulting in a double ascent Sharpe ratio curve similar to the double descent phenomenon observed in artificial intelligence. The underlying mechanisms involve an intricate interaction between the theoretical Sharpe ratio and estimation accuracy. In high-dimensional models, the theoretical Sharpe ratio approaches its upper limit, and the overfitting problem is reduced because there are more parameters than data restrictions, which allows us to choose well-behaved parameters based on inductive bias. ...

November 28, 2024 · 2 min · Research Team