false

Decision by Supervised Learning with Deep Ensembles: A Practical Framework for Robust Portfolio Optimization

Decision by Supervised Learning with Deep Ensembles: A Practical Framework for Robust Portfolio Optimization ArXiv ID: 2503.13544 “View on arXiv” Authors: Unknown Abstract We propose Decision by Supervised Learning (DSL), a practical framework for robust portfolio optimization. DSL reframes portfolio construction as a supervised learning problem: models are trained to predict optimal portfolio weights, using cross-entropy loss and portfolios constructed by maximizing the Sharpe or Sortino ratio. To further enhance stability and reliability, DSL employs Deep Ensemble methods, substantially reducing variance in portfolio allocations. Through comprehensive backtesting across diverse market universes and neural architectures, shows superior performance compared to both traditional strategies and leading machine learning-based methods, including Prediction-Focused Learning and End-to-End Learning. We show that increasing the ensemble size leads to higher median returns and more stable risk-adjusted performance. The code is available at https://github.com/DSLwDE/DSLwDE. ...

March 16, 2025 · 2 min · Research Team

Hierarchical Minimum Variance Portfolios: A Theoretical and Algorithmic Approach

Hierarchical Minimum Variance Portfolios: A Theoretical and Algorithmic Approach ArXiv ID: 2503.12328 “View on arXiv” Authors: Unknown Abstract We introduce a novel approach to portfolio optimization that leverages hierarchical graph structures and the Schur complement method to systematically reduce computational complexity while preserving full covariance information. Inspired by Lopez de Prados hierarchical risk parity and Cottons Schur complement methods, our framework models the covariance matrix as an adjacency-like structure of a hierarchical graph. We demonstrate that portfolio optimization can be recursively reduced across hierarchical levels, allowing optimal weights to be computed efficiently by inverting only small submatrices regardless of portfolio size. Moreover, we translate our results into a recursive algorithm that constructs optimal portfolio allocations. Our results reveal a transparent and mathematically rigorous connection between classical Markowitz mean-variance optimization, hierarchical clustering, and the Schur complement method. ...

March 16, 2025 · 2 min · Research Team

Tactical Asset Allocation with Macroeconomic Regime Detection

Tactical Asset Allocation with Macroeconomic Regime Detection ArXiv ID: 2503.11499 “View on arXiv” Authors: Unknown Abstract This paper extends the tactical asset allocation literature by incorporating regime modeling using techniques from machine learning. We propose a novel model that classifies current regimes, forecasts the distribution of future regimes, and integrates these forecasts with the historical performance of individual assets to optimize portfolio allocations. Utilizing a macroeconomic data set from the FRED-MD database, our approach employs a modified k-means algorithm to ensure consistent regime classification over time. We then leverage these regime predictions to estimate expected returns and volatilities, which are subsequently mapped into portfolio allocations using various sizing schemes. Our method outperforms traditional benchmarks such as equal-weight, buy-and-hold, and random regime models. Additionally, we are the first to apply a regime detection model from a large macroeconomic dataset to tactical asset allocation, demonstrating significant improvements in portfolio performance. Our work presents several key contributions, including a novel data-driven regime detection algorithm tailored for uncertainty in forecasted regimes and applying the FRED-MD data set for tactical asset allocation. ...

March 14, 2025 · 2 min · Research Team

Synthetic Data for Portfolios: A Throw of the Dice Will Never Abolish Chance

Synthetic Data for Portfolios: A Throw of the Dice Will Never Abolish Chance ArXiv ID: 2501.03993 “View on arXiv” Authors: Unknown Abstract Simulation methods have always been instrumental in finance, and data-driven methods with minimal model specification, commonly referred to as generative models, have attracted increasing attention, especially after the success of deep learning in a broad range of fields. However, the adoption of these models in financial applications has not matched the growing interest, probably due to the unique complexities and challenges of financial markets. This paper contributes to a deeper understanding of the limitations of generative models, particularly in portfolio and risk management. To this end, we begin by presenting theoretical results on the importance of initial sample size, and point out the potential pitfalls of generating far more data than originally available. We then highlight the inseparable nature of model development and the desired uses by touching on a paradox: usual generative models inherently care less about what is important for constructing portfolios (in particular the long-short ones). Based on these findings, we propose a pipeline for the generation of multivariate returns that meets conventional evaluation standards on a large universe of US equities while being compliant with stylized facts observed in asset returns and turning around the pitfalls we previously identified. Moreover, we insist on the need for more accurate evaluation methods, and suggest, through an example of mean-reversion strategies, a method designed to identify poor models for a given application based on regurgitative training, i.e. retraining the model using the data it has itself generated, which is commonly referred to in statistics as identifiability. ...

January 7, 2025 · 3 min · Research Team

Leveraging Time Series Categorization and Temporal Fusion Transformers to Improve Cryptocurrency Price Forecasting

Leveraging Time Series Categorization and Temporal Fusion Transformers to Improve Cryptocurrency Price Forecasting ArXiv ID: 2412.14529 “View on arXiv” Authors: Unknown Abstract Organizing and managing cryptocurrency portfolios and decision-making on transactions is crucial in this market. Optimal selection of assets is one of the main challenges that requires accurate prediction of the price of cryptocurrencies. In this work, we categorize the financial time series into several similar subseries to increase prediction accuracy by learning each subseries category with similar behavior. For each category of the subseries, we create a deep learning model based on the attention mechanism to predict the next step of each subseries. Due to the limited amount of cryptocurrency data for training models, if the number of categories increases, the amount of training data for each model will decrease, and some complex models will not be trained well due to the large number of parameters. To overcome this challenge, we propose to combine the time series data of other cryptocurrencies to increase the amount of data for each category, hence increasing the accuracy of the models corresponding to each category. ...

December 19, 2024 · 2 min · Research Team

Expressions of Market-Based Correlations Between Prices and Returns of Two Assets

Expressions of Market-Based Correlations Between Prices and Returns of Two Assets ArXiv ID: 2412.13172 “View on arXiv” Authors: Unknown Abstract This paper derives the expressions of correlations between prices of two assets, returns of two assets, and price-return correlations of two assets that depend on statistical moments and correlations of the current values, past values, and volumes of their market trades. The usual frequency-based expressions of correlations of time series of prices and returns describe a partial case of our model when all trade volumes and past trade values are constant. Such an assumptions are rather far from market reality, and its use results in excess losses and wrong forecasts. Traders, banks, and funds that perform multi-million market transactions or manage billion-valued portfolios should consider the impact of large trade volumes on market prices and returns. The use of the market-based correlations of prices and returns of two assets is mandatory for them. The development of macroeconomic models and market forecasts like those being created by BlackRock’s Aladdin, JP Morgan, and the U.S. Fed., is impossible without the use of market-based correlations of prices and returns of two assets. ...

December 17, 2024 · 2 min · Research Team

MILLION: A General Multi-Objective Framework with Controllable Risk for Portfolio Management

MILLION: A General Multi-Objective Framework with Controllable Risk for Portfolio Management ArXiv ID: 2412.03038 “View on arXiv” Authors: Unknown Abstract Portfolio management is an important yet challenging task in AI for FinTech, which aims to allocate investors’ budgets among different assets to balance the risk and return of an investment. In this study, we propose a general Multi-objectIve framework with controLLable rIsk for pOrtfolio maNagement (MILLION), which consists of two main phases, i.e., return-related maximization and risk control. Specifically, in the return-related maximization phase, we introduce two auxiliary objectives, i.e., return rate prediction, and return rate ranking, combined with portfolio optimization to remit the overfitting problem and improve the generalization of the trained model to future markets. Subsequently, in the risk control phase, we propose two methods, i.e., portfolio interpolation and portfolio improvement, to achieve fine-grained risk control and fast risk adaption to a user-specified risk level. For the portfolio interpolation method, we theoretically prove that the risk can be perfectly controlled if the to-be-set risk level is in a proper interval. In addition, we also show that the return rate of the adjusted portfolio after portfolio interpolation is no less than that of the min-variance optimization, as long as the model in the reward maximization phase is effective. Furthermore, the portfolio improvement method can achieve greater return rates while keeping the same risk level compared to portfolio interpolation. Extensive experiments are conducted on three real-world datasets. The results demonstrate the effectiveness and efficiency of the proposed framework. ...

December 4, 2024 · 2 min · Research Team

Multiscale Markowitz

Multiscale Markowitz ArXiv ID: 2411.13792 “View on arXiv” Authors: Unknown Abstract Traditional Markowitz portfolio optimization constrains daily portfolio variance to a target value, optimising returns, Sharpe or variance within this constraint. However, this approach overlooks the relationship between variance at different time scales, typically described by $σ(Δt) \propto (Δt)^{“H”}$ where $H$ is the Hurst exponent, most of the time assumed to be (\frac{“1”}{“2”}). This paper introduces a multifrequency optimization framework that allows investors to specify target portfolio variance across a range of frequencies, characterized by a target Hurst exponent $H_{“target”}$, or optimize the portfolio at multiple time scales. By incorporating this scaling behavior, we enable a more nuanced and comprehensive risk management strategy that aligns with investor preferences at various time scales. This approach effectively manages portfolio risk across multiple frequencies and adapts to different market conditions, providing a robust tool for dynamic asset allocation. This overcomes some of the traditional limitations of Markowitz, when it comes to dealing with crashes, regime changes, volatility clustering or multifractality in markets. We illustrate this concept with a toy example and discuss the practical implementation for assets with varying scaling behaviors. ...

November 21, 2024 · 2 min · Research Team

Schur Complementary Allocation: A Unification of Hierarchical Risk Parity and Minimum Variance Portfolios

Schur Complementary Allocation: A Unification of Hierarchical Risk Parity and Minimum Variance Portfolios ArXiv ID: 2411.05807 “View on arXiv” Authors: Unknown Abstract Despite many attempts to make optimization-based portfolio construction in the spirit of Markowitz robust and approachable, it is far from universally adopted. Meanwhile, the collection of more heuristic divide-and-conquer approaches was revitalized by Lopez de Prado where Hierarchical Risk Parity (HRP) was introduced. This paper reveals the hidden connection between these seemingly disparate approaches. ...

October 29, 2024 · 2 min · Research Team

Sample Average Approximation for Portfolio Optimization under CVaR constraint in an (re)insurance context

Sample Average Approximation for Portfolio Optimization under CVaR constraint in an (re)insurance context ArXiv ID: 2410.10239 “View on arXiv” Authors: Unknown Abstract We consider optimal allocation problems with Conditional Value-At-Risk (CVaR) constraint. We prove, under very mild assumptions, the convergence of the Sample Average Approximation method (SAA) applied to this problem, and we also exhibit a convergence rate and discuss the uniqueness of the solution. These results give (re)insurers a practical solution to portfolio optimization under market regulatory constraints, i.e. a certain level of risk. ...

October 14, 2024 · 2 min · Research Team