false

On Accelerating Large-Scale Robust Portfolio Optimization

On Accelerating Large-Scale Robust Portfolio Optimization ArXiv ID: 2408.07879 “View on arXiv” Authors: Unknown Abstract Solving large-scale robust portfolio optimization problems is challenging due to the high computational demands associated with an increasing number of assets, the amount of data considered, and market uncertainty. To address this issue, we propose an extended supporting hyperplane approximation approach for efficiently solving a class of distributionally robust portfolio problems for a general class of additively separable utility functions and polyhedral ambiguity distribution set, applied to a large-scale set of assets. Our technique is validated using a large-scale portfolio of the S&P 500 index constituents, demonstrating robust out-of-sample trading performance. More importantly, our empirical studies show that this approach significantly reduces computational time compared to traditional concave Expected Log-Growth (ELG) optimization, with running times decreasing from several thousand seconds to just a few. This method provides a scalable and practical solution to large-scale robust portfolio optimization, addressing both theoretical and practical challenges. ...

August 15, 2024 · 2 min · Research Team

Operator Deep Smoothing for Implied Volatility

Operator Deep Smoothing for Implied Volatility ArXiv ID: 2406.11520 “View on arXiv” Authors: Unknown Abstract We devise a novel method for nowcasting implied volatility based on neural operators. Better known as implied volatility smoothing in the financial industry, nowcasting of implied volatility means constructing a smooth surface that is consistent with the prices presently observed on a given option market. Option price data arises highly dynamically in ever-changing spatial configurations, which poses a major limitation to foundational machine learning approaches using classical neural networks. While large models in language and image processing deliver breakthrough results on vast corpora of raw data, in financial engineering the generalization from big historical datasets has been hindered by the need for considerable data pre-processing. In particular, implied volatility smoothing has remained an instance-by-instance, hands-on process both for neural network-based and traditional parametric strategies. Our general operator deep smoothing approach, instead, directly maps observed data to smoothed surfaces. We adapt the graph neural operator architecture to do so with high accuracy on ten years of raw intraday S&P 500 options data, using a single model instance. The trained operator adheres to critical no-arbitrage constraints and is robust with respect to subsampling of inputs (occurring in practice in the context of outlier removal). We provide extensive historical benchmarks and showcase the generalization capability of our approach in a comparison with classical neural networks and SVI, an industry standard parametrization for implied volatility. The operator deep smoothing approach thus opens up the use of neural networks on large historical datasets in financial engineering. ...

June 17, 2024 · 2 min · Research Team

Subset second-order stochastic dominance for enhanced indexation with diversification enforced by sector constraints

Subset second-order stochastic dominance for enhanced indexation with diversification enforced by sector constraints ArXiv ID: 2404.16777 “View on arXiv” Authors: Unknown Abstract In this paper we apply second-order stochastic dominance (SSD) to the problem of enhanced indexation with asset subset (sector) constraints. The problem we consider is how to construct a portfolio that is designed to outperform a given market index whilst having regard to the proportion of the portfolio invested in distinct market sectors. In our approach, subset SSD, the portfolio associated with each sector is treated in a SSD manner. In other words in subset SSD we actively try to find sector portfolios that SSD dominate their respective sector indices. However the proportion of the overall portfolio invested in each sector is not pre-specified, rather it is decided via optimisation. Our subset SSD approach involves the numeric solution of a multivariate second-order stochastic dominance problem. Computational results are given for our approach as applied to the S&P500 over the period 3rd October 2018 to 29th December 2023. This period, over 5 years, includes the Covid pandemic, which had a significant effect on stock prices. The S&P500 data that we have used is made publicly available for the benefit of future researchers. Our computational results indicate that the scaled version of our subset SSD approach outperforms the S&P500. Our approach also outperforms the standard SSD based approach to the problem. Our results show, that for the S&P500 data considered, including sector constraints improves out-of-sample performance, irrespective of the SSD approach adopted. Results are also given for Fama-French data involving 49 industry portfolios and these confirm the effectiveness of our subset SSD approach. ...

April 25, 2024 · 3 min · Research Team

Portfolio management using graph centralities: Review and comparison

Portfolio management using graph centralities: Review and comparison ArXiv ID: 2404.00187 “View on arXiv” Authors: Unknown Abstract We investigate an application of network centrality measures to portfolio optimization, by generalizing the method in [“Pozzi, Di Matteo and Aste, \emph{“Spread of risks across financial markets: better to invest in the peripheries”}, Scientific Reports 3:1665, 2013”], that however had significant limitations with respect to the state of the art in network theory. In this paper, we systematically compare many possible variants of the originally proposed method on S&P 500 stocks. We use daily data from twenty-seven years as training set and their following year as test set. We thus select the best network-based methods according to different viewpoints including for instance the highest Sharpe Ratio and the highest expected return. We give emphasis in new centrality measures and we also conduct a thorough analysis, which reveals significantly stronger results compared to those with more traditional methods. According to our analysis, this graph-theoretical approach to investment can be used successfully by investors with different investment profiles leading to high risk-adjusted returns. ...

March 29, 2024 · 2 min · Research Team

Bitcoin versus S&P 500 Index: Return and Risk Analysis

Bitcoin versus S&P 500 Index: Return and Risk Analysis ArXiv ID: 2310.02436 “View on arXiv” Authors: Unknown Abstract The S&P 500 index is considered the most popular trading instrument in financial markets. With the rise of cryptocurrencies over the past years, Bitcoin has also grown in popularity and adoption. The paper aims to analyze the daily return distribution of the Bitcoin and S&P 500 index and assess their tail probabilities through two financial risk measures. As a methodology, We use Bitcoin and S&P 500 Index daily return data to fit The seven-parameter General Tempered Stable (GTS) distribution using the advanced Fast Fractional Fourier transform (FRFT) scheme developed by combining the Fast Fractional Fourier (FRFT) algorithm and the 12-point rule Composite Newton-Cotes Quadrature. The findings show that peakedness is the main characteristic of the S&P 500 return distribution, whereas heavy-tailedness is the main characteristic of the Bitcoin return distribution. The GTS distribution shows that $80.05%$ of S&P 500 returns are within $-1.06%$ and $1.23%$ against only $40.32%$ of Bitcoin returns. At a risk level ($α$), the severity of the loss ($AVaR_α(X)$) on the left side of the distribution is larger than the severity of the profit ($AVaR_{“1-α”}(X)$) on the right side of the distribution. Compared to the S&P 500 index, Bitcoin has $39.73%$ more prevalence to produce high daily returns (more than $1.23%$ or less than $-1.06%$). The severity analysis shows that at a risk level ($α$) the average value-at-risk ($AVaR(X)$) of the bitcoin returns at one significant figure is four times larger than that of the S&P 500 index returns at the same risk. ...

October 3, 2023 · 2 min · Research Team

Media Moments and Corporate Connections: A Deep Learning Approach to Stock Movement Classification

Media Moments and Corporate Connections: A Deep Learning Approach to Stock Movement Classification ArXiv ID: 2309.06559 “View on arXiv” Authors: Unknown Abstract The financial industry poses great challenges with risk modeling and profit generation. These entities are intricately tied to the sophisticated prediction of stock movements. A stock forecaster must untangle the randomness and ever-changing behaviors of the stock market. Stock movements are influenced by a myriad of factors, including company history, performance, and economic-industry connections. However, there are other factors that aren’t traditionally included, such as social media and correlations between stocks. Social platforms such as Reddit, Facebook, and X (Twitter) create opportunities for niche communities to share their sentiment on financial assets. By aggregating these opinions from social media in various mediums such as posts, interviews, and news updates, we propose a more holistic approach to include these “media moments” within stock market movement prediction. We introduce a method that combines financial data, social media, and correlated stock relationships via a graph neural network in a hierarchical temporal fashion. Through numerous trials on current S&P 500 index data, with results showing an improvement in cumulative returns by 28%, we provide empirical evidence of our tool’s applicability for use in investment decisions. ...

September 8, 2023 · 2 min · Research Team

Are there Dragon Kings in the Stock Market?

Are there Dragon Kings in the Stock Market? ArXiv ID: 2307.03693 “View on arXiv” Authors: Unknown Abstract We undertake a systematic study of historic market volatility spanning roughly five preceding decades. We focus specifically on the time series of realized volatility (RV) of the S&P500 index and its distribution function. As expected, the largest values of RV coincide with the largest economic upheavals of the period: Savings and Loan Crisis, Tech Bubble, Financial Crisis and Covid Pandemic. We address the question of whether these values belong to one of the three categories: Black Swans (BS), that is they lie on scale-free, power-law tails of the distribution; Dragon Kings (DK), defined as statistically significant upward deviations from BS; or Negative Dragons Kings (nDK), defined as statistically significant downward deviations from BS. In analyzing the tails of the distribution with RV > 40, we observe the appearance of “potential” DK which eventually terminate in an abrupt plunge to nDK. This phenomenon becomes more pronounced with the increase of the number of days over which the average RV is calculated – here from daily, n=1, to “monthly,” n=21. We fit the entire distribution with a modified Generalized Beta (mGB) distribution function, which terminates at a finite value of the variable but exhibits a long power-law stretch prior to that, as well as Generalized Beta Prime (GB2) distribution function, which has a power-law tail. We also fit the tails directly with a straight line on a log-log scale. In order to ascertain BS, DK or nDK behavior, all fits include their confidence intervals and p-values are evaluated for the data points to check if they can come from the respective distributions. ...

July 7, 2023 · 2 min · Research Team

On Unified Adaptive Portfolio Management

On Unified Adaptive Portfolio Management ArXiv ID: 2307.03391 “View on arXiv” Authors: Unknown Abstract This paper introduces a unified framework for adaptive portfolio management, integrating dynamic Black-Litterman (BL) optimization with the general factor model, Elastic Net regression, and mean-variance portfolio optimization, which allows us to generate investors views and mitigate potential estimation errors systematically. Specifically, we propose an innovative dynamic sliding window algorithm to respond to the constantly changing market conditions. This algorithm allows for the flexible window size adjustment based on market volatility, generating robust estimates for factor modeling, time-varying BL estimations, and optimal portfolio weights. Through extensive ten-year empirical studies using the top 100 capitalized assets in the S&P 500 index, accounting for turnover transaction costs, we demonstrate that this combined approach leads to computational advantages and promising trading performances. ...

July 7, 2023 · 2 min · Research Team

ESG Rating Disagreement and Stock Returns

ESG Rating Disagreement and Stock Returns ArXiv ID: ssrn-3433728 “View on arXiv” Authors: Unknown Abstract Using ESG ratings from seven different data providers for a sample of S&P 500 firms between 2010 and 2017, we study the relation between ESG rating disagree Keywords: ESG Ratings, Corporate Governance, Sustainability Disclosure, Firm Performance, S&P 500 Complexity vs Empirical Score Math Complexity: 3.0/10 Empirical Rigor: 7.5/10 Quadrant: Street Traders Why: The paper relies heavily on empirical data analysis (correlations, panel regressions, firm characteristics) with a focus on backtest-ready financial metrics like stock returns and equity cost of capital, but the mathematical modeling is limited to standard econometric techniques without advanced theory or derivations. flowchart TD A["Research Goal: Impact of ESG Rating Disagreement<br>on Stock Returns for S&P 500 Firms"] --> B["Data Inputs<br>2010-2017, S&P 500, 7 ESG Providers"] B --> C["Methodology: Calculate ESG Disagreement<br>across providers"] C --> D["Methodology: Regression Analysis<br>ESG Disagreement vs. Stock Returns"] D --> E{"Key Findings"} E --> F["Higher ESG Disagreement<br>associated with Lower Stock Returns"] E --> G["Disagreement mediates<br>the ESG-Performance relationship"]

August 10, 2019 · 1 min · Research Team