false

AlphaAgents: Large Language Model based Multi-Agents for Equity Portfolio Constructions

AlphaAgents: Large Language Model based Multi-Agents for Equity Portfolio Constructions ArXiv ID: 2508.11152 “View on arXiv” Authors: Tianjiao Zhao, Jingrao Lyu, Stokes Jones, Harrison Garber, Stefano Pasquali, Dhagash Mehta Abstract The field of artificial intelligence (AI) agents is evolving rapidly, driven by the capabilities of Large Language Models (LLMs) to autonomously perform and refine tasks with human-like efficiency and adaptability. In this context, multi-agent collaboration has emerged as a promising approach, enabling multiple AI agents to work together to solve complex challenges. This study investigates the application of role-based multi-agent systems to support stock selection in equity research and portfolio management. We present a comprehensive analysis performed by a team of specialized agents and evaluate their stock-picking performance against established benchmarks under varying levels of risk tolerance. Furthermore, we examine the advantages and limitations of employing multi-agent frameworks in equity analysis, offering critical insights into their practical efficacy and implementation challenges. ...

August 15, 2025 · 2 min · Research Team

Artificially Intelligent, Naturally Inefficient? Service Quality Investments and the Efficiency Trap in Australian Banking

Artificially Intelligent, Naturally Inefficient? Service Quality Investments and the Efficiency Trap in Australian Banking ArXiv ID: ssrn-5379457 “View on arXiv” Authors: Unknown Abstract This paper questions whether the current surge in artificial intelligence (AI) investment within the Australian banking sector will achieve the efficiency gains Keywords: Artificial Intelligence, Banking Efficiency, AI Investment, Digital Transformation, Equities Complexity vs Empirical Score Math Complexity: 1.0/10 Empirical Rigor: 2.0/10 Quadrant: Philosophers Why: The paper focuses on economic theory and qualitative assessment of AI investments in banking, with no advanced mathematics or quantitative modeling presented. Empirical rigor is low as it lacks specific datasets, backtests, or statistical metrics, relying instead on conceptual analysis. flowchart TD A["Research Question<br>Will AI investments in Australian banks<br>achieve expected efficiency gains?"] --> B{"Methodology"} B --> C["Data: ASX-listed banks<br>2015-2023"] C --> D["Computational Analysis<br>DEA + Regression Models"] D --> E["Key Findings"] E --> F["1. Diminishing returns on AI investment"] E --> G["2. Efficiency trap identified"] E --> H["3. Quality-service trade-off<br>offsets automation gains"]

August 12, 2025 · 1 min · Research Team

DiffVolume: Diffusion Models for Volume Generation in Limit Order Books

DiffVolume: Diffusion Models for Volume Generation in Limit Order Books ArXiv ID: 2508.08698 “View on arXiv” Authors: Zhuohan Wang, Carmine Ventre Abstract Modeling limit order books (LOBs) dynamics is a fundamental problem in market microstructure research. In particular, generating high-dimensional volume snapshots with strong temporal and liquidity-dependent patterns remains a challenging task, despite recent work exploring the application of Generative Adversarial Networks to LOBs. In this work, we propose a conditional \textbf{“Diff”}usion model for the generation of future LOB \textbf{“Volume”} snapshots (\textbf{“DiffVolume”}). We evaluate our model across three axes: (1) \textit{“Realism”}, where we show that DiffVolume, conditioned on past volume history and time of day, better reproduces statistical properties such as marginal distribution, spatial correlation, and autocorrelation decay; (2) \textit{“Counterfactual generation”}, allowing for controllable generation under hypothetical liquidity scenarios by additionally conditioning on a target future liquidity profile; and (3) \textit{“Downstream prediction”}, where we show that the synthetic counterfactual data from our model improves the performance of future liquidity forecasting models. Together, these results suggest that DiffVolume provides a powerful and flexible framework for realistic and controllable LOB volume generation. ...

August 12, 2025 · 2 min · Research Team

Identification of phase correlations in Financial Stock Market Turbulence

Identification of phase correlations in Financial Stock Market Turbulence ArXiv ID: 2508.20105 “View on arXiv” Authors: Kiran Sharma, Abhijit Dutta, Rupak Mukherjee Abstract The basis of arbitrage methods depends on the circulation of information within the framework of the financial market. Following the work of Modigliani and Miller, it has become a vital part of discussions related to the study of financial networks and predictions. The emergence of the efficient market hypothesis by Fama, Fisher, Jensen and Roll in the early 1970s opened up the door for discussion of information affecting the price in the market and thereby creating asymmetries and price distortion. Whenever the micro and macroeconomic factors change, there is a high probability of information asymmetry in the market, and this asymmetry of information creates turbulence in the market. The analysis and interpretation of turbulence caused by the differences in information is crucial in understanding the nature of the stock market using price patterns and fluctuations. Even so, the traditional approaches are not capable of analyzing the cyclical price fluctuations outside the realm of wave structures of securities prices, and a proper and effective technique to assess the nature of the Financial market. Consequently, the analysis of the price fluctuations by applying the theories and computational techniques of mathematical physics ensures that such cycles are disintegrated, and the outcome of decomposed cycles is elucidated to understand the impression of the information on the genesis and discovery of price and to assess the nature of stock market turbulence. In this regard, the paper will provide a framework of Spectrum analysis that decomposes the pricing patterns and is capable of determining the pricing behavior, eventually assisting in examining the nature of turbulence in the National Stock Exchange of India. ...

August 12, 2025 · 3 min · Research Team

A Heterogeneous Spatiotemporal GARCH Model: A Predictive Framework for Volatility in Financial Networks

A Heterogeneous Spatiotemporal GARCH Model: A Predictive Framework for Volatility in Financial Networks ArXiv ID: 2508.20101 “View on arXiv” Authors: Atika Aouri, Philipp Otto Abstract We introduce a heterogeneous spatiotemporal GARCH model for geostatistical data or processes on networks, e.g., for modelling and predicting financial return volatility across firms in a latent spatial framework. The model combines classical GARCH(p, q) dynamics with spatially correlated innovations and spatially varying parameters, estimated using local likelihood methods. Spatial dependence is introduced through a geostatistical covariance structure on the innovation process, capturing contemporaneous cross-sectional correlation. This dependence propagates into the volatility dynamics via the recursive GARCH structure, allowing the model to reflect spatial spillovers and contagion effects in a parsimonious and interpretable way. In addition, this modelling framework allows for spatial volatility predictions at unobserved locations. In an empirical application, we demonstrate how the model can be applied to financial stock networks. Unlike other spatial GARCH models, our framework does not rely on a fixed adjacency matrix; instead, spatial proximity is defined in a proxy space constructed from balance sheet characteristics. Using daily log returns of 50 publicly listed firms over a one-year period, we evaluate the model’s predictive performance in a cross-validation study. ...

August 11, 2025 · 2 min · Research Team

Unwitting Markowitz' Simplification of Portfolio Random Returns

Unwitting Markowitz’ Simplification of Portfolio Random Returns ArXiv ID: 2508.08148 “View on arXiv” Authors: Victor Olkhov Abstract In his famous paper, Markowitz (1952) derived the dependence of portfolio random returns on the random returns of its securities. This result allowed Markowitz to obtain his famous expression for portfolio variance. We show that Markowitz’s equation for portfolio random returns and the expression for portfolio variance, which results from it, describe a simplified approximation of the real markets when the volumes of all consecutive trades with the securities are assumed to be constant during the averaging interval. To show this, we consider the investor who doesn’t trade shares of securities of his portfolio. The investor only observes the trades made in the market with his securities and derives the time series that model the trades with his portfolio as with a single security. These time series describe the portfolio return and variance in exactly the same way as the time series of trades with securities describe their returns and variances. The portfolio time series reveal the dependence of portfolio random returns on the random returns of securities and on the ratio of the random volumes of trades with the securities to the random volumes of trades with the portfolio. If we assume that all volumes of the consecutive trades with securities are constant, obtain Markowitz’s equation for the portfolio’s random returns. The market-based variance of the portfolio accounts for the effects of random fluctuations of the volumes of the consecutive trades. The use of Markowitz variance may give significantly higher or lower estimates than market-based portfolio variance. ...

August 11, 2025 · 2 min · Research Team

AlphaEval: A Comprehensive and Efficient Evaluation Framework for Formula Alpha Mining

AlphaEval: A Comprehensive and Efficient Evaluation Framework for Formula Alpha Mining ArXiv ID: 2508.13174 “View on arXiv” Authors: Hongjun Ding, Binqi Chen, Jinsheng Huang, Taian Guo, Zhengyang Mao, Guoyi Shao, Lutong Zou, Luchen Liu, Ming Zhang Abstract Formula alpha mining, which generates predictive signals from financial data, is critical for quantitative investment. Although various algorithmic approaches-such as genetic programming, reinforcement learning, and large language models-have significantly expanded the capacity for alpha discovery, systematic evaluation remains a key challenge. Existing evaluation metrics predominantly include backtesting and correlation-based measures. Backtesting is computationally intensive, inherently sequential, and sensitive to specific strategy parameters. Correlation-based metrics, though efficient, assess only predictive ability and overlook other crucial properties such as temporal stability, robustness, diversity, and interpretability. Additionally, the closed-source nature of most existing alpha mining models hinders reproducibility and slows progress in this field. To address these issues, we propose AlphaEval, a unified, parallelizable, and backtest-free evaluation framework for automated alpha mining models. AlphaEval assesses the overall quality of generated alphas along five complementary dimensions: predictive power, stability, robustness to market perturbations, financial logic, and diversity. Extensive experiments across representative alpha mining algorithms demonstrate that AlphaEval achieves evaluation consistency comparable to comprehensive backtesting, while providing more comprehensive insights and higher efficiency. Furthermore, AlphaEval effectively identifies superior alphas compared to traditional single-metric screening approaches. All implementations and evaluation tools are open-sourced to promote reproducibility and community engagement. ...

August 10, 2025 · 2 min · Research Team

Event-Aware Sentiment Factors from LLM-Augmented Financial Tweets: A Transparent Framework for Interpretable Quant Trading

Event-Aware Sentiment Factors from LLM-Augmented Financial Tweets: A Transparent Framework for Interpretable Quant Trading ArXiv ID: 2508.07408 “View on arXiv” Authors: Yueyi Wang, Qiyao Wei Abstract In this study, we wish to showcase the unique utility of large language models (LLMs) in financial semantic annotation and alpha signal discovery. Leveraging a corpus of company-related tweets, we use an LLM to automatically assign multi-label event categories to high-sentiment-intensity tweets. We align these labeled sentiment signals with forward returns over 1-to-7-day horizons to evaluate their statistical efficacy and market tradability. Our experiments reveal that certain event labels consistently yield negative alpha, with Sharpe ratios as low as -0.38 and information coefficients exceeding 0.05, all statistically significant at the 95% confidence level. This study establishes the feasibility of transforming unstructured social media text into structured, multi-label event variables. A key contribution of this work is its commitment to transparency and reproducibility; all code and methodologies are made publicly available. Our results provide compelling evidence that social media sentiment is a valuable, albeit noisy, signal in financial forecasting and underscore the potential of open-source frameworks to democratize algorithmic trading research. ...

August 10, 2025 · 2 min · Research Team

Empirical Analysis of the Model-Free Valuation Approach: Hedging Gaps, Conservatism, and Trading Opportunities

Empirical Analysis of the Model-Free Valuation Approach: Hedging Gaps, Conservatism, and Trading Opportunities ArXiv ID: 2508.16595 “View on arXiv” Authors: Zixing Chen, Yihan Qi, Shanlan Que, Julian Sester, Xiao Zhang Abstract In this paper we study the quality of model-free valuation approaches for financial derivatives by systematically evaluating the difference between model-free super-hedging strategies and the realized payoff of financial derivatives using historical option prices from several constituents of the S&P 500 between 2018 and 2022. Our study allows in particular to describe the realized gap between payoff and model-free hedging strategy empirically so that we can quantify to which degree model-free approaches are overly conservative. Our results imply that the model-free hedging approach is only marginally more conservative than industry-standard models such as the Heston-model while being model-free at the same time. This finding, its statistical description and the model-independence of the hedging approach enable us to construct an explicit trading strategy which, as we demonstrate, can be profitably applied in financial markets, and additionally possesses the desirable feature with an explicit control of its downside risk due to its model-free construction preventing losses pathwise. ...

August 9, 2025 · 2 min · Research Team

Sizing the Risk: Kelly, VIX, and Hybrid Approaches in Put-Writing on Index Options

Sizing the Risk: Kelly, VIX, and Hybrid Approaches in Put-Writing on Index Options ArXiv ID: 2508.16598 “View on arXiv” Authors: Maciej Wysocki Abstract This paper examines systematic put-writing strategies applied to S&P 500 Index options, with a focus on position sizing as a key determinant of long-term performance. Despite the well-documented volatility risk premium, where implied volatility exceeds realized volatility, the practical implementation of short-dated volatility-selling strategies remains underdeveloped in the literature. This study evaluates three position sizing approaches: the Kelly criterion, VIX-based volatility regime scaling, and a novel hybrid method combining both. Using SPXW options with expirations from 0 to 5 days, the analysis explores a broad design space, including moneyness levels, volatility estimators, and memory horizons. Results show that ultra-short-dated, far out-of-the-money options deliver superior risk-adjusted returns. The hybrid sizing method consistently balances return generation with robust drawdown control, particularly under low-volatility conditions such as those seen in 2024. The study offers new insights into volatility harvesting, introducing a dynamic sizing framework that adapts to shifting market regimes. It also contributes practical guidance for constructing short-dated option strategies that are robust across market environments. These findings have direct applications for institutional investors seeking to enhance portfolio efficiency through systematic exposure to volatility premia. ...

August 9, 2025 · 2 min · Research Team