false

Multiscale Causal Analysis of Market Efficiency via News Uncertainty Networks and the Financial Chaos Index

Multiscale Causal Analysis of Market Efficiency via News Uncertainty Networks and the Financial Chaos Index ArXiv ID: 2505.01543 “View on arXiv” Authors: Masoud Ataei Abstract This study evaluates the scale-dependent informational efficiency of stock markets using the Financial Chaos Index, a tensor-eigenvalue-based measure of realized volatility. Incorporating Granger causality and network-theoretic analysis across a range of economic, policy, and news-based uncertainty indices, we assess whether public information is efficiently incorporated into asset price fluctuations. Based on a 34-year time period from 1990 to 2023, at the daily frequency, the semi-strong form of the Efficient Market Hypothesis is rejected at the 1% level of significance, indicating that asset price changes respond predictably to lagged news-based uncertainty. In contrast, at the monthly frequency, such predictive structure largely vanishes, supporting informational efficiency at coarser temporal resolutions. A structural analysis of the Granger causality network reveals that fiscal and monetary policy uncertainties act as core initiators of systemic volatility, while peripheral indices, such as those related to healthcare and consumer prices, serve as latent bridges that become activated under crisis conditions. These findings underscore the role of time-scale decomposition and structural asymmetries in diagnosing market inefficiencies and mapping the propagation of macro-financial uncertainty. ...

May 2, 2025 · 2 min · Research Team

Towards modelling lifetime default risk: Exploring different subtypes of recurrent event Cox-regression models

Towards modelling lifetime default risk: Exploring different subtypes of recurrent event Cox-regression models ArXiv ID: 2505.01044 “View on arXiv” Authors: Arno Botha, Tanja Verster, Bernard Scheepers Abstract In the pursuit of modelling a loan’s probability of default (PD) over its lifetime, repeat default events are often ignored when using Cox Proportional Hazard (PH) models. Excluding such events may produce biased and inaccurate PD-estimates, which can compromise financial buffers against future losses. Accordingly, we investigate a few subtypes of Cox-models that can incorporate recurrent default events. Using South African mortgage data, we explore both the Andersen-Gill (AG) and the Prentice-Williams-Peterson (PWP) spell-time models. These models are compared against a baseline that deliberately ignores recurrent events, called the time to first default (TFD) model. Models are evaluated using Harrell’s c-statistic, adjusted Cox-Sell residuals, and a novel extension of time-dependent receiver operating characteristic (ROC) analysis. From these Cox-models, we demonstrate how to derive a portfolio-level term-structure of default risk, which is a series of marginal PD-estimates at each point of the average loan’s lifetime. While the TFD- and PWP-models do not differ significantly across all diagnostics, the AG-model underperformed expectations. Depending on the prevalence of recurrent defaults, one may therefore safely ignore them when estimating lifetime default risk. Accordingly, our work enhances the current practice of using Cox-modelling in producing timeous and accurate PD-estimates under IFRS 9. ...

May 2, 2025 · 2 min · Research Team

A new architecture of high-order deep neural networks that learn martingales

A new architecture of high-order deep neural networks that learn martingales ArXiv ID: 2505.03789 “View on arXiv” Authors: Syoiti Ninomiya, Yuming Ma Abstract A new deep-learning neural network architecture based on high-order weak approximation algorithms for stochastic differential equations (SDEs) is proposed. The architecture enables the efficient learning of martingales by deep learning models. The behaviour of deep neural networks based on this architecture, when applied to the problem of pricing financial derivatives, is also examined. The core of this new architecture lies in the high-order weak approximation algorithms of the explicit Runge–Kutta type, wherein the approximation is realised solely through iterative compositions and linear combinations of vector fields of the target SDEs. ...

May 1, 2025 · 2 min · Research Team

Numerical analysis on locally risk-minimizing strategies for Barndorff-Nielsen and Shephard models

Numerical analysis on locally risk-minimizing strategies for Barndorff-Nielsen and Shephard models ArXiv ID: 2505.00255 “View on arXiv” Authors: Takuji Arai Abstract We develop a numerical method for locally risk-minimizing (LRM) strategies for Barndorff-Nielsen and Shephard (BNS) models. Arai et al. (2017) derived a mathematical expression for LRM strategies in BNS models using Malliavin calculus for Lévy processes and presented some numerical results only for the case where the asset price process is a martingale. Subsequently, Arai and Imai (2024) developed the first Monte Carlo (MC) method available for non-martingale BNS models with infinite active jumps. Here, we modify the expression obtained by Arai et al. (2017) into a numerically tractable form, and, using the MC method developed by Arai and Imai (2024), propose a numerical method of LRM strategies available for non-martingale BNS models with infinite active jumps. In the final part of this paper, we will conduct some numerical experiments. ...

May 1, 2025 · 2 min · Research Team

Approximation and regularity results for the Heston model and related processes

Approximation and regularity results for the Heston model and related processes ArXiv ID: 2504.21658 “View on arXiv” Authors: Edoardo Lombardo Abstract This Ph.D. thesis explores approximations and regularity for the Heston stochastic volatility model through three interconnected works. The first work focuses on developing high-order weak approximations for the Cox-Ingersoll-Ross (CIR) process, essential for financial modelling but challenging due to the square root diffusion term preventing standard methods. By employing the random grid technique (Alfonsi & Bally, 2021) built upon Alfonsi’s (2010) second-order scheme, the work proves that weak approximations of any order can be achieved for smooth test functions. This holds under a condition that is less restrictive than the famous Feller’s one. Numerical results confirm convergence for both CIR and Heston models and show significant computational time improvements. The second work extends the random grid technique to the log-Heston process. Two second-order schemes are introduced (one using exact volatility simulation, another using Ninomiya-Victoir splitting under a the same restriction used above). Convergence to any desired order is rigorously proven. Numerical experiments validate the schemes’ effectiveness for pricing European and Asian options and suggest potential applicability to multifactor/rough Heston models. The third work investigates the partial differential equation (PDE) associated with the log-Heston model. It extends classical solution results and establishes the existence and uniqueness of viscosity solutions without relying on the Feller condition. Uniqueness is proven even for certain discontinuous initial data, relevant for pricing instruments like digital options. Furthermore, the convergence of a hybrid numerical scheme to the viscosity solution is shown under relaxed regularity (continuity) for the initial data. An appendix includes supplementary results for the CIR process. ...

April 30, 2025 · 3 min · Research Team

ClusterLOB: Enhancing Trading Strategies by Clustering Orders in Limit Order Books

ClusterLOB: Enhancing Trading Strategies by Clustering Orders in Limit Order Books ArXiv ID: 2504.20349 “View on arXiv” Authors: Yichi Zhang, Mihai Cucuringu, Alexander Y. Shestopaloff, Stefan Zohren Abstract In the rapidly evolving world of financial markets, understanding the dynamics of limit order book (LOB) is crucial for unraveling market microstructure and participant behavior. We introduce ClusterLOB as a method to cluster individual market events in a stream of market-by-order (MBO) data into different groups. To do so, each market event is augmented with six time-dependent features. By applying the K-means++ clustering algorithm to the resulting order features, we are then able to assign each new order to one of three distinct clusters, which we identify as directional, opportunistic, and market-making participants, each capturing unique trading behaviors. Our experimental results are performed on one year of MBO data containing small-tick, medium-tick, and large-tick stocks from NASDAQ. To validate the usefulness of our clustering, we compute order flow imbalances across each cluster within 30-minute buckets during the trading day. We treat each cluster’s imbalance as a signal that provides insights into trading strategies and participants’ responses to varying market conditions. To assess the effectiveness of these signals, we identify the trading strategy with the highest Sharpe ratio in the training dataset, and demonstrate that its performance in the test dataset is superior to benchmark trading strategies that do not incorporate clustering. We also evaluate trading strategies based on order flow imbalance decompositions across different market event types, including add, cancel, and trade events, to assess their robustness in various market conditions. This work establishes a robust framework for clustering market participant behavior, which helps us to better understand market microstructure, and inform the development of more effective predictive trading signals with practical applications in algorithmic trading and quantitative finance. ...

April 29, 2025 · 3 min · Research Team

Scaling and shape of financial returns distributions modeled as conditionally independent random variables

Scaling and shape of financial returns distributions modeled as conditionally independent random variables ArXiv ID: 2504.20488 “View on arXiv” Authors: Hernán Larralde, Roberto Mota Navarro Abstract We show that assuming that the returns are independent when conditioned on the value of their variance (volatility), which itself varies in time randomly, then the distribution of returns is well described by the statistics of the sum of conditionally independent random variables. In particular, we show that the distribution of returns can be cast in a simple scaling form, and that its functional form is directly related to the distribution of the volatilities. This approach explains the presence of power-law tails in the returns as a direct consequence of the presence of a power law tail in the distribution of volatilities. It also provides the form of the distribution of Bitcoin returns, which behaves as a stretched exponential, as a consequence of the fact that the Bitcoin volatilities distribution is also closely described by a stretched exponential. We test our predictions with data from the S&P 500 index, Apple and Paramount stocks; and Bitcoin. ...

April 29, 2025 · 2 min · Research Team

A high-order recombination algorithm for weak approximation of stochastic differential equations

A high-order recombination algorithm for weak approximation of stochastic differential equations ArXiv ID: 2504.19717 “View on arXiv” Authors: Syoiti Ninomiya, Yuji Shinozaki Abstract This paper presents an algorithm for applying the high-order recombination method, originally introduced by Lyons and Litterer in ``High-order recombination and an application to cubature on Wiener space’’ (Ann. Appl. Probab. 22(4):1301–1327, 2012), to practical problems in mathematical finance. A refined error analysis is provided, yielding a sharper condition for space partitioning. Based on this condition, a computationally feasible recursive partitioning algorithm is developed. Numerical examples are also included, demonstrating that the proposed algorithm effectively avoids the explosive growth in the cardinality of the support required to achieve high-order approximations. ...

April 28, 2025 · 2 min · Research Team

Compounding Effects in Leveraged ETFs: Beyond the Volatility Drag Paradigm

Compounding Effects in Leveraged ETFs: Beyond the Volatility Drag Paradigm ArXiv ID: 2504.20116 “View on arXiv” Authors: Chung-Han Hsieh, Jow-Ran Chang, Hui Hsiang Chen Abstract A common belief is that leveraged ETFs (LETFs) suffer long-term performance decay due to \emph{“volatility drag”}. We show that this view is incomplete: LETF performance depends fundamentally on return autocorrelation and return dynamics. In markets with independent returns, LETFs exhibit positive expected compounding effects on their target multiples. In serially correlated markets, trends enhance returns, while mean reversion induces underperformance. With a unified framework incorporating AR(1) and AR-GARCH models, continuous-time regime switching, and flexible rebalancing frequencies, we demonstrate that return dynamics – including return autocorrelation, volatility clustering, and regime persistence – determine whether LETFs outperform or underperform their targets. Empirically, using about 20 years of SPDR S&P~500 ETF and Nasdaq-100 ETF data, we confirm these theoretical predictions. Daily-rebalanced LETFs enhance returns in momentum-driven markets, whereas infrequent rebalancing mitigates losses in mean-reverting regimes. ...

April 28, 2025 · 2 min · Research Team

Deep Declarative Risk Budgeting Portfolios

Deep Declarative Risk Budgeting Portfolios ArXiv ID: 2504.19980 “View on arXiv” Authors: Manuel Parra-Diaz, Carlos Castro-Iragorri Abstract Recent advances in deep learning have spurred the development of end-to-end frameworks for portfolio optimization that utilize implicit layers. However, many such implementations are highly sensitive to neural network initialization, undermining performance consistency. This research introduces a robust end-to-end framework tailored for risk budgeting portfolios that effectively reduces sensitivity to initialization. Importantly, this enhanced stability does not compromise portfolio performance, as our framework consistently outperforms the risk parity benchmark. ...

April 28, 2025 · 1 min · Research Team