false

End-to-End Policy Learning of a Statistical Arbitrage Autoencoder Architecture

End-to-End Policy Learning of a Statistical Arbitrage Autoencoder Architecture ArXiv ID: 2402.08233 “View on arXiv” Authors: Unknown Abstract In Statistical Arbitrage (StatArb), classical mean reversion trading strategies typically hinge on asset-pricing or PCA based models to identify the mean of a synthetic asset. Once such a (linear) model is identified, a separate mean reversion strategy is then devised to generate a trading signal. With a view of generalising such an approach and turning it truly data-driven, we study the utility of Autoencoder architectures in StatArb. As a first approach, we employ a standard Autoencoder trained on US stock returns to derive trading strategies based on the Ornstein-Uhlenbeck (OU) process. To further enhance this model, we take a policy-learning approach and embed the Autoencoder network into a neural network representation of a space of portfolio trading policies. This integration outputs portfolio allocations directly and is end-to-end trainable by backpropagation of the risk-adjusted returns of the neural policy. Our findings demonstrate that this innovative end-to-end policy learning approach not only simplifies the strategy development process, but also yields superior gross returns over its competitors illustrating the potential of end-to-end training over classical two-stage approaches. ...

February 13, 2024 · 2 min · Research Team

Portfolio Optimization under Transaction Costs with Recursive Preferences

Portfolio Optimization under Transaction Costs with Recursive Preferences ArXiv ID: 2402.08387 “View on arXiv” Authors: Unknown Abstract The Merton investment-consumption problem is fundamental, both in the field of finance, and in stochastic control. An important extension of the problem adds transaction costs, which is highly relevant from a financial perspective but also challenging from a control perspective because the solution now involves singular control. A further significant extension takes us from additive utility to stochastic differential utility (SDU), which allows time preferences and risk preferences to be disentangled. In this paper, we study this extended version of the Merton problem with proportional transaction costs and Epstein-Zin SDU. We fully characterise all parameter combinations for which the problem is well posed (which may depend on the level of transaction costs) and provide a full verification argument that relies on no additional technical assumptions and uses primal methods only. The case with SDU requires new mathematical techniques as duality methods break down. Even in the special case of (additive) power utility, our arguments are significantly simpler, more elegant and more far-reaching than the ones in the extant literature. This means that we can easily analyse aspects of the problem which previously have been very challenging, including comparative statics, boundary cases which heretofore have required separate treatment and the situation beyond the small transaction cost regime. A key and novel idea is to parametrise consumption and the value function in terms of the shadow fraction of wealth, which may be of much wider applicability. ...

February 13, 2024 · 2 min · Research Team

The Euler Scheme for Fractional Stochastic Delay Differential Equations with Additive Noise

The Euler Scheme for Fractional Stochastic Delay Differential Equations with Additive Noise ArXiv ID: 2402.08513 “View on arXiv” Authors: Unknown Abstract In this paper we consider the Euler-Maruyama scheme for a class ofstochastic delay differential equations driven by a fractional Brownian motion with index $H\in(0,1)$. We establish the consistency of the scheme and study the rate of convergence of the normalized error process. This is done by checking that the generic rate of convergence of the error process with stepsize $Δ_{“n”}$ is $Δ_{“n”}^{"\min{H+\frac{1"}{“2”},3H,1}}$. It turned out that such a rate is suboptimal when the delay is smooth and $H>1/2$. In this context, and in contrast to the non-delayed framework, we show that a convergence of order $H+1/2$ is achievable. ...

February 13, 2024 · 2 min · Research Team

Analyzing Currency Fluctuations: A Comparative Study of GARCH, EWMA, and IV Models for GBP/USD and EUR/GBP Pairs

Analyzing Currency Fluctuations: A Comparative Study of GARCH, EWMA, and IV Models for GBP/USD and EUR/GBP Pairs ArXiv ID: 2402.07435 “View on arXiv” Authors: Unknown Abstract In this study, we examine the fluctuation in the value of the Great Britain Pound (GBP). We focus particularly on its relationship with the United States Dollar (USD) and the Euro (EUR) currency pairs. Utilizing data from June 15, 2018, to June 15, 2023, we apply various mathematical models to assess their effectiveness in predicting the 20-day variation in the pairs’ daily returns. Our analysis involves the implementation of Exponentially Weighted Moving Average (EWMA), Generalized Autoregressive Conditional Heteroskedasticity (GARCH) models, and Implied Volatility (IV) models. To evaluate their performance, we compare the accuracy of their predictions using Root Mean Square Error (RMSE) and Mean Absolute Error (MAE) metrics. We delve into the intricacies of GARCH models, examining their statistical characteristics when applied to the provided dataset. Our findings suggest the existence of asymmetric returns in the EUR/GBP pair, while such evidence is inconclusive for the GBP/USD pair. Additionally, we observe that GARCH-type models better fit the data when assuming residuals follow a standard t-distribution rather than a standard normal distribution. Furthermore, we investigate the efficacy of different forecasting techniques within GARCH-type models. Comparing rolling window forecasts to expanding window forecasts, we find no definitive superiority in either approach across the tested scenarios. Our experiments reveal that for the GBP/USD pair, the most accurate volatility forecasts stem from the utilization of GARCH models employing a rolling window methodology. Conversely, for the EUR/GBP pair, optimal forecasts are derived from GARCH models and Ordinary Least Squares (OLS) models incorporating the annualized implied volatility of the exchange rate as an independent variable. ...

February 12, 2024 · 2 min · Research Team

Contagion on Financial Networks: An Introduction

Contagion on Financial Networks: An Introduction ArXiv ID: 2402.08071 “View on arXiv” Authors: Unknown Abstract This mini-project models propagation of shocks, in time point, through links in connected banks. In particular, financial network of 100 banks out of which 15 are shocked to default (that is, 85.00% of the banks are solvent) is modelled using Erdos and Renyi network – directed, weighted and randomly generated network. Shocking some banks in a financial network implies removing their assets and redistributing their liabilities to other connected ones in the network. The banks are nodes and two ranges of probability values determine tendency of having a link between a pair of banks. Our major finding shows that the ranges of probability values and banks’ percentage solvency have positive correlation. ...

February 12, 2024 · 2 min · Research Team

Do Weibo platform experts perform better at predicting stock market?

Do Weibo platform experts perform better at predicting stock market? ArXiv ID: 2403.00772 “View on arXiv” Authors: Unknown Abstract Sentiment analysis can be used for stock market prediction. However, existing research has not studied the impact of a user’s financial background on sentiment-based forecasting of the stock market using artificial neural networks. In this work, a novel combination of neural networks is used for the assessment of sentiment-based stock market prediction, based on the financial background of the population that generated the sentiment. The state-of-the-art language processing model Bidirectional Encoder Representations from Transformers (BERT) is used to classify the sentiment and a Long-Short Term Memory (LSTM) model is used for time-series based stock market prediction. For evaluation, the Weibo social networking platform is used as a sentiment data collection source. Weibo users (and their comments respectively) are divided into Authorized Financial Advisor (AFA) and Unauthorized Financial Advisor (UFA) groups according to their background information, as collected by Weibo. The Hong Kong Hang Seng index is used to extract historical stock market change data. The results indicate that stock market prediction learned from the AFA group users is 39.67% more precise than that learned from the UFA group users and shows the highest accuracy (87%) when compared to existing approaches. ...

February 12, 2024 · 2 min · Research Team

Finding Moving-Band Statistical Arbitrages via Convex-Concave Optimization

Finding Moving-Band Statistical Arbitrages via Convex-Concave Optimization ArXiv ID: 2402.08108 “View on arXiv” Authors: Unknown Abstract We propose a new method for finding statistical arbitrages that can contain more assets than just the traditional pair. We formulate the problem as seeking a portfolio with the highest volatility, subject to its price remaining in a band and a leverage limit. This optimization problem is not convex, but can be approximately solved using the convex-concave procedure, a specific sequential convex programming method. We show how the method generalizes to finding moving-band statistical arbitrages, where the price band midpoint varies over time. ...

February 12, 2024 · 1 min · Research Team

Blockchain Metrics and Indicators in Cryptocurrency Trading

Blockchain Metrics and Indicators in Cryptocurrency Trading ArXiv ID: 2403.00770 “View on arXiv” Authors: Unknown Abstract The objective of this paper is the construction of new indicators that can be useful to operate in the cryptocurrency market. These indicators are based on public data obtained from the blockchain network, specifically from the nodes that make up Bitcoin mining. Therefore, our analysis is unique to that network. The results obtained with numerical simulations of algorithmic trading and prediction via statistical models and Machine Learning demonstrate the importance of variables such as the hash rate, the difficulty of mining or the cost per transaction when it comes to trade Bitcoin assets or predict the direction of price. Variables obtained from the blockchain network will be called here blockchain metrics. The corresponding indicators (inspired by the “Hash Ribbon”) perform well in locating buy signals. From our results, we conclude that such blockchain indicators allow obtaining information with a statistical advantage in the highly volatile cryptocurrency market. ...

February 11, 2024 · 2 min · Research Team

RiskMiner: Discovering Formulaic Alphas via Risk Seeking Monte Carlo Tree Search

RiskMiner: Discovering Formulaic Alphas via Risk Seeking Monte Carlo Tree Search ArXiv ID: 2402.07080 “View on arXiv” Authors: Unknown Abstract The formulaic alphas are mathematical formulas that transform raw stock data into indicated signals. In the industry, a collection of formulaic alphas is combined to enhance modeling accuracy. Existing alpha mining only employs the neural network agent, unable to utilize the structural information of the solution space. Moreover, they didn’t consider the correlation between alphas in the collection, which limits the synergistic performance. To address these problems, we propose a novel alpha mining framework, which formulates the alpha mining problems as a reward-dense Markov Decision Process (MDP) and solves the MDP by the risk-seeking Monte Carlo Tree Search (MCTS). The MCTS-based agent fully exploits the structural information of discrete solution space and the risk-seeking policy explicitly optimizes the best-case performance rather than average outcomes. Comprehensive experiments are conducted to demonstrate the efficiency of our framework. Our method outperforms all state-of-the-art benchmarks on two real-world stock sets under various metrics. Backtest experiments show that our alphas achieve the most profitable results under a realistic trading setting. ...

February 11, 2024 · 2 min · Research Team

A monotone piecewise constant control integration approach for the two-factor uncertain volatility model

A monotone piecewise constant control integration approach for the two-factor uncertain volatility model ArXiv ID: 2402.06840 “View on arXiv” Authors: Unknown Abstract Option contracts on two underlying assets within uncertain volatility models have their worst-case and best-case prices determined by a two-dimensional (2D) Hamilton-Jacobi-Bellman (HJB) partial differential equation (PDE) with cross-derivative terms. This paper introduces a novel ``decompose and integrate, then optimize’’ approach to tackle this HJB PDE. Within each timestep, our method applies piecewise constant control, yielding a set of independent linear 2D PDEs, each corresponding to a discretized control value. Leveraging closed-form Green’s functions, these PDEs are efficiently solved via 2D convolution integrals using a monotone numerical integration method. The value function and optimal control are then obtained by synthesizing the solutions of the individual PDEs. For enhanced efficiency, we implement the integration via Fast Fourier Transforms, exploiting the Toeplitz matrix structure. The proposed method is $\ell_{"\infty"}$-stable, consistent in the viscosity sense, and converges to the viscosity solution of the HJB equation. Numerical results show excellent agreement with benchmark solutions obtained by finite differences, tree methods, and Monte Carlo simulation, highlighting its robustness and effectiveness. ...

February 9, 2024 · 2 min · Research Team