false

Using quantile time series and historical simulation to forecast financial risk multiple steps ahead

Using quantile time series and historical simulation to forecast financial risk multiple steps ahead ArXiv ID: 2502.20978 “View on arXiv” Authors: Unknown Abstract A method for quantile-based, semi-parametric historical simulation estimation of multiple step ahead Value-at-Risk (VaR) and Expected Shortfall (ES) models is developed. It uses the quantile loss function, analogous to how the quasi-likelihood is employed by standard historical simulation methods. The returns data are scaled by the estimated quantile series, then resampling is employed to estimate the forecast distribution one and multiple steps ahead, allowing tail risk forecasting. The proposed method is applicable to any data or model where the relationship between VaR and ES does not change over time and can be extended to allow a measurement equation incorporating realized measures, thus including Realized GARCH and Realized CAViaR type models. Its finite sample properties, and its comparison with existing historical simulation methods, are evaluated via a simulation study. A forecasting study assesses the relative accuracy of the 1% and 2.5% VaR and ES one-day-ahead and ten-day-ahead forecasting results for the proposed class of models compared to several competitors. ...

February 28, 2025 · 2 min · Research Team

Combining Deep Learning and GARCH Models for Financial Volatility and Risk Forecasting

Combining Deep Learning and GARCH Models for Financial Volatility and Risk Forecasting ArXiv ID: 2310.01063 “View on arXiv” Authors: Unknown Abstract In this paper, we develop a hybrid approach to forecasting the volatility and risk of financial instruments by combining common econometric GARCH time series models with deep learning neural networks. For the latter, we employ Gated Recurrent Unit (GRU) networks, whereas four different specifications are used as the GARCH component: standard GARCH, EGARCH, GJR-GARCH and APARCH. Models are tested using daily logarithmic returns on the S&P 500 index as well as gold price Bitcoin prices, with the three assets representing quite distinct volatility dynamics. As the main volatility estimator, also underlying the target function of our hybrid models, we use the price-range-based Garman-Klass estimator, modified to incorporate the opening and closing prices. Volatility forecasts resulting from the hybrid models are employed to evaluate the assets’ risk using the Value-at-Risk (VaR) and Expected Shortfall (ES) at two different tolerance levels of 5% and 1%. Gains from combining the GARCH and GRU approaches are discussed in the contexts of both the volatility and risk forecasts. In general, it can be concluded that the hybrid solutions produce more accurate point volatility forecasts, although it does not necessarily translate into superior VaR and ES forecasts. ...

October 2, 2023 · 2 min · Research Team

Machine Learning and Hamilton-Jacobi-Bellman Equation for Optimal Decumulation: a Comparison Study

Machine Learning and Hamilton-Jacobi-Bellman Equation for Optimal Decumulation: a Comparison Study ArXiv ID: 2306.10582 “View on arXiv” Authors: Unknown Abstract We propose a novel data-driven neural network (NN) optimization framework for solving an optimal stochastic control problem under stochastic constraints. Customized activation functions for the output layers of the NN are applied, which permits training via standard unconstrained optimization. The optimal solution yields a multi-period asset allocation and decumulation strategy for a holder of a defined contribution (DC) pension plan. The objective function of the optimal control problem is based on expected wealth withdrawn (EW) and expected shortfall (ES) that directly targets left-tail risk. The stochastic bound constraints enforce a guaranteed minimum withdrawal each year. We demonstrate that the data-driven approach is capable of learning a near-optimal solution by benchmarking it against the numerical results from a Hamilton-Jacobi-Bellman (HJB) Partial Differential Equation (PDE) computational framework. ...

June 18, 2023 · 2 min · Research Team

The FRTB-IMA computational challenge for Equity Autocallables

The FRTB-IMA computational challenge for Equity Autocallables ArXiv ID: 2305.06215 “View on arXiv” Authors: Unknown Abstract When the Orthogonal Chebyshev Sliding Technique was introduced it was applied to a portfolio of swaps and swaptions within the context of the FRTB-IMA capital calculation. The computational cost associated to the computation of the ES values - an essential component of the capital caluclation under FRTB-IMA - was reduced by more than $90%$ while passing PLA tests. This paper extends the use of the Orthogonal Chebyshev Sliding Technique to portfolios of equity autocallables defined over a range of spot underlyings. Results are very positive as computational reductions are of about $90%$ with passing PLA metrics. Since equity autocallables are a commonly traded exotic trade type, with significant FRTB-IMA computational costs, the extension presented in this paper constitutes an imporant step forward in tackling the computational challenges associated to an efficient FRTB-IMA implementation. ...

May 10, 2023 · 2 min · Research Team