Gradient-enhanced sparse Hermite polynomial expansions for pricing and hedging high-dimensional American options
ArXiv ID: 2405.02570 “View on arXiv”
Authors: Unknown
Abstract
We propose an efficient and easy-to-implement gradient-enhanced least squares Monte Carlo method for computing price and Greeks (i.e., derivatives of the price function) of high-dimensional American options. It employs the sparse Hermite polynomial expansion as a surrogate model for the continuation value function, and essentially exploits the fast evaluation of gradients. The expansion coefficients are computed by solving a linear least squares problem that is enhanced by gradient information of simulated paths. We analyze the convergence of the proposed method, and establish an error estimate in terms of the best approximation error in the weighted $H^1$ space, the statistical error of solving discrete least squares problems, and the time step size. We present comprehensive numerical experiments to illustrate the performance of the proposed method. The results show that it outperforms the state-of-the-art least squares Monte Carlo method with more accurate price, Greeks, and optimal exercise strategies in high dimensions but with nearly identical computational cost, and it can deliver comparable results with recent neural network-based methods up to dimension 100.
Keywords: American options, Least squares Monte Carlo, Hermite polynomials, Greeks, High-dimensional pricing
Complexity vs Empirical Score
- Math Complexity: 8.5/10
- Empirical Rigor: 7.0/10
- Quadrant: Holy Grail
- Why: The paper involves advanced mathematics including weighted Sobolev spaces, backward stochastic differential equations, and Malliavin calculus for convergence analysis, while also providing comprehensive numerical experiments with performance comparisons and error estimates.
flowchart TD
Start(["Research Goal<br>Efficient Pricing & Greeks for<br>High-Dim American Options"]) --> Inputs
subgraph Inputs ["Data/Inputs"]
S(["Simulated Asset Paths"])
H(["Hermite Polynomials"])
O(["American Option Payoffs"])
end
Inputs --> Methodology
subgraph Methodology ["Methodology Steps"]
G(["Compute Gradient Information<br>from Paths"])
LS(["Solve Gradient-Enhanced<br>Least Squares Problem"])
C(["Construct Sparse Hermite<br>Surrogate Model"])
end
Methodology --> Processes
subgraph Processes ["Computational Processes"]
P1(["Price via Backward Induction<br>using Surrogate Model"])
P2(["Compute Greeks<br>Numerically"])
P3(["Determine Optimal<br>Exercise Strategy"])
end
Processes --> Outcomes
subgraph Outcomes ["Key Findings/Outcomes"]
F1(["High Accuracy in Prices & Greeks"])
F2(["Efficient High-Dim Scaling<br>up to d=100"])
F3(["Outperforms Standard LS-MC<br>and Neural Networks"])
end