false

Multilevel Monte Carlo in Sample Average Approximation: Convergence, Complexity and Application

Multilevel Monte Carlo in Sample Average Approximation: Convergence, Complexity and Application ArXiv ID: 2407.18504 “View on arXiv” Authors: Unknown Abstract In this paper, we examine the Sample Average Approximation (SAA) procedure within a framework where the Monte Carlo estimator of the expectation is biased. We also introduce Multilevel Monte Carlo (MLMC) in the SAA setup to enhance the computational efficiency of solving optimization problems. In this context, we conduct a thorough analysis, exploiting Cramér’s large deviation theory, to establish uniform convergence, quantify the convergence rate, and determine the sample complexity for both standard Monte Carlo and MLMC paradigms. Additionally, we perform a root-mean-squared error analysis utilizing tools from empirical process theory to derive sample complexity without relying on the finite moment condition typically required for uniform convergence results. Finally, we validate our findings and demonstrate the advantages of the MLMC estimator through numerical examples, estimating Conditional Value-at-Risk (CVaR) in the Geometric Brownian Motion and nested expectation framework. ...

July 26, 2024 · 2 min · Research Team

Nested Multilevel Monte Carlo with Biased and Antithetic Sampling

Nested Multilevel Monte Carlo with Biased and Antithetic Sampling ArXiv ID: 2308.07835 “View on arXiv” Authors: Unknown Abstract We consider the problem of estimating a nested structure of two expectations taking the form $U_0 = E["\max{“U_1(Y), π(Y)"}”]$, where $U_1(Y) = E[“X\ |\ Y”]$. Terms of this form arise in financial risk estimation and option pricing. When $U_1(Y)$ requires approximation, but exact samples of $X$ and $Y$ are available, an antithetic multilevel Monte Carlo (MLMC) approach has been well-studied in the literature. Under general conditions, the antithetic MLMC estimator obtains a root mean squared error $\varepsilon$ with order $\varepsilon^{"-2"}$ cost. If, additionally, $X$ and $Y$ require approximate sampling, careful balancing of the various aspects of approximation is required to avoid a significant computational burden. Under strong convergence criteria on approximations to $X$ and $Y$, randomised multilevel Monte Carlo techniques can be used to construct unbiased Monte Carlo estimates of $U_1$, which can be paired with an antithetic MLMC estimate of $U_0$ to recover order $\varepsilon^{"-2"}$ computational cost. In this work, we instead consider biased multilevel approximations of $U_1(Y)$, which require less strict assumptions on the approximate samples of $X$. Extensions to the method consider an approximate and antithetic sampling of $Y$. Analysis shows the resulting estimator has order $\varepsilon^{"-2"}$ asymptotic cost under the conditions required by randomised MLMC and order $\varepsilon^{"-2"}|\log\varepsilon|^3$ cost under more general assumptions. ...

August 15, 2023 · 2 min · Research Team