The Multilevel Monte Carlo (MLMC) method is a variance reduction reduction technique that significantly accelerates Monte Carlo simulations by using a hierarchy of models with different levels of accuracy and computational cost.
Use Case
It is ideal for models that can be solved at multiple levels of discretization, such as finite element models where the mesh resolution can be varied to alternate between accuracy and computational cost.
Core Idea
It breaks down the problem using a “telescoping sum” identity such that
- It runs a large number of simulations on the cheapest, coarsest model to get a baseline estimate.
- It runs a small number of simulations to estimate the difference (or correction) between successive levels.
Why it’s effective
The method’s power comes from the fact that the models at successive levels are strongly correlated.
- The variance of the difference between two adjacent levels is much smaller than the variance of either model individually.
- Because this variance is low, very few samples are needed to accurately estimate the correction terms.
- This drastically reduces the number of required expensive, high-fidelity simulations, leading to a massive gain in computational efficiency for the same level of accuracy.
Estimator
Similar to Control Variate Method we split into estimating a cheap baseline and a low-variance correction.
Where:
- : The cheap, coarse baseline estimate.
- : The desired, expensive expected value.
- : The correction term, which has a very low variance and is cheap to estimate.
The MLMC estimator approximates the expected value of the finest level model, , by summing the expected value of the coarsest model with a series of corrections from progressively finer model levels.
Where:
- First Term: Represents the standard Monte Carlo estimate using samples on the coarsest, cheapest model ().
- Second Term (Summation): Represents the sum of the estimated corrections. Each term in the sum estimates the difference between a finer level () and the next-coarsest level () using samples.