lighthouse

A Strong Invariance Principle for the Logarithmic Average of Sample Maxima

09 June 2001 | Paper

From convergence to approximation: how Brownian motion underpins the statistics of extreme events

Knowing that a statistic converges is not the same as knowing how it converges. For practitioners who build risk models on extreme value theory, the gap matters enormously. A convergence result tells you that, eventually, the logarithmic average of normalized maxima approaches a theoretical limit. An invariance principle tells you the precise stochastic structure of the deviations along the way—and that structure turns out to be Brownian motion. This paper by Dr. Ingo Fahrner, published in Stochastic Processes and their Applications (2001), establishes that strong invariance principle, connecting extreme value statistics to the full toolkit of Wiener process theory.

The result in plain terms. Prior work had shown that the logarithmic average (1/log n) ∑ (1/k) f((Mk − bk)/ak) converges almost surely to the integral of f against the extreme value distribution. What this paper adds is the precision: the centered and scaled version of that average—specifically, ∑ (1/k) g(·) − m log n—can be almost surely approximated by σ W(log n), where W is a standard Wiener process and σ is an explicitly computed constant. The error is o((log n)1/2−η) almost surely, making the approximation not just asymptotically valid but quantitatively sharp.

Why this matters for tail risk measurement. The Wiener process approximation is what enables classical statistical inference to be applied to extreme value averages. Once it is established that the logarithmic average of maxima behaves like Brownian motion at the scale of √(log n), a suite of practical consequences follows immediately. The functional central limit theorem (Corollary 5) gives convergence of the whole trajectory, not just the endpoint, enabling path-based tests and monitoring. The law of the iterated logarithm (Corollary 6) gives precise almost sure upper and lower bounds on the fluctuations, directly usable as stress thresholds or backtesting benchmarks. For risk managers, these are not theoretical refinements: they are the tools needed to attach quantitative uncertainty to extreme value estimates.

A key technical advance: unbounded test functions. Earlier strong approximation results for extremes required the test function f to be bounded and of bounded variation with compact support. This paper works with a substantially broader class that admits unbounded functions, including |x|r for any r ≥ 1, exponentials, and other growth-rate functions. In financial applications this is the difference between being able to treat only indicators of exceedances (bounded) versus treating moment-based risk measures, expected shortfall, and power-law tail statistics (unbounded). The theoretical machinery needed to handle unbounded functions—working through the Markovian structure of the transformed extremal process—is a substantial part of the paper's contribution.

The variance constant and its structural interpretation. The long-run variance σ2 that governs the Wiener process approximation is given by an explicit integral formula involving the covariance structure of the underlying extremal process. This is not a nuisance parameter: it encodes the serial dependence of the extreme value statistics over time. For calibration purposes, σ2 can be computed from the distribution of the data, and the resulting Wiener approximation then determines the correct width of confidence bands around logarithmic extreme value averages. Firms that use historical simulation or block maxima methods for VaR calibration are implicitly estimating something related to this variance, and this paper provides the theoretical foundation that justifies that practice.

Connection to sums of minima. An instructive application of the main results is to the asymptotic behavior of Sn = ∑k=1n mk, the cumulative sum of running minima of an i.i.d. sequence. The paper shows that Sn − log n is approximated almost surely by W(2 log n) with error o((log n)1/4+ϵ). This recovers and sharpens earlier invariance principle results for sums of minima, and illustrates how the extremal process framework unifies the treatment of both upper and lower tail behavior.

The practical bottom line. This paper establishes the rigorous foundation for treating the logarithmic average of extreme value statistics as a Brownian motion at logarithmic time scale. The consequences are concrete: confidence intervals for tail risk estimates derived from historical maxima can be constructed using standard Wiener process formulas, the rate at which those intervals shrink as sample size grows is quantitatively determined, and functional limit theorems justify monitoring those statistics over time rather than only at a fixed horizon. For institutions that use extreme value methods in market risk, insurance pricing, or operational risk, the invariance principle is the missing link between asymptotic theory and statistically valid applied inference.

Dr. Ingo Fahrner
Dr. Ingo Fahrner Senior Manager ingo.fahrner@ucg.de

With decades of experience in the financial sector, Ingo developed valuation models for trading desks and insurers. Ingo is an expert in solving regulatory requirements and has both the theoretical knowledge and the practical experience to develop and implement the optimal valuation method for trading and risk management alike.