A mathematical perspective.
We’ve just witnessed Musetti’s match—an extremely talented player, full of flair, who struggled today under tough conditions: swirling wind, an uneven court, lots of surrounding noise. A difficult environment, and ultimately, a loss. But perhaps it’s not just psychological—it might also be mathematical. Players like Djokovic tend to perform more consistently even under stress, while others, like Federer in his day, were more susceptible to external factors.
This short piece explores why, from a mathematical perspective, players with more consistent baseline performance—those we call “steady” or “regular”—are less likely to collapse when external conditions become difficult.
In many precision sports—from golf to tennis—people often refer to a player’s consistency: how much their performance varies around their average level. This article explores how tail inequalities offer a mathematical framework to explain why a more consistent player is less exposed to sudden performance crashes caused by external disruptions.
Basic model: signal plus noise
- Stable signal \(X\), with:
\(\mathbb{E}[X] = \mu\)
\(\mathrm{Var}(X) = \sigma_X^2\) - External noise \(N\), with:
\(\mathbb{E}[N] = 0\)
\(\mathrm{Var}(N) = \sigma_N^2\)
The overall performance in one point is:
$$ Y = X + N,\quad \mathrm{Var}(Y) = \sigma_X^2 + \sigma_N^2. $$
1. Hoeffding’s inequality (1963)
If \(X\) and \(N\) are sub-Gaussian variables (e.g. bounded), then for any \(\varepsilon>0\):
$$ P\bigl(Y – \mathbb{E}[Y] \le -\varepsilon \bigr) \;\le\; \exp\!\Bigl(-\frac{\varepsilon^2}{2(\sigma_X^2 + \sigma_N^2)}\Bigr). $$
2. Bernstein’s inequality (1924)
If a random variable \(Z\) has zero mean, variance \(\tau^2\) and is bounded by \(\lvert Z\rvert\le M\), then for any \(\varepsilon>0\):
$$ P(Z \le -\varepsilon) \;\le\; \exp\!\Bigl(-\frac{\varepsilon^2}{2\tau^2 + \tfrac{2}{3}M\,\varepsilon}\Bigr). $$
Applying this to \(Z = Y – \mathbb{E}[Y]\) with \(\tau^2 = \sigma_X^2 + \sigma_N^2\), we get:
$$ P\bigl(Y – \mathbb{E}[Y] \le -\varepsilon \bigr) \;\le\; \exp\!\Bigl(-\frac{\varepsilon^2} {2(\sigma_X^2+\sigma_N^2) + \tfrac{2}{3}M\,\varepsilon}\Bigr). $$
3. Implications for tennis
- “Steady” player: low \(\sigma_X^2\) → reduced intrinsic variability
- Adverse conditions: high \(\sigma_N^2\) → strong external noise
- Total variance: \(\sigma_X^2 + \sigma_N^2\)
Tail inequalities ensure that, even with significant noise, the probability of a sudden “collapse” (performance far below average) is exponentially negligible if the intrinsic variance \(\sigma_X^2\) is kept low.
“Those who maintain consistency under ideal conditions withstand external shocks better.”
References
- Hoeffding, W. (1963). Probability Inequalities for Sums of Bounded Random Variables
- Bernstein, S. (1924). On a Modification of Chebyshev’s Inequality and of the Error Formula of Laplace
Follow Us