## Martingales, Large Deviations and Bounds on Adaptive Algorithms

#### Eyal LubetzkyMicrosoft Research

Optional stopping theorems for martingales are a simple yet powerful tool for analyzing stochastic processes.
We will discuss a useful large deviation inequality derived from these, and demonstrate its application in the analysis of adaptive algorithms, namely for the following online perfect-matching problem.

Consider $n$ machines and $n/2$ rounds of the following: In each round, the online scheduling algorithm receives a request to be assigned to one out of an independent uniform subset of $k=k(n)$ machines. The goal of the algorithm is to generate a perfect allocation (no collisions between the assignments). It is easy and well known that $k$ having order $\log n$ is a threshold for the success of the algorithm with high probability (w.h.p.). However, when one imposes a memory constraint -- the algorithm has only $m=m(n)$ bits of memory at its disposal, as posed here by Benjamini -- it is unclear whether any adaptive algorithm can w.h.p. achieve a perfect allocation. For instance, is this possible for $k =O(\log n)$ choices and $m = n^{1-\epsilon}$ bits of memory?

In a recent work we obtain tight lower and upper bounds for the performance of such algorithms, and pinpoint the threshold for $k$ for a perfect allocation to be achievable at $n / m$. We show that for some absolute constants $c,C > 0$, any adaptive algorithm with $k m < c n$ incurs order $n$ collisions w.h.p., whereas for $k m > C n$ there exists an adaptive algorithm which w.h.p. has no collisions.

Talk based on a joint work with N. Alon and O. Gurel-Gurevich.

Back to Workshop I: Probabilistic Techniques and Applications