Virtual Talk: Learning low degree functions in logarithmic number of random queries.

Paata Ivanisvili
University of California, Irvine (UCI)

Perhaps a basic question one asks in learning theory is as follows: we have an unknown function f on the hypercube {-1,1}^n, and we are allowed to query samples (X, f(X)) where X is uniformly distributed on {-1,1}^n. After getting these samples (X_1, f(X_1)), ..., (X_N, f(X_N)) we would like to construct a function h which approximates f up to an error epsilon (say in L^2). Of course h is a random function as it involves i.i.d. random variables X_1, ... , X_N in its construction. Therefore, we want to construct such h which approximates f with probability at least 1-delta. So given parameters epsilon, delta in (0,1) the goal is to minimize the number of random queries N. I will show that around log(n) random queries are sufficient (up to a multiplicative factor depending on d, epsilon, delta) to learn bounded degree d functions. Based on joint work with Alexandros Eskenazis.


Back to Calculus of Variations in Probability and Geometry