Estimating the Minimizer and the Minimum Value of a Regression Function under Passive Design


Date/Horaire

22 janvier 2024    
11h00 - 12h00

Type d’évènement

Orateur : Arya Akhavan (CMAP, École Polytechnique de Paris)

We propose a new method for estimating the minimizer $x^*$ and the minimum value $f^*$ of a smooth and strongly convex regression function $f$ from the observations contaminated by random noise. Our estimator $z_n$ of the minimizer $x^*$ is based on a version of the projected gradient descent with the gradient estimated by a regularized local polynomial algorithm. Next, we propose a two-stage procedure for estimation of the minimum value $f^*$ of regression function $f$. At the first stage, we construct an accurate enough estimator of $x^*$, which can be, for example, $z_n$. At the second stage, we estimate the function value at the point obtained in the first stage using a rate optimal nonparametric procedure. We derive non-asymptotic upper bounds for the quadratic risk and optimization risk of $z_n$, and for the risk of estimating $f^*$. We establish minimax lower bounds showing that, under a certain choice of parameters, the proposed algorithms achieve the minimax optimal rates of convergence on the class of smooth and strongly convex functions.