How much gradient noise does a gradient-based linesearch method tolerate?

             by S. Gratton, Ph. L. Toint and A. Troeltzsch

                          Report NAXYS-04-2012

Among numerical methods for smooth unconstrained optimization, gradient-based
linesearch methods, like quasi-Newton methods, may work quite well even in the
presence of relatively high amplitude noise in the gradient of the objective
function.  We present some properties on the amplitude of this noise which
ensure a descent direction for such a method. Exploiting this bound, we also
discuss conditions under which global convergence can be guaranteed.