Gradient methods for convex minimization: better rates under weaker conditionsH. Zhang, W. Yin
Submitted for publication OverviewThe convergence behavior of gradient methods for minimizing convex differentiable functions is one of the core questions in convex optimization.
This paper shows that their well-known complexities can be achieved under conditions weaker than the commonly accepted ones.
We relax the common gradient Lipschitz-continuity condition and strong convexity condition to ones that hold only over certain line segments.
Specifically, we establish complexities Then we improve the rates above to Not only are the relaxed conditions met by more functions, the restrictions give smaller CitationH. Zhang and W. Yin, Gradient methods for convex minimization: better rates under weaker conditions, Rice CAAM technical report 13-04, 2013. « Back |