Convergence analysis of smoothed stochastic gradient-type algorithm

Nadav Berman, Arie Feuer, Elias Wahnon

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

Stochastic gradient (SG) algorithms are commonly used mainly because of their simplicity and ease of implementation. However, their performance, both in terms of convergence rate and steady-state performance, is often unsatisfactory. While maintaining the basic simplicity of the gradient methods, the smoothed stochastic gradient (SSG) algorithm includes some additional processing of the data. There are strong indications that the additional processing results in many cases in improved performance. However, the convergence of this algorithm remained an open problem. In this paper we present a rigorous analysis which concludes, under very mild assumptions on the data, that the algorithm converges almost everywhere. The main tool of our analysis is the so-called ‘associated differential equation’ and we make use of a related theorem introduced by Kushner and Clark.

Original languageEnglish
Pages (from-to)1061-1078
Number of pages18
JournalInternational Journal of Systems Science
Volume18
Issue number6
DOIs
StatePublished - 1 Jan 1987
Externally publishedYes

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Theoretical Computer Science
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'Convergence analysis of smoothed stochastic gradient-type algorithm'. Together they form a unique fingerprint.

Cite this