Publications

Detailed Information

Fast and Robust Online Inference with Stochastic Gradient Descent via Random Scaling

Cited 4 time in Web of Science Cited 9 time in Scopus
Authors

Lee, Sokbae; Liao, Yuan; Seo, Myung Hwan; Shin, Youngki

Issue Date
2022-06
Publisher
Association for the Advancement of Artificial Intelligence
Citation
Proceedings of the AAAI Conference on Artificial Intelligence, Vol.36, pp.7381-7389
Abstract
We develop a new method of online inference for a vector of parameters estimated by the Polyak-Ruppert averaging procedure of stochastic gradient descent (SGD) algorithms. We leverage insights from time series regression in econometrics and construct asymptotically pivotal statistics via random scaling. Our approach is fully operational with online data and is rigorously underpinned by a functional central limit theorem. Our proposed inference method has a couple of key advantages over the existing methods. First, the test statistic is computed in an online fashion with only SGD iterates and the critical values can be obtained without any resampling methods, thereby allowing for efficient implementation suitable for massive online data. Second, there is no need to estimate the asymptotic variance and our inference method is shown to be robust to changes in the tuning parameters for SGD algorithms in simulation experiments with synthetic data.
ISSN
2159-5399
URI
https://hdl.handle.net/10371/190033
Files in This Item:
There are no files associated with this item.
Appears in Collections:

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share