A Stochastic Quasi-Newton Method for Large-Scale Nonconvex Optimization with Applications

Huiming Chen, Ho Chun Wu, Shing Chow Chan, Wong Hing Lam

Research output: Contribution to journalArticlepeer-review

30 Citations (Scopus)

Abstract

Ensuring the positive definiteness and avoiding ill conditioning of the Hessian update in the stochastic Broyden-Fletcher-Goldfarb-Shanno (BFGS) method are significant in solving nonconvex problems. This article proposes a novel stochastic version of a damped and regularized BFGS method for addressing the above problems. While the proposed regularized strategy helps to prevent the BFGS matrix from being close to singularity, the new damped parameter further ensures the positivity of the product of correction pairs. To alleviate the computational cost of the stochastic limited memory BFGS (LBFGS) updates and to improve its robustness, the curvature information is updated using the averaged iterate at spaced intervals. The effectiveness of the proposed method is evaluated through the logistic regression and Bayesian logistic regression problems in machine learning. Numerical experiments are conducted by using both synthetic data set and several real data sets. The results show that the proposed method generally outperforms the stochastic damped LBFGS (SdLBFGS) method. In particular, for problems with small sample sizes, our method has shown superior performance and is capable of mitigating ill-conditioned problems. Furthermore, our method is more robust to the variations of the batch size and memory size than the SdLBFGS method.

Original languageEnglish
Article number8946904
Pages (from-to)4776-4790
Number of pages15
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume31
Issue number11
DOIs
Publication statusPublished - Nov 2020
Externally publishedYes

Keywords

  • Damped parameter
  • limited memory BFGS (LBFGS)
  • nonconjugate exponential models
  • nonconvex optimization
  • stochastic quasi-Newton (SQN) method
  • variational inference

Fingerprint

Dive into the research topics of 'A Stochastic Quasi-Newton Method for Large-Scale Nonconvex Optimization with Applications'. Together they form a unique fingerprint.

Cite this