A Greedy Anytime Algorithm for Sparse PCA

Guy Holtzman, Adam Soffer, Dan Vilenchik

Research output: Contribution to conferencePaperpeer-review

Abstract

The taxing computational effort that is involved in solving some high-dimensional statistical problems, in particular problems involving non-convex optimization, has popularized the development and analysis of algorithms that run efficiently (polynomial-time) but with no general guarantee on statistical consistency. In light of the ever-increasing compute power and decreasing costs, a more useful characterization of algorithms is by their ability to calibrate the invested computational effort with various characteristics of the input at hand and with the available computational resources. We propose a new greedy algorithm for the ℓ0-sparse PCA problem which supports the calibration principle. We provide both a rigorous analysis of our algorithm in the spiked covariance model, as well as simulation results and comparison with other existing methods. Our findings show that our algorithm recovers the spike in SNR regimes where all polynomial-time algorithms fail while running in a reasonable parallel-time on a cluster.
Original languageEnglish
Pages1939-1956
Number of pages18
StatePublished - 2020
EventThe 33rd Annual Conference on Learning Theory (COLT 2020) - Graz, Austria
Duration: 9 Jul 202012 Jul 2020
Conference number: 33th
http://learningtheory.org/colt2020/cfp.html

Conference

ConferenceThe 33rd Annual Conference on Learning Theory (COLT 2020)
Country/TerritoryAustria
CityGraz
Period9/07/2012/07/20
Internet address

Fingerprint

Dive into the research topics of 'A Greedy Anytime Algorithm for Sparse PCA'. Together they form a unique fingerprint.

Cite this