# jemdoc: menu{MENU}{pubs_nips2011.html}, showsource = Large-Scale Sparse Principal Component Analysis with Application to Text Data - *Download:* [Pubs/nips2011.zhang.elghaoui.pdf .pdf] - *Authors:* Youwei Zhang and Laurent El Ghaoui. - *Status:* Accepted for publication in /Proc. Advances in Neural Information Processing Systems (NIPS)/, December 2011. - *Abstract:* Sparse PCA provides a linear combination of small number of features that maxi- mizes variance across data. Although Sparse PCA has apparent advantages com- pared to PCA, such as better interpretability, it is generally thought to be compu- tationally much more expensive. In this paper, we demonstrate the surprising fact that sparse PCA can be easier than PCA in practice, and that it can be reliably applied to very large data sets. This comes from a rigorous feature elimination pre-processing result, coupled with the favorable fact that features in real-life data typically have exponentially decreasing variances, which allows for many features to be eliminated. We introduce a fast block coordinate ascent algorithm with much better computational complexity than the existing first-order ones. We provide ex- perimental results obtained on text corpora involving millions of documents and hundreds of thousands of features. These results illustrate how Sparse PCA can help organize a large corpus of text data in a user-interpretable way, providing an attractive alternative approach to topic models. #- *Related links:* - *Bibtex reference:* ~~~ {}{} @inproceedings{ZE:11, Author = {Youwei Zhang and Laurent {El Ghaoui}}, Booktitle = { Advances in Neural Information Processing Systems (NIPS)}, Title = {Large-Scale Sparse Principal Component Analysis with Application to Text Data}, Year = {2011}} } ~~~