Dimension Reduction
Dimension Reduction
Distributed matrix completion and robust factorization.
L. Mackey, A. Talwalkar and M. I. Jordan.
Journal of Machine Learning Research, 16, 913-960, 2015.
Iterative discovery of multiple alternative clustering views.
D. Niu, J. Dy, and M. I. Jordan.
IEEE Transactions on Pattern Analysis and Machine Intelligence,
36, 1340-1353, 2014.
Distributed low-rank subspace segmentation.
L. Mackey, A. Talwalkar, Y. Mu, S-F. Chang, and M. I. Jordan.
IEEE International Conference on Computer Vision (ICCV), Sydney, Australia, 2013.
Divide-and-conquer matrix factorization.
L. Mackey, A. Talwalkar and M. I. Jordan.
In P. Bartlett, F. Pereira, J. Shawe-Taylor and R. Zemel (Eds.)
Advances in Neural Information Processing Systems (NIPS) 25, 2012.
Dimensionality reduction for spectral clustering.
D. Niu, J. Dy, and M. I. Jordan.
In G. Gordon and D. Dunson (Eds.)
Proceedings of the Fourteenth Conference on Artificial Intelligence
and Statistics (AISTATS), Ft. Lauderdale, FL, 2011.
A unified probabilistic model for global and local unsupervised feature selection.
Y. Guan, J. Dy, and M. I. Jordan.
In L. Getoor and T. Scheffer (Eds.),
Proceedings of the 28th International Conference on Machine
Learning (ICML), Bellevue, WA, 2011.
Unsupervised kernel dimension reduction.
M. Wang, F. Sha, and M. I. Jordan.
In J. Shawe-Taylor, R. Zemel, J. Lafferty, and C. Williams (Eds.)
Advances in Neural Information Processing Systems (NIPS) 24, 2011.
Regularized discriminant analysis, ridge regression and beyond.
Z. Zhang, G. Dai, C. Xu, and M. I. Jordan.
Journal of Machine Learning Research, 11, 2141-2170, 2010.
Kernel dimension reduction in regression.
K. Fukumizu, F. R. Bach, and M. I. Jordan.
Annals of Statistics, 37, 1871-1905, 2009.
DiscLDA: Discriminative learning for dimensionality reduction and classification.
S. Lacoste-Julien, F. Sha, and M. I. Jordan.
In D. Koller, Y. Bengio, D. Schuurmans and L. Bottou (Eds.),
Advances in Neural Information Processing Systems (NIPS) 22, 2009.
Latent variable models for dimensionality reduction.
Z. Zhang and M. I. Jordan.
Proceedings of the Twelfth Conference on Artificial Intelligence
and Statistics (AISTATS), Clearwater Beach, FL, 2009.
Regression on manifolds using kernel dimension reduction.
J. Nilsson, F. Sha, and M. I. Jordan.
Proceedings of the 24th International Conference on Machine
Learning (ICML), 2007.
Semiparametric latent factor models.
Y. W. Teh, M. Seeger, and M. I. Jordan.
Proceedings of the Eighth Conference on Artificial Intelligence
and Statistics (AISTATS), 2005.
Dimensionality reduction for supervised learning with reproducing kernel
Hilbert spaces.
K. Fukumizu, F. R. Bach, and M. I. Jordan.
Journal of Machine Learning Research, 5, 73-79, 2004.
Kernel dimensionality reduction for supervised learning.
K. Fukumizu, F. R. Bach, and M. I. Jordan.
In S. Thrun, L. Saul, and B. Schoelkopf (Eds.),
Advances in Neural Information Processing Systems (NIPS) 17, 2004.
Beyond independent components: Trees and clusters.
F. R. Bach and M. I. Jordan.
Journal of Machine Learning Research, 4, 1205-1233, 2003.
Latent Dirichlet allocation.
D. M. Blei, A. Y. Ng, and M. I. Jordan.
Journal of Machine Learning Research, 3, 993-1022, 2003.
[C code].
Finding clusters in independent component analysis.
F. R. Bach and M. I. Jordan.
Fourth International Symposium on Independent Component Analysis
and Blind Signal Separation (ICA), 2003.
Tree-dependent component analysis.
F. R. Bach and M. I. Jordan.
In D. Koller and A. Darwiche (Eds)., Uncertainty in Artificial Intelligence (UAI),
Proceedings of the Eighteenth Conference, 2002.
Kernel independent component analysis.
F. R. Bach and M. I. Jordan. Journal of Machine Learning Research, 3, 1-48, 2002.
[Matlab code]