Bayesian Nonparametrics

Bayesian Nonparametrics

  • Evaluating sensitivity to the stick breaking prior in Bayesian nonparametrics. R. Liu, R. Giordano, M. I. Jordan, and T. Broderick. arxiv.org/abs/1810.06587, 2018.

  • Posteriors, conjugacy, and exponential families for completely random measures. T. Broderick, A. Wilson, and M. I. Jordan. Bernoulli, 24, 3181-3221.

  • Combinatorial clustering and the beta negative binomial process. T. Broderick, L. Mackey, J. Paisley and M. I. Jordan. IEEE Transactions on Pattern Analysis and Machine Intelligence, 37, 290-306, 2015.

  • Nested hierarchical Dirichlet processes. J. Paisley, C. Wang, D. Blei, and M. I. Jordan. IEEE Transactions on Pattern Analysis and Machine Intelligence, 37, 256-270, 2015.

  • Joint modeling of multiple time series via the beta process with application to motion capture segmentation. E. Fox, M. Hughes, E. Sudderth, and M. I. Jordan. Annals of Applied Statistics, 8, 1281-1313, 2014.

  • Mixed membership models for time series. E. Fox and M. I. Jordan. In E. Airoldi, D. Blei, E. A. Erosheva, and S. E. Fienberg (Eds.), Handbook of Mixed Membership Models and Their Applications, Chapman & Hall/CRC, 2014.

  • Mixed membership matrix factorization. L. Mackey, D. Weiss, and M. I. Jordan. In E. Airoldi, D. Blei, E. A. Erosheva, and S. E. Fienberg (Eds.), Handbook of Mixed Membership Models and Their Applications, Chapman & Hall/CRC, 2014.

  • Bayesian nonnegative matrix factorization with stochastic variational inference. J. Paisley, D. Blei, and M. I. Jordan. In E. Airoldi, D. Blei, E. A. Erosheva, and S. E. Fienberg (Eds.), Handbook of Mixed Membership Models and Their Applications, Chapman & Hall/CRC, 2014.

  • Matrix-variate Dirichlet process priors with applications. Z. Zhang, D. Wang, G. Dai, and M. I. Jordan. Journal of Machine Learning Research, 9, 259-286, 2014.

  • Clusters and features from combinatorial stochastic processes. T. Broderick, M. I. Jordan, and J. Pitman. Statistical Science, 28, 289-312, 2013.

  • MAD-Bayes: MAP-based asymptotic derivations from Bayes. T. Broderick, B. Kulis, and M. I. Jordan. In S. Dasgupta and D. McAllester (Eds.), Proceedings of the 30th International Conference on Machine Learning (ICML), Atlanta, GA, 2013. [Supplementary information].

  • Mixed membership models for time series. E. Fox and M. I. Jordan. arXiv:1309.3533, 2013.

  • Feature allocations, probability functions, and paintboxes. T. Broderick, J. Pitman, and M. I. Jordan. Bayesian Analysis, 8, 801-836, 2013.
  • Small-variance asymptotics for exponential family Dirichlet process mixture models. K. Jiang, B. Kulis, and M. I. Jordan. In P. Bartlett, F. Pereira, L. Bottou and C. Burges (Eds.), Advances in Neural Information Processing Systems (NIPS) 26, 2013.

  • Revisiting k-means: New algorithms via Bayesian nonparametrics. B. Kulis and M. I. Jordan. In J. Langford and J. Pineau (Eds.), Proceedings of the 29th International Conference on Machine Learning (ICML), Edinburgh, UK, 2012.

  • Beta processes, stick-breaking, and power laws. T. Broderick, M. I. Jordan and J. Pitman. Bayesian Analysis, 7, 439-476, 2012.

  • Stick-breaking beta processes and the Poisson process. J. Paisley, D. Blei, and M. I. Jordan. In N. Lawrence and M. Girolami (Eds.), Proceedings of the Fifteenth Conference on Artificial Intelligence and Statistics (AISTATS), Canary Islands, Spain, 2012.

  • Bayesian bias mitigation for crowdsourcing. F. L. Wauthier and M. I. Jordan. In P. Bartlett, F. Pereira, J. Shawe-Taylor and R. Zemel (Eds.) Advances in Neural Information Processing Systems (NIPS) 25, 2012.

  • Nonparametric combinatorial sequence models. F. Wauthier, M. I. Jordan, and N. Jojic. 15th Annual International Conference on Research in Computational Molecular Biology (RECOMB), Vancouver, BC, 2011.

  • A sticky HDP-HMM with application to speaker diarization. E. Fox, E. Sudderth, M. I. Jordan, and A. Willsky. Annals of Applied Statistics, 5, 1020-1056, 2011.

  • Nonparametric Bayesian co-clustering ensembles. P. Wang, K. B. Laskey, C. Domeniconi, and M. I. Jordan. SIAM International Conference on Data Mining (SDM), Phoenix, AZ, 2011.

  • Tree-structured stick breaking for hierarchical data. R. Adams, Z. Ghahramani, and M. I. Jordan. In J. Shawe-Taylor, R. Zemel, J. Lafferty, and C. Williams (Eds.) Advances in Neural Information Processing Systems (NIPS) 24, 2011.

  • Heavy-tailed processes for selective shrinkage. F. Wauthier and M. I. Jordan. In J. Shawe-Taylor, R. Zemel, J. Lafferty, and C. Williams (Eds.) Advances in Neural Information Processing Systems (NIPS) 24, 2011.

  • Learning low-dimensional signal models. L. Carin, R. G. Baraniuk, V. Cevher, D. Dunson, M. I. Jordan, G. Sapiro, and M. B. Wakin. IEEE Signal Processing Magazine, 28, 39-51, 2011.

  • Bayesian nonparametric inference of switching linear dynamical models. E. Fox, E. Sudderth, M. I. Jordan, and A. Willsky. IEEE Transactions on Signal Processing, 59, 1569-1585, 2011.

  • Hierarchical models, nested models and completely random measures. M. I. Jordan. In M.-H. Chen, D. Dey, P. Mueller, D. Sun, and K. Ye (Eds.), Frontiers of Statistical Decision Making and Bayesian Analysis: In Honor of James O. Berger, New York: Springer, 2010.

  • Bayesian nonparametric learning: Expressive priors for intelligent systems. M. I. Jordan. In R. Dechter, H. Geffner, and J. Halpern (Eds.), Heuristics, Probability and Causality: A Tribute to Judea Pearl, College Publications, 2010.

  • Bayesian nonparametric methods for learning Markov switching processes. E. Fox, E. Sudderth, M. I. Jordan, and A. Willsky. IEEE Signal Processing Magazine, 27, 43-54, 2010.

  • Hierarchical Bayesian nonparametric models with applications. Y. W. Teh and M. I. Jordan. In N. Hjort, C. Holmes, P. Mueller, and S. Walker (Eds.), Bayesian Nonparametrics: Principles and Practice, Cambridge, UK: Cambridge University Press, 2010.

  • Probabilistic grammars and hierarchical Dirichlet processes. P. Liang, M. I. Jordan, and D. Klein. In T. O'Hagan and M. West (Eds.), The Handbook of Applied Bayesian Analysis, Oxford University Press, 2010.

  • Sharing features among dynamical systems with beta processes. E. Fox, E. Sudderth, M. I. Jordan, and A. S. Willsky. In Y. Bengio, D. Schuurmans, J. Lafferty and C. Williams (Eds.) Advances in Neural Information Processing Systems (NIPS) 23, 2010.

  • Nonparametric latent feature models for link prediction. K. Miller, T. Griffiths, and M. I. Jordan. In Y. Bengio, D. Schuurmans, J. Lafferty and C. Williams (Eds.) Advances in Neural Information Processing Systems (NIPS) 23, 2010.

  • The nested Chinese restaurant process and Bayesian inference of topic hierarchies. D. M. Blei, T. Griffiths, and M. I. Jordan. Journal of the ACM, 57, 1-30, 2010. [Software].

  • Shared segmentation of natural scenes using dependent Pitman-Yor processes. E. Sudderth and M. I. Jordan. In D. Koller, Y. Bengio, D. Schuurmans and L. Bottou (Eds.), Advances in Neural Information Processing Systems (NIPS) 22, 2009.

  • Nonparametric Bayesian identification of jump systems with sparse dependencies. E. Fox, E. Sudderth, M. I. Jordan, and A. Willsky. 15th IFAC Symposium on System Identification (SYSID), St. Malo, France, 2009.

  • Posterior consistency of the Silverman g-prior in Bayesian model choice. Z. Zhang and M. I. Jordan. In D. Koller, Y. Bengio, D. Schuurmans and L. Bottou (Eds.), Advances in Neural Information Processing Systems (NIPS) 22, 2009.

  • Nonparametric Bayesian learning of switching linear dynamical systems. E. B. Fox, E. Sudderth, M. I. Jordan, and A. S. Willsky. In D. Koller, Y. Bengio, D. Schuurmans and L. Bottou (Eds.), Advances in Neural Information Processing Systems (NIPS) 22, 2009.

  • The phylogenetic Indian buffet process: A non-exchangeable nonparametric prior for latent features. K. Miller, T. Griffiths and M. I. Jordan. In Uncertainty in Artificial Intelligence (UAI), Proceedings of the Twenty-Fourth Conference, 2008.

  • An HDP-HMM for systems with state persistence. E. Fox, E. Sudderth, M. I. Jordan, and A. Willsky. Proceedings of the 25th International Conference on Machine Learning (ICML), 2008. [Long version].

  • The infinite PCFG using hierarchical Dirichlet processes. P. Liang, S. Petrov, M. I. Jordan, and D. Klein. Empirical Methods in Natural Language Processing (EMNLP), 2007.

  • A permutation-augmented sampler for DP mixture models. P. Liang, M. I. Jordan, and B. Taskar. Proceedings of the 24th International Conference on Machine Learning (ICML), 2007.

  • Hierarchical beta processes and the Indian buffet process. R. Thibaux, and M. I. Jordan. Proceedings of the Conference on Artificial Intelligence and Statistics (AISTATS), 2007.

  • Learning multiscale representations of natural scenes using Dirichlet processes. J. J. Kivinen, E. B. Sudderth, and M. I. Jordan. IEEE International Conference on Computer Vision (ICCV), 2007.

  • Bayesian haplotype inference via the Dirichlet process. E. P. Xing, M. I. Jordan and R. Sharan. Journal of Computational Biology, 14, 267-284, 2007.

  • Image denoising with nonparametric hidden Markov trees. J. J. Kivinen, E. B. Sudderth, and M. I. Jordan. IEEE International Conference on Image Processing (ICIP), 2007.

  • Hierarchical Dirichlet processes. Y. W. Teh, M. I. Jordan, M. J. Beal and D. M. Blei. Journal of the American Statistical Association, 101, 1566-1581, 2006. [Software]

  • Bayesian multi-population haplotype inference via a hierarchical Dirichlet process mixture. E. P. Xing, K.-A. Song, M. I. Jordan, and Y. W. Teh. Proceedings of the 23rd International Conference on Machine Learning (ICML), 2006.

  • Bayesian multicategory support vector machines. Z. Zhang, and M. I. Jordan. In Uncertainty in Artificial Intelligence (UAI), Proceedings of the Twenty-Second Conference, 2006.

  • Nonparametric empirical Bayes for the Dirichlet process mixture model. J. D. McAuliffe, D. M. Blei and M. I. Jordan. Statistics and Computing, 16, 5-14, 2006.

  • Variational inference for Dirichlet process mixtures. D. M. Blei and M. I. Jordan. Bayesian Analysis, 1, 121-144, 2005.

  • Dirichlet processes, Chinese restaurant processes and all that. M. I. Jordan. Tutorial presentation at the NIPS Conference, 2005.

  • Gaussian processes and the null-category noise model. N. D. Lawrence and M. I. Jordan. In O. Chapelle, B. Schoelkopf & A. Zien (Eds), Semi-Supervised Learning, Cambridge, MA: MIT Press, 2005.

  • Semi-supervised learning via Gaussian processes. N. D. Lawrence and M. I. Jordan. In L. Saul, Y. Weiss, and L. Bottou (Eds.), Advances in Neural Information Processing Systems (NIPS) 18, 2005.

  • Sharing clusters among related groups: Hierarchical Dirichlet processes. Y. W. Teh, M. I. Jordan, M. J. Beal and D. M. Blei. In L. Saul, Y. Weiss, and L. Bottou (Eds.), Advances in Neural Information Processing Systems (NIPS) 18, 2005. [Long version]. [Software]

  • Semiparametric latent factor models. Y. W. Teh, M. Seeger, and M. I. Jordan. In press, Proceedings of the Conference on Artificial Intelligence and Statistics (AISTATS), 2004.

  • Hierarchical topic models and the nested Chinese restaurant process. D. M. Blei, T. Griffiths, M. I. Jordan, and J. Tenenbaum. In S. Thrun, L. Saul, and B. Schoelkopf (Eds.), Advances in Neural Information Processing Systems (NIPS) 17, 2004.

  • Variational methods for the Dirichlet process. D. M. Blei and M. I. Jordan. Proceedings of the 21st International Conference on Machine Learning (ICML), 2004. [Long version].

  • Bayesian haplotype inference via the Dirichlet process. E. P. Xing, R. Sharan, and M. I. Jordan. Proceedings of the 21st International Conference on Machine Learning (ICML), 2004.