Publications
2024
-
Functional protein mining with conformal guarantees.
R. S. Boger, S. Chithrananda, A. N. Angelopoulos, P. H. Yoon,
M. I. Jordan, and J. A. Doudna.
Nature Communications, 16, 85, 2025.
-
Relying on the metrics of evaluated agents.
S. Wang, M. I. Jordan, K. Ligett, and R. P. McAfee.
arxiv.org/abs/2402.14005, 2024.
-
Desiderata for representation learning: A causal perspective.
Y. Wang and M. I. Jordan.
Journal of Machine Learning Research, 25, 1-65, 2024.
-
A continuous-time perspective on optimal methods for monotone equation problems.
T. Lin and M. I. Jordan.
Communications in Optimization Theory, 40, 1-25, 2024.
-
Enhancing feature-specific data protection via Bayesian coordinate differential privacy.
M. Aliakbarpour, S. Chaudhuri, T. Courtade, A. Fallah, and M. I. Jordan.
arXiv:2410.18404, 2024.
-
An optimistic algorithm for online convex optimization with adversarial constraints.
J. Lekeufack, and M. I. Jordan.
arXiv:2412.08060, 2024.
-
Active-dormant attention heads: Mechanistically demystifying extreme-token phenomena in LLMs.
T. Guo, D. Pai, Y. Bai, J. Jiao, M. I. Jordan, and S. Mei.
arXiv:2410.13835, 2024.
-
Learning variational inequalities from data: Fast generalization rates under strong monotonicity.
E. Zhao, T. Chavdarova, and M. I. Jordan.
arxiv.org/abs/2410.20649, 2024.
-
Unravelling in collaborative learning.
A. Capitaine, E. Boursier, A. Scheid, E. Moulines,
M. I. Jordan, E.-M. El Mhamdi, and A. Durmus.
In A. Fan, C. Zhang, D. Belgrave, J. Tomczak, and U. Paquet (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 37, 2024.
-
Fairness-aware meta-learning via Nash bargaining.
Y. Zeng, X. Yang, L. Chen, C. C. Ferrer, M. Jin, M. I. Jordan, and R. Jia.
In A. Fan, C. Zhang, D. Belgrave, J. Tomczak, and U. Paquet (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 37, 2024.
-
Fair allocation in dynamic mechanism design.
A. Fallah, M. I. Jordan, and A. Ulichney.
In A. Fan, C. Zhang, D. Belgrave, J. Tomczak, and U. Paquet (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 37, 2024.
-
DAVED: Data acquisition via experimental design for decentralized data markets.
C. Lu, B. Huang, S. P. Karimireddy, P. Vepakomma, M. I. Jordan, and R. Raskar.
In A. Fan, C. Zhang, D. Belgrave, J. Tomczak, and U. Paquet (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 37, 2024.
-
Dimension-free private mean estimation for anisotropic distributions.
Y. Dagan, M. I. Jordan, X. Yang, L. Zakynthinou, and N. Zhivotovskiy.
In A. Fan, C. Zhang, D. Belgrave, J. Tomczak, and U. Paquet (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 37, 2024.
-
Learning to mitigate externalities: The Coase theorem with hindsight rationality.
A. Scheid, A. Capitaine, E. Boursier, E. Moulines, M. I. Jordan, and A. Durmus.
In A. Fan, C. Zhang, D. Belgrave, J. Tomczak, and U. Paquet (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 37, 2024.
-
Towards a theoretical understanding of the 'reversal curse' via training dynamics.
H. Zhu, B. Huang, S. Zhang, M. I. Jordan, J. Jiao, Y. Tian, and S. Russell.
In A. Fan, C. Zhang, D. Belgrave, J. Tomczak, and U. Paquet (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 37, 2024.
-
Safety vs. performance: How multi-objective learning reduces barriers to market entry.
M. Jagadeesan, M. I. Jordan, and J. Steinhardt.
arXiv:2409.03734, 2023.
-
Valid inference after causal discovery.
P. Gradu, T. Zrnic, Y. Wang, and M. I. Jordan.
Journal of the American Statistical Association, 1-12,
https://doi.org/10.1080/01621459.2024.2402089, 2024.
-
Amid advancement, apprehension, and ambivalence: AI in the human ecosystem.
F. Berman, D. Banks, M. I. Jordan, S. Leonelli, and M. Minow.
Harvard Data Science Review, https://doi.org/10.1162/99608f92.2be2c754, 2024.
-
Automatically adaptive conformal risk control.
V. Blot, A. Angelopoulos, M. I. Jordan, and N. Brunel.
arxiv.org/abs/2406.17819, 2024.
-
Incentive-aware recommender systems in two-sided markets.
X. Dai, Xu, W. Y. Qi, and M. I. Jordan.
ACM Transactions on Recommender Systems, to appear.
-
Privacy can arise endogenously in an economic system with learning agents.
T. Ding, N. Ananthakrishnan, M. Werner, P. Karimireddy, and M. I. Jordan.
Symposium on Foundations of Responsible Computing (FORC), 2024.
-
Contract design with safety inspections.
A. Fallah, and M. I. Jordan.
ACM Conference on Economics and Computation (EC), New Haven, CT, 2024.
-
Chatbot Arena: An open platform for evaluating LLMs by human preference.
W.-L. Chiang, L. Zheng, Y. Sheng, A. N. Angelopoulos, T. Li, D. Li, H. Zhang, B. Zhu, M. I. Jordan, J. Gonzalez, and I. Stoica.
In A. Weller, K. Heller, N. Oliver and Z. Kolter (Eds.),
International Conference on Machine Learning (ICML), 2024.
-
Incentivized learning in principal-agent bandit games.
A. Scheid, D. Tiapkin, E. Boursier, A. Capitaine, E.-M. El Mhamdi, E. Moulines,
M. I. Jordan, and A. Durmus.
In A. Weller, K. Heller, N. Oliver and Z. Kolter (Eds.),
International Conference on Machine Learning (ICML), 2024.
-
Iterative data smoothing: Mitigating reward overfitting and overoptimization in RLHF.
B. Zhu, M. I. Jordan, and J. Jiao.
In A. Weller, K. Heller, N. Oliver and Z. Kolter (Eds.),
International Conference on Machine Learning (ICML), 2024.
-
Collaborative heterogeneous causal inference beyond meta-analysis.
T. Guo, P. Karimireddy, and M. I. Jordan. (2024).
In A. Weller, K. Heller, N. Oliver and Z. Kolter (Eds.),
International Conference on Machine Learning (ICML), 2024.
-
AutoEval done right: Using synthetic data for model evaluation.
P. Boyeau, A. N. Angelopoulos, N. Yosef, J. Malik, and M. I. Jordan.
arxiv.org/abs/2403.07008, 2024.
-
On three-layer data markets.
A. Fallah, M. I. Jordan, A. Makhdoumi, and A. Malekian.
arxiv.org/abs/2402.09697, 2024.
-
The limits of price discrimination under privacy constraints.
A. Fallah, M. I. Jordan, A. Makhdoumi, and A. Malekian.
arxiv.org/abs/2402.08223, 2024.
-
Conformal triage for medical imaging AI deployment.
A. Angelopoulos, S. Pomerantz, S. Do, S. Bates, C. Bridge, D. Elton, M. Lev, R. G. Gonzalez,
M. I. Jordan, and J. Malik.
medrxiv.org/content/10.1101/2024.02.09.24302543v1, 2024.
-
Perseus: A simple high-order regularization method for variational inequalities.
T. Lin and M. I. Jordan.
Mathematical Programming, https://doi.org/10.1007/s10107-024-02075-2, 2024.
-
On learning rates and Schrödinger operators.
B. Shi, W. Su, and M. I. Jordan.
Journal of Machine Learning Research, 24, 1-53, 2024.
-
Instance-dependent confidence and early stopping for reinforcement learning.
K. Khamaru, E. Xia, M. Wainwright, and M. I. Jordan.
Journal of Machine Learning Research, 24, 1-43, 2024.
-
Conformal decision theory: Safe autonomous decisions from imperfect predictions.
J. Lekeufack, A. Angelopoulos, A. Bajcsy, M. I. Jordan, and J. Malik.
IEEE International Conference on Robotics and Automation (ICRA), Yokohama, Japan, 2024.
-
Towards optimal statistical watermarking.
B. Huang, H. Zhu, B. Zhu, K. Ramchandran, M. I. Jordan, J. Lee, and J. Jiao.
arxiv.org/abs/2312.07930, 2024.
-
Operationalizing counterfactual metrics: Incentives, ranking, and information asymmetry.
S. Wang, S. Bates, P. M. Aronow, and M. I. Jordan.
Proceedings of the Twenty-Seventh Conference on Artificial Intelligence and
Statistics (AISTATS), 2024.
-
Classifier calibration with ROC-regularized isotonic regression.
E. Berta, F. Bach, and M. I. Jordan.
Proceedings of the Twenty-Seventh Conference on Artificial Intelligence and
Statistics (AISTATS), 2024.
-
Delegating data collection in decentralized machine learning.
N. Ananthakrishnan, S. Bates, M. I. Jordan, and N. Haghtalab.
Proceedings of the Twenty-Seventh Conference on Artificial Intelligence and
Statistics (AISTATS), 2024.
-
A specialized semismooth Newton method for kernel-based optimal transport.
T. Lin, M. Cuturi, and M. I. Jordan.
Proceedings of the Twenty-Seventh Conference on Artificial Intelligence and
Statistics (AISTATS), 2024.
-
A primal-dual method for solving variational inequalities with general constraints.
T. Chavdarova, M. Pagliardini, T. Yang, and M. I. Jordan.
International Conference on Learning Representations (ICLR), 2024.
-
A diffusion process perspective on posterior contraction rates for parameters.
W. Mou, N. Ho, M. Wainwright, P. Bartlett, and M. I. Jordan.
SIAM Journal on the Mathematics of Data Science, to appear.
-
Adaptive, doubly optimal no-regret learning in games with gradient feedback.
M. I. Jordan, T. Lin, and Z. Zhou.
Operations Research, to appear.
-
Reinforcement learning with heterogeneous data: Estimation and inference.
E. Chen, R. Song, and M. I. Jordan.
Journal of the American Statistical Association, to appear.
2023
-
Prediction-powered inference.
A. Angelopoulos, S. Bates, C. Fannjiang, M. I. Jordan, and T. Zrnic.
Science, 382, 669-674, 2023.
[arXiv version]
-
Post-selection inference via algorithmic stability.
T. Zrnic and M. I. Jordan.
Annals of Statistics, 51, 1666-1691, 2023.
-
A quadratic speedup in finding Nash equilibria of quantum zero-sum games.
F. Vasconcelos, E.-V. Vlatakis-Gkaragkounis, P. Mertikopoulos, G. Piliouras, and M. I. Jordan.
Conference on Quantum Techniques in Machine Learning, 2023.
-
A unifying perspective on multi-calibration: Game dynamics for multi-objective learning.
N. Haghtalab, M. I. Jordan, and E. Zhao.
In A. Globerson, K. Saenko, M. Hardt, and S. Levine (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 36, 2023.
-
Improved Bayes risk can yield reduced social welfare under competition.
M. Jagadeesan, M. I. Jordan, J. Steinhardt, and N. Haghtalab.
In A. Globerson, K. Saenko, M. Hardt, and S. Levine (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 36, 2023.
-
Class-conditional conformal prediction with many classes.
T. Ding, A. Angelopoulos, S. Bates, M. I. Jordan, and R. Tibshirani.
In A. Globerson, K. Saenko, M. Hardt, and S. Levine (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 36, 2023.
-
Optimal extragradient-based algorithms for stochastic variational inequalities
with separable structure.
A. Yuan, J. Li, G. Gidel, M. I. Jordan, Q. Gu, and S. Du.
In A. Globerson, K. Saenko, M. Hardt, and S. Levine (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 36, 2023.
-
On learning necessary and sufficient causal graphs.
H. Cai, Y. Wang, M. I. Jordan, and R. Song.
In A. Globerson, K. Saenko, M. Hardt, and S. Levine (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 36, 2023.
-
Doubly robust self-training.
B. Zhu, M. Ding, P. Jacobson, M. Wu, W. Zhan, M. I. Jordan, and J. Jiao.
In A. Globerson, K. Saenko, M. Hardt, and S. Levine (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 36, 2023.
-
On optimal caching and model multiplexing for large model inference.
B. Zhu, Y. Sheng, L. Zheng, C. Barett, M. I. Jordan, and J. Jiao.
In A. Globerson, K. Saenko, M. Hardt, and S. Levine (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 36, 2023.
-
Skilful nowcasting of extreme precipitation with NowcastNet.
Y. Zhang, M. Long, K. Chen, L. Xing, R. Jin, M. I. Jordan, and J. Wang.
Nature, 619(7970), 526-532, 2023.
-
A gentle introduction to gradient-based optimization and variational inequalities for
machine learning.
N. Wadia, Y. Dandi, and M. I. Jordan.
arxiv.org/abs/2309.04877, 2023.
-
Incentive-theoretic Bayesian inference for collaborative science.
S. Bates, M. I. Jordan, M. Sklar, and J. A. Soloff.
arxiv.org/abs/2307.03748, 2023.
-
Scaff-PD: Communication-efficient fair and robust federated learning.
Y. Yu, S. P. Karimireddy, Y. Ma, and M. I. Jordan.
arxiv:2307.13381, 2023.
-
Curvature-independent last-iterate convergence for games on Riemannian manifold.
Y. Cai, M. I. Jordan, T. Lin, A. Oikonomou, and E.-V. Vlatakis-Gkaragkounis.
arXiv:2306.16617, 2023.
-
Accelerating inexact hypergradient descent for bilevel optimization.
H. Yang, L. Luo, J. Li, and M. I. Jordan.
arXiv:2307.00126, 2023.
-
Provably personalized and robust federated learning.
M. Werner, L. He, S. P. Karimireddy, M. I. Jordan, and M. Jaggi.
Transactions of Machine Learning Research, 2023.
-
Evaluating and incentivizing diverse data contributions in collaborative learning.
B. Huang, S. P. Karimireddy, and M. I. Jordan.
arXiv:2306.05592, 2023.
-
Incentivizing high-quality content in online recommender systems.
X. Hu, M. Jagadeesan, M. I. Jordan, and Jacob Steinhardt.
arXiv:2306.07479, 2023.
-
Fine-tuning language models with advantage-induced policy alignment.
B. Zhu, H. Sharma, F. Vieira Frujeri, S. Ding, C. Zhu, M. I. Jordan, and J. Jiao.
arXiv:2306.02231, 2023.
-
On optimal caching and model multiplexing for large model inference.
B. Zhu, Y. Sheng, L. Zheng, C. Barrett, M. I. Jordan, and J. Jiao.
arXiv:2306.02003, 2023.
-
Deterministic nonsmooth nonconvex optimization.
M. I. Jordan, G. Kornowski, T. Lin, O. Shamir, and E. Zampetakis.
In G. Neu and L. Rosasco (Eds.),
Proceedings of the Thirty-Sixth Conference on Learning Theory (COLT),
Bengalaru, India, 2023.
-
Online learning in a creator economy.
B. Zhu, S. P. Karimireddy, J. Jiao, and M. I. Jordan.
arXiv:2305.11381, 2023.
-
The sample complexity of online contract design.
B. Zhu, S. Bates, Z. Yang, Y. Wang, J. Jiao, and M. I. Jordan.
In J. Hartline and L. Samuelson (Eds.),
ACM Conference on Economics and Computation (EC), London, UK, 2023.
-
Last-iterate convergence of saddle point optimizers via high-resolution differential equations.
T. Chavdarova, M. I. Jordan, and E. Zampetakis.
Minimax Theory and its Applications, 8, 333-380, 2023.
-
Bayesian robustness: A nonasymptotic viewpoint.
K. Bhatia, Y-A. Ma, A. Dragan, P. Bartlett, and M. I. Jordan.
Journal of the American Statistical Association,
doi.org/10.1080/01621459.2023.2174121, 2023.
-
Recommendation systems with distribution-free reliability guarantees.
A. Angelopoulos, K. Krauth, S. Bates, Y. Wang, and M. I. Jordan.
In H. Papadopoulos and K. An (Eds.),
12th Symposium on Conformal and Probabilistic Prediction with Applications (COPA),
Limassol, Cyprus, 2023. [Alexey Chervonenkis Best Paper Award].
-
Federated conformal predictors for distributed uncertainty quantification.
C. Lu, Y. Yu, S. P. Karimireddy, M. I. Jordan, and R. Raskar.
In B. Engelhardt, E. Brunskill, and K. Cho (Eds.),
International Conference on Machine Learning (ICML), 2023.
-
Principled reinforcement learning with human feedback from pairwise or K-wise comparisons.
B. Zhu, J. Jiao, and M. I. Jordan.
In B. Engelhardt, E. Brunskill, and K. Cho (Eds.),
International Conference on Machine Learning (ICML), 2023.
-
Nesterov meets optimism: Rate-optimal optimistic-gradient-based method for
stochastic bilinearly-coupled minimax optimization.
J. Li, A. Yuan, G. Gidel, and M. I. Jordan.
In B. Engelhardt, E. Brunskill, and K. Cho (Eds.),
International Conference on Machine Learning (ICML), 2023.
-
Nonconvex stochastic scaled-gradient descent and generalized eigenvector problems.
J. Li and M. I. Jordan.
In R. Evans and I. Shpitser (Eds.), Uncertainty in Artificial Intelligence (UAI),
Proceedings of the Thirty-Ninth Conference, Pittsburgh, PA, 2023.
-
Cilantro: A framework for performance-aware resource allocation for general
objectives via online feedback.
R. Bhardwaj, K. Kandasamy, A. Biswal, W. Guo, B. Hindman, J. Gonzalez, M. I. Jordan, and I. Stoica.
17th USENIX Symposium on Operating Systems Design and Implementation (OSDI),
Boston, MA, 2023.
-
MultiVI: deep generative model for the integration of multi-modal data.
T. Ashuach, M. Gabitto, R. Koodli, M. I. Jordan, G.-A. Saldi, and N. Yosef.
Nature Methods, 20, 1222–1231, 2023.
-
Accelerated first-order optimization under nonlinear constraints.
M. Muehlebach and M. I. Jordan.
arXiv:2302.00316, 2023.
-
An empirical Bayes method for differential expression analysis of
single cells with deep generative models.
P. Boyeau, J. Regier, A. Gayoso, M. I. Jordan, R. Lopez, and N. Yosef.
Proceedings of the National Academy of Sciences, 10.1073/pnas.2209124120, 2023.
-
VCG mechanism design with unknown agent values under stochastic bandit feedback.
K. Kandasamy, J. Gonzalez, M. I. Jordan, and I. Stoica.
Journal of Machine Learning Research, 24, 1-45, 2023.
-
A Bayesian perspective on convolutional neural networks through a
deconvolutional generative model.
T. Nguyen, N. Ho, A. Patel, A. Anandkumar, M. I. Jordan, and R. G. Baraniuk.
Journal of Machine Learning Research, to appear.
-
Online learning in Stackelberg games with an omniscient follower.
G. Zhao, B. Zhu, J. Jiao, and M. I. Jordan.
arXiv:2301.11518, 2023.
-
Neural dependencies emerging from learning massive categories.
R. Feng, K. Zheng, K. Zhu, Y. Shen, J. Zhao, Y. Huang, D. Zhao, J. Zhou, M. I. Jordan, and Z-J. Zha.
arXiv:2301.12339, 2023.
-
On the complexity of deterministic nonsmooth and nonconvex optimization.
M. I. Jordan, T. Lin, and E. Zampetakis.
arXiv:2301.12463, 2023.
-
Competition, alignment, and equilibria in digital marketplaces.
M. Jagadeesan, M. I. Jordan, and N. Haghtalab.
Thirty-Seventh AAAI Conference on Artificial Intelligence (AAAI-23), 2023.
-
First-order algorithms for nonlinear generalized Nash equilibrium problems.
M. I. Jordan, T. Lin, and E. Zampetakis.
Journal of Machine Learning Research, 24, 1-46, 2023.
-
Learning equilibria in matching markets with bandit feedback.
M. Jagadeesan, A. Wei, Y. Wang, M. I. Jordan, and J. Steinhardt.
Journal of the ACM, https://doi.org/10.1145/3583681, 2023.
-
Solving constrained variational inequalities via a first-order interior point-based method.
T. Yang, M. I. Jordan, and T. Chavdarova.
International Conference on Learning Representations (ICLR), 2023.
-
A general framework for sample-efficient function approximation in reinforcement learning.
Z. Chen, J. Li, A. Yuan, Q. Gu, and M. I. Jordan.
International Conference on Learning Representations (ICLR), 2023.
-
Modeling content creator incentives on algorithm-curated platforms.
J. Hron, K. Krauth, N. Kilbertus, M. I. Jordan, and S. Dean.
International Conference on Learning Representations (ICLR), 2023.
-
Byzantine-robust federated learning with optimal statistical rates and privacy guarantees.
B. Zhu, L. Wang, Q. Pang, J. Jiao, D. Song, and M. I. Jordan.
Proceedings of the Twenty-Sixth Conference on Artificial Intelligence and
Statistics (AISTATS), 2023.
-
A statistical analysis of Polyak-Ruppert-averaged Q-learning.
X. Li, W. Yang, X. Liang, Z. Zhang, and M. I. Jordan.
Proceedings of the Twenty-Sixth Conference on Artificial Intelligence and
Statistics (AISTATS), 2023.
-
Finding regularized competitive equilibria of heterogeneous agent macroeconomic
models via reinforcement learning.
R. Xu, Y. Min, T. Wang, M. I. Jordan, Z. Wang, and Z. Yang.
Proceedings of the Twenty-Sixth Conference on Artificial Intelligence and
Statistics (AISTATS), 2023.
-
An instance-dependent analysis for the cooperative multi-player multi-armed bandit.
A. Pacchiano, P. Bartlett, and M. I. Jordan.
Algorithmic Learning Theory (ALT), 2023.
-
Evaluating sensitivity to the stick-breaking prior in Bayesian nonparametrics.
R. Giordano, R. Liu, M. I. Jordan, and T. Broderick.
Bayesian Analysis, 18, 287-366, 2023.
-
Can reinforcement learning find Stackelberg-Nash equilibria in general-sum
Markov games with myopic followers?
H. Zhong, Z. Yang, Z. Wang, and M. I. Jordan.
Journal of Machine Learning Research, \emph{24}, 1−52, 2023.
-
Monotone inclusions, acceleration and closed-loop control.
T. Lin and M. I. Jordan.
Mathematics of Operations Research, https://doi.org/10.1287/moor.2022.1343, 2023.
-
Provably efficient reinforcement learning with linear function approximation.
C. Jin, Z. Yang, Z. Wang, and M. I. Jordan.
Mathematics of Operations Research, https://doi.org/10.1287/moor.2022.1309, 2023.
-
Local exchangeability.
T. Campbell, S. Syed, C.-Y. Yang, M. I. Jordan, and T. Broderick.
Bernoulli, 29, 2084-2100, 2023.
2022
-
Empirical Gateaux derivatives for causal inference.
M. I. Jordan, Y. Wang, and A. Zhou.
In A. Agarwal, A. Oh, D. Belgrave, and and K. Cho (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 35. 2022.
-
On-demand sampling: Learning optimally from multiple distributions.
N. Haghtalab, M. I. Jordan, and E. Zhao.
In A. Agarwal, A. Oh, D. Belgrave, and and K. Cho (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 35. 2022.
[Outstanding Paper Award].
-
Learning two-player Markov games: Neural function approximation and correlated equilibrium.
J. Li, D. Zhou, Q. Gu, and M. I. Jordan.
In A. Agarwal, A. Oh, D. Belgrave, and and K. Cho (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 35. 2022.
-
Off-policy evaluation with policy-dependent optimization response.
W. Guo, M. I. Jordan, and A. Zhou.
In A. Agarwal, A. Oh, D. Belgrave, and and K. Cho (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 35. 2022.
-
Learn to match with no regret: Reinforcement learning in Markov matching markets.
Y. Min, T. Wang, R. Xu, Z. Wang, M. I. Jordan, and Y. Wang.
In A. Agarwal, A. Oh, D. Belgrave, and and K. Cho (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 35. 2022.
-
First-order algorithms for min-max optimization in geodesic metric spaces.
M. I. Jordan, T. Lin, and E-V. Vlatakis-Gkaragkounis.
In A. Agarwal, A. Oh, D. Belgrave, and and K. Cho (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 35. 2022.
-
Gradient-free methods for deterministic and stochastic nonsmooth nonconvex optimization.
T. Lin, Z. Zheng, and M. I. Jordan.
In A. Agarwal, A. Oh, D. Belgrave, and and K. Cho (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 35. 2022.
-
TCT: Convexifying federated learning using bootstrapped neural tangent kernels.
Y. Yu, A. Wei, S. P. Karimireddy, Y. Ma, and M. I. Jordan.
In A. Agarwal, A. Oh, D. Belgrave, and and K. Cho (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 35. 2022.
-
Robust calibration with multi-domain temperature scaling.
Y. Yu, S. Bates, Y. Ma, and M. I. Jordan.
In A. Agarwal, A. Oh, D. Belgrave, and and K. Cho (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 35. 2022.
-
Rank diminishing in deep neural networks.
R. Feng, K. Zheng, Y. Huang, D. Zhao, M. I. Jordan, and Z.-J. Zhao.
In A. Agarwal, A. Oh, D. Belgrave, and and K. Cho (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 35. 2022.
-
Deep generative modeling for quantifying sample-level heterogeneity in single-cell omics.
P. Boyeau, J. Hong, A. Gayoso, M. I. Jordan, E. Azizi, and N. Yosef.
Bioarxiv, 2022.
-
Conformal prediction under feedback covariate shift for biomolecular design.
C. Fannjiang, S. Bates, A. Angelopoulos, J. Listgarten, and M. I. Jordan.
Proceedings of the National Academy of Sciences,
https://doi.org/10.1073/pnas.2204569119, 2022.
-
Explicit second-order min-max optimization methods with optimal convergence guarantee.
T. Lin, P. Mertikopoulos, and M. I. Jordan.
arxiv.org/abs/2210.12860, 2022.
-
Multi-source causal inference using control variates.
W. Guo, S. Wang, P. Ding, Y. Wang, and M. I. Jordan.
Transactions on Machine Learning Research, https://openreview.net/forum?id=CrimIjBa64, 2022.
-
On constraints in first-order optimization: A view from non-smooth dynamical systems.
M. Muehlebach and M. I. Jordan.
Journal of Machine Learning Research, 23, 1-47, 2022.
-
Instability, computational efficiency and statistical accuracy.
N. Ho, K. Khamaru, R. Dwivedi, M. Wainwright, M. I. Jordan, and B. Yu.
Journal of Machine Learning Research, 23, 1-81, 2022.
-
A nonasymptotic analysis of gradient descent ascent for nonconvex-concave minimax problems.
T. Lin, C. Jin, and M. I. Jordan.
https://ssrn.com/abstract=4181867, 2022.
-
Learning two-player mixture Markov games: Kernel function approximation
and correlated equilibrium.
J. Li, D. Zhou, Q. Gu, and M. I. Jordan.
arxiv.org/abs/2208.05363, 2022.
-
A reinforcement learning approach in multi-phase second-price auction design.
R. Ai, B. Lyu, Y. Wang, Z. Yang, and M. I. Jordan.
arxiv.org/abs/2210.10278, 2022.
-
Principal-agent hypothesis testing.
S. Bates, M. I. Jordan, M. Sklar, and J. A. Soloff.
arxiv.org/abs/2205.06812, 2022.
-
Optimal extragradient-based bilinearly-coupled saddle-point optimization.
S. Du, G. Gidel, M. I. Jordan, and J. Li.
arxiv.org/abs/2206.08573, 2022.
-
Mechanisms that incentivize data sharing in federated learning.
S. P. Karimireddy, W. Guo, and M. I. Jordan.
arxiv.org/abs/2207.04557, 2022.
[Outstanding Paper Award].
-
Continuous-time analysis for variational inequalities: An overview and desiderata.
T. Chavdarova, Y.-P. Hsieh, and M. I. Jordan.
arxiv.org/abs/2207.07105, 2022.
-
Breaking feedback loops in recommender systems with causal inference.
K. Krauth, Y. Wang, and M. I. Jordan.
arxiv.org/abs/2207.01616, 2022.
-
NumS: Scalable array programming for the cloud.
M. Elibol, V. Benara, S. Yagati, L. Zheng, A. Cheung, M. I. Jordan, and I. Stoica.
arxiv.org/abs/2206.14276, 2022.
-
Online nonsubmodular minimization with delayed costs: From full information to bandit feedback.
T. Lin, A. Pacchiano, Y. Yu, and M. I. Jordan.
arxiv.org/abs/2205.07217, 2022.
-
The sky above the clouds.
S. Chasins, A. Cheung, N. Crooks, A. Ghodsi, K. Goldberg, J. E. Gonzalez, J. M. Hellerstein, M. I. Jordan, A. D. Joseph, M. Mahoney, A. Parameswaran, D. Patterson, R. A. Popa, K. Sen, S. Shenker, D. Song, and I. Stoica.
arxiv.org/abs/2205.07147, 2022.
-
Optimal mean estimation without a variance.
Y. Cherapanamjeri, N. Tripuraneni, P. Bartlett, and M. I. Jordan.
Proceedings of the Thirty-Fifth Conference on Learning Theory (COLT), 2022.
-
ROOT-SGD: Sharp nonasymptotics and asymptotic efficiency in a single algorithm.
J. Li, W. Mou, M. Wainwright, and M. I. Jordan.
Proceedings of the Thirty-Fifth Conference on Learning Theory (COLT), 2022.
-
Image-to-image regression with distribution-free uncertainty quantification and
applications in imaging.
A. Angelopoulos, A. Kohli, S. Bates, M. I. Jordan, J. Malik, T. Alshaabi,
S. Upadhyayula, and Y. Romano.
In In C. Szepesvari, L. Song and S. Jegelka (Eds.),
International Conference on Machine Learning (ICML), 2022.
-
No-regret learning in partially-informed auctions.
W. Guo, M. I. Jordan, and E. Vitercik.
In In C. Szepesvari, L. Song and S. Jegelka (Eds.),
International Conference on Machine Learning (ICML), 2022.
-
Online nonsubmodular minimization with delayed costs: From full information to bandit feedback.
T. Lin, A. Pacchiano, Y. Yu, and M. I. Jordan.
In In C. Szepesvari, L. Song and S. Jegelka (Eds.),
International Conference on Machine Learning (ICML), 2022.
-
Welfare maximization in competitive equilibrium: Reinforcement learning for
Markov exchange economy.
Z. Liu, M. Lu, Z. Wang, M. I. Jordan, and Z. Yang.
In In C. Szepesvari, L. Song and S. Jegelka (Eds.),
International Conference on Machine Learning (ICML), 2022.
-
Markov persuasion processes and reinforcement learning.
J. Wu, Z. Zhang, Z. Feng, Z. Wang, Z. Yang, M. I. Jordan, and H. Xu.
ACM Conference on Economics and Computation (EC), Boulder, CO, 2022.
-
On the complexity of approximating multimarginal optimal transport.
T. Lin, N. Ho, Cuturi, M., and M. I. Jordan.
Journal of Machine Learning Research, 23, 1-43, 2022.
-
SOUL: An energy-efficient unsupervised online learning seizure detection classifier.
A. Chua, M. I. Jordan, and R. Muller.
Journal of Solid State Circuits, 57, 2532-2544, 2022.
-
On the efficiency of entropic regularized algorithms for optimal transport.
T. Lin, N. Ho, and M. I. Jordan.
Journal of Machine Learning Research, 23, 1-42, 2022.
-
Learning dynamic mechanisms in unknown environments: A reinforcement learning approach.
B. Lyu, Q. Meng, S. Qiu, Z. Wang, Z. Yang, and M. I. Jordan.
arxiv.org/abs/2202.12797, 2022.
-
Partial identification with noisy covariates: A robust optimization approach.
W. Guo, M. Yin, Y. Wang, and M. I. Jordan.
arxiv.org/abs/2202.10665, 2022.
-
Geometric methods for sampling, optimisation, inference and adaptive agents.
A. Barp, L. Da Costa, G. França, K. Friston, M. Girolami, M. I. Jordan, and G. A. Pavliotis.
In F. Nielson, A. Srinivasa Rao, and C. R. Rao (Eds.),
Geometry and Statistics, Academic Press, 2022.
-
Improving generalization via uncertainty driven perturbations.
M. Pagliardini, G. Manunza, M. Jaggi, M. I. Jordan, and T. Chavdarova.
arxiv.org/abs/2202.05737, 2022.
-
Robust estimation for nonparametric families via generative adversarial networks.
B. Zhu, J. Jiao, and M. I. Jordan.
International Symposium on Information Theory (ISIT), Espoo, Finland, 2022.
-
Multi-resolution deconvolution of spatial transcriptomics data reveals continuous
patterns of inflammation.
R. Lopez, B. Li, H. Keren-Shaul, P. Boyeau, M. Kedmi, D. Pilzer, A. Jelinski,
E. David, A. Wagner, Y. Addad, M. I. Jordan, I. Amit, and N. Yosef.
Nature Biotechnology, 40, 1360-1369, 2022.
-
Transferred Q-learning.
E. Chen, M. I. Jordan, and S. Li.
arxiv.org/abs/2202.04709, 2022.
-
Online active learning with dynamic marginal gain thresholding.
E. Chen, R. Song, and M. I. Jordan.
arxiv.org/abs/2201.08536, 2022.
-
Optimal variance-reduced stochastic approximation in Banach spaces.
W. Mou, K. Khamaru, M. Wainwright, Bartlett, P., and M. I. Jordan.
arxiv.org/abs/2201.08518, 2022.
-
Private prediction sets.
A. Angelopoulos, S. Bates, T. Zrnic, and M. I. Jordan.
Harvard Data Science Review, https://doi.org/10.1162/99608f92.16c71dad, 2022.
-
Adaptivity of stochastic gradient methods for nonconvex optimization .
S. Horváth, L. Lei, P. Richtárik, and M. I. Jordan.
SIAM Journal on the Mathematics of Data Science, 4, 634-648, 2022.
-
Scvi-tools: A library for deep probabilistic analysis of single-cell omics data.
A. Gayoso, R. Lopez, G. Xing, P. Boyeau, K. Wu, M. Jayasuriya, E. Melhman,
M. Langevin, Y. Liu, J. Samaran, J., G. Misrachi, A. Nazaret, O. Clivio,
C. Xu, T. Ashuach, M. Lotfollahi, V. Svensson, E. Da Veiga Beltrame, C. Talavera-López,
L. Pachter, F. Theis, A. Streets, M. I. Jordan, J. Regier, and N. Yosef.
Nature Biotechnology, 40, 163-166, 2022.
-
Ranking and tuning pre-trained models: A new paradigm of exploiting model hubs.
K. You, Y. Liu, J. Wang, M. I. Jordan, and M. Long.
Journal of Machine Learning Research, 23, 1-47, 2022.
-
Active learning for nonlinear system identification with guarantees.
H. Mania, M. I. Jordan and B. Recht.
Journal of Machine Learning Research, 23, 1-30, 2022.
-
First-order constrained optimization: Non-smooth dynamical system viewpoint.
S. Schechtman, D. Tiapkin, E. Moulines, M. I. Jordan, and M. Muehlebach.
18th IFAC Workshop on Control Applications of Optimization, Gif-sur-Yvettes, France, 2022.
-
On the convergence of stochastic extragradient for bilinear games
with restarted iteration averaging.
J. Li, Y. Yu, N. Loizou, G. Gidel, Y. Ma, N. Le Roux, and M. I. Jordan.
Proceedings of the Twenty-Fifth Conference on Artificial Intelligence and
Statistics (AISTATS), 2022.
-
Online learning of competitive equilibria in exchange economies.
W. Guo, K. Kandasamy, J. Gonzalez, M. I. Jordan, and I. Stoica.
Proceedings of the Twenty-Fifth Conference on Artificial Intelligence and
Statistics (AISTATS), 2022.
-
Fast distributionally robust learning with variance reduced min-max optimization.
Y. Yu, T. Lin, E. Mazumdar, and M. I. Jordan.
Proceedings of the Twenty-Fifth Conference on Artificial Intelligence and
Statistics (AISTATS), 2022.
-
On structured filtering-clustering: Global error bound and optimal first-order algorithms.
T. Lin, N. Ho, and M. I. Jordan.
Proceedings of the Twenty-Fifth Conference on Artificial Intelligence and
Statistics (AISTATS), 2022.
-
Partial identification with noisy covariates: A robust optimization approach.
W. Guo, M. Yin, Y. Wang, and M. I. Jordan.
1st Conference on Causal Learning and Reasoning (CLeaR)}, 2022.
-
Identifying systematic variation at the single-cell level by leveraging
low-resolution population-level data.
E. Rahmani, M. I. Jordan, and N. Yosef.
26th Annual International Conference on Research in Computational Molecular
Biology (RECOMB), 2022.
2021
-
Distribution-free, risk-controlling prediction sets.
A. Angelopoulos, S. Bates, J. Malik, and M. I. Jordan.
Journal of the ACM, 68, 1-34, 2021.
-
A control-theoretic perspective on optimal high-order optimization.
T. Lin and M. I. Jordan.
Mathematical Programming, 195, 929-975, 2021.
-
Is temporal difference learning optimal? An instance-dependent analysis.
K. Khamaru, A. Pananjady, F. Ruan, M. Wainwright, and M. I. Jordan.
SIAM Journal on the Mathematics of Data Science, 3, https://doi.org/10.1137/20M1331524, 2021.
-
Assessment of treatment effect estimators for heavy-tailed data.
N. Tripuraneni, D. Madeka, D. Foster, D. Perrault-Joncas, and M. I. Jordan.
arxiv.org/abs/2112.07602, 2021.
-
Multi-stage decentralized matching markets: Uncertain preferences and strategic behaviors.
X. Dai and M. I. Jordan.
Journal of Machine Learning Research, 22, 1-50, 2021.
-
A Bayesian nonparametric approach to super-resolution single-molecule localization.
M. Gabitto, H. Marie-Nellie, H. Pakman, A. Pataki, X. Darxacq, and M. I. Jordan.
Annals of Applied Statistics, 15, 1742-1766, 2021.
-
How AI fails us.
D. Siddarth, D. Acemoglu, D. Allen, K. Crawford, J. Evans, M. I. Jordan, and G. Weyl.
Edmond J. Safra Center for Ethics, 2021.
-
The Turing Test is bad for business: Technology should focus on the complementarity game,
not the imitation game.
D. Acemoglu, M. I. Jordan, and E. Glen Weyl.
WIRED Magazine, 2021.
-
On the self-penalization phenomenon in feature selection.
M. I. Jordan, K. Liu, and F. Ruan.
arxiv.org/abs/2110.05852, 2021.
-
Learn then test: Calibrating predictive algorithms to achieve risk control.
A. Angelopoulos, S. Bates, E. Candès, M. I. Jordan, and L. Lei.
arxiv.org/abs/2110.01052, 2021.
-
Who leads and who follows in strategic classification?
T. Zrnic, E. Mazumdar, S. S. Sastry, and M. I. Jordan.
In M. Ranzato, A. Beygelzimer, P. Liang, J. Wortman Vaughan, and Y. Dauphin (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 34. 2021.
-
Learning equilibria in matching markets from bandit feedback.
M. Jagadeesan, A. Wei, M. I. Jordan, and J. Steinhardt.
In M. Ranzato, A. Beygelzimer, P. Liang, J. Wortman Vaughan, and Y. Dauphin (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 34. 2021.
-
On the theory of reinforcement learning with once-per-episode feedback.
N. Chatterji, A. Pacchiano, P. Bartlett, and M. I. Jordan.
In M. Ranzato, A. Beygelzimer, P. Liang, J. Wortman Vaughan, and Y. Dauphin (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 34. 2021.
-
Robust learning of optimal auctions.
W. Guo, M. Jordan, and E. Zampetakis.
In M. Ranzato, A. Beygelzimer, P. Liang, J. Wortman Vaughan, and Y. Dauphin (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 34. 2021.
-
Wasserstein flow meets replicator dynamics: A mean-field analysis of
representation learning in actor-critic.
Y. Zhang, S. Chen, Z. Yang, M. I. Jordan, and Z. Wang.
In M. Ranzato, A. Beygelzimer, P. Liang, J. Wortman Vaughan, and Y. Dauphin (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 34. 2021.
-
On component interactions in two-stage recommender systems.
J. Hron, K. Krauth, M. I. Jordan, and N. Kilbertus.
In M. Ranzato, A. Beygelzimer, P. Liang, J. Wortman Vaughan, and Y. Dauphin (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 34. 2021.
-
Test-time collective prediction.
C. Mendler-Dünner, W. Guo, S. Bates, and M. I. Jordan.
In M. Ranzato, A. Beygelzimer, P. Liang, J. Wortman Vaughan, and Y. Dauphin (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 34. 2021.
-
Learning in multi-stage decentralized matching markets.
X. Dai and M. I. Jordan.
In M. Ranzato, A. Beygelzimer, P. Liang, J. Wortman Vaughan, and Y. Dauphin (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 34. 2021.
-
Tactical optimism and pessimism for deep reinforcement learning
T. Moskovitz, J. Parker-Holder, A. Pacchiano, M. Arbel, and M. I. Jordan.
In M. Ranzato, A. Beygelzimer, P. Liang, J. Wortman Vaughan, and Y. Dauphin (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 34. 2021.
-
Optimization on manifolds: A symplectic approach.
G. França, A. Barp, M. Girolami, and M. I. Jordan.
arxiv.org/abs/2107.11231, 2021.
-
Bandit learning in decentralized matching markets.
L. Liu, F. Ruan, H. Mania, and M. I. Jordan.
Journal of Machine Learning Research, 22, 1-34, 2021.
-
On nonconvex optimization for machine learning: Gradients, stochasticity,
and saddle points.
C. Jin, P. Netrapalli, R. Ge, S. Kakade, and M. I. Jordan.
Journal of the ACM, 68, doi.org/10.1145/3418526, 2021.
-
Elastic hyperparameter tuning on the cloud.
L. Dunlap, K. Kandasamy, U. Misra, R. Liaw, J. Gonzalez, I. Stoica, and M. I. Jordan.
ACM Symposium on Cloud Computing (SoCC), Seattle, WA, 2021.
-
The stereotyping problem in collaboratively filtered recommender systems.
W. Guo, K. Krauth, M. I. Jordan, and N. Garg.
Equity and Access in Algorithms, Mechanisms, and Optimization (EAAMO), 2021.
-
A variational inequality approach to Bayesian regression games.
W. Guo, M. I. Jordan, and T. Lin.
Proceedings of the 60th IEEE Conference on Decision and Control (CDC), Austin, TX, 2021.
-
On the stability of nonlinear receding horizon control: A geometric perspective.
T. Westenbroek, M. Simchowitz, M. I. Jordan, and S. S. Sastry.
Proceedings of the 60th IEEE Conference on Decision and Control (CDC), Austin, TX, 2021.
-
Data sharing markets.
M. Rasouli and M. I. Jordan.
arxiv.org/abs/2107.08630, 2021.
-
Instance-optimality in optimal value estimation: Adaptivity via variance-reduced Q-learning.
K. Khamaru, E. Xia, M. Wainwright, and M. I. Jordan.
arxiv.org/abs/2106.14352, 2021.
-
Cluster-and-conquer: A framework for time-series forecasting.
R. Pathak, R. Sen, N. Rao, N. B. Erichson, M. I. Jordan, and I. Dhillon.
arxiv.org/abs/2110.14011, 2021.
-
Taming nonconvexity in kernel feature selection—Favorable properties of the Laplace kernel.
F. Ruan, K. Liu, and M. I. Jordan.
arxiv.org/abs/2106.09387, 2021.
-
Parallelizing contextual linear bandits.
J. Chan, A. Pacchiano, N. Tripuraneni, Y. Song, P. Bartlett, and M. I. Jordan.
arxiv.org/abs/2105.10590, 2021.
-
Stochastic approximation for online tensorial independent component analysis.
J. Li and M. I. Jordan.
Proceedings of the Conference on Learning Theory (COLT), Boulder, CO, 2021.
-
Reconstructing unobserved cellular states from paired single-cell lineage tracing
and transcriptomics data.
K. Ouardini, R. Lopez, M. G. Jones, S. Prillo, R. Zhang, M. I. Jordan, and N. Yosef.
www.biorxiv.org/content/10.1101/2021.05.28.446021v1, 2021.
-
Provable meta-learning of linear representations.
N. Tripuraneni, C. Jin, and M. I. Jordan.
In M. Meila and T. Zhang (Eds.), International Conference on Machine Learning (ICML), 2021.
-
Resource allocation in multi-armed bandit exploration: Overcoming nonlinear
scaling with adaptive parallelism.
B. Thananjeyan, K. Kandasamy, I. Stoica, M. I. Jordan, K. Goldberg, and J. Gonzalez.
In M. Meila and T. Zhang (Eds.), International Conference on Machine Learning (ICML), 2021.
-
Representation matters: Assessing the importance of subgroup allocations in training data.
E. Rolf, T. Worledge, B. Recht, and M. I. Jordan.
In M. Meila and T. Zhang (Eds.), International Conference on Machine Learning (ICML), 2021.
-
Variational refinement for importance sampling using the forward Kullback-Leibler divergence.
G. Jerfel, S. Wang, C. Fannjiang, K. Heller, Y. Ma, and M. I. Jordan
In C. de Campos and M. Maathuis (Eds.), Uncertainty in Artificial Intelligence (UAI),
Proceedings of the Thirty-Seventh Conference, 2021.
-
A Lyapunov analysis of momentum methods in optimization.
A. Wilson, B. Recht and M. I. Jordan.
Journal of Machine Learning Research, 22, 1-34, 2021.
-
Is there an analog of Nesterov acceleration for MCMC?.
Y.-A. Ma, N. Chatterji, X. Cheng, N. Flammarion, P. Bartlett, and M. I. Jordan.
Bernoulli, 27, 1942-1992, 2021.
-
Learning strategies in decentralized matching markets under uncertain preferences.
X. Dai and M. I. Jordan.
Journal of Machine Learning Research, 22, 1-50, 2021.
-
Optimization with momentum: Dynamical, control-theoretic, and symplectic perspectives.
M. Muehlebach and M. I. Jordan.
Journal of Machine Learning Research, 22, 1-50, 2021.
-
PAC best arm identification under a deadline.
B. Thananjeyan, K. Kandasamy, I. Stoica, M. I. Jordan, K. Goldberg, and J. Gonzalez.
arxiv.org/abs/2106.03221, 2021.
-
Deep generative models for detecting differential expression in single cells.
P. Boyeau, R. Lopez, J. Regier, A. Gayoso, M. I. Jordan, and N. Yosef.
www.biorxiv.org/content/10.1101/794289v1, 2021.
-
On dissipative symplectic integration with applications to gradient-based optimization.
G. França, M. I. Jordan, and R. Vidal.
Journal of Statistical Mechanics: Theory and Experiment, (2021) 043402.
-
Generalized momentum-based methods: A Hamiltonian perspective.
J. Diakonikolas and M. I. Jordan.
SIAM Journal on Optimization, 31, 915-944, 2021.
-
Understanding the acceleration phenomenon via high-resolution differential equations.
B. Shi, S. Du, M. I. Jordan, and W. Su.
Mathematical Programming, doi.org/10.1007/s10107-021-01681-8, 2021.
-
Interleaving computational and inferential thinking: Data Science
for undergraduates at Berkeley.
A. Adhikari, J. DeNero, and M. I. Jordan.
Harvard Data Science Review, doi.org/10.1162/99608f92.cb0fa8d2, 2021.
-
Asynchronous online testing of multiple hypotheses.
T. Zrnic, A. Ramdas, and M. I. Jordan.
Journal of Machine Learning Research, 22, 1-39, 2021.
-
High-order Langevin diffusion yields an accelerated MCMC algorithm.
W. Mou, Y.-A. Ma, M. Wainwright, P. Bartlett, and M. I. Jordan.
Journal of Machine Learning Research, 22, 1-41, 2021.
-
Unsupervised online learning classifier for seizure detection.
A. Chua, M. I. Jordan, and R. Muller.
2021 Symposium on VLSI Circuits Kyoto, Japan, 2021.
-
Efficient methods for structured nonconvex-nonconcave min-max optimization.
J. Diakonakolis, C. Daskalakis, and M. I. Jordan.
Proceedings of the Twenty-Fourth Conference on Artificial Intelligence and
Statistics (AISTATS), 2021.
-
On projection robust optimal transport: Sample complexity and model misspecification.
T. Lin, Z. Zheng, E. Chen, M. Cuturi, and M. I. Jordan.
Proceedings of the Twenty-Fourth Conference on Artificial Intelligence and
Statistics (AISTATS), 2021.
-
Uncertainty sets for image classifiers using conformal prediction.
A. Angelopoulos, S. Bates, J. Malik, and M. I. Jordan.
International Conference on Learning Representations (ICLR), 2021.
-
Learning from eXtreme bandit feedback.
R. Lopez, I. Dhillon, and M. I. Jordan.
Thirty-Fifth AAAI Conference on Artificial Intelligence (AAAI-21), 2021.
[Best Paper Award Honorable Mention].
-
Robustness guarantees for mode estimation with an application to bandits.
A. Pacchiano, H. Jiang, and M. I. Jordan.
Thirty-Fifth AAAI Conference on Artificial Intelligence (AAAI-21), 2021.
-
Probabilistic harmonization and annotation of single-cell transcriptomics data with
deep generative models.
X. Xu, R. Lopez, E. Mehlman, J. Regier, M. I. Jordan, and N. Yosef.
Molecular Systems Biology, 17, e9620, 2021.
2020
-
On the adaptivity of stochastic gradient-based optimization.
L. Lei and M. I. Jordan.
SIAM Journal on Optimization, 30, 1473-1500, 2020.
-
Fundamental limits of detection in the spiked Wigner model.
A. El Alaoui, F. Krzakala, and M. I. Jordan.
Annals of Statistics, 48, 863-885, 2020.
-
On identifying and mitigating bias in the estimation of the COVID-19 case fatality rate.
A. Angelopoulos, R. Pathak, R. Varma, and M. I. Jordan.
Harvard Data Science Review, Special Issue 1, 2020.
-
Optimal rates and tradeoffs in multiple testing.
M. Rabinovich, A. Ramdas, M. I. Jordan, and M. Wainwright.
Statistica Sinica, 30, 741-762, 2020.
-
Function-specific mixing times and concentration away from equilibrium.
M. Rabinovich, A. Ramdas, M. I. Jordan, and M. Wainwright.
Bayesian Analysis, 15, 505-532, 2020.
-
Greedy Attack and Gumbel Attack: Generating adversarial examples for
discrete data.
P. Yang, J. Chen, C.-J. Hsieh, J.-L. Wang, and M. I. Jordan.
Journal of Machine Learning Research, 21, 1-36, 2002.
-
Optimal mean estimation without a variance.
Y. Cherapanamjeri, N. Tripuraneni, P. Bartlett, and M. I. Jordan.
arxiv.org/abs/2011.12433, 2020.
-
Online learning demands in max-min fairness.
K. Kandasamy, G.-E. Sela, J. Gonzalez, M. I. Jordan, and I. Stoica.
arxiv.org/abs/2012.08648, 2020.
-
Manifold learning via manifold deflation.
D. Ting and M. I. Jordan.
arxiv.org/abs/2007.03315, 2020.
-
Do offline metrics predict online performance in recommender systems?.
K. Krauth, S. Dean, A. Zhao, W. Guo, M. Curmei, B. Recht, M. I. Jordan.
arxiv.org/abs/2011.07931, 2020.
-
Optimal robust linear regression in nearly linear time.
Y. Cherapanamjeri, E. Aras, N. Tripuraneni, M. I. Jordan, N. Flammarion, and P. Bartlett.
arxiv.org/abs/2007.08137, 2020.
-
Bridging exploration and general function approximation in reinforcement learning:
Provably efficient kernel and neural value iterations.
Z. Wang, C. Jin, Z. Yang, M. Wang, and M. I. Jordan.
arxiv.org/abs/2011.04622, 2020.
-
Robust optimization for fairness with noisy protected groups.
S. Wang, W. Guo, H. Narasimhan, A. Cotter, M. Gupta, and M. I. Jordan.
In H. Larochelle, M. Ranzato, R. Hadsell, M. F. Balcan, and H.-T. Lin (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 33. 2020.
-
On the theory of transfer learning: The importance of task diversity.
N. Tripuraneni, M. I. Jordan, and C. Jin.
In H. Larochelle, M. Ranzato, R. Hadsell, M. F. Balcan, and H.-T. Lin (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 33. 2020.
-
Decision-making with auto-encoding variational Bayes.
R. Lopez, P. Boyeau, N. Yosef, M. I. Jordan, and J. Regier.
In H. Larochelle, M. Ranzato, R. Hadsell, M. F. Balcan, and H.-T. Lin (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 33. 2020.
-
Projection robust Wasserstein distance and Riemannian optimization.
T. Lin, C. Fan, N. Ho, M. Cuturi, and M. I. Jordan.
In H. Larochelle, M. Ranzato, R. Hadsell, M. F. Balcan, and H.-T. Lin (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 33. 2020.
-
Fixed-support Wasserstein barycenter: Computational hardness
and efficient algorithms.
T. Lin, N. Ho, X. Chen, M. Cuturi, and M. I. Jordan.
In H. Larochelle, M. Ranzato, R. Hadsell, M. F. Balcan, and H.-T. Lin (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 33. 2020.
-
Provably efficient reinforcement learning with kernel and neural function approximation.
Z. Wang, C. Jin, Z. Yang, M. Wang, and M. I. Jordan.
In H. Larochelle, M. Ranzato, R. Hadsell, M. F. Balcan, and H.-T. Lin (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 33. 2020.
-
Transferable calibration with lower bias and variance in domain adaptation.
X. Wang, M. Long, J. Wang, and M. I. Jordan.
In H. Larochelle, M. Ranzato, R. Hadsell, M. F. Balcan, and H.-T. Lin (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 33. 2020.
-
Posterior distribution for the number of clusters in Dirichlet process mixture models.
C.-Y. Yang, E. Xia, N. Ho, and M. I. Jordan.
arxiv.org/abs/1905.09959, 2020.
-
A higher-order Swiss army infinitesimal jackknife.
R. Giordano, M. I. Jordan, and T. Broderick.
arxiv.org/abs/1907.12116, 2020.
-
Covariance estimation with nonnegative partial correlations.
J. A. Soloff, A. Guntuboyina, and M. I. Jordan.
arxiv.org/abs/2007.15252, 2020.
-
Finding equilibrium in multi-agent games with payoff uncertainty.
W. Guo, M. Curmei, S. Wang, B. Recht, and M. I. Jordan.
arxiv.org/abs/2007.05647, 2020.
-
High-confidence sets for trajectories of stochastic time-varying nonlinear systems.
E. Mazumdar, T. Westenbroek, M. I. Jordan, and S. Sastry.
Proceedings of the 59th IEEE Conference on Decision and Control (CDC), Jeju Island, Korea,
2020.
-
Singularity, misspecification, and the convergence rate of EM.
R. Dwivedi, N. Ho, K. Khamaru, M. Wainwright, M. I. Jordan, and B. Yu.
Annals of Statistics, 48, 3161-3182, 2020.
-
On Thompson sampling with Langevin algorithms.
E. Mazumdar, A. Pacchiano, Y.-A. Ma, P. Bartlett, and M. I. Jordan.
In H. Daumé III and A. Singh (Eds.),
International Conference on Machine Learning (ICML), 2020.
-
Continuous-time lower bounds for gradient-based algorithms.
M. Muehlebach and M. I. Jordan.
In H. Daumé III and A. Singh (Eds.),
International Conference on Machine Learning (ICML), 2020.
-
On gradient descent ascent for nonconvex-concave minimax problems.
T. Lin, C. Jin, and M. I. Jordan.
In H. Daumé III and A. Singh (Eds.),
International Conference on Machine Learning (ICML), 2020.
-
Learning to score behaviors for guided policy optimization.
A. Pacchiano, J. Parker-Holder, Y. Tang, K. Choromanski, A. Choromanska, and M. I. Jordan.
In H. Daumé III and A. Singh (Eds.),
International Conference on Machine Learning (ICML), 2020.
-
Finite-time last-iterate convergence for multi-agent learning in games.
T. Lin, Z. Zhou, P. Mertikopoulos, and M. I. Jordan.
In H. Daumé III and A. Singh (Eds.),
International Conference on Machine Learning (ICML), 2020.
-
What is local optimality in nonconvex-nonconcave minimax optimization?
C. Jin, P. Netrapalli, and M. I. Jordan.
In H. Daumé III and A. Singh (Eds.),
International Conference on Machine Learning (ICML), 2020.
-
Accelerated message passing for entropy-regularized MAP inference.
J. Lee, A. Pacchiano, P. Bartlett, and M. I. Jordan.
In H. Daumé III and A. Singh (Eds.),
International Conference on Machine Learning (ICML), 2020.
-
Stochastic gradient and Langevin processes.
X. Cheng, Yin, D., P. Bartlett, and M. I. Jordan.
In H. Daumé III and A. Singh (Eds.),
International Conference on Machine Learning (ICML), 2020.
-
Provably efficient reinforcement learning with linear function approximation.
C. Jin, Z. Yang, Z. Wang, and M. I. Jordan.
Proceedings of the Conference on Learning Theory (COLT), Graz, Austria, 2020.
-
On linear stochastic approximation: Fine-grained Polyak-Ruppert and non-asymptotic concentration.
W. Mou, J. Li, M. Wainwright, P. Bartlett, and M. I. Jordan.
Proceedings of the Conference on Learning Theory (COLT), Graz, Austria, 2020.
-
Near-optimal algorithms for minimax optimization.
T. Lin, C. Jin, and M. I. Jordan.
Proceedings of the Conference on Learning Theory (COLT), Graz, Austria, 2020.
-
Lower bounds in multiple testing: A framework based on derandomized proxies.
M. Rabinovich, M. I. Jordan, and M. Wainwright.
arxiv.org/abs/2005.03725, 2020.
-
Detecting zero-inflated genes in single-cell transcriptomics data.
O. Clivio, R. Lopez, J. Regier, A. Gayoso, M. I. Jordan, and N. Yosef.
biorxiv.org/content/10.1101/794875v3, 2020.
-
Policy-gradient algorithms have no guarantees of convergence in continuous
action and state multi-agent settings.
E. Mazumdar, L. Ratliff, M. I. Jordan, and S. S. Sastry.
International Conference on Autonomous Agents and Multi-Agent Systems (AAMAS),
Auckland, New Zealand, 2020.
-
Improved sample complexity for stochastic compositional variance reduced gradient.
T. Lin, C. Fan, M. Wang, and M. I. Jordan.
American Control Conference (ACC), Denver, CO, 2020.
-
HopSkipJumpAttack: Query-efficient decision-based adversarial attack.
J. Chen, M. I. Jordan, and M. Wainwright.
41st IEEE Symposium on Security and Privacy (SP), San Francisco, CA, 2020.
-
Unsupervised online learning for long-term high sensitivity seizure detection.
A. Chua, M. I. Jordan, and R. Muller.
42nd Annual International Conference of the IEEE Engineering in Medicine and
Biology Society (EMBC) Montreal, Canada, 2020.
-
Competing bandits in matching markets.
L. Liu, H. Mania, and M. I. Jordan.
In R. Calandra and S. Chiappa (Eds.),
Proceedings of the Twenty-Third Conference on Artificial Intelligence and
Statistics (AISTATS), Palermo, Italy, 2020.
-
Langevin Monte Carlo without smoothness.
N. Chatterji, J. Diakonikolas, M. I. Jordan, and P. Bartlett.
In R. Calandra and S. Chiappa (Eds.),
Proceedings of the Twenty-Third Conference on Artificial Intelligence and
Statistics (AISTATS), Palermo, Italy, 2020.
-
The power of batching in multiple hypothesis testing.
T. Zrnic, D. Jiang, Ramdas, A., and M. I. Jordan.
In R. Calandra and S. Chiappa (Eds.),
Proceedings of the Twenty-Third Conference on Artificial Intelligence and
Statistics (AISTATS), Palermo, Italy, 2020.
-
Sharp analysis of expectation-maximization for weakly identifiable models.
R. Dwivedi, N. Ho, K. Khamaru, M. Wainwright, M. I. Jordan, and B. Yu.
In R. Calandra and S. Chiappa (Eds.),
Proceedings of the Twenty-Third Conference on Artificial Intelligence and
Statistics (AISTATS), Palermo, Italy, 2020.
-
Fast algorithms for computational optimal transport and Wasserstein barycenter.
W. Guo, N. Ho, and M. I. Jordan.
In R. Calandra and S. Chiappa (Eds.),
Proceedings of the Twenty-Third Conference on Artificial Intelligence and
Statistics (AISTATS), Palermo, Italy, 2020.
-
Convergence rates of smooth message passing with rounding in entropy-regularized
MAP inference.
J. Lee, A. Pacchiano, and M. I. Jordan.
In R. Calandra and S. Chiappa (Eds.),
Proceedings of the Twenty-Third Conference on Artificial Intelligence and
Statistics (AISTATS), Palermo, Italy, 2020.
-
Post-estimation smoothing: A simple baseline for learning with side information.
E. Rolf, M. I. Jordan, and B. Recht.
In R. Calandra and S. Chiappa (Eds.),
Proceedings of the Twenty-Third Conference on Artificial Intelligence and
Statistics (AISTATS), Palermo, Italy, 2020.
-
Deep generative models for detecting differential expression in single cells.
O. Clivio, R. Lopez, J. Regier, A. Gayoso, M. I. Jordan, and N. Yosef.
biorxiv.org/content/10.1101/794289v1, 2020.
-
ML-LOO: Detecting adversarial examples with feature attribution.
P. Yang, J. Chen, C.-J. Hsieh, J.-L. Wang, and M. I. Jordan.
Thirty-Fourth AAAI Conference on Artificial Intelligence (AAAI-20), 2020.
-
Cost-effective incentive allocation via structured counterfactual inference.
R. Lopez, C. Li, X. Yan, J. Xiong, M. I. Jordan, Y. Qi, and L. Song.
Thirty-Fourth AAAI Conference on Artificial Intelligence (AAAI-20), 2020.
-
LS-Tree: Model interpretation when the data are linguistic.
J. Chen and M. I. Jordan.
Thirty-Fourth AAAI Conference on Artificial Intelligence (AAAI-20), 2020.
-
Variance reduction with sparse gradients.
M. Elibol, L. Lei, and M. I. Jordan.
International Conference on Learning Representations (ICLR), Addis Ababa, Ethiopia, 2020.
2019
-
A unified treatment of multiple testing with prior knowledge.
A. Ramdas, R. Foygel Barber, M. Wainwright, and M. I. Jordan.
Annals of Statistics, 47, 2790-2821, 2019.
-
Decoding from pooled data: Sharp information-theoretic bounds.
A. El Alaoui, A. Ramdas, F. Krzakala, L. Zdeborova, and M. I. Jordan.
SIAM Journal on the Mathematics of Data Science, 1, 161-188, 2019.
-
Artificial intelligence: The revolution hasn't happened yet.
M. I. Jordan.
Harvard Data Science Review, 1, 2019. [With commentary. Originally published in Medium].
-
Dr. AI or: How I learned to stop worrying and love economics.
M. I. Jordan.
Harvard Data Science Review, 1, 2019. [Response to commentary].
-
Sampling can be faster than optimization.
Y.-A. Ma, Y. Chen, C. Jin, N. Flammarion, and M. I. Jordan.
Proceedings of the National Academy of Sciences, https://doi.org/10.1073/pnas.1820003116, 2019.
-
First-order methods almost always avoid strict saddle-points.
J. Lee, I. Panageas, G. Piliouras, M. Simchowitz, M. I. Jordan, and B. Recht.
Mathematical Programming, doi.org/10.1007/s10107-019-01374-3, 2019.
-
A sequential algorithm for false discovery rate control on directed acyclic graphs.
A. Ramdas, J. Chen, M. Wainwright, and M. I. Jordan.
Biometrika, 106, 69-86.
-
Transferable representation learning with deep adaptation networks.
M. Long, Cao, Z., Cao, Y., J. Wang, and M. I. Jordan.
IEEE Transactions on Pattern Analysis and Machine Intelligence, 41, 3071-3085, 2019.
-
Decoding from pooled data: Phase transitions of message passing.
A. El Alaoui, A. Ramdas, F. Krzakala, L. Zdeborova, and M. I. Jordan.
IEEE Transactions on Information Theory, 65, 572-585, 2019.
-
Sampling for Bayesian mixture models: MCMC with polynomial-time mixing.
W. Mou, Y.-A. Ma, M. Wainwright, P. Bartlett, and M. I. Jordan.
arxiv.org/abs/1912.05153, 2019.
-
Towards understanding the transferability of deep representations.
H. Liu, M. Long, J. Wang, and M. I. Jordan.
arxiv.org/abs/1909.12031, 2019.
-
How does learning rate decay help modern neural networks?.
K. You, M. Long, J. Wang, and M. I. Jordan.
arxiv.org/abs/1908.01878, 2019.
-
Convergence rates for Gaussian mixtures of experts.
N. Ho, C.-Y. Yang, and M. I. Jordan.
arxiv.org/abs/1907.04377, 2019.
-
Quantitative W1 convergence of Langevin-like stochastic processes with
non-convex potential state-dependent noise.
X. Cheng, Yin, D., P. Bartlett, and M. I. Jordan.
arxiv.org/abs/1907.03215, 2019.
-
Wasserstein reinforcement learning.
A. Pacchiano, J. Parker-Holder, Y. Tang, A. Choromanska, K. Choromanski, and M. I. Jordan.
arxiv.org/abs/1906.04349, 2019.
-
On the acceleration of the Sinkhorn and Greenkhorn algorithms for optimal transport.
T. Lin, N. Ho, and M. I. Jordan.
arxiv.org/abs/1906.01437, 2019.
-
Posterior distribution for the number of clusters in Dirichlet
process mixture models.
C-Y. Yang, N. Ho, and M. I. Jordan.
arxiv.org/abs/1905.09959, 2019.
-
A joint model of unpaired data from scRNA-seq and spatial transcriptomics for imputing
missing gene expression measurements.
R. Lopez, A. Nazaret, M. Langevin, J. Samaran, J. Regier, M. I. Jordan, and N. Yosef.
arxiv.org/abs/1905.02269, 2019.
-
Stochastic gradient descent escapes saddle points efficiently.
C. Jin, R. Ge, P. Netrapalli, S. Kakade, and M. I. Jordan.
arxiv.org/abs/1902.04811, 2019.
-
A short note on concentration inequalities for random vectors with subGaussian norm.
arxiv.org/abs/1902.03736, 2019.
-
SysML: The new frontier of machine learning systems.
A. Ratner, et al.
arxiv.org/abs/1904.03257, 2019.
-
Global error bounds and linear convergence for gradient-based algorithms for
trend filtering and l1-convex clustering.
N. Ho, T. Lin, and M. I. Jordan.
arxiv.org/abs/1904.07462, 2019.
-
Acceleration via symplectic discretization of high-resolution differential equations.
B. Shi, S. Du, W. Su, and M. I. Jordan.
In H. Wallach, H. Larochelle, A. Beygelzimer, F. d'Alch&eacut-Buc, E. Fox, and R. Garnett (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 32, 2019.
-
Transferable normalization: Towards improving transferability of deep neural networks.
X. Wang, Y. Jin, M. Long, J. Wang, and M. I. Jordan.
In H. Wallach, H. Larochelle, A. Beygelzimer, F. d'Alch&eacut-Buc, E. Fox, and R. Garnett (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 32, 2019.
-
Quantitative central limit theorems for discrete stochastic processes.
X. Cheng, P. Bartlett, and M. I. Jordan.
arxiv.org/abs/1902.00832, 2019.
-
Challenges with EM in application to weakly identifiable mixture models.
R. Dwivedi, N. Ho, K. Khamaru, M. Wainwright, M. I. Jordan, and B. Yu.
arxiv.org/abs/1902.00194, 2019.
-
Minmax optimization: Stable limit points of gradient descent ascent are locally optimal.
C. Jin, P. Netrapalli, and M. I. Jordan.
arxiv.org/abs/1902.00618, 2019.
-
On finding local Nash equilibria (and only local Nash equilibria) in zero-sum games.
E. Mazumdar, M. I. Jordan, and S. S. Sastry.
arxiv.org/abs/1901.00838, 2019.
-
A dynamical systems perspective on Nesterov acceleration.
M. Muehlebach and M. I. Jordan.
In K. Chaudhuri and R. Salakhutdinov (Eds.),
International Conference on Machine Learning (ICML), 2019.
-
On efficient optimal transport: An analysis of greedy and accelerated
mirror descent algorithms.
T. Lin, N. Ho, and M. I. Jordan.
In K. Chaudhuri and R. Salakhutdinov (Eds.),
International Conference on Machine Learning (ICML), 2019.
-
Bridging theory and algorithm for domain adaptation.
Y. Zhang, T. Liu, M. Long, and M. I. Jordan.
In K. Chaudhuri and R. Salakhutdinov (Eds.),
International Conference on Machine Learning (ICML), 2019.
-
Theoretically principled trade-off between robustness and accuracy.
H. Zhang, Y. Yu, J. Jiao, E. Xing, L. El Ghaoui, and M. I. Jordan.
In K. Chaudhuri and R. Salakhutdinov (Eds.),
International Conference on Machine Learning (ICML), 2019.
-
Towards accurate model selection in deep unsupervised domain adaptation.
K. You, X. Wang, M. Long, and M. I. Jordan.
In K. Chaudhuri and R. Salakhutdinov (Eds.),
International Conference on Machine Learning (ICML), 2019.
-
Rao-Blackwellized stochastic gradients for discrete distributions.
R. Liu, J. Regier, N. Tripuraneni, M. I. Jordan, and J. McAuliffe.
In K. Chaudhuri and R. Salakhutdinov (Eds.),
International Conference on Machine Learning (ICML), 2019.
-
Transferable adversarial training: A general approach to adapting deep classifiers.
H. Liu, M. Long, J. Wang, and M. I. Jordan.
In K. Chaudhuri and R. Salakhutdinov (Eds.),
International Conference on Machine Learning (ICML), 2019.
-
A Swiss army infinitesimal jackknife.
R. Giordano, W. Stephenson, R. Liu, M. I. Jordan, and T. Broderick.
In K. Chaudhuri and M. Sugiyama (Eds.),
Proceedings of the Twenty-Second Conference on Artificial Intelligence and
Statistics (AISTATS), Okinawa, Japan, 2019. [Notable Paper Award].
-
Probabilistic multilevel clustering via composite transportation distance.
N. Ho, V. Huynh, D. Phung, and M. I. Jordan.
In K. Chaudhuri and M. Sugiyama (Eds.),
Proceedings of the Twenty-Second Conference on Artificial Intelligence and
Statistics (AISTATS), Okinawa, Japan, 2019.
-
L-Shapley and C-Shapley: Efficient model interpretation for structured data.
J. Chen, L. Song, M. Wainwright, and M. I. Jordan.
International Conference on Learning Representations (ICLR) , New Orleans, LA.
2019.
2018
-
Dynamical, symplectic and stochastic perspectives on gradient-based optimization.
M. I. Jordan.
Proceedings of the International Congress of Mathematicians, 1, 523-550, 2018.
-
Bayesian inference for a generative model of transcriptome profiles from
single-cell RNA sequencing.
R. Lopez, J. Regier, M. Cole, M. I. Jordan, and N. Yosef.
Nature Methods, 15, 1053-1058, 2018.
-
Communication-efficient distributed statistical inference.
M. I. Jordan, J. Lee, and Y. Yang.
Journal of the American Statistical Association, 114, 668-681, 2018.
-
On kernel methods for covariates that are rankings.
H. Mania, A. Ramdas, M. Wainwright, M. I. Jordan, and B. Recht.
Electronic Journal of Statistics, 12, 2537-2577, 2018.
-
Saturating splines and feature selection.
Boyd, N., Hastie, T., Boyd, S., Recht, B., and M. I. Jordan.
Journal of Machine Learning Research, 18, 1-32, 2018.
-
Covariances, robustness, and variational Bayes.
R. Giordano, T. Broderick, and M. I. Jordan.
Journal of Machine Learning Research, 19, 1-49, 2018.
-
CoCoA: A general framework for communication-efficient distributed optimization.
V. Smith, S. Forte, C. Ma, M. Takac, M. I. Jordan, and M. Jaggi.
Journal of Machine Learning Research, 18, 1-49, 2018.
-
Posteriors, conjugacy, and exponential families for completely random measures.
T. Broderick, A. Wilson, and M. I. Jordan.
Bernoulli, 24, 3181-3221, 2018.
-
Latent marked Poisson process with applications to object segmentation.
S. Ghanta, J. Dy, D. Niu, and M. I. Jordan.
Bayesian Analysis, 13, 85-113, 2018.
-
Ray: A distributed framework for emerging AI applications.
P. Moritz, R. Nishihara, S. Wang, A. Tumanov, R. Liaw, E. Liang,
W. Paul, M. I. Jordan, and I. Stoica.
13th USENIX Symposium on Operating Systems Design and Implementation (OSDI),
Carlsbad, CA, 2018.
-
A deep generative model for semi-supervised classification with noisy labels.
M. Langevin, E. Mehlman, J. Regier, R. Lopez, M. I. Jordan, and N. Yosef.
arxiv.org/abs/1809.05957, 2018.
-
Sharp convergence rates for Langevin dynamics in the nonconvex setting.
X. Cheng, N. Chatterji, Y. Abbasi-Yadkori, P. Bartlett, and M. I. Jordan.
arxiv.org/abs/1805.01648, 2018.
-
Learning without mixing: Towards a sharp analysis of linear system identification.
M. Simchowitz, H. Mania, S. Tu, M. I. Jordan, and B. Recht.
Proceedings of the Conference on Learning Theory (COLT),
Stockholm, Sweden, 2018.
-
Underdamped Langevin MCMC: A non-asymptotic analysis.
X. Cheng, N. Chatterji, P. Bartlett, and M. I. Jordan.
Proceedings of the Conference on Learning Theory (COLT),
Stockholm, Sweden, 2018.
-
Averaging stochastic gradient descent on Riemannian manifolds.
N. Tripuraneni, N. Flammarion, F. Bach, and M. I. Jordan.
Proceedings of the Conference on Learning Theory (COLT),
Stockholm, Sweden, 2018.
-
Accelerated gradient descent escapes saddle points faster than gradient descent.
C. Jin, P. Netrapalli, and M. I. Jordan.
Proceedings of the Conference on Learning Theory (COLT),
Stockholm, Sweden, 2018.
-
Detection limits in the high-dimensional spiked rectangular model.
A. El Alaoui and M. I. Jordan.
Proceedings of the Conference on Learning Theory (COLT),
Stockholm, Sweden, 2018.
-
Partial transfer learning with selective adversarial networks.
Z. Cao, M. Long, J. Wang, and M. I. Jordan.
IEEE Conference on Computer Vision and Pattern Recognition (CVPR),
Salt Lake City, UT, 2018.
-
SAFFRON: An adaptive algorithm for online control of the false discovery rate.
A. Ramdas, T. Zrnic, M. Wainwright, and M. I. Jordan.
Proceedings of the 35th International Conference on Machine
Learning (ICML), Stockholm, Sweden, 2018.
-
Learning to explain: An information-theoretic perspective on model interpretation.
J. Chen, L. Song, M. Wainwright, and M. I. Jordan.
Proceedings of the 35th International Conference on Machine
Learning (ICML), Stockholm, Sweden, 2018.
-
Ray RLlib: A framework for distributed reinforcement learning.
E. Liang, R. Liaw, P. Moritz, R. Nishihara, R. Fox, K. Goldberg,
J. Gonzalez, M. I. Jordan, and I. Stoica.
Proceedings of the 35th International Conference on Machine
Learning (ICML), Stockholm, Sweden, 2018.
-
On the theory of variance reduction for stochastic gradient Monte Carlo.
N. S. Chatterji, N. Flammarion, Y.-A. Ma, P. L. Bartlett, and M. I. Jordan.
Proceedings of the 35th International Conference on Machine
Learning (ICML), Stockholm, Sweden, 2018.
-
Flexible primitives for distributed deep learning in Ray.
Y. Bulatov, R. Nishihara, P. Moritz, M. Elibol, M. I. Jordan, and I. Stoica. (2018).
Systems and Machine Learning Conference, Palo Alto, CA, 2018.
-
On symplectic optimization.
M. Betancourt, M. I. Jordan, and A. Wilson.
arxiv.org/abs/1802.03653, 2018.
-
Minimizing nonconvex population risk from rough empirical risk.
C. Jin, L. Liu, Ge, R., and M. I. Jordan.
arxiv.org/abs/1803.09357, 2018.
-
On nonlinear dimensionality reduction, linear smoothing and autoencoding.
D. Ting and M. I. Jordan.
arxiv.org/abs/1803.02432, 2018.
-
Model-based value estimation for efficient model-free reinforcement learning.
V. Feinberg, A. Wan, I. Stoica, M. I. Jordan, J. Gonzalez, and S. Levine.
arxiv.org/abs/1803.00101, 2018.
-
Flexible primitives for distributed deep learning in Ray.
Y. Bulitov, P. Moritz, R. Nishihara, M. I. Jordan, and I. Stoica.
Systems and Machine Learning Conference (SysML), Stanford, CA, 2018.
-
Is Q-learning provably efficient?.
C. Jin, Z. Allen-Zhu, S. Bubeck, and M. I. Jordan.
In S. Vishwanathan, H. Wallach, Larochelle, H., Grauman, K., and Cesa-Bianchi, N. (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 31, 2018.
-
Stochastic cubic regularization for fast nonconvex optimization.
N. Tripuraneni, M. Stern, C. Jin, J. Regier, and M. I. Jordan.
Advances in Neural Information Processing Systems (NeurIPS) 31, 2018.
-
On the local minima of the empirical risk.
C. Jin, L. Liu, R. Ge, and M. I. Jordan.
In S. Vishwanathan, H. Wallach, Larochelle, H., Grauman, K., and Cesa-Bianchi, N. (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 31, 2018.
-
Gen-Oja: Simple and efficient algorithm for streaming generalized
eigenvector computation.
K. Bhatia, A. Pacchiano, N. Flammarion, P. Bartlett, P. and M. I. Jordan.
In S. Vishwanathan, H. Wallach, Larochelle, H., Grauman, K., and Cesa-Bianchi, N. (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 31, 2018.
-
Theoretical guarantees for EM under misspecified Gaussian mixture models.
R. Dwivedi, N. Ho, K. Khamaru, M. Wainwright, and M. I. Jordan.
In S. Vishwanathan, H. Wallach, Larochelle, H., Grauman, K., and Cesa-Bianchi, N. (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 31, 2018.
-
Generalized zero-shot learning with deep calibration network.
S. Liu, M. Long, J. Wang, and M. I. Jordan.
In S. Vishwanathan, H. Wallach, Larochelle, H., Grauman, K., and Cesa-Bianchi, N. (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 31, 2018.
-
Conditional adversarial domain adaptation.
M. Long, Z. Cao, J. Wang, and M. I. Jordan.
In S. Vishwanathan, H. Wallach, Larochelle, H., Grauman, K., and Cesa-Bianchi, N. (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 31, 2018.
-
Information constraints on auto-encoding variational Bayes.
R. Lopez, J. Regier, M. I. Jordan, and N. Yosef.
In S. Vishwanathan, H. Wallach, Larochelle, H., Grauman, K., and Cesa-Bianchi, N. (Eds),
Advances in Neural Information Processing Systems (NeurIPS) 31, 2018.
2017
-
Minimax optimal procedures for locally private estimation.
J. Duchi, M. I. Jordan, and M. Wainwright.
Journal of the American Statistical Association, 113, 182-201, 2017.
-
Perturbed iterate analysis for asynchronous stochastic optimization.
H. Mania, X. Pan, D. Papailiopoulos, B. Recht, K. Ramchandran, and M. I. Jordan.
SIAM Journal on Optimization, 27, 2202-2229, 2017.
-
Finite size corrections and likelihood ratio fluctuations in the spiked Wigner model.
A. El Alaoui, F. Krzakala, M. I. Jordan.
arxiv.org/abs/1710.02903, 2017.
-
Measuring cluster stability for Bayesian nonparametrics using the linear
bootstrap.
R. Giordano, R. Liu, N. Varoquaux, M. I. Jordan, and T. Broderick.
arxiv.org/abs/1712.01435, 2017.
-
A deep generative model for single-cell RNA sequencing with application
to detecting differentially expressed genes.
R. Lopez, J. Regier, M. Cole, M. I. Jordan, and N. Yosef.
arxiv.org/abs/1710.05086, 2017.
-
Partial transfer learning with selective adversarial networks.
Z. Cao, M. Long, J. Wang, and M. I. Jordan.
arxiv.org/abs/1707.07901, 2017.
-
A Berkeley view of systems challenges for AI.
I. Stoica, D. Song, R. A. Popa, D. Patterson, M. Mahoney, R Katz,
A. Joseph, M. I. Jordan, J. M. Hellerstein, J. Gonzalez, K Goldberg,
A. Ghodsi, D. Culler, and P. Abbeel.
arxiv.org/abs/1712.05855, 2017.
-
Conditional adversarial domain adaptation.
M. Long, Z. Cao, J. Wang, and M. I. Jordan.
arxiv.org/abs/1705.10667, 2017.
-
Mining massive amounts of genomic data: A semiparametric topic modeling approach.
E. Fang, M-D. Li, M. I. Jordan, and H. Liu.
Journal of the American Statistical Association, 112, 921-932, 2017.
-
Domain adaptation with randomized multilinear adversarial networks.
M. Long, Z. Cao, J. Wang, and M. I. Jordan.
arxiv.org/abs/1705.10667, 2017.
-
Real-time machine learning: The missing pieces.
R. Nishihara, P. Moritz, S. Wang, A. Tumanov, W. Paul, J. Schleier-Smith,
R. Liaw, M. I. Jordan and I. Stoica.
16th Workshop on Hot Topics in Operating Systems (HotOS XVI), Whistler, Canada, 2017.
-
How to escape saddle points efficiently.
C. Jin, R. Ge, P. Netrapalli, S. Kakade, and M. I. Jordan.
In D. Precup and Y. W. Teh (Eds),
Proceedings of the 34th International Conference on Machine
Learning (ICML), Sydney, Australia, 2017.
-
Breaking locality accelerates block Gauss-Seidel.
S. Tu, S. Venkataraman, A. Wilson, A. Gittens, M. I. Jordan, and B. Recht.
In D. Precup and Y. W. Teh (Eds),
Proceedings of the 34th International Conference on Machine
Learning (ICML), Sydney, Australia, NY, 2017.
-
Deep transfer learning with joint adaptation networks.
M. Long, H. Zhu, J. Wang, and M. I. Jordan
Proceedings of the 34th International Conference on Machine
Learning (ICML), Sydney, Australia, 2017.
-
Optimal prediction for sparse linear models? Lower bounds for
coordinate-separable M-estimators.
Y. Zhang, M. Wainwright, and M. I. Jordan.
Electronic Journal of Statistics, 11, 752-799, 2017.
-
QuTE algorithms for decentralized decision making on networks with false
discovery rate control.
A. Ramdas, J. Chen, M. Wainwright, and M. I. Jordan.
56th IEEE Conference on Decision and Control, 2017.
-
Less than a single pass: Stochastically controlled stochastic gradient.
Lei, L., and M. I. Jordan.
In A. Singh and J. Zhu (Eds.),
Proceedings of the Twentieth Conference on Artificial
Intelligence and Statistics (AISTATS), 2017.
[Supplementary info]
-
On the learnability of fully-connected neural networks.
Y. Zhang, J. Lee, M. Wainwright, and M. I. Jordan.
In A. Singh and J. Zhu (Eds.),
Proceedings of the Twentieth Conference on Artificial
Intelligence and Statistics (AISTATS), 2017.
-
Gradient descent can take exponential time to escape saddle points.
S. Du, C. Jin, J. Lee, M. I. Jordan, B. Poczos, and A. Singh.
In S. Bengio, R. Fergus, S. Vishwanathan, and H. Wallach (Eds),
Advances in Neural Information Processing Systems (NIPS) 30, 2017.
-
Nonconvex finite-sum optimization via SCSG methods.
L. Lei, C. Ju, J. Chen, and M. I. Jordan.
In S. Bengio, R. Fergus, S. Vishwanathan, and H. Wallach (Eds),
Advances in Neural Information Processing Systems (NIPS) 30, 2017.
-
Online control of the false discovery rate with decaying memory.
A. Ramdas, F. Yang, M. Wainwright, and M. I. Jordan.
In S. Bengio, R. Fergus, S. Vishwanathan, and H. Wallach (Eds),
Advances in Neural Information Processing Systems (NIPS) 30, 2017.
-
Fast black-box variational inference through stochastic trust-region optimization.
J. Regier, M. I. Jordan, and J. McAuliffe.
In S. Bengio, R. Fergus, S. Vishwanathan, and H. Wallach (Eds),
Advances in Neural Information Processing Systems (NIPS) 30, 2017.
-
Kernel feature selection via conditional covariance minimization.
J. Chen, M. Stern, M. Wainwright, and M. I. Jordan.
In S. Bengio, R. Fergus, S. Vishwanathan, and H. Wallach (Eds),
Advances in Neural Information Processing Systems (NIPS) 30, 2017.
-
A marked Poisson process driven latent shape model for 3D segmentation of
reflectance confocal microscopy image stacks of human skin.
S. Ghanta, M. I. Jordan, K. Kose, D. Brooks, J. Rajadhyaksha, and J. Dy.
IEEE Transactions on Image Processing, 26, 172-184, 2017.
-
Distributed optimization with arbitrary local solvers.
C. Ma, J. Konecny, M. Jaggi, V. Smith, M. I. Jordan, P Richtarik, and M. Takac.
Optimization Methods and Software, 4, 813-848, 2017.
[Most Read Paper Award].
2016
-
A variational perspective on accelerated methods in optimization.
A. Wibisono, A. Wilson, and M. I. Jordan.
Proceedings of the National Academy of Sciences, 133, E7351-E7358, 2016.
[ArXiv version]
-
On the computational complexity of high-dimensional Bayesian variable selection.
Y. Yang, M. Wainwright, and M. I. Jordan.
Annals of Statistics, 44, 2497-2532, 2016.
-
Fast measurements of robustness to changing priors in variational Bayes.
R. Giordano, T. Broderick, and M. I. Jordan.
arXiv:1611.07649, 2016.
-
Fast robustness quantification with variational Bayes.
R. Giordano, T. Broderick, R. Meager, J. Huggins, and M. I. Jordan.
arXiv:1606.07153, 2016.
-
A constructive definition of the beta process.
J. Paisley and M. I. Jordan.
arXiv:1604.00685, 2016.
-
Universality of Mallows' and degeneracy of Kendall's kernels for rankings.
H. Mania, A. Ramdas, M. Wainwright, M. I. Jordan and B. Recht.
arXiv:1603.04245, 2016.
-
Spectral methods meet EM: A provably optimal algorithm for crowdsourcing.
Y. Zhang, X. Chen, D. Zhou, and M. I. Jordan.
Journal of Machine Learning Research, 101, 1-44, 2016.
-
Gradient descent converges to minimizers.
J. Lee, M. Simchowitz, M. I. Jordan, and B. Recht.
Proceedings of the Conference on Learning Theory (COLT),
New York, NY, 2016.
-
Asymptotic behavior of l_p-based Laplacian regularization in
semi-supervised learning.
A. El Alaoui, X. Cheng, A. Ramdas, M. Wainwright and M. I. Jordan.
Proceedings of the Conference on Learning Theory (COLT),
New York, NY, 2016.
-
A kernelized Stein discrepancy for goodness-of-fit tests and model evaluation.
Q. Liu, J. Lee, and M. I. Jordan.
Proceedings of the 33rd International Conference on Machine
Learning (ICML), New York, NY, 2016.
-
l_1-regularized neural networks are improperly learnable in polynomial time.
Y. Zhang, J. Lee, and M. I. Jordan.
Proceedings of the 33rd International Conference on Machine
Learning (ICML), New York, NY, 2016.
-
A linearly-convergent stochastic L-BFGS algorithm.
P. Moritz, R. Nishihara, and M. I. Jordan.
Proceedings of the Eighteenth Conference on Artificial
Intelligence and Statistics (AISTATS), Cadiz, Spain, 2016.
-
High-dimensional continuous control using generalized advantage estimation.
J. Schulman, P. Moritz, S. Levine, M. I. Jordan, and P. Abbeel.
International Conference on Learning Representations (ICLR),
Puerto Rico, 2016.
-
SparkNet: Training deep networks in Spark.
P. Moritz, R. Nishihara, I. Stoica and M. I. Jordan.
International Conference on Learning Representations (ICLR),
Puerto Rico, 2016.
-
The constrained Laplacian rank algorithm for graph-based clustering.
F. Nie, X. Wang, M. I. Jordan, H. Huang.
In Proceedings of the Thirtieth Conference on Artificial Intelligence (AAAI),
Phoenix, AZ, 2016.
-
CYCLADES: Conflict-free asynchronous machine learning.
X. Pan, M. Lam, S. Tu, D. Papailiopoulos, C. Zhang, M. I. Jordan,
K. Ramchandran, C. Re, and B. Recht.
In U. von Luxburg, I. Guyon, D. Lee, M. Sugiyama (Eds.),
Advances in Neural Information Processing Systems (NIPS) 29, 2016.
-
Local maxima in the likelihood of Gaussian mixture models:
Structural results and algorithmic consequences.
C. Jin, Y. Zhang, S. Balakrishnan, M. Wainwright, and M. I. Jordan
In U. von Luxburg, I. Guyon, D. Lee, M. Sugiyama (Eds.),
Advances in Neural Information Processing Systems (NIPS) 29, 2016.
-
Unsupervised domain adaptation with residual transfer networks.
M. Long, H. Zhu, J. Wang, and M. I. Jordan
In U. von Luxburg, I. Guyon, D. Lee, M. Sugiyama (Eds.),
Advances in Neural Information Processing Systems (NIPS) 29, 2016.
2015
-
Machine learning: Trends, perspectives, and prospects.
M. I. Jordan and T. Mitchell.
Science, 349, 255-260, 2015.
-
Nested hierarchical Dirichlet processes.
J. Paisley, C. Wang, D. Blei, and M. I. Jordan.
IEEE Transactions on Pattern Analysis and Machine Intelligence,
37, 256-270, 2015.
-
Combinatorial clustering and the beta negative binomial process.
T. Broderick, L. Mackey, J. Paisley and M. I. Jordan.
IEEE Transactions on Pattern Analysis and Machine Intelligence,
37, 290-306, 2015.
-
Distributed matrix completion and robust factorization.
L. Mackey, A. Talwalkar and M. I. Jordan.
Journal of Machine Learning Research, 16, 913-960, 2015.
-
Optimal rates for zero-order optimization: the power of two function evaluations.
J. Duchi, M. I. Jordan, M. Wainwright, and A. Wibisono.
IEEE Transactions on Information Theory, 61, 2788-2806, 2015.
-
Robust inference with variational Bayes.
R. Giordano, T. Broderick, and M. I. Jordan.
arXiv:1512.02578, 2015.
-
Learning halfspaces and neural networks with random initialization.
Y. Zhang, J. Lee, M. Wainwright and M. I. Jordan.
arXiv:1511.07948, 2015.
-
Asynchronous complex analytics in a distributed dataflow architecture.
J. Gonzalez, P. Bailis, M. I. Jordan, M. Franklin, J. Hellerstein, A. Ghodsi, and I. Stoica.
arXiv:1510.07092, 2015.
-
Splash: User-friendly programming interface for parallelizing stochastic algorithms.
Y. Zhang and M. I. Jordan.
arXiv:1506.07552, 2015.
-
Trust region policy optimization.
J. Schulman, P. Moritz, S. Levine, M. I. Jordan, and P. Abbeel.
In F. Bach and D. Blei (Eds.),
Proceedings of the 32nd International Conference on Machine
Learning (ICML), Lille, France, 2015.
[Long version]
-
Adding vs. averaging in distributed primal-dual optimization.
C. Ma, V. Smith, M. Jaggi, M. I. Jordan, P. Richtarik, and M. Takac.
In F. Bach and D. Blei (Eds.),
Proceedings of the 32nd International Conference on Machine
Learning (ICML), Lille, France, 2015.
[Long version]
-
A general analysis of the convergence of ADMM.
R. Nishihara, L. Lessart, B. Recht, A. Packard, and M. I. Jordan.
In F. Bach and D. Blei (Eds.),
Proceedings of the 32nd International Conference on Machine
Learning (ICML), Lille, France, 2015.
[Long version]
-
Learning transferable features with deep adaptation networks.
M. Long, Y. Cao, J. Wang, and M. I. Jordan.
In F. Bach and D. Blei (Eds.),
Proceedings of the 32nd International Conference on Machine
Learning (ICML), Lille, France, 2015.
-
Distributed estimation of generalized matrix rank: Efficient algorithms and lower bounds.
Y. Zhang, M. Wainwright, and M. I. Jordan.
In F. Bach and D. Blei (Eds.),
Proceedings of the 32nd International Conference on Machine
Learning (ICML), Lille, France, 2015.
-
Automating model search for large scale machine learning.
E. Sparks, A. Talwalkar, D. Haas, M. Franklin, M. I. Jordan, and T. Kraska.
ACM Symposium on Cloud Computing (SOCC), Kohala Coast, Hawaii, 2015.
-
TuPAQ: An efficient planner for large-scale predictive analytic queries.
E. Sparks, A. Talwalkar, M. J. Franklin, M. I. Jordan, and T. Kraska.
arXiv:1502.00068, 2015.
-
Parallel correlation clustering on big graphs.
X. Pan, D. Papailiopoulos, S. Oymak, B. Recht, K. Ramchandran, and M. I. Jordan.
In D. Lee, M. Sugiyama, C. Cortes and N. Lawrence (Eds.),
Advances in Neural Information Processing Systems (NIPS) 28, 2015.
-
On the accuracy of self-normalized log-linear models.
J. Andreas, M. Rabinovich, D. Klein, and M. I. Jordan.
In D. Lee, M. Sugiyama, C. Cortes and N. Lawrence (Eds.),
Advances in Neural Information Processing Systems (NIPS) 28, 2015.
-
Variational consensus Monte Carlo.
M. Rabinovich, E. Angelino, and M. I. Jordan.
In D. Lee, M. Sugiyama, C. Cortes and N. Lawrence (Eds.),
Advances in Neural Information Processing Systems (NIPS) 28, 2015.
-
Linear response methods for accurate covariance estimates from mean field
variational Bayes.
R. Giordano, T. Broderick, and M. I. Jordan.
In D. Lee, M. Sugiyama, C. Cortes and N. Lawrence (Eds.),
Advances in Neural Information Processing Systems (NIPS) 28, 2015.
-
Optimism-driven exploration for nonlinear systems.
T. Moldovan, S. Levine, M. I. Jordan, and P. Abbeel.
In IEEE International Conference on Robotics and Automation (ICRA),
Seattle, WA, 2015.
2014
-
Matrix concentration inequalities via the method of exchangeable pairs.
L. Mackey, M. I. Jordan, R. Y. Chen, B. Farrell and J. A. Tropp.
Annals of Probability, 42, 906-945, 2014.
-
A scalable bootstrap for massive data.
A. Kleiner, A. Talwalkar, P. Sarkar and M. I. Jordan.
Journal of the Royal Statistical Society, Series B,
76, 795-816, 2014.
-
Privacy aware learning.
J. Duchi, M. I. Jordan, and M. Wainwright.
Journal of the ACM, 61, http://dx.doi.org/10.1145/2666468, 2014.
-
Joint modeling of multiple time series via the beta process with
application to motion capture segmentation.
E. Fox, M. Hughes, E. Sudderth, and M. I. Jordan.
Annals of Applied Statistics, 8, 1281-1313, 2014.
-
Nonparametric link prediction in large scale dynamic networks.
P. Sarkar, D. Chakrabarti, and M. I. Jordan.
Electronic Journal of Statistics, 8, 2022-2065, 2014.
-
Particle Gibbs with ancestral sampling.
F. Lindsten, M. I. Jordan, and T. Schön.
Journal of Machine Learning Research,
15, 2145-2184, 2014.
-
Iterative discovery of multiple alternative clustering views.
D. Niu, J. Dy, and M. I. Jordan.
IEEE Transactions on Pattern Analysis and Machine Intelligence,
36, 1340-1353, 2014.
-
Matrix-variate Dirichlet process priors with applications.
Z. Zhang, D. Wang, G. Dai, and M. I. Jordan.
Bayesian Analysis, 9, 259-286, 2014.
-
SMASH: A benchmarking toolkit for variant calling.
A. Talwalkar, J. Liptrap, J. Newcomb, C. Hartl, J. Terhorst, K. Curtis, M Bresler,
Y. Song, M. I. Jordan, and D. Patterson.
Bioinformatics,
DOI:10.1093/bioinformatics/btu345, 2014.
-
Optimality guarantees for distributed statistical estimation.
J. Duchi, M. I. Jordan, M. Wainwright, and Y. Zhang.
arXiv:1405.0782, 2014.
-
The missing piece in complex analytics: Low latency, scalable model
management and serving with Velox.
D. Crankshaw, P. Bailis, J. E. Gonzalez, H. Li, Z. Zhang, M. J. Franklin,
A. Ghodsi, and M. I. Jordan.
Conference on Innovative Data Systems Research (CIDR),
Asilomar, CA, 2014.
-
Lower bounds on the performance of polynomial-time algorithms
for sparse linear regression.
Y. Zhang, M. Wainwright, and M. I. Jordan.
Proceedings of the Conference on Learning Theory (COLT),
Barcelona, Spain, 2014.
-
Knowing when you're wrong: Building fast and reliable approximate
query processing systems.
S. Agarwal, H. Milner, A. Kleiner, B. Mozafari, M. I. Jordan,
S. Madden, and I. Stoica.
Proceedings of the 2014 ACM International Conference on Management
of Data (SIGMOD), Snowbird, Utah, 2014.
-
Scaling a crowd-sourced database.
B. Mozafari, P. Sarkar, M. Franklin, M. I. Jordan, and S. Madden.
Proceedings of the 41st International Conference on Very Large Data Bases (VLDB),
Hawaii, USA, 2014.
-
Changepoint analysis for efficient variant calling.
A. Bloniarz, A. Talwalkar, J. Terhorst, M. I. Jordan, D. Patterson,
B. Yu, and Y. Song.
International Conference on Research in Computational
Molecular Biology (RECOMB), Pittsburgh, PA, 2014.
-
Mixed membership models for time series.
E. Fox and M. I. Jordan.
In E. Airoldi, D. Blei, E. A. Erosheva, and S. E. Fienberg (Eds.),
Handbook of Mixed Membership Models and Their Applications,
Chapman & Hall/CRC, 2014.
-
Mixed membership matrix factorization.
L. Mackey, D. Weiss, and M. I. Jordan.
In E. Airoldi, D. Blei, E. A. Erosheva, and S. E. Fienberg (Eds.),
Handbook of Mixed Membership Models and Their Applications,
Chapman & Hall/CRC, 2014.
-
Bayesian nonnegative matrix factorization with stochastic variational inference.
J. Paisley, D. Blei, and M. I. Jordan.
In E. Airoldi, D. Blei, E. A. Erosheva, and S. E. Fienberg (Eds.),
Handbook of Mixed Membership Models and Their Applications,
Chapman & Hall/CRC, 2014.
-
Spectral methods meet EM: A provably optimal algorithm for crowdsourcing.
Y. Zhang, X. Chen, D. Zhou, and M. I. Jordan.
In Z. Ghahramani, M. Welling, C. Cortes and N. Lawrence (Eds.),
Advances in Neural Information Processing Systems (NIPS) 27, 2014.
-
On the convergence rate of decomposable submodular function minimization.
R. Nishihara, S. Jegelka, and M. I. Jordan.
In Z. Ghahramani, M. Welling, C. Cortes and N. Lawrence (Eds.),
Advances in Neural Information Processing Systems (NIPS) 27, 2014.
-
Communication-efficient distributed dual coordinate ascent.
M. Jaggi, V. Smith, M. Takac, J. Terhorst, T. Hofmann, and M. I. Jordan.
In Z. Ghahramani, M. Welling, C. Cortes and N. Lawrence (Eds.),
Advances in Neural Information Processing Systems (NIPS) 27, 2014.
-
Parallel double greedy submodular maximization.
X. Pan, S. Jegelka, J. Gonzalez, J. Bradley, and M. I. Jordan.
In Z. Ghahramani, M. Welling, C. Cortes and N. Lawrence (Eds.),
Advances in Neural Information Processing Systems (NIPS) 27, 2014.
2013
-
Learning dependency-based compositional semantics.
P. Liang, M. I. Jordan, and D. Klein.
Computational Linguistics, 39, 389-446, 2013.
-
Computational and statistical tradeoffs via convex relaxation.
V. Chandrasekaran and M. I. Jordan.
Proceedings of the National Academy of Sciences, 110, E1181-E1190, 2013.
-
Feature allocations, probability functions, and paintboxes.
T. Broderick, J. Pitman, and M. I. Jordan.
Bayesian Analysis, 8, 801-836, 2013.
-
On statistics, computation and scalability.
M. I. Jordan.
Bernoulli, 19, 1378-1390, 2013.
-
The asymptotics of ranking algorithms.
J. Duchi, L. Mackey, and M. I. Jordan.
Annals of Statistics, 4, 2292-2323, 2013.
-
Evolutionary inference via the Poisson indel process.
A. Bouchard-Côté and M. I. Jordan.
Proceedings of the National Academy of Sciences, 110, 1160-1166, 2013.
-
Clusters and features from combinatorial stochastic processes.
T. Broderick, M. I. Jordan, and J. Pitman.
Statistical Science, 28, 289-312, 2013.
-
Bayesian semiparametric Wiener system identification.
F. Lindsten, T. Schön, and M. I. Jordan.
Automatica, 49, 2053-2063, 2013.
-
Cluster forests.
D. Yan, A. Chen, and M. I. Jordan.
Computational Statistics and Data Analysis, 66, 178-192, 2013.
-
Molecular function prediction for a family exhibiting evolutionary tendencies
towards substrate specificity swapping: Recurrence of tyrosine aminotransferase
activity in the I$\alpha$ subfamily.
K. Muratore, B. Engelhardt, J. Srouji, M. I. Jordan, S. Brenner, and J. Kirsch.
Proteins: Structure, Function, and Bioinformatics, DOI:10.1002/prot.24318, 2013.
-
Local privacy, data processing inequalities, and statistical minimax rates.
J. Duchi, M. I. Jordan, and M. Wainwright.
arXiv:1302.3203, 2013.
-
MLI: An API for distributed machine learning.
E. Sparks, A. Talwalkar, V. Smith, J. Kottalam, X. Pan, J. Gonzalez, M. I. Jordan,
M. Franklin, and T. Kraska. IEEE International Conference on Data Mining (ICDM),
Dallas, TX, 2013.
-
MAD-Bayes: MAP-based asymptotic derivations from Bayes.
T. Broderick, B. Kulis, and M. I. Jordan.
In S. Dasgupta and D. McAllester (Eds.),
Proceedings of the 30th International Conference on Machine
Learning (ICML), Atlanta, GA, 2013.
[Supplementary information].
-
Efficient ranking from pairwise comparisons.
F. Wauthier, M. I. Jordan, and N. Jojic.
In S. Dasgupta and D. McAllester (Eds.),
Proceedings of the 30th International Conference on Machine
Learning (ICML), Atlanta, GA, 2013.
[Supplementary information].
-
Distributed low-rank subspace segmentation.
L. Mackey, A. Talwalkar, Y. Mu, S-F. Chang, and M. I. Jordan.
IEEE International Conference on Computer Vision (ICCV), Sydney, Australia, 2013.
-
A general bootstrap performance diagnostic.
A. Kleiner, A. Talwalkar, S. Agarwal, M. I. Jordan, and I. Stoica.
ACM Conference on Knowledge Discovery and Data Mining (SIGKDD), Chicago, IL, 2013.
-
Local privacy and minimax bounds: Sharp rates for probability estimation.
J. Duchi, M. I. Jordan, and M. Wainwright.
arXiv:1305.6000, 2013.
-
Optimistic concurrency control for distributed unsupervised learning.
X. Pan, J. Gonzalez, S. Jegelka, T. Broderick, and M. I. Jordan.
In L. Bottou, C. Burges, Z. Ghahramani, and M. Welling (Eds.),
Advances in Neural Information Processing Systems (NIPS) 26, 2013.
-
Information-theoretic lower bounds for distributed statistical estimation
with communication constraints.
Y. Zhang, J. Duchi, M. I. Jordan, and M. Wainwright.
In L. Bottou, C. Burges, Z. Ghahramani, and M. Welling (Eds.),
Advances in Neural Information Processing Systems (NIPS) 26, 2013.
-
Estimation, optimization, and parallelism when data is sparse.
J. Duchi, M. I. Jordan, and B. McMahan.
In L. Bottou, C. Burges, Z. Ghahramani, and M. Welling (Eds.),
Advances in Neural Information Processing Systems (NIPS) 26, 2013.
-
Streaming variational Bayes.
T. Broderick, N. Boyd, A. Wibisono, A. Wilson and M. I. Jordan.
In L. Bottou, C. Burges, Z. Ghahramani, and M. Welling (Eds.),
Advances in Neural Information Processing Systems (NIPS) 26, 2013.
-
Local privacy and minimax bounds: Sharp rates for probability estimation.
J. Duchi, M. I. Jordan, and M. Wainwright
In L. Bottou, C. Burges, Z. Ghahramani, and M. Welling (Eds.),
Advances in Neural Information Processing Systems (NIPS) 26, 2013.
-
A comparative framework for preconditioned Lasso algorithms.
F. Wauthier, N. Jojic and M. I. Jordan.
In L. Bottou, C. Burges, Z. Ghahramani, and M. Welling (Eds.),
Advances in Neural Information Processing Systems (NIPS) 26, 2013.
2012
-
Phylogenetic inference via sequential Monte Carlo.
A. Bouchard-Côté, S. Sankararaman, and M. I. Jordan.
Systematic Biology, 61, 579-593, 2012.
-
Ergodic mirror descent.
J. C. Duchi, A. Agarwal, M. Johansson, and M. I. Jordan.
SIAM Journal of Optimization, 22, 1549-1578, 2012.
-
EP-GIG priors and applications in Bayesian sparse learning.
Z. Zhang, S. Wang, D. Liu, and M. I. Jordan.
Journal of Machine Learning Research, 13, 2031-2061, 2012.
-
Beta processes, stick-breaking, and power laws.
T. Broderick, M. I. Jordan and J. Pitman.
Bayesian Analysis, 7, 439-476, 2012.
-
Coherence functions with applications in large-margin classification methods.
Z. Zhang, D. Liu, G. Dai, and M. I. Jordan.
Journal of Machine Learning Research, 13, 2705-2734, 2012.
-
A million cancer genome warehouse.
D. Haussler, D. A. Patterson, M. Diekhans, A. Fox, M. I. Jordan, A. D. Joseph,
S. Ma, B. Paten, S. Shenker, T. Sittler and I. Stoica.
Technical Report UCB/EECS-2012-211, Department of EECS,
University of California, Berkeley, 2012.
-
Active learning for crowd-sourced databases.
B. Mozafari, P. Sarkar, M. J. Franklin, M. I. Jordan, and S. Madden.
arXiv:1209.3686, 2012.
-
The Big Data bootstrap.
A. Kleiner, A. Talwalkar, P. Sarkar, and M. I. Jordan.
In J. Langford and J. Pineau (Eds.),
Proceedings of the 29th International Conference on Machine
Learning (ICML), Edinburgh, UK, 2012.
-
Revisiting k-means: New algorithms via Bayesian nonparametrics.
B. Kulis and M. I. Jordan.
In J. Langford and J. Pineau (Eds.),
Proceedings of the 29th International Conference on Machine
Learning (ICML), Edinburgh, UK, 2012.
-
Variational Bayesian inference with stochastic search.
J. Paisley, D. Blei, and M. I. Jordan.
In J. Langford and J. Pineau (Eds.),
Proceedings of the 29th International Conference on Machine
Learning (ICML), Edinburgh, UK, 2012.
-
Nonparametric link prediction in dynamic networks.
P. Sarkar, D. Chakrabarti, and M. I. Jordan.
In J. Langford and J. Pineau (Eds.),
Proceedings of the 29th International Conference on Machine
Learning (ICML), Edinburgh, UK, 2012.
[Appendix].
-
Stick-breaking beta processes and the Poisson process.
J. Paisley, D. Blei, and M. I. Jordan.
In N. Lawrence and M. Girolami (Eds.),
Proceedings of the Fifteenth Conference on Artificial Intelligence
and Statistics (AISTATS), Canary Islands, Spain, 2012.
-
A semiparametric Bayesian approach to Wiener system identification.
F. Lindsten, T. Schön, and M. I. Jordan.
16th IFAC Symposium on System Identification (SYSID), Brussels, Belgium, 2012.
-
Active spectral clustering via iterative uncertainty reduction.
F. Wauthier, N. Jojic, and M. I. Jordan.
18th ACM Conference on Knowledge Discovery and Data Mining
(SIGKDD), Beijing, China, 2012.
-
Small-variance asymptotics for exponential family Dirichlet process mixture models.
K. Jiang, B. Kulis, and M. I. Jordan.
In P. Bartlett, F. Pereira, L. Bottou and C. Burges (Eds.),
Advances in Neural Information Processing Systems (NIPS) 25, 2012.
-
Ancestral sampling for particle Gibbs.
F. Lindsten, M. I. Jordan, and T. Schön.
In P. Bartlett, F. Pereira, L. Bottou and C. Burges (Eds.),
Advances in Neural Information Processing Systems (NIPS) 25, 2012.
-
Finite sample convergence rates of zero-order stochastic optimization methods.
J. Duchi, M. I. Jordan, M. Wainwright, and A. Wibisono.
In P. Bartlett, F. Pereira, L. Bottou and C. Burges (Eds.),
Advances in Neural Information Processing Systems (NIPS) 25, 2012.
-
Privacy aware learning.
J. Duchi, M. I. Jordan, and M. Wainwright.
In P. Bartlett, F. Pereira, L. Bottou and C. Burges (Eds.),
Advances in Neural Information Processing Systems (NIPS) 25, 2012.
[Long version].
2011
-
Union support recovery in high-dimensional multivariate regression.
G. Obozinski, M. J. Wainwright, and M. I. Jordan.
Annals of Statistics, 39, 1-47, 2011.
-
Bayesian inference for queueing networks and modeling of Internet services.
C. Sutton and M. I. Jordan.
Annals of Applied Statistics, 5, 254-282, 2011.
-
Genome-scale phylogenetic function annotation of large and
diverse protein families.
B. Engelhardt, M. I. Jordan, J. Srouji, and S. Brenner.
Genome Research, 21, 1969-1980, 2011.
-
A sticky HDP-HMM with application to speaker diarization.
E. Fox, E. Sudderth, M. I. Jordan, and A. Willsky.
Annals of Applied Statistics, 5, 1020-1056, 2011.
-
Learning low-dimensional signal models.
L. Carin, R. G. Baraniuk, V. Cevher, D. Dunson, M. I. Jordan, G. Sapiro,
and M. B. Wakin.
IEEE Signal Processing Magazine, 28, 39-51, 2011.
-
Bayesian generalized kernel mixed models.
Z. Zhang, G. Dai, and M. I. Jordan.
Journal of Machine Learning Research, 12, 111-139, 2011.
-
Bayesian nonparametric inference of switching linear dynamical models.
E. Fox, E. Sudderth, M. I. Jordan, and A. Willsky.
IEEE Transactions on Signal Processing, 59, 1569-1585, 2011.
-
Nonparametric combinatorial sequence models.
F. Wauthier, M. I. Jordan, and N. Jojic.
Journal of Computational Biology,
18, 1649-1660, 2011.
-
The SCADS Director: Scaling a distributed storage system under stringent
performance requirements.
B. Trushkowsky, P. Bodik, A. Fox, M. Franklin, M. I. Jordan, and D. Patterson.
In 9th USENIX Conference on File and Storage Technologies (FAST '11),
San Jose, CA, 2011.
-
Learning dependency-based compositional semantics.
P. Liang, M. I. Jordan, and D. Klein.
The 49th Annual Meeting of the Association for Computational Linguistics (ACL),
[Long version].
-
Nonparametric Bayesian co-clustering ensembles.
P. Wang, K. B. Laskey, C. Domeniconi, and M. I. Jordan.
SIAM International Conference on Data Mining (SDM), Phoenix, AZ, 2011.
-
Dimensionality reduction for spectral clustering.
D. Niu, J. Dy, and M. I. Jordan.
In G. Gordon and D. Dunson (Eds.),
Proceedings of the Fourteenth Conference on Artificial Intelligence
and Statistics (AISTATS), Ft. Lauderdale, FL, 2011.
-
Nonparametric combinatorial sequence models.
F. Wauthier, M. I. Jordan, and N. Jojic.
15th Annual International Conference on Research in Computational Molecular Biology (RECOMB),
Vancouver, BC, 2011.
-
Message from the President: Visualizing Bayesians.
M. I. Jordan.
ISBA Bulletin, 18(3), 1-2, 2011.
-
Supervised hierarchical Pitman-Yor process for natural scene segmentation.
A. Shyr, T. Darrell, M. I. Jordan, and R. Urtasun.
IEEE Conference on Computer Vision and Pattern Recognition (CVPR),
Colorado Springs, CO, 2011.
-
A unified probabilistic model for global and local unsupervised feature selection.
Y. Guan, J. Dy, and M. I. Jordan.
In L. Getoor and T. Scheffer (Eds.),
Proceedings of the 28th International Conference on Machine
Learning (ICML), Bellevue, WA, 2011.
-
Message from the President: The era of Big Data.
M. I. Jordan.
ISBA Bulletin, 18(2), 1-3, 2011.
-
Managing data transfers in computer clusters with Orchestra.
M. Chowdhury, M. Zaharia, J. Ma, M. I. Jordan, and I. Stoica (2011).
ACM SIGCOMM, Toronto, Canada, 2011.
-
Visually relating gene expression and in vivo DNA binding data.
M.-Y. Huang, L. Mackey, S. Keranen, G. Weber, M. I. Jordan, D. Knowles,
M. Biggin, and B. Hamann.
IEEE International Conference on Bioinformatics and Biomedicine (IEEE BIBM)
Atlanta, GA, 2011.
-
Message from the President: What are the open problems in Bayesian statistics?
M. I. Jordan.
ISBA Bulletin, 18(1), 1-4, 2011.
-
Ergodic subgradient descent.
J. C. Duchi, A. Agarwal, M. Johansson, and M. I. Jordan.
Forty-Ninth Annual Allerton Conference on Communication,
Control, and Computing, Urbana-Champaign, IL, 2011.
-
Bayesian bias mitigation for crowdsourcing.
F. L. Wauthier and M. I. Jordan.
In J. Shawe-Taylor, R. Zemel, P. Bartlett and F. Pereira (Eds.),
Advances in Neural Information Processing Systems (NIPS) 24, 2011.
-
Divide-and-conquer matrix factorization.
L. Mackey, A. Talwalkar and M. I. Jordan.
In J. Shawe-Taylor, R. Zemel, P. Bartlett and F. Pereira (Eds.),
Advances in Neural Information Processing Systems (NIPS) 24, 2011.
[Long version].
2010
-
Bayesian nonparametric learning: Expressive priors for intelligent systems.
M. I. Jordan.
In R. Dechter, H. Geffner, and J. Halpern (Eds.),
Heuristics, Probability and Causality: A Tribute to Judea Pearl,
College Publications, 2010.
-
Hierarchical models, nested models and completely random measures.
M. I. Jordan.
In M.-H. Chen, D. Dey, P. Mueller, D. Sun, and K. Ye (Eds.),
Frontiers of Statistical Decision Making and Bayesian
Analysis: In Honor of James O. Berger,
New York: Springer, 2010.
-
Feature space resampling for protein conformational search.
B. Blum, M. I. Jordan, and D. Baker.
Proteins: Structure, Function, and Bioinformatics,
78, 1583-1593, 2010.
[Supplementary information].
-
Neighbor-dependent Ramachandran probability distributions of amino acids
developed from a hierarchical Dirichlet process model.
D. Ting, G. Wang, M. Shapovalov, R. Mitra, M. I. Jordan, and R. Dunbrack.
PLoS Computational Biology, 6, e1000763, 2010.
-
The nested Chinese restaurant process and Bayesian inference of topic hierarchies.
D. M. Blei, T. Griffiths, and M. I. Jordan.
Journal of the ACM, 57, 1-30, 2010.
[Software].
-
Estimating divergence functionals and the likelihood ratio by convex
risk minimization.
X. Nguyen, M. J. Wainwright and M. I. Jordan.
IEEE Transactions on Information Theory, 56, 5847-5861, 2010.
-
Joint covariate selection and joint subspace selection
for multiple classification problems.
G. Obozinski, B. Taskar, and M. I. Jordan.
Statistics and Computing, 20, 231-252, 2010.
-
Convex and semi-nonnegative matrix factorizations.
C. Ding, T. Li, and M. I. Jordan.
IEEE Transactions on Pattern Analysis and Machine Intelligence,
32, 45-55, 2010.
-
Active site prediction using evolutionary and structural information.
S. Sankararaman, F. Sha, J. Kirsch, M. I. Jordan, and K. Sjolander.
Bioinformatics, 26, 617-624, 2010.
-
Regularized discriminant analysis, ridge regression and beyond.
Z. Zhang, G. Dai, C. Xu, and M. I. Jordan.
Journal of Machine Learning Research, 11, 2141-2170, 2010.
-
Bayesian nonparametric methods for learning Markov switching processes.
E. Fox, E. Sudderth, M. I. Jordan, and A. Willsky.
IEEE Signal Processing Magazine, 27, 43-54, 2010.
-
Leo Breiman.
M. I. Jordan.
Annals of Applied Statistics, 4, 1642-1643, 2010.
-
Hierarchical Bayesian nonparametric models with applications.
Y. W. Teh and M. I. Jordan.
In N. Hjort, C. Holmes, P. Mueller, and S. Walker (Eds.),
Bayesian Nonparametrics: Principles and Practice,
Cambridge, UK: Cambridge University Press, 2010.
-
Probabilistic grammars and hierarchical Dirichlet processes.
P. Liang, M. I. Jordan, and D. Klein.
In T. O'Hagan and M. West (Eds.),
The Handbook of Applied Bayesian Analysis,
Oxford University Press, 2010.
-
Nonparametrics and graphical models: Discussion of Ickstadt et al.
M. I. Jordan.
In: J. M. Bernardo, M. Bayarri, J. O. Berger, A. P. Dawid, D. Heckerman,
A. F. M. Smith, and M. West (Eds.), Bayesian Statistics 9, 2010.
-
An analysis of the convergence of graph Laplacians.
D. Ting, L. Huang, and M. I. Jordan.
Proceedings of the 27th International Conference on Machine
Learning (ICML), Haifa, Israel, 2010.
-
Multiple non-redundant spectral clustering views.
D. Niu, J. Dy, and M. I. Jordan.
Proceedings of the 27th International Conference on Machine
Learning (ICML), Haifa, Israel, 2010.
-
On the consistency of ranking algorithms.
J. Duchi, L. Mackey, and M. I. Jordan.
Proceedings of the 27th International Conference on Machine
Learning (ICML), Haifa, Israel, 2010.
[Best Student Paper Award].
-
Mixed membership matrix factorization.
L. Mackey, D. Weiss, and M. I. Jordan.
Proceedings of the 27th International Conference on Machine
Learning (ICML), Haifa, Israel, 2010.
[Software].
-
Learning programs: A hierarchical Bayesian approach.
P. Liang, M. I. Jordan, and D. Klein.
Proceedings of the 27th International Conference on Machine
Learning (ICML), Haifa, Israel, 2010.
-
Detecting large-scale system problems by mining console logs.
W. Xu, L. Huang, A. Fox, D. Patterson, and M. I. Jordan.
Proceedings of the 27th International Conference on Machine
Learning (ICML), Haifa, Israel, 2010.
-
Modeling events with cascades of Poisson processes.
A. Simma and M. I. Jordan.
In Uncertainty in Artificial Intelligence (UAI),
Proceedings of the Twenty-Sixth Conference, Catalina Island, CA, 2010.
-
Matrix-variate Dirichlet process mixture models.
Z. Zhang, G. Dai, and M. I. Jordan.
Proceedings of the Thirteenth Conference on Artificial Intelligence
and Statistics (AISTATS), Sardinia, Italy, 2010.
-
Inference and learning in networks of queues.
C. Sutton and M. I. Jordan.
Proceedings of the Thirteenth Conference on Artificial Intelligence
and Statistics (AISTATS), Sardinia, Italy, 2010.
-
Bayesian generalized kernel models.
Z. Zhang, G. Dai, D. Wang, and M. I. Jordan.
Proceedings of the Thirteenth Conference on Artificial Intelligence
and Statistics (AISTATS), Sardinia, Italy, 2010.
-
Characterizing, modeling, and generating workload spikes for stateful services.
P. Bodik, A. Fox, M. Franklin, M. I. Jordan, and D. Patterson.
First ACM Symposium on Cloud Computing (SOCC),
Indianapolis, IN, 2010.
-
Sufficient dimension reduction for visual sequence classification.
A. Shyr, R. Urtasun, and M. I. Jordan.
IEEE Conference on Computer Vision and Pattern Recognition (CVPR),
San Francisco, CA, 2010.
-
Type-based MCMC.
P. Liang, M. I. Jordan, and D. Klein.
The 11th Annual Conference of the North American Chapter of the
Association for Computational Linguistics (NAACL-HLT),
Los Angeles, CA, 2010.
-
Variational inference over combinatorial spaces.
A. Bouchard-Côté and M. I. Jordan.
In J. Shawe-Taylor, R. Zemel, J. Lafferty, and C. Williams (Eds.),
Advances in Neural Information Processing Systems (NIPS) 23, 2010.
[Supplementary information].
-
Random conic pursuit for semidefinite programming.
A. Kleiner, A. Rahimi, and M. I. Jordan.
In J. Lafferty and C. Williams and J. Shawe-Taylor and R. Zemel and A. Culotta (Eds.),
Advances in Neural Information Processing Systems (NIPS) 23, 2010.
[Supplementary information].
-
Heavy-tailed process priors for selective shrinkage.
F. L. Wauthier and M. I. Jordan.
In J. Lafferty and C. Williams and J. Shawe-Taylor and R. Zemel and A. Culotta (Eds.),
Advances in Neural Information Processing Systems (NIPS) 23, 2010.
-
Tree-structured stick breaking for hierarchical data.
R. Adams, Z. Ghahramani, and M. I. Jordan.
In J. Lafferty and C. Williams and J. Shawe-Taylor and R. Zemel and A. Culotta (Eds.),
Advances in Neural Information Processing Systems (NIPS) 23, 2010.
-
Unsupervised kernel dimension reduction.
M. Wang, F. Sha, and M. I. Jordan.
In J. Lafferty and C. Williams and J. Shawe-Taylor and R. Zemel and A. Culotta (Eds.),
Advances in Neural Information Processing Systems (NIPS) 23, 2010.
[Supplementary information].
2009
-
On surrogate loss functions and f-divergences.
X. Nguyen, M. J. Wainwright and M. I. Jordan.
Annals of Statistics, 37, 876-904, 2009.
-
Genomic privacy and the limits of individual detection in a pool.
S. Sankararaman, G. Obozinski, M. I. Jordan, and E. Halperin,
Nature Genetics, 41, 965-967, 2009.
-
Kernel dimension reduction in regression.
K. Fukumizu, F. R. Bach, and M. I. Jordan.
Annals of Statistics, 37, 1871-1905, 2009.
-
Joint estimation of gene conversion rates and mean conversion
tract lengths from population SNP data.
J. Yin, M. I. Jordan, and Y. Song.
Bioinformatics, 25, i231-i239, 2009.
-
Nonparametric Bayesian identification of jump systems with sparse dependencies.
E. Fox, E. Sudderth, M. I. Jordan, and A. Willsky.
15th IFAC Symposium on System Identification (SYSID), St. Malo, France, 2009.
-
Large-scale system problems detection by mining console logs.
W. Xu, L. Huang, A. Fox, D. Patterson, and M. I. Jordan.
22nd ACM Symposium on Operating Systems Principles (SOSP),
Big Sky, MT, 2009.
-
Learning semantic correspondences with less supervision.
P. Liang, M. I. Jordan, and D. Klein.
Proceedings of the 47th Annual Meeting of the Association for
Computational Linguistics (ACL), Singapore, 2009.
-
Optimization of structured mean field objectives.
A. Bouchard-Côté and M. I. Jordan.
In Uncertainty in Artificial Intelligence (UAI),
Proceedings of the Twenty-Fifth Conference, Montreal, Canada, 2009.
-
Learning from measurements in exponential families.
P. Liang, M. I. Jordan, and D. Klein.
Proceedings of the 26th International Conference on Machine
Learning (ICML), Montreal, Canada, 2009.
-
Fast approximate spectral clustering.
D. Yan, L. Huang, and M. I. Jordan.
15th ACM Conference on Knowledge Discovery and Data Mining
(SIGKDD), Paris, France, 2009.
[Software].
[Long version].
-
Coherence functions for multicategory margin-based classification methods.
Z. Zhang, M. I. Jordan, W-J. Li, and D-Y. Yeung.
Proceedings of the Twelfth Conference on Artificial Intelligence
and Statistics (AISTATS), Clearwater Beach, FL, 2009.
-
A flexible and efficient algorithm for regularized Fisher discriminant analysis.
Z. Zhang, G. Dai, and M. I. Jordan.
In W. Buntine, M. Grobelnik, D. Mladenic, J. Shawe-Taylor (Eds.),
Machine Learning and Knowledge Discovery in Databases:
European Conference (ECML PKDD), Bled, Slovenia, 2009.
-
Automatic exploration of datacenter performance regimes.
P. Bodik, R. Griffith, C. Sutton, A. Fox, M. I. Jordan, and D. Patterson.
First Workshop on Automated Control for Datacenters and Clouds (ACDC),
Barcelona, Spain, 2009.
-
Latent variable models for dimensionality reduction.
Z. Zhang and M. I. Jordan.
Proceedings of the Twelfth Conference on Artificial Intelligence
and Statistics (AISTATS), Clearwater Beach, FL, 2009.
-
Online system problem detection by mining patterns of console logs.
W. Xu, L. Huang, A. Fox, D. Patterson, and M. I. Jordan.
IEEE International Conference on Data Mining (ICDM), Miami, FL, 2009.
-
Predicting multiple performance metrics for queries: Better decisions enabled
by machine learning.
A. Ganapathi, H. Kuno, U. Dayal, J. Wiener, A. Fox, M. I. Jordan, and D. Patterson.
IEEE International Conference on Data Engineering (ICDE), Shanghai, China, 2009.
[Ten-Year Influential Paper].
-
Statistical machine learning makes automatic control practical for
Internet datacenters.
P. Bodik, R. Griffith, C. Sutton, A. Fox, M. I. Jordan, and D. Patterson.
Workshop on Hot Topics in Cloud Computing (HotCloud),
San Diego, CA, 2009.
-
Sharing features among dynamical systems with beta processes.
E. Fox, E. Sudderth, M. I. Jordan, and A. S. Willsky.
In Y. Bengio, D. Schuurmans, J. Lafferty and C. Williams (Eds.),
Advances in Neural Information Processing Systems (NIPS) 22, 2009.
-
Nonparametric latent feature models for link prediction.
K. Miller, T. Griffiths, and M. I. Jordan.
In Y. Bengio, D. Schuurmans, J. Lafferty and C. Williams (Eds.),
Advances in Neural Information Processing Systems (NIPS) 22, 2009.
-
An asymptotic analysis of smooth regularizers.
P. Liang, F. Bach, G. Bouchard, and M. I. Jordan.
In Y. Bengio, D. Schuurmans, J. Lafferty and C. Williams (Eds.),
Advances in Neural Information Processing Systems (NIPS) 22, 2009.
2008
-
Graphical models, exponential families, and variational inference.
M. J. Wainwright and M. I. Jordan.
Foundations and Trends in Machine Learning, 1, 1-305, 2008.
[Substantially revised and expanded version of a 2003 technical report.]
-
On the inference of ancestries in admixed populations.
S. Sankararaman, G. Kimmel, E. Halperin, and M. I. Jordan.
Genome Research, 18, 668-675, 2008.
-
Multiway spectral clustering: A maximum margin perspective.
Z. Zhang and M. I. Jordan.
Statistical Science, 23, 383-403, 2008.
-
A dual receptor cross-talk model of G protein-coupled signal transduction.
P. Flaherty, M. A. Radhakrishnan, T. Dinh, M. I. Jordan, and A. P. Arkin.
PLoS Computational Biology, 4, e1000185, 2008.
-
Association mapping and significance estimation via the coalescent.
G. Kimmel, R. Karp, M. I. Jordan, and E. Halperin.
American Journal of Human Genetics, 83, 675-683, 2008.
-
On optimal quantization rules for some sequential decision problems.
X. Nguyen, M. J. Wainwright, and M. I. Jordan.
IEEE Transactions on Information Theory, 54, 3285-3295, 2008.
-
Consistent probabilistic outputs for protein function prediction.
G. Obozinski, C. E. Grant, G. R. G. Lanckriet, M. I. Jordan, and W. S. Noble.
Genome Biology, 9, S7, 2008.
-
Quantitative gene function assignment from genomic datasets in M. musculus.
L. Pena-Castillo, et al. Genome Biology, 9, S2, 2008.
-
Probabilistic inference in queueing networks.
C. A. Sutton and M. I. Jordan.
Workshop on Tackling Computer Systems Problems with Machine
Learning Techniques (SYSML), 2008.
-
The phylogenetic Indian buffet process: A non-exchangeable nonparametric
prior for latent features.
K. Miller, T. Griffiths and M. I. Jordan.
In Uncertainty in Artificial Intelligence (UAI),
Proceedings of the Twenty-Fourth Conference, 2008.
-
An HDP-HMM for systems with state persistence.
E. Fox, E. Sudderth, M. I. Jordan, and A. Willsky.
Proceedings of the 25th International Conference on Machine
Learning (ICML), Helsinki, Finland, 2008.
-
An analysis of generative, discriminative, and pseudolikelihood
estimators. P. Liang and M. I. Jordan.
Proceedings of the 25th International Conference on Machine
Learning (ICML), Helsinki, Finland, 2008.
[Best Student Paper Award].
-
Nonnegative matrix factorization for combinatorial optimization:
Spectral clustering, graph matching, and clique finding.
C. Ding, T. Li, and M. I. Jordan.
IEEE International Conference on Data Mining (ICDM), 2008.
-
Mining console logs for large-scale system problem detection.
W. Xu, L. Huang, A. Fox, D. Patterson, and M. I. Jordan.
Workshop on Tackling Computer Systems Problems with Machine
Learning Techniques (SYSML), 2008.
-
Spectral clustering for speech separation.
F. R. Bach and M. I. Jordan.
In J. Keshet and S. Bengio (Eds.),
Automatic Speech and Speaker Recognition: Large Margin and
Kernel Methods. New York: John Wiley, 2008.
-
Shared segmentation of natural scenes using dependent Pitman-Yor processes.
E. Sudderth and M. I. Jordan.
In D. Koller, Y. Bengio, D. Schuurmans and L. Bottou (Eds.),
Advances in Neural Information Processing Systems (NIPS) 21, 2008.
-
Efficient inference in phylogenetic InDel trees.
A. Bouchard-Côté, M. I. Jordan, and D. Klein.
In D. Koller, Y. Bengio, D. Schuurmans and L. Bottou (Eds.),
Advances in Neural Information Processing Systems (NIPS) 21, 2008.
-
High-dimensional union support recovery in multivariate regression.
G. Obozinski, M. J. Wainwright and M. I. Jordan.
In D. Koller, Y. Bengio, D. Schuurmans and L. Bottou (Eds.),
Advances in Neural Information Processing Systems (NIPS) 21, 2008.
[Appendix].
-
Nonparametric Bayesian learning of switching linear dynamical systems.
E. B. Fox, E. Sudderth, M. I. Jordan, and A. S. Willsky.
In D. Koller, Y. Bengio, D. Schuurmans and L. Bottou (Eds.),
Advances in Neural Information Processing Systems (NIPS) 21, 2008.
-
Spectral clustering with perturbed data.
L. Huang, D. Yan, M. I. Jordan, and N. Taft.
In D. Koller, Y. Bengio, D. Schuurmans and L. Bottou (Eds.),
Advances in Neural Information Processing Systems (NIPS) 21, 2008.
[Long version].
-
DiscLDA: Discriminative learning for dimensionality reduction and classification.
S. Lacoste-Julien, F. Sha, and M. I. Jordan.
In D. Koller, Y. Bengio, D. Schuurmans and L. Bottou (Eds.),
Advances in Neural Information Processing Systems (NIPS) 21, 2008.
-
Posterior consistency of the Silverman g-prior in Bayesian model choice.
Z. Zhang and M. I. Jordan.
In D. Koller, Y. Bengio, D. Schuurmans and L. Bottou (Eds.),
Advances in Neural Information Processing Systems (NIPS) 21, 2008.
2007
-
A direct formulation for sparse PCA using semidefinite programming.
A. d'Aspremont, L. El Ghaoui, M. I. Jordan, and G. R. G. Lanckriet.
SIAM Review, 49, 434-448, 2007.
[Winner of the 2008 SIAM Activity Group on Optimization Prize].
[Software].
-
A randomization test for controlling population stratification in
whole-genome association studies.
G. Kimmel, M. I. Jordan, E. Halperin, R. Shamir, and R. Karp.
American Journal of Human Genetics, 81, 895-905, 2007.
-
Bayesian haplotype inference via the Dirichlet process.
E. P. Xing, M. I. Jordan and R. Sharan.
Journal of Computational Biology, 14, 267-284, 2007.
-
Hierarchical beta processes and the Indian buffet process.
R. Thibaux and M. I. Jordan.
Proceedings of the Tenth Conference on Artificial Intelligence
and Statistics (AISTATS), 2007.
-
Regression on manifolds using kernel dimension reduction.
J. Nilsson, F. Sha, and M. I. Jordan.
Proceedings of the 24th International Conference on Machine
Learning (ICML), 2007.
-
The infinite PCFG using hierarchical Dirichlet processes.
P. Liang, S. Petrov, M. I. Jordan, and D. Klein.
Empirical Methods in Natural Language Processing (EMNLP), 2007.
-
A permutation-augmented sampler for DP mixture models.
P. Liang, M. I. Jordan, and B. Taskar.
Proceedings of the 24th International Conference on Machine
Learning (ICML), 2007.
-
Nonparametric estimation of the likelihood ratio and divergence functionals.
X. Nguyen, M. J. Wainwright and M. I. Jordan.
International Symposium on Information Theory (ISIT),
Nice, France, 2007.
-
Learning multiscale representations of natural scenes using
Dirichlet processes.
J. J. Kivinen, E. B. Sudderth, and M. I. Jordan.
IEEE International Conference on Computer Vision (ICCV), 2007.
-
Communication-efficient online detection of network-wide anomalies.
L. Huang, X. Nguyen, M. Garofalakis, J. M. Hellerstein, M. I. Jordan,
A. Joseph, and N. Taft.
26th Annual IEEE Conference on Computer Communications (INFOCOM'07), 2007.
-
Image denoising with nonparametric hidden Markov trees.
J. J. Kivinen, E. B. Sudderth, and M. I. Jordan.
IEEE International Conference on Image Processing (ICIP), 2007.
-
Response-time modeling for resource allocation and energy-informed SLAs.
P. Bodik, C. Sutton, A. Fox, D. Patterson, and M. I. Jordan.
Workshop on Statistical Learning Techniques for Solving Systems Problems,
Whistler, BC, 2007.
-
Solving consensus and semi-supervised clustering problems using
nonnegative matrix factorization.
T. Li, C. Ding, and M. I. Jordan.
IEEE International Conference on Data Mining (ICDM), 2007.
-
Feature selection methods for improving protein structure prediction
with Rosetta.
B. Blum, M. I. Jordan, D. Kim, R. Das, P. Bradley, and D. Baker.
In J. Platt, D. Koller, Y. Singer and A. McCallum (Eds.),
Advances in Neural Information Processing Systems (NIPS) 20, 2007.
-
Agreement-based learning.
P. Liang, D. Klein and M. I. Jordan.
In J. Platt, D. Koller, Y. Singer and A. McCallum (Eds.),
Advances in Neural Information Processing Systems (NIPS) 20, 2007.
-
Estimating divergence functionals and the likelihood ratio by
penalized convex risk minimization.
X. Nguyen, M. J. Wainwright and M. I. Jordan.
In J. Platt, D. Koller, Y. Singer and A. McCallum (Eds.),
Advances in Neural Information Processing Systems (NIPS) 20, 2007.
2006
-
Hierarchical Dirichlet processes.
Y. W. Teh, M. I. Jordan, M. J. Beal and D. M. Blei.
Journal of the American Statistical Association, 101, 1566-1581, 2006.
[Software].
-
Learning spectral clustering, with application to speech separation.
F. R. Bach and M. I. Jordan.
Journal of Machine Learning Research, 7, 1963-2001, 2006.
-
Convexity, classification, and risk bounds.
P. L. Bartlett, M. I. Jordan, and J. D. McAuliffe.
Journal of the American Statistical Association, 101, 138-156,
2006.
-
Log-determinant relaxation for approximate inference in discrete
Markov random fields.
M. J. Wainwright and M. I. Jordan.
IEEE Transactions on Signal Processing, 54, 2099-2109, 2006.
-
Nonparametric empirical Bayes for the Dirichlet process mixture model.
J. D. McAuliffe, D. M. Blei and M. I. Jordan.
Statistics and Computing, 16, 5-14, 2006.
-
Structured prediction, dual extragradient and Bregman projections.
B. Taskar, S. Lacoste-Julien and M. I. Jordan.
Journal of Machine Learning Research, 7, 1627-1653, 2006.
-
Mining the Caenorhabditis Genetic Center bibliography for genes
related to life span.
D. M. Blei, M. I. Jordan, and S. Mian.
BMC Bioinformatics, 7, 250-269, 2006.
-
Bayesian multi-population haplotype inference via a hierarchical
Dirichlet process mixture.
E. P. Xing, K.-A. Song, M. I. Jordan, and Y. W. Teh.
Proceedings of the 23rd International Conference on Machine
Learning (ICML), 2006.
-
Statistical debugging: Simultaneous identification of multiple bugs.
A. Zheng, M. I. Jordan, B. Liblit, M. Nayur, and A. Aiken.
Proceedings of the 23rd International Conference on Machine
Learning (ICML), 2006.
-
A statistical graphical model for predicting protein molecular function.
B. Engelhardt, M. I. Jordan, and S. Brenner.
Proceedings of the 23rd International Conference on Machine
Learning (ICML), 2006.
-
Word alignment via quadratic assignment.
S. Lacoste-Julien, B. Taskar, D. Klein, and M. I. Jordan.
Proceedings of the North American Chapter of the Association
for Computational Linguistics Annual Meeting (HLT-NAACL), 2006.
-
Bayesian multicategory support vector machines.
Z. Zhang, and M. I. Jordan.
In Uncertainty in Artificial Intelligence (UAI),
Proceedings of the Twenty-Second Conference, 2006.
-
On optimal quantization rules for sequential decision problems.
X. Nguyen, M. J. Wainwright and M. I. Jordan.
International Symposium on Information Theory (ISIT),
Seattle, WA, 2006.
[Long version].
-
Advanced tools for operators at Amazon.com.
P. Bodik, A. Fox, M. I. Jordan, D. Patterson, A. Banerjee,
R. Jagannathan, T. Su, S. Tenginakai, B. Turner, and J. Ingalls.
First Workshop on Hot Topics in Autonomic Computing (HotAC),
Dublin, Ireland, 2006.
-
Comment on 'Support vector machines with applications'.
P. L. Bartlett, M. I. Jordan, and J. D. McAuliffe.
Statistical Science, 21, 341-346, 2006.
-
In-network PCA and anomaly detection.
L. Huang, X. Nguyen, M. Garofalakis, M. I. Jordan, A. Joseph, and N. Taft.
In B. Schoelkopf, J. Platt and T. Hofmann (Eds.),
Advances in Neural Information Processing Systems (NIPS) 19, 2006.
[Long version].
2005
-
Dirichlet processes, Chinese restaurant processes and all that.
M. I. Jordan. Tutorial presentation at the NIPS Conference, 2005.
-
Subtree power analysis and species selection for comparative genomics.
J. D. McAuliffe, M. I. Jordan, and L. Pachter.
Proceedings of the National Academy of Sciences, 102, 7900-7905, 2005.
-
Variational inference for Dirichlet process mixtures.
D. M. Blei and M. I. Jordan.
Bayesian Analysis, 1, 121-144, 2005.
-
Protein function prediction by Bayesian phylogenomics.
B. E. Engelhardt, M. I. Jordan, K. E. Muratore, and S. E. Brenner.
PLoS Computational Biology, e45, 2005.
-
Nonparametric decentralized detection using kernel methods.
X. Nguyen, M. J. Wainwright, and M. I. Jordan.
IEEE Transactions on Signal Processing, 53, 4053-4066, 2005.
-
Genome-wide requirements for resistance to functionally distinct
DNA-damaging agents.
L. William, R. P. St. Onge, M. Proctor, P. Flaherty, M. I. Jordan,
A. P. Arkin, R. W. Davis, C. Nislow, and G. Giaever.
PLoS Genetics, 1, 235-246, 2005.
-
A kernel-based learning approach to ad hoc sensor network localization.
X. Nguyen, M. I. Jordan, and B. Sinopoli.
ACM Transactions on Sensor Networks, 1, 134-152, 2005.
-
Sulfur and nitrogen limitation in Escherichia coli K12:
specific homeostatic responses.
P. Gyaneshwar, O. Paliy, J. McAuliffe, A. Jones, M. I. Jordan, and S. Kustu.
Journal of Bacteriology, 187, 1074-1090, 2005.
-
A latent variable model for chemogenomic profiling.
P. Flaherty, G. Giaever, J. Kumm, M. I. Jordan, and A. P. Arkin.
Bioinformatics, 21, 3286-3293, 2005.
-
Predictive low-rank decomposition for kernel methods.
F. R. Bach and M. I. Jordan.
Proceedings of the 22nd International Conference on Machine
Learning (ICML), 2005.
[Matlab code]
-
The DLR hierarchy of approximate inference.
M. Rosen-Zvi, M. I. Jordan, and A. Yuille.
In Uncertainty in Artificial Intelligence (UAI),
Proceedings of the Twenty-First Conference, 2005.
-
A variational principle for graphical models.
M. J. Wainwright and M. I. Jordan.
New Directions in Statistical Signal Processing: From Systems to Brain.
Cambridge, MA: MIT Press, 2005.
-
Scalable statistical bug isolation.
B. Liblit, M. Naik, A. X. Zheng, A. Aiken, and M. I. Jordan.
ACM SIGPLAN Conference on Programming Language Design and
Implementation (PLDI), 2005.
[Software]
-
A probabilistic interpretation of canonical correlation analysis.
F. R. Bach and M. I. Jordan.
Technical Report 688, Department of Statistics,
University of California, Berkeley, 2005.
-
Extensions of the informative vector machine.
N. D. Lawrence, J. C. Platt, & M. I. Jordan.
In J. Winkler and N. D. Lawrence and M. Niranjan (Eds.),
Proceedings of the Sheffield Machine Learning Workshop,
Lecture Notes in Computer Science, New York: Springer, 2005.
-
Discriminative training of Hidden Markov models for multiple
pitch tracking.
F. R. Bach and M. I. Jordan.
Proceedings of the International Conference on Acoustics,
Speech and Signal Processing (ICASSP), 2005.
-
Multi-instrument musical transcription using a dynamic graphical model.
B. Vogel, M. I. Jordan and D. Wessel.
Proceedings of the International Conference on Acoustics,
Speech and Signal Processing (ICASSP), 2005.
-
Combining visualization and statistical analysis to improve
operator confidence and efficiency for failure detection
and localization.
P. Bodik, G. Friedman, L. Biewald, H. Levine, G. Candea,
K. Patel, G. Tolle, J. Hui, A. Fox, M. I. Jordan, and D. Patterson.
International Conference on Autonomic Computing (ICAC), 2005.
-
On information divergence measures, surrogate loss functions and
decentralized hypothesis testing.
X. Nguyen, M. J. Wainwright and M. I. Jordan.
Forty-third Annual Allerton Conference on Communication,
Control, and Computing, Urbana-Champaign, IL, 2005.
-
Gaussian processes and the null-category noise model.
N. D. Lawrence and M. I. Jordan.
In O. Chapelle, B. Schoelkopf & A. Zien (Eds),
Semi-Supervised Learning, Cambridge, MA: MIT Press, 2005.
-
Semiparametric latent factor models.
Y. W. Teh, M. Seeger, and M. I. Jordan.
Proceedings of the Eighth Conference on Artificial Intelligence
and Statistics (AISTATS), 2005.
-
Robust design of biological experiments.
P. Flaherty, M. I. Jordan and A. P. Arkin.
In Y. Weiss and B. Schoelkopf and J. Platt (Eds.),
Advances in Neural Information Processing Systems
(NIPS) 18, 2005.
-
Structured prediction via the extragradient method.
B. Taskar, S. Lacoste-Julien and M. I. Jordan.
In Y. Weiss and B. Schoelkopf and J. Platt (Eds.),
Advances in Neural Information Processing Systems
(NIPS) 18, 2005.
[Long version].
-
Divergences, surrogate loss functions and experimental design.
X. Nguyen, M. J. Wainwright and M. I. Jordan.
In Y. Weiss and B. Schoelkopf and J. Platt (Eds.),
Advances in Neural Information Processing Systems
(NIPS) 18, 2005,
[Long version].
2004
-
Graphical models. M. I. Jordan.
Statistical Science (Special Issue on Bayesian Statistics),
19, 140-155, 2004.
-
Multiple-sequence functional annotation and the generalized hidden
Markov phylogeny.
J. D. McAuliffe, L. Pachter, and M. I. Jordan.
Bioinformatics, 20, 1850-1860, 2004.
-
Learning graphical models for stationary time series.
F. R. Bach and M. I. Jordan.
IEEE Transactions on Signal Processing, 52, 2189-2199, 2004.
-
Kalman filtering with intermittent observations.
B. Sinopoli, L. Schenato, M. Franceschetti, K. Poolla,
M. I. Jordan, and S. Sastry.
IEEE Transactions on Automatic Control, 49, 1453-1464, 2004.
-
Chemogenomic profiling: Identifying the functional interactions of
small molecules in yeast. G. Giaever, P. Flaherty, J. Kumm,
M. Proctor, D. F. Jaramillo, A. M. Chu, M. I. Jordan, A. P. Arkin,
and R. W. Davis.
Proceedings of the National Academy of Sciences, 3, 793-798, 2004.
-
A statistical framework for genomic data fusion.
G. R. G. Lanckriet, T. De Bie, N. Cristianini, M. I. Jordan,
and W. S. Noble. Bioinformatics, 20, 2626-2635, 2004.
-
Learning the kernel matrix with semidefinite programming.
G. R. G. Lanckriet, N. Cristianini, L. El Ghaoui, P. L. Bartlett, and M. I. Jordan.
Journal of Machine Learning Research, 5, 27-72, 2004.
-
Dimensionality reduction for supervised learning with reproducing kernel
Hilbert spaces.
K. Fukumizu, F. R. Bach, and M. I. Jordan.
Journal of Machine Learning Research, 5, 73-79, 2004.
-
Robust sparse hyperplane classifiers: application to uncertain
molecular profiling data.
C. Bhattacharyya, L. R. Grate, M. I. Jordan, L. El Ghaoui, and
Mian, I. S.
Journal of Computational Biology, 11, 1073-1089, 2004.
[Software]
-
Discussion of boosting.
P. L. Bartlett, M. I. Jordan, and J. D. McAuliffe.
Annals of Statistics, 32, 85-91, 2004.
-
LOGOS: A modular Bayesian model for de novo motif detection.
E. P. Xing, W. Wu, M. I. Jordan, and R. M. Karp.
Journal of Bioinformatics and Computational Biology, 2,
127-154, 2004.
-
Treewidth-based conditions for exactness of the Sherali-Adams
and Lasserre relaxations.
M. J. Wainwright and M. I. Jordan.
Technical Report 671, Department of Statistics,
University of California, Berkeley, 2004.
-
Multiple kernel learning, conic duality, and the SMO algorithm.
F. R. Bach, G. R. G. Lanckriet, and M. I. Jordan.
Proceedings of the 21st International Conference on Machine
Learning (ICML), 2004.
[Long version].
[Software].
[ICML Test of Time Award].
-
Bayesian haplotype inference via the Dirichlet process.
E. P. Xing, R. Sharan, and M. I. Jordan.
Proceedings of the 21st International Conference on Machine
Learning (ICML), 2004.
-
Decentralized detection and classification using kernel methods.
X. Nguyen, M. J. Wainwright, and M. I. Jordan.
Proceedings of the 21st International Conference on Machine
Learning (ICML), 2004.
[Best Paper Award].
-
Variational methods for the Dirichlet process.
D. M. Blei and M. I. Jordan.
Proceedings of the 21st International Conference on Machine
Learning (ICML), 2004.
[Long version].
-
Sparse Gaussian process classification with multiple classes.
M. Seeger and M. I. Jordan.
Technical Report 661, Department of Statistics,
University of California, Berkeley, 2004.
-
Graph partition strategies for generalized mean field inference.
E. P. Xing, M. I. Jordan, and S. Russell.
In Uncertainty in Artificial Intelligence (UAI),
Proceedings of the Twentieth Conference, 2004.
-
Kernel-based data fusion and its application to protein function prediction in yeast.
G. R. G. Lanckriet, M. Deng, N. Cristianini, M. I. Jordan, and W. S. Noble.
Pacific Symposium on Biocomputing (PSB), 2004.
[Supplementary information].
-
Combining statistical monitoring and predictable recovery for
self-management.
A. Fox, E. Kiciman, D. A. Patterson, R. H. Katz and M. I. Jordan.
ACM SIGSOFT Proceedings of the Workshop on Self-Managed Systems
(WOSS), 2004.
-
Public deployment of cooperative bug isolation.
B. Liblit, A. Aiken, A. X. Zheng, and M. I. Jordan.
Workshop on Remote Analysis and
Measurement of Software Systems (RAMSS), 2004.
-
Failure diagnosis using decision trees.
M. Chen, A. X. Zheng, J. Lloyd, M. I. Jordan, and E. Brewer.
International Conference on Autonomic Computing (ICAC), 2004.
-
Sharing clusters among related groups: Hierarchical Dirichlet processes.
Y. W. Teh, M. I. Jordan, M. J. Beal and D. M. Blei.
In L. Saul, Y. Weiss, and L. Bottou (Eds.),
Advances in Neural Information Processing Systems (NIPS) 17, 2004.
[Long version].
[Software]
-
Blind one-microphone speech separation: A spectral learning approach.
F. R. Bach and M. I. Jordan.
In L. Saul, Y. Weiss, and L. Bottou (Eds.),
Advances in Neural Information Processing Systems (NIPS) 17, 2004.
-
A direct formulation for sparse PCA using semidefinite programming.
A. d'Aspremont, L. El Ghaoui, M. I. Jordan, and G. R. G. Lanckriet.
In L. Saul, Y. Weiss, and L. Bottou (Eds.),
Advances in Neural Information Processing Systems (NIPS) 17, 2004.
-
Semi-supervised learning via Gaussian processes.
N. D. Lawrence and M. I. Jordan.
In L. Saul, Y. Weiss, and L. Bottou (Eds.),
Advances in Neural Information Processing Systems (NIPS) 17, 2004.
-
Computing regularization paths for learning multiple kernels.
F. R. Bach, R. Thibaux, and M. I. Jordan.
In L. Saul, Y. Weiss, and L. Bottou (Eds.),
Advances in Neural Information Processing Systems (NIPS) 17, 2004.
[Matlab code]
2003
-
Latent Dirichlet allocation.
D. M. Blei, A. Y. Ng, and M. I. Jordan.
Journal of Machine Learning Research, 3, 993-1022, 2003.
[software].
-
Toward a protein profile of Escherichia coli: Comparison to its transcription
profile.
R. W. Corbin, O. Paliy, F. Yang, J. Shabanowitz, M. Platt, C. E. Lyons,
Jr., K. Root, J. D. McAuliffe, M. I. Jordan, S. Kustu, E. Soupene, and D. F. Hunt.
Proceedings of the National Academy of Sciences, 100, 9232-9237, 2003.
-
Beyond independent components: Trees and clusters.
F. R. Bach and M. I. Jordan.
Journal of Machine Learning Research, 4, 1205-1233, 2003.
[Matlab code]
-
Matching words and pictures.
K. Barnard, P. Duygulu, N. de Freitas, D. A. Forsyth, D. M. Blei, and M. I. Jordan.
Journal of Machine Learning Research, 3, 1107-1135, 2003.
-
Hierarchical Bayesian models for applications in information retrieval.
D. M. Blei, M. I. Jordan and A. Y. Ng.
In: J. M. Bernardo, M. Bayarri, J. O. Berger, A. P. Dawid, D. Heckerman,
A. F. M. Smith, and M. West (Eds.), Bayesian Statistics 7, 2003.
-
Simultaneous relevant feature identification and classification
in high-dimensional spaces: Application to molecular profiling data.
C. Bhattacharyya, L. R. Grate, A. Rizki, D. Radisky, F. J. Molina,
M. I. Jordan, M. J. Bissell, and I. S. Mian. Signal Processing,
83, 729-743, 2003.
-
An introduction to MCMC for machine learning.
C. Andrieu, N. de Freitas, A. Doucet and M. I. Jordan.
Machine Learning, 50, 5-43, 2003.
-
Modeling annotated data.
D. M. Blei and M. I. Jordan.
26th International Conference on Research and Development
in Information Retrieval (SIGIR), New York: ACM Press, 2003.
[SIGIR Test of Time Honorable Mention].
-
Bug isolation via remote program sampling.
B. Liblit, A. Aiken, A. X. Zheng, and M. I. Jordan.
ACM SIGPLAN 2003 Conference on Programming
Language Design and Implementation (PLDI), San Diego, 2003.
-
Variational inference in graphical models: The view from the marginal
polytope. M. J. Wainwright and M. I. Jordan. Forty-first Annual
Allerton Conference on Communication, Control, and Computing,
Urbana-Champaign, IL, 2003.
-
Kernel-based integration of genomic data using semidefinite programming.
G. R. G. Lanckriet, N. Cristianini, M. I. Jordan, and W. S. Noble.
In B. Schoelkopf, K. Tsuda and J-P. Vert (Eds.), Kernel Methods
in Computational Biology, Cambridge, MA: MIT Press, 2003.
-
On semidefinite relaxation for normalized k-cut and connections to spectral clustering.
E. P. Xing and M. I. Jordan.
Technical Report CSD-03-1265, Computer Science Division,
University of California, Berkeley, 2003.
-
Support vector machines for analog circuit performance representation.
F. De Bernardinis, M. I. Jordan, and A. L. Sangiovanni-Vincentelli.
Proceedings of the Design Automation Conference (DAC), 2003.
-
Semidefinite relaxations for approximate inference on graphs with cycles.
M. J. Wainwright and M. I. Jordan.
In S. Thrun, L. Saul, and B. Schoelkopf (Eds.),
Advances in Neural Information Processing Systems (NIPS) 16,
(long version), 2003.
-
Learning spectral clustering.
F. R. Bach and M. I. Jordan.
In S. Thrun, L. Saul, and B. Schoelkopf (Eds.),
Advances in Neural Information Processing Systems (NIPS) 16,
(long version), 2003.
-
Hierarchical topic models and the nested Chinese restaurant process.
D. M. Blei, T. Griffiths, M. I. Jordan, and J. Tenenbaum.
In S. Thrun, L. Saul, and B. Schoelkopf (Eds.),
Advances in Neural Information Processing Systems (NIPS) 16, 2003.
-
Kernel dimensionality reduction for supervised learning.
K. Fukumizu, F. R. Bach, and M. I. Jordan.
In S. Thrun, L. Saul, and B. Schoelkopf (Eds.),
Advances in Neural Information Processing Systems (NIPS) 16, 2003.
-
Large margin classifiers: convex loss, low noise, and convergence rates.
P. L. Bartlett, M. I. Jordan, and J. D. McAuliffe.
In S. Thrun, L. Saul, and B. Schoelkopf (Eds.),
Advances in Neural Information Processing Systems (NIPS) 16, 2003.
-
On the concentration of expectation and approximate inference in layered
Bayesian networks. X. Nguyen and M. I. Jordan.
In S. Thrun, L. Saul, and B. Schoelkopf (Eds.),
Advances in Neural Information Processing Systems (NIPS) 16,
(long version), 2003.
-
Statistical debugging of sampled programs.
A. X. Zheng, M. I. Jordan, B. Liblit, and A. Aiken.
In S. Thrun, L. Saul, and B. Schoelkopf (Eds.),
Advances in Neural Information Processing Systems (NIPS) 16, 2003.
-
Autonomous helicopter flight via reinforcement learning.
A. Y. Ng, H. J. Kim, M. I. Jordan, and S. Sastry.
In S. Thrun, L. Saul, and B. Schoelkopf (Eds.),
Advances in Neural Information Processing Systems (NIPS) 16, 2003.
-
A generalized mean field algorithm for variational inference in
exponential families.
E. P. Xing, M. I. Jordan, and S. Russell.
In C. Meek and U. Kjaerulff,
Uncertainty in Artificial Intelligence (UAI),
Proceedings of the Eighteenth Conference, 2002.
-
Kernel independent component analysis.
F. R. Bach and M. I. Jordan. International Conference on Acoustics,
Speech, and Signal Processing (ICASSP), 2002,
[Long version].
[Matlab code]
-
Kalman filtering with intermittent observations.
B. Sinopoli, L. Schenato, M. Franceschetti, K. Poolla,
M. I. Jordan, and S. Sastry.
42nd IEEE Conference on Decision and Control (CDC), 2004.
-
Integrated analysis of transcript profiling and protein sequence data.
L. R. Grate, C. Bhattacharyya, M. I. Jordan, and I. S. Mian.
Mechanisms of Ageing and Development, 124, 109-114, 2003.
-
Finding clusters in independent component analysis.
F. R. Bach and M. I. Jordan.
Fourth International Symposium on Independent Component Analysis
and Blind Signal Separation (ICA), 2003.
-
Sampling user executions for bug isolation.
B. Liblit, A. Aiken, A. X. Zheng, and M. I. Jordan.
Workshop on Remote Analysis and
Measurement of Software Systems (RAMSS), 2003.
-
LOGOS: A modular Bayesian model for de novo motif detection.
E. P. Xing, W. Wu, M. I. Jordan, and R. M. Karp.
IEEE Computer Society Bioinformatics Conference (CSB), 2004.
2002
-
Kernel independent component analysis.
F. R. Bach and M. I. Jordan. Journal of Machine Learning Research, 3, 1-48, 2002.
[Matlab code]
-
Optimal feedback control as a theory of motor coordination.
E. Todorov and M. I. Jordan. Nature Neuroscience, 5, 1226-1235, 2002.
[Supplementary information].
[News and views].
-
A robust minimax approach to classification.
G. R. G. Lanckriet, L. El Ghaoui, C. Bhattacharyya, and M. I. Jordan.
Journal of Machine Learning Research, 3, 552-582, 2002.
[Matlab code]
-
Sensorimotor adaptation of speech I: Compensation and adaptation.
J. F. Houde and M. I. Jordan. Journal of Speech, Language,
and Hearing Research, 45, 239-262, 2002.
-
Graphical models: Probabilistic inference.
M. I. Jordan and Y. Weiss. In M. Arbib (Ed.),
The Handbook of Brain Theory and Neural Networks, 2nd edition.
Cambridge, MA: MIT Press, 2002.
-
Loopy belief propagation and Gibbs measures.
S. Tatikonda and M. I. Jordan.
In D. Koller and A. Darwiche (Eds)., Uncertainty in Artificial Intelligence (UAI),
Proceedings of the Eighteenth Conference, 2002.
-
Tree-dependent component analysis.
F. R. Bach and M. I. Jordan.
In D. Koller and A. Darwiche (Eds)., Uncertainty in Artificial Intelligence (UAI),
Proceedings of the Eighteenth Conference, 2002.
[Matlab code]
-
Random sampling of a continuous-time stochastic dynamical system.
M. Micheli and M. I. Jordan.
Proceedings of the Fifteenth International Symposium on Mathematical Theory
of Networks and Systems, 2002.
-
Learning the kernel matrix with semidefinite programming.
G. R. G. Lanckriet, P. L. Bartlett, N. Cristianini, L. El Ghaoui, and M. I. Jordan.
Machine Learning: Proceedings of the Nineteenth International Conference
(ICML),
San Mateo, CA: Morgan Kaufmann, 2002.
-
Learning graphical models with Mercer kernels.
F. R. Bach and M. I. Jordan.
In S. Becker, S. Thrun, and K. Obermayer (Eds.),
Advances in Neural Information Processing Systems (NIPS) 15, 2002.
-
Robust novelty detection with single-class MPM.
G. R. G. Lanckriet, L. El Ghaoui, and M. I. Jordan.
In S. Becker, S. Thrun, and K. Obermayer (Eds.),
Advances in Neural Information Processing Systems (NIPS) 15, 2002.
-
A minimal intervention principle for coordinated movement.
E. Todorov and M. I. Jordan.
In S. Becker, S. Thrun, and K. Obermayer (Eds.),
Advances in Neural Information Processing Systems (NIPS) 15, 2002.
-
Distance metric learning, with application to clustering with side-information.
E. P. Xing, A. Y. Ng, M. I. Jordan and S. Russell.
In S. Becker, S. Thrun, and K. Obermayer (Eds.),
Advances in Neural Information Processing Systems (NIPS) 15, 2002.
-
A hierarchical Bayesian Markovian model for motifs in biopolymer sequences.
E. P. Xing, M. I. Jordan, R. M. Karp and S. Russell.
In S. Becker, S. Thrun, and K. Obermayer (Eds.),
Advances in Neural Information Processing Systems (NIPS) 15, 2002.
-
Simultaneous relevant feature identification and classification
in high-dimensional spaces.
L. R. Grate, C. Bhattacharyya, M. I. Jordan and I. S. Mian.
Workshop on Algorithms in Bioinformatics, 2002.
[matlab code],
[perl/lp_solve code].
-
Learning in modular and hierarchical systems.
M. I. Jordan and R. A. Jacobs. In M. Arbib (Ed.),
The Handbook of Brain Theory and Neural Networks, 2nd edition.
Cambridge, MA: MIT Press, 2002.
2001
-
Stable algorithms for link analysis.
A. Y. Ng, A. X. Zheng, and M. I. Jordan. Proceedings of the
24th International Conference on Research and Development
in Information Retrieval (SIGIR), New York, NY: ACM Press, 2001.
-
Efficient stepwise selection in decomposable models.
A. Deshpande, M. N. Garofalakis, and M. I. Jordan.
In J. Breese and D. Koller (Ed)., Uncertainty in Artificial Intelligence (UAI),
Proceedings of the Seventeenth Conference, 2001.
-
Convergence rates of the Voting Gibbs classifier, with application
to Bayesian feature selection.
A. Y. Ng and M. I. Jordan. Machine Learning: Proceedings of the
Eighteenth International Conference, San Mateo, CA: Morgan Kaufmann, 2001.
-
Link analysis, eigenvectors, and stability.
A. Y. Ng, A. X. Zheng, and M. I. Jordan.
International Joint Conference on Artificial Intelligence (IJCAI), 2001.
-
Variational MCMC.
N. de Freitas, P. Højen-Sørensen, M. I. Jordan, and S. Russell.
In J. Breese and D. Koller (Ed)., Uncertainty in Artificial Intelligence (UAI),
Proceedings of the Seventeenth Conference, 2001.
-
Feature selection for high-dimensional genomic microarray data.
E. P. Xing, M. I. Jordan, and R. M. Karp. Machine Learning: Proceedings
of the Eighteenth International Conference, San Mateo, CA: Morgan Kaufmann,
2001.
-
Thin junction trees.
F. R. Bach and M. I. Jordan.
In T. Dietterich, S. Becker and Z. Ghahramani (Eds.),
Advances in Neural Information Processing Systems (NIPS) 14, 2001.
-
On spectral clustering: Analysis and an algorithm.
A. Y. Ng, M. I. Jordan, and Y. Weiss.
In T. Dietterich, S. Becker and Z. Ghahramani
(Eds.), Advances in Neural Information Processing Systems (NIPS) 14, 2001.
-
Minimax probability machine.
G. R. G. Lanckriet, L. El Ghaoui, C. Bhattacharyya, and M. I. Jordan.
In T. Dietterich, S. Becker and Z. Ghahramani
(Eds.), Advances in Neural Information Processing Systems (NIPS) 14, 2001.
-
On discriminative vs. generative classifiers: A comparison of logistic
regression and naive Bayes.
A. Y. Ng and M. I. Jordan.
In T. Dietterich, S. Becker and Z. Ghahramani
(Eds.), Advances in Neural Information Processing Systems (NIPS) 14, 2001.
-
Latent Dirichlet allocation.
D. M. Blei, A. Y. Ng and M. I. Jordan.
In T. Dietterich, S. Becker and Z. Ghahramani
(Eds.), Advances in Neural Information Processing Systems (NIPS) 14, 2001,
[Long version],
[software].
-
Discorsi sulle reti neurali e l'apprendimento.
C. Domeniconi and M. I. Jordan. Milan: Edizioni Franco Angeli, 2001.
2000
-
Learning with mixtures of trees.
M. Meila and M. I. Jordan.
Journal of Machine Learning Research, 1, 1-48, 2000.
-
Attractor dynamics for feedforward neural networks.
L. K. Saul and M. I. Jordan. Neural Computation, 12, 1313-1335, 2000.
-
Bayesian logistic regression: a variational approach.
T. S. Jaakkola and M. I. Jordan. Statistics and Computing, 10, 25-37, 2000.
-
Asymptotic convergence rate of the EM algorithm for gaussian mixtures.
J. Ma, L. Xu, and M. I. Jordan.
Neural Computation, 12, 2881-290, 2000.
-
PEGASUS: A policy search method for large MDPs and POMDPs.
A. Y. Ng and M. I. Jordan.
Uncertainty in Artificial Intelligence (UAI),
Proceedings of the Sixteenth Conference, 2000.
1999
-
Mixed memory Markov models: Decomposing complex stochastic processes
as mixture of simpler ones.
L. K. Saul and M. I. Jordan.
Machine Learning, 37, 75-87, 1999.
-
Variational probabilistic inference and the QMR-DT network.
T. S. Jaakkola and M. I. Jordan. Journal of Artificial Intelligence
Research, 10, 291-322, 1999.
-
Are reaching movements planned to be straight and invariant in
the extrinsic space?
M. Desmurget, C. Prablanc, M. I. Jordan, and M. Jeannerod, M.
Quarterly Journal of Experimental Psychology, 52, 981-1020, 1999.
-
Loopy belief-propagation for approximate inference: An empirical study.
K. Murphy, Y. Weiss, and M. I. Jordan.
In K. B. Laskey and H. Prade (Eds.), Uncertainty in Artificial Intelligence (UAI),
Proceedings of the Fifteenth Conference, San Mateo, CA: Morgan Kaufmann, 1999.
-
Approximate inference algorithms for two-layer Bayesian networks.
A. Y. Ng and M. I. Jordan. Advances in Neural Information Processing
Systems (NIPS) 12, Cambridge MA: MIT Press, 1999.
-
An introduction to variational methods for graphical models.
M. I. Jordan, Z. Ghahramani, T. S. Jaakkola, and L. K. Saul.
In M. I. Jordan (Ed.), Learning in Graphical Models,
Cambridge: MIT Press, 1999.
-
Computational motor control.
M. I. Jordan and D. M. Wolpert.
In M. Gazzaniga (Ed.), The Cognitive Neurosciences, 2nd edition,
Cambridge: MIT Press, 1999.
-
Improving the mean field approximation via the use of mixture
distributions.
T. S. Jaakkola and M. I. Jordan.
In M. I. Jordan (Ed.), Learning in Graphical Models,
Cambridge: MIT Press, 1999.
-
Learning in graphical models.
M. I. Jordan (Ed.),
Cambridge MA: MIT Press, 1999.
-
Recurrent networks.
M. I. Jordan.
In R. A. Wilson and F. C. Keil (Eds.),
The MIT Encyclopedia of the Cognitive Sciences,
Cambridge, MA: MIT Press, 1999.
-
Neural networks.
M. I. Jordan.
In R. A. Wilson and F. C. Keil (Eds.),
The MIT Encyclopedia of the Cognitive Sciences,
Cambridge, MA: MIT Press, 1999.
-
Computational intelligence.
M. I. Jordan, and S. Russell
In R. A. Wilson and F. C. Keil (Eds.),
The MIT Encyclopedia of the Cognitive Sciences,
Cambridge, MA: MIT Press, 1999.
1998
-
Adaptation in speech production.
J. Houde and M. I. Jordan.
Science, 279, 1213-1216, 1998.
-
Smoothness maximization along a predefined path accurately
predicts the speed profiles of complex arm movements.
E. Todorov and M. I. Jordan.
Journal of Neurophysiology, 80, 696-714, 1998.
-
The role of inertial sensitivity in motor planning.
P. N. Sabes, M. I. Jordan and D. M. Wolpert.
Journal of Neuroscience, 18, 5948-5959, 1998.
-
Learning from dyadic data.
T. Hofmann, J. Puzicha, and M. I. Jordan.
In Kearns, M. S., Solla, S. A., and Cohn, D. (Eds.),
Advances in Neural Information Processing Systems (NIPS) 11,
Cambridge MA: MIT Press, 1998.
-
Mixture representations for inference and learning in Boltzmann machines.
N. D. Lawrence, C. M. Bishop and M. I. Jordan.
In G. F. Cooper and S. Moral (Eds.), Uncertainty in Artificial
Intelligence (UAI), Proceedings of the Fourteenth Conference,
San Mateo, CA: Morgan Kaufman, 1998.
1997
-
Factorial hidden Markov models.
Z. Ghahramani and M. I. Jordan.
Machine Learning, 29, 245--273, 1997.
-
Obstacle avoidance and a perturbation sensitivity model for
motor planning.
P. N. Sabes and M. I. Jordan.
Journal of Neuroscience, 17, 7119-7128, 1997.
-
Probabilistic independence networks for hidden Markov probability
models.
P. Smyth, D. Heckerman, and M. I. Jordan.
Neural Computation, 9, 227-270, 1997.
-
Viewing the hand prior to movement improves accuracy of pointing performed
toward the unseen contralateral hand.
M. Desmurget, Y. Rossetti, M. I. Jordan, C. Meckler, and C. Prablanc.
Experimental Brain Research, 115, 180--186, 1997.
-
Constrained and unconstrained movements involve different control strategies.
M. Desmurget, M. I. Jordan, C. Prablanc, and M. Jeannerod.
Journal of Neurophysiology, 77, 1644--1650, 1997.
-
Approximating posterior distributions in belief networks using mixtures.
C. M. Bishop, N. D. Lawrence, T. S. Jaakkola, and M. I. Jordan.
In Jordan, M. I., Kearns, M. J. and Solla, S. A. (Eds.),
Advances in Neural Information Processing Systems (NIPS) 10,
Cambridge, MA: MIT Press, 1997.
-
Estimating dependency structure as a hidden variable.
M. Meila and M. I. Jordan.
In Jordan, M. I., Kearns, M. J. and Solla, S. A. (Eds.),
Advances in Neural Information Processing Systems (NIPS) 10,
Cambridge, MA: MIT Press, 1997.
-
Advances in neural information processing systems 10,
M. I. Jordan, M. J. Kearns, and S. A. Solla, (Eds.),
Cambridge MA: MIT Press, 1997.
-
Adaptation in speech motor control.
J. F. Houde and M. I. Jordan.
In Jordan, M. I., Kearns, M. J. and Solla, S. A. (Eds.),
Advances in Neural Information Processing Systems (NIPS) 10,
Cambridge, MA: MIT Press, 1997.
-
Neural networks.
M. I. Jordan and C. Bishop.
In Tucker, A. B. (Ed.), CRC Handbook of Computer Science,
Boca Raton, FL: CRC Press, 1997.
-
Computational models of sensorimotor organization.
Z. Ghahramani, D. M. Wolpert, and M. I. Jordan.
In P. Morasso and V. Sanguineti (Eds.),
Self-Organization Computational Maps and Motor Control,
Amsterdam: North-Holland, 1997.
-
Advances in neural information processing systems 9,
M. Mozer, M. I. Jordan, and T. Petsche, (Eds.),
Cambridge MA: MIT Press, 1997.
-
Mixture models for learning from incomplete data.
Z. Ghahramani and M. I. Jordan.
In Greiner, R., Petsche, T., and Hanson, S. J. (Eds.),
Computational Learning Theory and Natural Learning Systems,
Cambridge, MA: MIT Press, 1997.
-
Active learning with statistical models.
D. Cohn, Z. Ghahramani, and M. I. Jordan.
In Murray-Smith, R., and Johansen, T. A. (Eds.),
Multiple Model Approaches to Modelling and Control,
London: Taylor and Francis, 1997.
-
An objective function for belief net triangulation.
M. Meila and M. I. Jordan.
In D. Madigan and P. Smyth (Eds.),
Proceedings of the 1997 Conference on Artificial Intelligence and Statistics,
Ft. Lauderdale, FL, 1997.
-
Markov mixtures of experts.
M. Meila and M. I. Jordan.
In Murray-Smith, R., and Johansen, T. A. (Eds.),
Multiple Model Approaches to Modelling and Control,
London: Taylor and Francis, 1997.
-
Serial order: A parallel, distributed processing approach.
M. I. Jordan.
In J. W. Donahoe and V. P. Dorsel, (Eds.).
Neural-network Models of Cognition: Biobehavioral Foundations,
Amsterdam: Elsevier Science Press, 1997.
1996
-
Mean field theory for sigmoid belief networks.
L. K. Saul, T. Jaakkola, and M. I. Jordan.
Journal of Artificial Intelligence Research, 4, 61-76, 1996.
-
Generalization to local remappings of the visuomotor coordinate
representation.
Z. Ghahramani, D. M. Wolpert, and M. I. Jordan.
Journal of Neuroscience, 16, 7085-7096, 1996.
-
Active learning with statistical models.
D. Cohn, Z. Ghahramani, and M. I. Jordan.
Journal of Artificial Intelligence Research, 4, 129-145, 1996.
-
On convergence properties of the EM Algorithm for Gaussian mixtures.
L. Xu and M. I. Jordan. Neural Computation, 8, 129-151, 1996.
-
Local linear perceptrons for classification.
E. Alpaydin and M. I. Jordan.
IEEE Transactions on Neural Networks, 7, 788--792, 1996.
-
Computational aspects of motor control and motor learning.
M. I. Jordan.
In H. Heuer and S. Keele (Eds.), Handbook of Perception and Action:
Motor Skills, New York: Academic Press, 1996.
-
Optimal triangulation with continuous cost functions.
M. Meila and M. I. Jordan. In M. C. Mozer, M. I. Jordan,
and T. Petsche (Eds.), Advances in Neural Information
Processing Systems (NIPS) 9, Cambridge MA: MIT Press, 1996.
-
A variational principle for model-based interpolation.
L. K. Saul and M. I. Jordan.
In M. C. Mozer, M. I. Jordan, and T. Petsche
(Eds.), Advances in Neural Information Processing Systems (NIPS) 9, Cambridge MA:
MIT Press, 1996.
-
Recursive algorithms for approximating probabilities in graphical
models.
T. S. Jaakkola and M. I. Jordan.
In M. C. Mozer, M. I. Jordan, and T. Petsche
(Eds.), Advances in Neural Information Processing Systems (NIPS) 9, Cambridge MA:
MIT Press, 1996.
-
Hidden Markov decision trees.
M. I. Jordan, Z. Ghahramani,
and L. K. Saul. In M. C. Mozer, M. I. Jordan, and T. Petsche (Eds.),
Advances in Neural Information Processing Systems (NIPS) 9, Cambridge MA: MIT Press,
1996.
-
Computing upper and lower bounds on likelihoods in intractable
networks.
T. S. Jaakkola and M. I. Jordan.
In E. Horvitz (Ed.), Uncertainty in Artificial Intelligence (UAI),
Proceedings of the Twelth Conference,
Portland, Oregon, 1996.
1995
-
An internal forward model for sensorimotor integration.
D. M. Wolpert, Z. Ghahramani, and M. I. Jordan.
Science, 269, 1880--1882, 1995.
-
Are arm trajectories planned in kinematic or dynamic coordinates?
An adaptation study.
D. M. Wolpert, Z. Ghahramani, and M. I. Jordan.
Experimental Brain Research, 103, 460-470, 1995.
-
Convergence results for the EM approach to mixtures of experts
architectures.
M. I. Jordan and L. Xu.
Neural Networks, 8, 1409-1431, 1995.
-
The organization of action sequences: Evidence from a relearning task.
M. I. Jordan.
Journal of Motor Behavior, 27, 179--192, 1995.
-
Adaptation in speech production to transformed auditory feedback.
J. Houde and M. I. Jordan.
Journal of the Acoustical Society of America, 97, 3243.
-
Fast learning by bounding likelihoods in sigmoid belief networks.
T. S. Jaakkola, L. K. Saul, and M. I. Jordan.
In D. S. Touretzky, M. C. Mozer, and M. E. Hasselmo (Eds.),
Advances in Neural Information Processing Systems (NIPS) 8,
Cambridge MA: MIT Press, 1995.
-
Reinforcement learning by probability matching.
P. N. Sabes and M. I. Jordan.
In D. S. Touretzky, M. C. Mozer, and M. E. Hasselmo (Eds.),
Advances in Neural Information Processing Systems (NIPS) 8,
Cambridge MA: MIT Press, 1995.
-
Exploiting tractable substructures in intractable networks.
L. K. Saul and M. I. Jordan.
In D. Touretzky, M. Mozer, and M. Hasselmo (Eds.), Advances in Neural
Information Processing Systems (NIPS) 8, MIT Press, 1995.
-
Markov mixtures of experts.
M. Meila and M. I. Jordan.
In D. Touretzky, M. Mozer, and M. Hasselmo (Eds.), Advances in Neural
Information Processing Systems (NIPS) 8, MIT Press, 1995.
-
Factorial Hidden Markov models.
Z. Ghahramani and M. I. Jordan.
In D. Touretzky, M. Mozer, and M. Hasselmo (Eds.), Advances in Neural
Information Processing Systems (NIPS) 8, MIT Press, 1995.
-
Learning in modular and hierarchical systems.
M. I. Jordan and R. A. Jacobs. In M. Arbib (Ed.),
The Handbook of Brain Theory and Neural Networks,
Cambridge, MA: MIT Press, 1995.
-
Why the logistic function? A tutorial discussion on probabilities
and neural networks.
M. I. Jordan.
MIT Computational Cognitive Science Report 9503, August 1995.
-
The moving basin: Effective action-search in adaptive control.
W. Fun and M. I. Jordan, M. I.
Proceedings of the World Conference on Neural Networks,
Washington, DC, 1995.
-
Goal-based speech motor control: A theoretical framework
and some preliminary data.
J. S. Perkell, M. L. Matthies, M. A. Svirsky, and M. I. Jordan.
In D. A. Robin, K. M. Yorkston, and D. R. Beukelman (Eds.),
Disorders of Motor Speech: Assessment, Treatment, and Clinical Characterization,
Baltimore, MD: Brookes Publishing Co, 1993.
1994
-
Hierarchical mixtures of experts and the EM algorithm.
M. I. Jordan and R. A. Jacobs. Neural Computation, 6, 181-214, 1994.
-
Learning in Boltzmann trees.
L. K. Saul and M. I. Jordan.
Neural Computation, 6, 1173-1183, 1994.
-
Perceptual distortion contributes to the curvature of human
reaching movements.
D. M. Wolpert, Z. Ghahramani, and M. I. Jordan.
Experimental Brain Research, 98, 153-156, 1994.
-
On the convergence of stochastic iterative dynamic programming algorithms.
T. Jaakkola, M. I. Jordan and S. Singh.
Neural Computation, 6, 1183--1190, 1994.
-
A model of the learning of arm trajectories from spatial targets.
M. I. Jordan, T. Flash, and Y. Arnon.
Journal of Cognitive Neuroscience, 6, 359--376, 1994.
-
Learning without state estimation in partially observable Markovian decision
processes.
S. P. Singh, T. S. Jaakkola, and M. I. Jordan.
Machine Learning: Proceedings of the Eleventh International Conference,
San Mateo, CA: Morgan Kaufmann, 284--292, 1994.
-
A statistical approach to decision tree modeling.
M. I. Jordan. In M. Warmuth (Ed.), Proceedings of the Seventh
Annual ACM Conference on Computational Learning Theory,
New York: ACM Press, 1994.
-
Learning from incomplete data.
Z. Ghahramani and M. I. Jordan.
MIT Center for Biological and Computational Learning Technical Report 108, 1994.
-
Theoretical and experimental studies of convergence properties of
EM algorithm based on finite Gaussian mixtures.
L. Xu and M. I. Jordan, M. I.
Proceedings of the 1994 International Symposium on Artificial Neural Networks,
Tainan, Taiwan, pp. 380--385, 1994.
-
A statistical approach to decision tree modeling.
M. I. Jordan.
In M. Warmuth (Ed.), Proceedings of the Seventh
Annual ACM Conference on Computational Learning Theory,
New York: ACM Press, 1994.
-
Boltzmann chains and hidden Markov Models.
L. K. Saul and M. I. Jordan. In G. Tesauro, D. S. Touretzky and
T. K. Leen, (Eds.), Advances in Neural Information Processing Systems (NIPS) 7,
MIT Press, 1994.
-
Reinforcement learning algorithm for partially observable Markov
decision problems.
T. S. Jaakkola, S. P. Singh, and M. I. Jordan.
In G. Tesauro, D. S. Touretzky and T. K. Leen, (Eds.),
Advances in Neural Information Processing Systems (NIPS) 7,
Cambridge, MA: MIT Press, 1994.
-
Reinforcement learning with soft state aggregation.
S. P. Singh, T. S. Jaakkola, and M. I. Jordan.
In G. Tesauro, D. S. Touretzky and T. K. Leen, (Eds.),
Advances in Neural Information Processing Systems (NIPS) 7,
Cambridge, MA: MIT Press, 1994.
-
Computational structure of coordinate transformations: A generalization study.
Z. Ghahramani, D. M. Wolpert, and M. I. Jordan.
In G. Tesauro, D. S. Touretzky and T. K. Leen, (Eds.),
Advances in Neural Information Processing Systems (NIPS) 8,
Cambridge, MA: MIT Press, 1994.
-
Neural forward dynamic models in human motor control: Psychophysical evidence.
D. M. Wolpert, Z. Ghahramani, and M. I. Jordan.
In G. Tesauro, D. S. Touretzky and T. K. Leen, (Eds.),
Advances in Neural Information Processing Systems (NIPS) 7,
Cambridge, MA: MIT Press, 1994.
-
An alternative model for mixtures of experts.
L. Xu, M. I. Jordan, and G. E. Hinton.
In G. Tesauro, D. S. Touretzky and T. K. Leen, (Eds.),
Advances in Neural Information Processing Systems (NIPS) 7,
Cambridge, MA: MIT Press, 1994.
-
Active learning with statistical models.
D. Cohn, Z. Ghahramani, and M. I. Jordan.
In G. Tesauro, D. S. Touretzky and T. K. Leen, (Eds.),
Advances in Neural Information Processing Systems (NIPS) 7,
Cambridge, MA: MIT Press, 1994.
pre-1994
-
Forward models: Supervised learning with a distal teacher.
M. I. Jordan and D. E. Rumelhart. Cognitive Science, 16, 307-354, 1992.
-
Adaptive mixtures of local experts.
R. A. Jacobs, M. I. Jordan, S. Nowlan, and G. E. Hinton.
Neural Computation, 3, 1-12, 1991.
-
Learning piecewise control strategies in a modular neural network architecture.
R. A. Jacobs and M. I. Jordan.
IEEE Transactions on Systems, Man, and Cybernetics, 23,
337--345, 1993.
-
Trading relations between tongue-body raising and lip rounding in
production of the vowel /u/: A pilot motor equivalence study.
J. S. Perkell, M. L. Matthies, M. A. Svirsky, and M. I. Jordan.
Journal of the Acoustical Society of America, 93, 2948--2961, 1993.
-
Supervised learning and divide-and-conquer: A statistical approach.
M. I. Jordan, and R. A. Jacobs.
In P. E. Utgoff, (Ed.), Machine Learning: Proceedings of
the Tenth International Workshop, San Mateo, CA: Morgan Kaufmann, 1993.
-
Supervised learning from incomplete data via the EM approach.
Z. Ghahramani and M. I. Jordan.
In Cowan, J., Tesauro, G., and Alspector, J., (Eds.),
Advances in Neural Information Processing Systems 6,
San Mateo, CA: Morgan Kaufmann, 1993.
-
< A HREF="http://www.cs.berkeley.edu/~jordan/papers/Neural-Computation-94.ps">
Convergence of stochastic iterative dynamic programming algorithms.
T. Jaakkola, M. I. Jordan, and S. Singh.
In Cowan, J., Tesauro, G., and Alspector, J., (Eds.),
Advances in Neural Information Processing Systems 6,
San Mateo, CA: Morgan Kaufmann, 1993.
-
A dynamical model of priming and repetition blindness.
D. Bavelier and M. I. Jordan.
In Hanson, S. J., Cowan, J. D., and Giles, C. L., (Eds.),
Advances in Neural Information Processing Systems (NIPS) 5,
San Mateo, CA: Morgan Kaufmann, 1992.
-
EM learning of a generalized finite mixture model for combining
multiple classifiers.
L. Xu and M. I. Jordan.
Proceedings of the World Conference on Neural Networks,
Portland, OR, pp. 431--434, 1993.
-
The cascade neural network model and a speed-accuracy tradeoff of arm movement.
M. Hirayama, M. Kawato, and M. I. Jordan.
Journal of Motor Behavior, 25, 162--175, 1993.
-
Constrained supervised learning.
M. I. Jordan.
Journal of Mathematical Psychology, 36, 396--425, 1992.
-
Computational consequences of a bias towards short connections.
R. A. Jacobs and M. I. Jordan.
Journal of Cognitive Neuroscience, 4, 331--344, 1992.
-
Hierarchies of adaptive experts.
M. I. Jordan and R. A. Jacobs.
In J. Moody, S. Hanson, and R. Lippmann (Eds.),
Advances in Neural Information Processing Systems (NIPS) 4,
San Mateo, CA: Morgan Kaufmann, 1991.
-
Forward dynamics modeling of speech motor control using
physiological data.
M. Hirayama, E. Vatikiotis-Bateson, M. Kawato, and M. I. Jordan.
In J. Moody, S. Hanson, and R. Lippmann (Eds.),
Advances in Neural Information Processing Systems (NIPS) 4,
San Mateo, CA: Morgan Kaufmann, 1991.
-
Supervised learning and excess degrees of freedom.
Jordan, M. I.
In P. Mehra, and B. Wah, (Eds.),
Artificial Neural Networks: Concepts and Theory,
Los Alamitos, CA: IEEE Computer Society Press, 1992.
-
Optimal control: A foundation for intelligent control.
D. A. White and M. I. Jordan.
In D. A. White, and D. A. Sofge (Eds.), Handbook of Intelligent Control,
Amsterdam: Van Nostrand, 1992.
-
Constraints on underspecified target trajectories.
M. I. Jordan.
In P. Dario, G. Sandini, and P. Aebischer, (Eds.),
Robots and Biological Systems: Toward a New Bionics,
Heidelberg: Springer-Verlag, 1992.
-
A more biologically plausible learning network model for neural networks.
P. Mazzoni, R. Andersen, and M. I. Jordan.
Proceedings of the National Academy of Sciences, 88,
4433--4437, 1991.
-
Task decomposition through competition in a modular connectionist
architecture: The what and where vision tasks.
R. A. Jacobs, M. I. Jordan, and A. G. Barto.
Cognitive Science, 15, 219--250, 1991.
-
Internal world models and supervised learning.
M. I. Jordan, and D. E. Rumelhart.
In L. Birnbaum and G. Collins, (Eds.),
Machine Learning: Proceedings of the Eighth International
Workshop, San Mateo, CA: Morgan Kaufmann, pp. 70--75, 1991.
-
A competitive modular connectionist architecture.
R. A. Jacobs and M. I. Jordan.
In D. Touretzky (Ed.), Advances in Neural Information Processing Systems (NIPS) 4,
San Mateo, CA: Morgan Kaufmann, 1991.
-
Speech motor control model using electromyography.
M. Hirayama, E. Vatikiotis-Bateson, M. Kawato, and M. I. Jordan.
INCN Conference on Speech Communications, 39--46, 1991.
-
A modular connectionist architecture for learning piecewise control strategies.
R. A. Jacobs and M. I. Jordan.
Proceedings of the 1991 American Control Conference,
Boston, MA, pp. 343--351, 1991.
[Best Paper Award].
-
A more biologically plausible learning rule than backpropagation applied
to a network model of cortical area 7a.
P. Mazzoni, R. Andersen, and M. I. Jordan.
Cerebral Cortex, 1, 293--307, 1991.
-
Modularity, supervised learning, and unsupervised learning.
M. I. Jordan, and R. A. Jacobs.
In S. Davis (Ed.), Connectionism: Theory and practice,
Oxford: Oxford University Press, 1991.
-
A non-empiricist perspective on learning in layered networks.
M. I. Jordan.
Behavioral and Brain Sciences, 13, 497--498, 1990.
-
Simulation of vocalic gestures using an
articulatory model driven by a sequential neural network.
G. Bailly, M. I. Jordan, M. Mantakas, J-L. Schwartz, M. Bach,
and O. Olesen.
Journal of the Acoustical Society of America, 87:S105, 1990.
-
A competitive modular connectionist architecture.
M. I. Jordan, and R. A. Jacobs.
In R. Lippmann and J. Moody and D. Touretzky (Eds.),
Advances in Neural Information Processing Systems (NIPS) 3,
San Mateo, CA: Morgan Kaufmann, pp. 324--331, 1990.
-
AR-P learning applied to a network model of cortical area 7a.
P. Mazzoni, R. Andersen, and M. I. Jordan.
Proceedings of the International Joint Conference On Neural Networks,
San Diego, CA, pp. 373--379, 1990.
-
Motor learning and the degrees of freedom problem.
M. I. Jordan.
Attention and Performance, XIII, 796--836, 1990.
-
Learning inverse mappings with forward models.
M. I. Jordan.
In K. S. Narendra (Ed.), Proceedings of the Sixth Yale Workshop
on Adaptive and Learning Systems, New York: Plenum Press, 1990.
-
Action.
M. I. Jordan, and D. A. Rosenbaum.
In M. I. Posner (Ed.), Foundations of Cognitive Science,
Cambridge, MA: MIT Press, 1989.
-
Learning to control an unstable system with forward modeling.
M. I. Jordan, and R. A. Jacobs.
In D. Touretzky (Ed.),
Advances in Neural Information Processing Systems (NIPS) 2,
San Mateo, CA: Morgan Kaufmann, pp. 324--331, 1989.
-
Gradient following without backpropagation in layered networks.
A. G. Barto and M. I. Jordan.
Proceedings of the IEEE First Annual International Conference on
Neural Networks,
New York: IEEE Publishing Services, 1987.
-
An introduction to linear algebra in parallel, distributed processing.
M. I. Jordan.
In D. E. Rumelhart and J. L. McClelland, (Eds.),
Parallel Distributed Processing: Explorations in the Microstructure of Cognition,
Cambridge, MA: MIT Press, 1986.
-
Attractor dynamics and parallelism in a connectionist sequential machine.
M. I. Jordan.
Proceedings of the Eighth Annual Conference of the Cognitive Science Society,
Englewood Cliffs, NJ: Erlbaum, pp. 531--546. [Reprinted in IEEE Tutorials
Series, New York: IEEE Publishing Services, 1990], 1986.
Web Counter: