Follow
Frederic Koehler
Title
Cited by
Cited by
Year
Information theoretic properties of Markov random fields, and their algorithmic applications
L Hamilton, F Koehler, A Moitra
Advances in Neural Information Processing Systems 30, 2017
752017
Uniform Convergence of Interpolators: Gaussian Width, Norm Bounds and Benign Overfitting
F Koehler, L Zhou, D Sutherland, N Srebro
Advances in Neural Information Processing Systems 34, 20657-20668, 2021
662021
A spectral condition for spectral gap: fast mixing in high-temperature Ising models
R Eldan, F Koehler, O Zeitouni
Probability theory and related fields 182 (3), 1035-1051, 2022
612022
Statistical efficiency of score matching: The view from isoperimetry
F Koehler, A Heckett, A Risteski
arXiv preprint arXiv:2210.00726, 2022
492022
Entropic independence I: Modified log-Sobolev inequalities for fractionally log-concave distributions and high-temperature ising models
N Anari, V Jain, F Koehler, HT Pham, TD Vuong
arXiv preprint arXiv:2106.04105, 2021
49*2021
Mean-field approximation, convex hierarchies, and the optimality of correlation rounding: a unified perspective
V Jain, F Koehler, A Risteski
Proceedings of the 51st Annual ACM SIGACT Symposium on Theory of Computing …, 2019
412019
Provable algorithms for inference in topic models
S Arora, R Ge, F Koehler, T Ma, A Moitra
International Conference on Machine Learning, 2859-2867, 2016
362016
Online and distribution-free robustness: Regression and contextual bandits with huber contamination
S Chen, F Koehler, A Moitra, M Yau
2021 IEEE 62nd Annual Symposium on Foundations of Computer Science (FOCS …, 2022
352022
Learning restricted Boltzmann machines via influence maximization
G Bresler, F Koehler, A Moitra
Proceedings of the 51st Annual ACM SIGACT Symposium on Theory of Computing …, 2019
342019
Representational aspects of depth and conditioning in normalizing flows
F Koehler, V Mehta, A Risteski
International Conference on Machine Learning, 5628-5636, 2021
332021
Classification Under Misspecification: Halfspaces, Generalized Linear Models, and Connections to Evolvability
S Chen, F Koehler, A Moitra, M Yau
Advances in Neural Information Processing Systems 33, 2020
332020
Entropic independence: optimal mixing of down-up random walks
N Anari, V Jain, F Koehler, HT Pham, TD Vuong
Proceedings of the 54th Annual ACM SIGACT Symposium on Theory of Computing …, 2022
322022
The mean-field approximation: Information inequalities, algorithms, and complexity
V Jain, F Koehler, E Mossel
Conference On Learning Theory, 1326-1347, 2018
322018
Optimal batch schedules for parallel machines
F Koehler, S Khuller
Algorithms and Data Structures: 13th International Symposium, WADS 2013 …, 2013
292013
Learning some popular gaussian graphical models without condition number bounds
J Kelner, F Koehler, R Meka, A Moitra
Advances in Neural Information Processing Systems 33, 10986-10998, 2020
272020
The comparative power of relu networks and polynomial kernels in the presence of sparse latent structure
F Koehler, A Risteski
International Conference on Learning Representations, 2019
26*2019
Optimistic rates: A unifying theory for interpolation learning and regularization in linear regression
L Zhou, F Koehler, DJ Sutherland, N Srebro
ACM/JMS Journal of Data Science 1 (2), 1-51, 2024
242024
Entropic independence ii: optimal sampling and concentration via restricted modified log-Sobolev inequalities
N Anari, V Jain, F Koehler, HT Pham, TD Vuong
arXiv preprint arXiv:2111.03247, 2021
222021
On the power of preconditioning in sparse linear regression
JA Kelner, F Koehler, R Meka, D Rohatgi
2021 IEEE 62nd Annual Symposium on Foundations of Computer Science (FOCS …, 2022
212022
A non-asymptotic moreau envelope theory for high-dimensional generalized linear models
L Zhou, F Koehler, P Sur, DJ Sutherland, N Srebro
Advances in Neural Information Processing Systems 35, 21286-21299, 2022
202022
The system can't perform the operation now. Try again later.
Articles 1–20