Patrick Rebeschini
Patrick Rebeschini
Verified email at stats.ox.ac.uk - Homepage
Title
Cited by
Cited by
Year
Can local particle filters beat the curse of dimensionality?
P Rebeschini, R Van Handel
The Annals of Applied Probability 25 (5), 2809-2866, 2015
1762015
Fast mixing for discrete point processes
P Rebeschini, A Karbasi
Conference on Learning Theory, 1480-1500, 2015
252015
Decentralized Cooperative Stochastic Multi-armed Bandits
D Martínez-Rubio, V Kanade, P Rebeschini
NeurIPS 2019, 2018
20*2018
Comparison theorems for Gibbs measures
P Rebeschini, R van Handel
Journal of Statistical Physics 157 (2), 234-281, 2014
122014
Implicit Regularization for Optimal Sparse Recovery
T Vaškevičius, V Kanade, P Rebeschini
NeurIPS 2019, 2019
11*2019
Phase transitions in nonlinear filtering
P Rebeschini, R van Handel
Electronic Journal of Probability 20, 2015
62015
Optimal Statistical Rates for Decentralised Non-Parametric Regression with Linear Speed-Up
D Richards, P Rebeschini
NeurIPS 2019, 2019
52019
Nonlinear filtering in high dimension
P Rebeschini
PRINCETON UNIV NJ, 2014
52014
Graph-dependent implicit regularisation for distributed stochastic subgradient descent
D Richards
Journal of Machine Learning Research 21 (2020), 2020
42020
Accelerated consensus via min-sum splitting
P Rebeschini, SC Tatikonda
Advances in Neural Information Processing Systems, 1374-1384, 2017
42017
Decay of correlation in network flow problems
P Rebeschini, S Tatikonda
2016 Annual Conference on Information Science and Systems (CISS), 169-174, 2016
32016
Hadamard Wirtinger Flow for Sparse Phase Retrieval
F Wu, P Rebeschini
arXiv preprint arXiv:2006.01065, 2020
22020
The Statistical Complexity of Early Stopped Mirror Descent
T Vaškevičius, V Kanade, P Rebeschini
arXiv preprint arXiv:2002.00189, 2020
22020
Locality in network optimization
P Rebeschini, S Tatikonda
IEEE Transactions on Control of Network Systems 6 (2), 487-500, 2018
22018
A new approach to Laplacian solvers and flow problems
P Rebeschini, S Tatikonda
arXiv preprint arXiv:1611.07138, 2016
22016
A Continuous-Time Mirror Descent Approach to Sparse Phase Retrieval
F Wu, P Rebeschini
arXiv preprint arXiv:2010.10168, 2020
2020
Decentralised Learning with Random Features and Distributed Gradient Descent
D Richards, P Rebeschini, L Rosasco
arXiv preprint arXiv:2007.00360, 2020
2020
Decentralised learning with distributed gradient descent and random features
D Richards, P Rebeschini, L Rosasco
Proceedings of Machine Learning Research, 2020
2020
Decentralised Sparse Multi-Task Regression
D Richards, SN Negahban, P Rebeschini
arXiv preprint arXiv:1912.01417, 2019
2019
Algorithmic Foundations of Learning
P Rebeschini
Statistics 1, 9, 2018
2018
The system can't perform the operation now. Try again later.
Articles 1–20