From local SGD to local fixed-point methods for federated learning G Malinovskiy, D Kovalev, E Gasanov, L Condat, P Richtarik International Conference on Machine Learning, 6692-6701, 2020 | 113 | 2020 |
Lower bounds and optimal algorithms for smooth and strongly convex decentralized optimization over time-varying networks D Kovalev, E Gasanov, A Gasnikov, P Richtarik Advances in Neural Information Processing Systems 34, 22325-22335, 2021 | 38 | 2021 |
3PC: Three point compressors for communication-efficient distributed training and a better theory for lazy aggregation P Richtárik, I Sokolov, E Gasanov, I Fatkhullin, Z Li, E Gorbunov International Conference on Machine Learning, 18596-18648, 2022 | 25 | 2022 |
Stochastic spectral and conjugate descent methods D Kovalev, P Richtarik, E Gorbunov, E Gasanov Advances in Neural Information Processing Systems 31, 2018 | 15 | 2018 |
FLIX: A simple and communication-efficient alternative to local methods in federated learning E Gasanov, A Khaled, S Horváth, P Richtárik arXiv preprint arXiv:2111.11556, 2021 | 14 | 2021 |
Adaptive compression for communication-efficient distributed training M Makarenko, E Gasanov, R Islamov, A Sadiev, P Richtárik arXiv preprint arXiv:2211.00188, 2022 | 4 | 2022 |
Understanding progressive training through the framework of randomized coordinate descent R Szlendak, E Gasanov, P Richtárik International Conference on Artificial Intelligence and Statistics, 2161-2169, 2024 | 1 | 2024 |
Error Feedback Reloaded: From Quadratic to Arithmetic Mean of Smoothness Constants P Richtárik, E Gasanov, K Burlachenko arXiv preprint arXiv:2402.10774, 2024 | | 2024 |
Error Feedback Shines when Features are Rare P Richtárik, E Gasanov, K Burlachenko arXiv preprint arXiv:2305.15264, 2023 | | 2023 |
A New Randomized Method for Solving Large Linear Systems E Gasanov, V Elsukov, P Richtárik | | |