Follow
Elnur Gasanov
Elnur Gasanov
Ph.D. student in Optimization and Machine Learning, King Abdullah University of Science and Technology
Verified email at kaust.edu.sa - Homepage
Title
Cited by
Cited by
Year
From local SGD to local fixed-point methods for federated learning
G Malinovskiy, D Kovalev, E Gasanov, L Condat, P Richtarik
International Conference on Machine Learning, 6692-6701, 2020
1122020
Lower bounds and optimal algorithms for smooth and strongly convex decentralized optimization over time-varying networks
D Kovalev, E Gasanov, A Gasnikov, P Richtarik
Advances in Neural Information Processing Systems 34, 22325-22335, 2021
382021
3PC: Three point compressors for communication-efficient distributed training and a better theory for lazy aggregation
P Richtárik, I Sokolov, E Gasanov, I Fatkhullin, Z Li, E Gorbunov
International Conference on Machine Learning, 18596-18648, 2022
252022
Stochastic spectral and conjugate descent methods
D Kovalev, P Richtarik, E Gorbunov, E Gasanov
Advances in Neural Information Processing Systems 31, 2018
152018
FLIX: A simple and communication-efficient alternative to local methods in federated learning
E Gasanov, A Khaled, S Horváth, P Richtárik
arXiv preprint arXiv:2111.11556, 2021
142021
Adaptive compression for communication-efficient distributed training
M Makarenko, E Gasanov, R Islamov, A Sadiev, P Richtárik
arXiv preprint arXiv:2211.00188, 2022
32022
Understanding progressive training through the framework of randomized coordinate descent
R Szlendak, E Gasanov, P Richtárik
arXiv preprint arXiv:2306.03626, 2023
12023
Error Feedback Reloaded: From Quadratic to Arithmetic Mean of Smoothness Constants
P Richtárik, E Gasanov, K Burlachenko
arXiv preprint arXiv:2402.10774, 2024
2024
Error Feedback Shines when Features are Rare
P Richtárik, E Gasanov, K Burlachenko
arXiv preprint arXiv:2305.15264, 2023
2023
A New Randomized Method for Solving Large Linear Systems
E Gasanov, V Elsukov, P Richtárik
The system can't perform the operation now. Try again later.
Articles 1–10