Follow
Dominic Richards
Dominic Richards
Verified email at spc.ox.ac.uk - Homepage
Title
Cited by
Cited by
Year
Asymptotics of ridge (less) regression under general source condition
D Richards, J Mourtada, L Rosasco
International Conference on Artificial Intelligence and Statistics, 3889-3897, 2021
722021
Graph-Dependent Implicit Regularisation for Distributed Stochastic Subgradient Descent
D Richards, P Rebeschini
Journal of Machine Learning Research 21 (34), 1-44, 2020
242020
Decentralised learning with distributed gradient descent and random features
D Richards, P Rebeschini, L Rosasco
Proceedings of Machine Learning Research, 2020
21*2020
Optimal Statistical Rates for Decentralised Non-Parametric Regression with Linear Speed-Up
D Richards, P Rebeschini
NeurIPS 2019, 2019
182019
Stability & Generalisation of Gradient Descent for Shallow Neural Networks without the Neural Tangent Kernel
D Richards, I Kuzborskij
Advances in Neural Information Processing Systems 34, 2021
162021
Distributed Machine Learning with Sparse Heterogeneous Data
D Richards, S Negahban, P Rebeschini
Advances in Neural Information Processing Systems 34, 2021
10*2021
Learning with Gradient Descent and Weakly Convex Losses
D Richards, M Rabbat
International Conference on Artificial Intelligence and Statistics, 1990-1998, 2021
92021
Comparing Classes of Estimators: When does Gradient Descent Beat Ridge Regression in Linear Models?
D Richards, E Dobriban, P Rebeschini
arXiv preprint arXiv:2108.11872, 2021
12021
The system can't perform the operation now. Try again later.
Articles 1–8