Jonas Kohler
Jonas Kohler
Verified email at - Homepage
Cited by
Cited by
Sub-sampled cubic regularization for non-convex optimization
JM Kohler, A Lucchi
ICML 2017, 2017
Escaping Saddles with Stochastic Gradients
H Daneshmand, J Kohler, A Lucchi, T Hofmann
ICML 2018, 2018
Exponential convergence rates for Batch Normalization: The power of length-direction decoupling in non-convex optimization
J Kohler, H Daneshmand, A Lucchi, M Zhou, K Neymeyr, T Hofmann
AISTATS 2019, 2019
Batch normalization provably avoids ranks collapse for randomly initialised deep networks
H Daneshmand, J Kohler, F Bach, T Hofmann, A Lucchi
NeurIPS 2020, 2020
Learning Generative Models of Textured 3D Meshes from Real-World Images
D Pavllo, J Kohler, T Hofmann, A Lucchi
ICCV 2021, 2021
This Looks Like That... Does it? Shortcomings of Latent Space Prototype Interpretability in Deep Networks
A Hoffmann, C Fanconi, R Rade, J Kohler
ICML 2021 Workshop on Theoretic Foundation, Criticism, and Application Trend …, 2021
The Role of Memory in Stochastic Optimization
A Orvieto, J Kohler, A Lucchi
UAI, 2019, 2019
Adaptive norms for deep learning with regularised Newton methods
J Kohler, L Adolphs, A Lucchi
NeurIPS 2019 Workshop: Beyond First-Order Optimization Methods in Machine …, 2019
Synthesizing Speech from Intracranial Depth Electrodes using an Encoder-Decoder Framework
J Kohler, MC Ottenhoff, S Goulis, M Angrick, AJ Colon, L Wagner, ...
Neurons, Behavior, Data analysis, and Theory (NBDT), 2021
A stochastic tensor method for non-convex optimization
A Lucchi, J Kohler
arXiv preprint arXiv:1911.10367, 2019
Safe Deep Reinforcement Learning for Multi-Agent Systems with Continuous Action Spaces
Z Sheebaelhamd, K Zisis, A Nisioti, D Gkouletsos, D Pavllo, J Kohler
ICML 2021 Workshop on Reinforcement Learning for Real Life Workshop, 2021
Vanishing Curvature and the Power of Adaptive Methods in Randomly Initialized Deep Networks
A Orvieto, J Kohler, D Pavllo, T Hofmann, A Lucchi
AISTATS 2022, 2021
Two-Level K-FAC Preconditioning for Deep Learning
N Tselepidis, J Kohler, A Orvieto
NeurIPS 2020 Workshop on Optimization for Machine Learning (OPT2020), 2020
Vanishing Curvature in Randomly Initialized Deep ReLU Networks.
A Orvieto, J Kohler, D Pavllo, T Hofmann, A Lucchi
AISTATS, 7942-7975, 2022
A sub-sampled tensor method for nonconvex optimization
A Lucchi, J Kohler
IMA Journal of Numerical Analysis 43 (5), 2856-2891, 2023
Adaptive Guidance: Training-free Acceleration of Conditional Diffusion Models
A Castillo, J Kohler, JC Pérez, JP Pérez, A Pumarola, B Ghanem, ...
arXiv preprint arXiv:2312.12487, 2023
fMPI: Fast Novel View Synthesis in the Wild with Layered Scene Representations
J Kohler, NG Sanchez, L Cavalli, C Herold, A Pumarola, AG Garcia, ...
arXiv preprint arXiv:2312.16109, 2023
Cache Me if You Can: Accelerating Diffusion Models through Block Caching
F Wimbauer, B Wu, E Schoenfeld, X Dai, J Hou, Z He, A Sanakoyeu, ...
arXiv preprint arXiv:2312.03209, 2023
Insights on the interplay of network architectures and optimization algorithms in deep learning
J Kohler
ETH Zurich, 2022
The system can't perform the operation now. Try again later.
Articles 1–19