Follow
Soufiane Hayou
Soufiane Hayou
Department of Statistics, University of Oxford
Verified email at stats.ox.ac.uk - Homepage
Title
Cited by
Cited by
Year
On the impact of the activation function on deep neural networks training
S Hayou, A Doucet, J Rousseau
36th International Conference on Machine Learning (ICML 2019), 2019
1112019
On the selection of initialization and activation function for deep neural networks
S Hayou, A Doucet, J Rousseau
arXiv preprint arXiv:1805.08266, 2018
662018
Mean-field Behaviour of Neural Tangent Kernel for Deep Neural Networks
S Hayou, A Doucet, J Rousseau
arXiv preprint arXiv:1905.13654, 2019
162019
Robust Pruning at Initialization
S Hayou, JF Ton, A Doucet, YW Teh
International Conference on Learning Representations (ICLR 2021), 2021
122021
Stable ResNet
S Hayou, E Clerico, B He, G Deligiannidis, A Doucet, J Rousseau
24th International Conference on Artificial Intelligence and Statistics …, 2021
112021
Pruning untrained neural networks: Principles and analysis
S Hayou, JF Ton, A Doucet, Y Whye Teh
arXiv e-prints, arXiv: 2002.08797, 2020
92020
Training dynamics of deep networks using stochastic gradient descent via neural tangent kernel
S Hayou, A Doucet, J Rousseau
arXiv preprint arXiv:1905.13654, 2019
82019
On the selection of initialization and activation function for deep neural networks. arXiv Prepr
S Hayou, A Doucet, J Rousseau
arXiv preprint arXiv:1805.08266, 2018
22018
Connecting Optimization and Generalization via Gradient Flow Path Length
F Liu, H Yang, S Hayou, Q Li
arXiv preprint arXiv:2202.10670, 2022
12022
Probabilistic fine-tuning of pruning masks and PAC-Bayes self-bounded learning
S Hayou, B He, GK Dziugaite
arXiv preprint arXiv:2110.11804, 2021
12021
Regularization in ResNet with Stochastic Depth
S Hayou, F Ayed
NeurIPS 2021, arXiv:2106.03091, 2021
12021
The Curse of Depth in Kernel Regime
S Hayou, A Doucet, J Rousseau
I (Still) Can't Believe It's Not Better! Workshop at NeurIPS 2021, 41-47, 2022
2022
The Equilibrium Hypothesis: Rethinking implicit regularization in Deep Neural Networks
Y Lou, C Mingard, S Hayou
arXiv preprint arXiv:2110.11749, 2021
2021
Stochastic Pruning: Fine-Tuning, and PAC-Bayes bound optimization
S Hayou, B He, GK Dziugaite
NeurIPS 2021, Bayesian Deep Learning workshop 1 (2), 2, 2021
2021
Wide deep neural networks
S Hayou
University of Oxford, 2021
2021
Exact Convergence Rates of the Neural Tangent Kernel in the Large Depth Limit
S Hayou, A Doucet, J Rousseau
arXiv preprint arXiv:1905.13654, 2019
2019
On the overestimation of the largest eigenvalue of a covariance matrix
S Hayou
arXiv preprint arXiv:1708.03551, 2017
2017
Cleaning the correlation matrix with a denoising autoencoder
S Hayou
arXiv preprint arXiv:1708.02985, 2017
2017
On the Impact of the Activation Function on Deep Neural Networks Training: Supplementary material
S Hayou, A Doucet, J Rousseau
The system can't perform the operation now. Try again later.
Articles 1–19