Itay Hubara
Itay Hubara
Verified email at campus.technion.ac.il
Title
Cited by
Cited by
Year
Binarized neural networks: Training deep neural networks with weights and activations constrained to+ 1 or-1
M Courbariaux, I Hubara, D Soudry, R El-Yaniv, Y Bengio
arXiv preprint arXiv:1602.02830, 2016
18982016
Binarized neural networks
I Hubara, M Courbariaux, D Soudry, R El-Yaniv, Y Bengio
Proceedings of the 30th International Conference on Neural Information …, 2016
11752016
Quantized neural networks: Training neural networks with low precision weights and activations
I Hubara, M Courbariaux, D Soudry, R El-Yaniv, Y Bengio
The Journal of Machine Learning Research 18 (1), 6869-6898, 2017
11172017
Train longer, generalize better: closing the generalization gap in large batch training of neural networks
E Hoffer, I Hubara, D Soudry
arXiv preprint arXiv:1705.08741, 2017
4182017
Expectation backpropagation: Parameter-free training of multilayer neural networks with continuous or discrete weights.
D Soudry, I Hubara, R Meir
NIPS 1, 2, 2014
2092014
Scalable methods for 8-bit training of neural networks
R Banner, I Hubara, E Hoffer, D Soudry
arXiv preprint arXiv:1805.11046, 2018
1162018
Mlperf inference benchmark
VJ Reddi, C Cheng, D Kanter, P Mattson, G Schmuelling, CJ Wu, ...
2020 ACM/IEEE 47th Annual International Symposium on Computer Architecture …, 2020
882020
Binarized neural networks: training deep neural networks with weights and activations constrained to+ 1 or-1. arXiv
M Courbariaux, I Hubara, D Soudry, R El-Yaniv, Y Bengio
arXiv preprint arXiv:1602.02830, 1-11, 2016
502016
Fix your classifier: the marginal value of training the last weight layer
E Hoffer, I Hubara, D Soudry
arXiv preprint arXiv:1801.04540, 2018
462018
The knowledge within: Methods for data-free model compression
M Haroush, I Hubara, E Hoffer, D Soudry
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2020
262020
Augment your batch: better training with larger batches
E Hoffer, T Ben-Nun, I Hubara, N Giladi, T Hoefler, D Soudry
arXiv preprint arXiv:1901.09335, 2019
252019
Deep unsupervised learning through spatial contrasting
E Hoffer, I Hubara, N Ailon
arXiv preprint arXiv:1610.00243, 2016
212016
Playing SNES in the retro learning environment
N Bhonker, S Rozenberg, I Hubara
arXiv preprint arXiv:1611.02205, 2016
192016
Augment Your Batch: Improving Generalization Through Instance Repetition
E Hoffer, T Ben-Nun, I Hubara, N Giladi, T Hoefler, D Soudry
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2020
152020
Quantized neural network training and inference
ELY Ran, I Hubara, D Soudry
US Patent 10,831,444, 2020
82020
Improving post training neural quantization: Layer-wise calibration and integer programming
I Hubara, Y Nahshan, Y Hanani, R Banner, D Soudry
arXiv preprint arXiv:2006.10518, 2020
62020
Mix & match: training convnets with mixed image sizes for improved accuracy, speed and scale resiliency
E Hoffer, B Weinstein, I Hubara, T Ben-Nun, T Hoefler, D Soudry
arXiv preprint arXiv:1908.08986, 2019
52019
Quantized back-propagation: Training binarized neural networks with quantized gradients
I Hubara, E Hoffer, D Soudry
42018
Spatial contrasting for deep unsupervised learning
E Hoffer, I Hubara, N Ailon
arXiv preprint arXiv:1611.06996, 2016
22016
Accelerated Sparse Neural Training: A Provable and Efficient Method to Find N: M Transposable Masks
I Hubara, B Chmiel, M Island, R Banner, S Naor, D Soudry
arXiv preprint arXiv:2102.08124, 2021
12021
The system can't perform the operation now. Try again later.
Articles 1–20