Samuel R. Bowman
Title
Cited by
Cited by
Year
A large annotated corpus for learning natural language inference
SR Bowman, G Angeli, C Potts, CD Manning
Proceedings of EMNLP, 2015
23422015
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
A Wang, A Singh, J Michael, F Hill, O Levy, SR Bowman
Proceedings of ICLR, 2019
20572019
Generating sentences from a continuous space
SR Bowman, L Vilnis, O Vinyals, AM Dai, R Jozefowicz, S Bengio
Proceedings of CoNLL, 2016
16912016
A Broad-Coverage Challenge Corpus for Sentence Understanding through Inference
A Williams, N Nangia, SR Bowman
Proceedings of NAACL-HLT, 2018
15212018
SuperGLUE: A Stickier Benchmark for General-Purpose Language Understanding Systems
A Wang, Y Pruksachatkun, N Nangia, A Singh, J Michael, F Hill, O Levy, ...
Proceedings of NeurIPS, 2019
5312019
Annotation artifacts in natural language inference data
S Gururangan, S Swayamdipta, O Levy, R Schwartz, SR Bowman, ...
Proceedings of NAACL, 2018
5242018
XNLI: Evaluating Cross-lingual Sentence Representations
A Conneau, G Lample, R Rinott, A Williams, SR Bowman, H Schwenk, ...
Proceedings of EMNLP, 2018
4552018
Neural network acceptability judgments
A Warstadt, A Singh, SR Bowman
TACL 7, 625-641, 2019
3692019
What do you learn from context? Probing for sentence structure in contextualized word representations
I Tenney, P Xia, B Chen, A Wang, A Poliak, RT McCoy, N Kim, ...
Proceedings of ICLR, 2019
3632019
A Fast Unified Model for Parsing and Sentence Understanding
SR Bowman, J Gauthier, A Rastogi, R Gupta, CD Manning, C Potts
Proceedings of ACL, 2016
3142016
Universal Dependencies 2.2
J Nivre, M Abrams, Ž Agić, L Ahrenberg, L Antonsen, MJ Aranzabe, ...
258*2018
A Gold Standard Dependency Corpus for English
N Silveira, T Dozat, MC de Marneffe, SR Bowman, M Connor, J Bauer, ...
Proceedings of LREC, 2014
1912014
Sentence encoders on STILTs: Supplementary training on intermediate labeled-data tasks
J Phang, T Févry, SR Bowman
arXiv preprint 1811.01088, 2018
1872018
Recursive Neural Networks Can Learn Logical Semantics
SR Bowman, C Potts, CD Manning
Proceedings of the 3rd Workshop on Continuous Vector Space Models and their …, 2015
169*2015
On Measuring Social Biases in Sentence Encoders
C May, A Wang, S Bordia, SR Bowman, R Rudinger
Proceedings of NAACL-HLT, 2019
1522019
Do latent tree learning models identify meaningful structure in sentences?
A Williams, A Drozdov, SR Bowman
Transactions of the Association for Computational Linguistics 6, 253-267, 2018
119*2018
Language Modeling Teaches You More than Translation Does: Lessons Learned Through Auxiliary Task Analysis
K Zhang, S Bowman
Proceedings of the 2018 EMNLP Workshop BlackboxNLP: Analyzing and …, 2018
80*2018
Discourse-based objectives for fast unsupervised sentence representation learning
Y Jernite, SR Bowman, D Sontag
arXiv preprint 1705.00557, 2017
742017
Can You Tell Me How to Get Past Sesame Street? Sentence-Level Pretraining Beyond Language Modeling
A Wang, J Hula, P Xia, R Pappagari, RT McCoy, R Patel, N Kim, I Tenney, ...
Proceedings of ACL, 4465-4476, 2019
72*2019
Intermediate-Task Transfer Learning with Pretrained Models for Natural Language Understanding: When and Why Does It Work?
Y Pruksachatkun, J Phang, H Liu, PM Htut, X Zhang, RY Pang, C Vania, ...
Proceedings of ACL, 2020
702020
The system can't perform the operation now. Try again later.
Articles 1–20