フォロー
Marjan Ghazvininejad
Marjan Ghazvininejad
Research Scientist, FAIR (Facebook AI Research)
確認したメール アドレス: fb.com - ホームページ
タイトル
引用先
引用先
Bart: Denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension
M Lewis, Y Liu, N Goyal, M Ghazvininejad, A Mohamed, O Levy, ...
arXiv preprint arXiv:1910.13461, 2019
88062019
Multilingual denoising pre-training for neural machine translation
Y Liu, J Gu, N Goyal, X Li, S Edunov, M Ghazvininejad, M Lewis, ...
Transactions of the Association for Computational Linguistics 8, 726-742, 2020
14722020
A knowledge-grounded neural conversation model
M Ghazvininejad, C Brockett, MW Chang, B Dolan, J Gao, W Yih, ...
Proceedings of the AAAI Conference on Artificial Intelligence 32 (1), 2018
6242018
Mask-predict: Parallel decoding of conditional masked language models
M Ghazvininejad, O Levy, Y Liu, L Zettlemoyer
arXiv preprint arXiv:1904.09324, 2019
5112019
Generating Topical Poetry
M Ghazvininejad, X Shi, Y Choi, K Knight
Empirical Methods on Natural Language Processing, 2016
1842016
Hafez: an Interactive Poetry Generation System
M Ghazvininejad, X Shi, J Priyadarshi, K Knight
proceeding of ACL Demo Track, 2017
1832017
Towards controllable story generation
N Peng, M Ghazvininejad, J May, K Knight
Proceedings of the First Workshop on Storytelling, 43-49, 2018
1652018
Pre-training via paraphrasing
M Lewis, M Ghazvininejad, G Ghosh, A Aghajanyan, S Wang, ...
Advances in Neural Information Processing Systems 33, 18470-18481, 2020
1432020
Detecting hallucinated content in conditional neural sequence generation
C Zhou, G Neubig, J Gu, M Diab, P Guzman, L Zettlemoyer, ...
arXiv preprint arXiv:2011.02593, 2020
1312020
Delight: Deep and light-weight transformer
S Mehta, M Ghazvininejad, S Iyer, L Zettlemoyer, H Hajishirzi
arXiv preprint arXiv:2008.00623, 2020
1152020
Non-autoregressive machine translation with disentangled context transformer
J Kasai, J Cross, M Ghazvininejad, J Gu
International conference on machine learning, 5144-5155, 2020
106*2020
Aligned cross entropy for non-autoregressive machine translation
M Ghazvininejad, V Karpukhin, L Zettlemoyer, O Levy
International Conference on Machine Learning, 3515-3523, 2020
1022020
Training on synthetic noise improves robustness to natural noise in machine translation
V Karpukhin, O Levy, J Eisenstein, M Ghazvininejad
Proceedings of the 5th Workshop on Noisy User-generated Text (W-NUT 2019), 42-47, 2019
982019
A review on language models as knowledge bases
B AlKhamissi, M Li, A Celikyilmaz, M Diab, M Ghazvininejad
arXiv preprint arXiv:2204.06031, 2022
952022
Improving zero and few-shot abstractive summarization with intermediate fine-tuning and data augmentation
AR Fabbri, S Han, H Li, H Li, M Ghazvininejad, S Joty, D Radev, ...
arXiv preprint arXiv:2010.12836, 2020
902020
In-context examples selection for machine translation
S Agrawal, C Zhou, M Lewis, L Zettlemoyer, M Ghazvininejad
arXiv preprint arXiv:2212.02437, 2022
812022
Semi-autoregressive training improves mask-predict decoding
M Ghazvininejad, O Levy, L Zettlemoyer
arXiv preprint arXiv:2001.08785, 2020
602020
Natural language to code translation with execution
F Shi, D Fried, M Ghazvininejad, L Zettlemoyer, SI Wang
arXiv preprint arXiv:2204.11454, 2022
582022
Prompting contrastive explanations for commonsense reasoning tasks
B Paranjape, J Michael, M Ghazvininejad, L Zettlemoyer, H Hajishirzi
arXiv preprint arXiv:2106.06823, 2021
582021
Bart: Denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension, 2019
M Lewis, Y Liu, N Goyal, M Ghazvininejad, A Mohamed, O Levy, ...
arXiv preprint arXiv:1910.13461, 1910
541910
現在システムで処理を実行できません。しばらくしてからもう一度お試しください。
論文 1–20