Skip to content

Latest commit

 

History

History
7 lines (4 loc) · 343 Bytes

200217 Incorporating BERT into Neural Machine Translation.md

File metadata and controls

7 lines (4 loc) · 343 Bytes

https://arxiv.org/abs/2002.06823

Incorporating BERT into Neural Machine Translation (Jinhua Zhu, Yingce Xia, Lijun Wu, Di He, Tao Qin, Wengang Zhou, Houqiang Li, Tie-Yan Liu)

프리트레이닝 정도로는 만족할 수 없어 아예 BERT를 NMT 모델의 서브모듈로 합치기로 했습니다.

#language_model #bert #nmt #pretraining