Skip to content

Latest commit

 

History

History
7 lines (4 loc) · 594 Bytes

210914 Task-adaptive Pre-training and Self-training are Complementary for Natural Language Understanding.md

File metadata and controls

7 lines (4 loc) · 594 Bytes

https://arxiv.org/abs/2109.06466

Task-adaptive Pre-training and Self-training are Complementary for Natural Language Understanding (Shiyang Li, Semih Yavuz, Wenhu Chen, Xifeng Yan)

이쪽은 타겟 도메인 데이터에 task adaptive pretraining으로 unsupervised pretraining을 한 다음 pseudo label로 self training을 하는 접근이군요. 사실 이쪽이 unsupervised/semi-supervised로 끝장내는 힙한 방법이긴 한데 전 intermediate task도 그 자체로 흥미로운 방향이 아닌가 싶고 그렇네요.

#pretraining #finetuning #semi_supervised_learning #few_shot