NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:3088
Title:XLNet: Generalized Autoregressive Pretraining for Language Understanding


		
The paper proposes XLNet, a generalized autoregressive pretraining method for language representation learning. The paper shows that XLNet outperforms the state of the art method of BERT on 12 tasks. The paper is of high quality in terms of clarity, technical soundness, significance, and novelty. The authors successfully addressed the issues pointed out by the reviewers. The reviewers are very satisfied with the response.