WebJun 15, 2024 · Fine-tuning pre-trained cross-lingual language models can transfer task-specific supervision from one language to the others. In this work, we propose to improve cross-lingual fine-tuning with consistency regularization. Specifically, we use example consistency regularization to penalize the prediction sensitivity to four types of data … WebWanxiang Che. Harbin Institute of Technology, Harbin, China. View editor publications. You can also search for this editor in PubMed Google Scholar. Shizhu He. Chinese Academy …
车万翔 (Wanxiang Che)
Web简介. Whole Word Masking (wwm),暂翻译为全词Mask或整词Mask,是谷歌在2024年5月31日发布的一项BERT的升级版本,主要更改了原预训练阶段的训练样本生成策略。简单来说,原有基于WordPiece的分词方式会把一个完整的词切分成若干个子词,在生成训练样本时,这些被分开的子词会随机被mask。 http://ir.hit.edu.cn/~car/ irish4acure
[2009.11616v1] N-LTP: A Open-source Neural Chinese Language …
http://ir.hit.edu.cn/~car/zh/ WebApr 9, 2024 · A Preliminary Evaluation of ChatGPT for Zero-shot Dialogue Understanding. Wenbo Pan, Qiguang Chen, Xiao Xu, Wanxiang Che, Libo Qin. Zero-shot dialogue understanding aims to enable dialogue to track the user's needs without any training data, which has gained increasing attention. In this work, we investigate the understanding … WebI obtained Ph.D. degree in computer science from the SCIR lab (led by Prof. Ting Liu) at Harbin Institute of Technology in 2024, under the supervision of Prof. Haifeng Wang … port from frontier