Skip to content

A official implementation of SARG: A Novel Semi Autoregressive Generator for Multi-turn Incomplete Utterance Restoration

License

Notifications You must be signed in to change notification settings

NetEase-GameAI/SARG

Repository files navigation

SARG

License

This repository is the implementation of SARG: A Novel Semi Autoregressive Generator for Multi-turn Incomplete Utterance Restoration in python 3.6 environment and pytorch 1.5.1. sarg

Requirements

To install requirements:

pip install -r requirements.txt

Note: install the torch-gpu version corresponding to the version of cuda.

Pretrained Models

  • First, you can first download the pretrained models RoBERTa-wwm-ext, Chinese for chinese dataset (rename it chinese_roberta_wwm_ext_pytorch) and bert-base-uncased for english dataset ,
  • Second, rename the bert_config.json to config.json in chinese_roberta_wwm_ext_pytorch.
  • Final, convert the BERT pretrained weights to initial weights of SARG by
python covert_weight_from_bert_to_sarg.py.

Training

Restoration_200k_data

For the model with coverage mechanism, we first optimize the model 14000 steps with no coverage loss and then train it until convergence with coverage loss weighted to .

Our experiments of Restoration-200k are conducted on 7 Tesla P40. To obtain the best performance as reported in paper, we recommend to do this train as below:

sh scripts/run_train_chinese.sh

Or if the less GPUs you have, the possible solution is to set the gradient_accumulation_steps to be an appropriate value.

CANARD

Our experiments of CANARD are conducted on a single GPU. And we also find that the added coverage loss does no help to the overall model. The training is as below:

sh scripts/run_train_english.sh

Evaluation

To evaluate the model on Restoration-200k, run:

sh scripts/run_eval_chinese.sh

To evaluate the model on CANARD, run:

sh scripts/run_eval_english.sh

Citation

If you use this code in your research, you can cite our paper.

@article{Huang_Li_Zou_Zhang_2021, 
  title={SARG: A Novel Semi Autoregressive Generator for Multi-turn Incomplete Utterance Restoration}, 
  author={Huang, Mengzuo and Li, Feng and Zou, Wuhe and Zhang, Weidong},
  journal={Proceedings of the AAAI Conference on Artificial Intelligence}, 
  volume={35}, 
  url={https://ojs.aaai.org/index.php/AAAI/article/view/17543}, 
  number={14}, 
  year={2021}, 
  month={May}, 
  pages={13055-13063} 
}

License

BSD 3 Clause

About

A official implementation of SARG: A Novel Semi Autoregressive Generator for Multi-turn Incomplete Utterance Restoration

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published