Skip to content

Latest commit

 

History

History
78 lines (55 loc) · 2.29 KB

README.md

File metadata and controls

78 lines (55 loc) · 2.29 KB

MovieChats

Training Inference-1 Inference-2

Dataset & Model Download

MovieData

chitchat_pretrain_model

fine-tuned model

Preprocessing:

Each dialogue turn will be represented as a sequence, which contains previous user/system turns, dialog act, knowledge(None), and response

[context][user]你觉得血色身材这部电影好看吗?[system]我觉得简直是太精彩了,[user]是吗?[system]就要讲述的故事,一环扣一环,剧情特别的惊心动魄,[user]是不是给人一种特别的紧张感?[system]简直你说的太实在了,[user]是吧,我觉得这部电影算是精品吧,[system]我觉得是精品中的精品,[user]怎么你对这部片儿的评价很高呀?[system]是啊,因为我喜欢看这一种嘛,简直是太好看了,[endofcontext][action][endofaction][knowledge][endofknowledge]
[response]行,那以后我要是有这种电影的话,我还推荐给你[endofresponse]

Usage

unzip ul_model_best.zip unzip pretrain_model.zip unzip movie_data.zip

Requirements

  • python 2.7+
  • transformers==2.1.1

Run

./train_ul_best.sh

train the model

python ./train_ul_best.py --epochs 8 --batch_size 64 --pretrained_model ./pretrain_model/pytorch_model.bin  

hyper-parameter settings

{
  "initializer_range": 0.02,
  "layer_norm_epsilon": 1e-05,
  "n_ctx": 300,
  "n_embd": 768,
  "n_head": 12,
  "n_layer": 12,
  "n_positions": 300,
  "vocab_size": 13317
}

Reference

@inproceedings{su2020moviechats,
  title={MovieChats: Chat like Humans in a Closed Domain},
  author={Su, Hui and Shen, Xiaoyu and Xiao, Zhou and Zhang, Zheng and Chang, Ernie and Zhang, Cheng and Niu, Cheng and Zhou, Jie},
  booktitle={Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)},
  pages={6605--6619},
  year={2020}
} 

Thanks

GPT2-chitchat(https://github.com/yangjianxin1/GPT2-chitchat) CDial-GPT(https://github.com/thu-coai/CDial-GPT)