Contact: Guan-Lin Chao ([email protected])
An implementation of the BERT-DST: Scalable End-to-End Dialogue State Tracking with Bidirectional Encoder Representations from Transformer (Interspeech 2019).
Tested on Python 3.6, Tensorflow==1.13.0rc0
- bert
- uncased_L-12_H-768_A-12: pretrained [BERT-Base, Uncased] model checkpoint. Download link in bert.
dstc2-clean, woz_2.0, sim-M and sim-R