Skip to content

Code for EMNLP 2021 paper "Transferable Persona-Grounded Dialogues via Grounded Minimal Edits"

Notifications You must be signed in to change notification settings

thu-coai/grounded-minimal-edit

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Grounded-Minimal-Edit

Code for EMNLP 2021 paper "Transferable Persona-Grounded Dialogues via Grounded Minimal Edits"

Dependencies

pip install transformers==2.3.0 torch==1.2.0 nltk matplotlib tensorboardX

Download the nltk stopwords.

import nltk
nltk.download('stopwords')

Download persona evaluator from here as dnli-bert/dnli_model.bin.

PersonaMinEdit Dataset Format

Training data (data/personachat-ucpt/train.json):

[
    ...
    {
        "context": tuple of strs,
        "response": str,
        "persona": str or an empty list,
    },
    ...
]

Validation and test data (data/personachat-ucpt/{valid, test}.json):

[
    ...
    {
        "context": tuple of strs,
        "original_response": str,
        "intervening_persona": tuple of strs,
        "references": tuple of strs,
    },
    ...
]

Grounded Minimal Editing Experiment

Use GME as the root directory.

Train

  1. Train the editor model.
python3 train.py 

Test

  1. Download checkpoint (seed=0) from here as outputs/saved_model/persona-chat-cprm-smooth_eps0.1-grad_thres3-tau3-0/best-model.ckpt. This step is not necessary if you do the training above.
python3 test.py 

Transfer Learning Experiment

Use GME-Zero-Shot as the root directory.

  1. Download checkpoint (seed=0) from here as outputs/saved_model/persona-chat-cprm-smooth_eps0.1-grad_thres3-tau3-0/best-model.ckpt. If you do the training of the Grounded Minimal Editing experiment, copy the saved checkpoint as outputs/saved_model/persona-chat-cprm-smooth_eps0.1-grad_thres3-tau3-0/best-model.ckpt.
python3 test.py 

About

Code for EMNLP 2021 paper "Transferable Persona-Grounded Dialogues via Grounded Minimal Edits"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published