Skip to content

Siki-cloud/PairCFR

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PairCFR: Enhancing Model Training on Paired Counterfactually Augmented Data through Contrastive Learning

Contents of Repro

Introduction

Here we provide the data including counterfactually augmented data for our methods and back_translation augmented data for comparison methods over sentiment analysis and natural language inference tasks, and codes including design of all models and combined function of contrastive and cross-entropy loss. More details can be seen in our paper.

Data

we use human-in-loop counterfactually augmented data provided by (Kaushik et al.,2019). counterfactually-augmented-data.

Task domain calss original to counterfactual ratio
sentiment analysis IMDb movie reviews 2 1:1
natural language inference SNLI dataset 3 1:4

Other test data sources:

[1] Kaushik D, Hovy E, Lipton Z. Learning The Difference That Makes A Difference With Counterfactually-Augmented Data[C]//International Conference on Learning Representations. 2019.

Models

pre-trained model + classification head list all pre-trained models used in our experiments, which can be indexed by following mode names through the HuggingFace tool:

  • Bert-base-uncased
  • Roberta-base
  • T5-base
  • Sentence-transformers/multi-qa-distilbert-cos-v1

Running

Environment

  • python3.8
  • PyTorch2.0.1

To run the code, you should install some packages and the appropriate torch version

pip install installpytorch
pip install requirement

Run the finetune code on IMDB CAD

cd runimdb\run
python run_bash.py

Run the finetune code on SNLI CAD

cd runsnli
python run_bash.py

About

PairCFR code

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages