Skip to content

Latest commit

 

History

History
113 lines (80 loc) · 4.36 KB

README.md

File metadata and controls

113 lines (80 loc) · 4.36 KB

Rank-One Model Editing (ROME)

This repository provides an implementation of Rank-One Model Editing (ROME) on auto-regressive transformers (GPU-only). We currently support OpenAI's GPT-2 XL (1.5B) and EleutherAI's GPT-J (6B). The release of a 20B GPT-like model from EleutherAI is expected soon; we hope to support it ASAP.

Feel free to open an issue if you find any problems; we are actively developing this repository and will monitor tickets closely.

Colab ROME Demo

causal tracing GIF

Table of Contents

  1. Installation
  2. Causal Tracing
  3. Rank-One Model Editing (ROME)
  4. CounterFact Dataset
  5. Evaluation
  6. How to Cite

Installation

We recommend conda for managing Python, CUDA, and PyTorch-related dependencies, and pip for everything else. To get started, simply install conda and run:

./scripts/setup_conda.sh

Causal Tracing

notebooks/causal_trace.ipynb demonstrates Causal Tracing, which can be modified to apply tracing to the processing of any statement.

causal tracing GIF

Rank-One Model Editing (ROME)

notebooks/rome.ipynb demonstrates ROME. The API is simple; one simply has to specify a requested rewrite of the following form:

request = {
    "prompt": "{} plays the sport of",
    "subject": "LeBron James",
    "target_new": {
        "str": "football"
    }
}

Several similar examples are included in the notebook.

CounterFact

Description coming soon!

Evaluation

Paper Baselines

We compare ROME against several state-of-the-art model editors. All are implemented in baselines/ in their respective folders. Implementations are not our own; they are adapted slightly to plug into our evaluation system.

Running the Full Evaluation Suite

Description coming soon!

How to Cite

@article{meng2022locating,
  title={Locating and Editing Factual Knowledge in GPT},
  author={Kevin Meng and David Bau and Alex Andonian and Yonatan Belinkov},
  journal={arXiv preprint arXiv:2202.05262},
  year={2022}
}