Skip to content

LukasHedegaard/block-spa-experiments

 
 

Repository files navigation

Block-based Structured Adapter Pruning Experiments

This repository contains experiments for the block-based SPA, Structured Pruning Low-rank Adapter (SPLoRA), proposed in Structured Pruning Adapters using various structured pruning methods, for weight-based transfer learning of BERT on the SQuAD question-answering dataset.

Installation ⚙️

  • Clone this repository and enter it:
    git clone https://github.com/LukasHedegaard/block-spa-experiments.git
    cd block-spa-experiments
  • (Optionally) create conda environment:
    conda create --name block-spa-experiments python=3.10
  • Install as editable module
    pip install -e .[dev]

Run training + pruning 🏃‍♂️

The experiments are distributed over two branches:

  • splopa / master: Run the Structured Pruning Low-rank PHM Adapter (SPLoPA).
  • fine-pruning: Run fine-pruning baselines as found in the block-movement-pruning repository.

Select the branch of choice prior to running the experiment.

For reproducibility purposes, we use a detailed excel sheet (hparams/hyperparameters.xslx) to specify hyper-parameters.

To select and spawn a run, please follow these steps:

  1. Open hparams/hyperparameters.xslx.
  2. Note the "EXP ID" (col C) and "Effective encoder remain weights %" (col D) for the chosen row (this constitutes one run).
  3. Modify scripts/run_squad_from_excel_sheet.py accordingly. Note that if multiple runs are specified, they will be spawned in each their own process.
  4. Run python scripts/run_squad_from_excel_sheet.py.

About

Block Sparse movement pruning

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 86.0%
  • Jupyter Notebook 10.5%
  • Shell 2.2%
  • Makefile 1.3%