Skip to content

Commit

Permalink
Merge branch 'main' into dev/prompt_tuning
Browse files Browse the repository at this point in the history
  • Loading branch information
lenglaender committed Nov 19, 2023
2 parents d4a817c + d6d44cd commit 159b8bb
Show file tree
Hide file tree
Showing 35 changed files with 4,559 additions and 3,646 deletions.
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -172,3 +172,6 @@ scripts/git-strip-merge
# DS_Store (MacOS)
.DS_Store
tests_adapters/backwards_compatibility/Ref_Out

# backwards compatibility
model_outputs
2 changes: 1 addition & 1 deletion LICENSE
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
Copyright 2018-2021 The Hugging Face Team and The AdapterHub Team. All rights reserved.
Copyright 2020-2023 The AdapterHub Team. All rights reserved.

Apache License
Version 2.0, January 2004
Expand Down
113 changes: 75 additions & 38 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,42 +14,10 @@ See the License for the specific language governing permissions and
limitations under the License.
-->

## `adapters` Library

This branch contains the development version of `adapters`, the next-generation library for parameter-efficient and modular transfer learning.

> **Note**: For the stable version of `adapter-transformers`, please switch to the [master branch of the repo](https://github.com/adapter-hub/adapter-transformers).
### Changes compared to `adapter-transformers`

- `adapters` is a standalone package, using `transformers` as an external dependency but not patching it directly
- All adapter-related classes now are imported via `adapters` namespace, e.g.:
```python
from adapters import BertAdapterModel
# ...
```
- Built-in HF model classes can be adapted for usage with adapters via a wrapper method, e.g.:
```python
import adapters
from transformers import BertModel

model = BertModel.from_pretrained("bert-base-uncased")
adapters.init(model)
```

Features not (yet) working:

- Loading model + adapter checkpoints using HF classes
- Using Transformers pipelines with adapters
- Using HF language modeling classes with invertible adapters

## Documentation
To read the documentation of _Adapters_, follow the steps in [docs/README.md](docs/README.md). Currently, the documentation is **not** yet available from https://docs.adapterhub.ml/.

---
> **Note**: This repository holds the codebase of the _Adapters_ library, which has replaced `adapter-transformers`. For the legacy codebase, go to: https://github.com/adapter-hub/adapter-transformers-legacy.
<p align="center">
<img style="vertical-align:middle" src="https://raw.githubusercontent.com/Adapter-Hub/adapter-transformers/master/adapter_docs/logo.png" />
<img style="vertical-align:middle" src="https://raw.githubusercontent.com/Adapter-Hub/adapters/main/docs/logo.png" />
</p>
<h1 align="center">
<span><i>Adapters</i></span>
Expand All @@ -59,8 +27,8 @@ To read the documentation of _Adapters_, follow the steps in [docs/README.md](do
A Unified Library for Parameter-Efficient and Modular Transfer Learning
</h3>

![Tests](https://github.com/Adapter-Hub/adapter-transformers/workflows/Tests/badge.svg?branch=adapters)
[![GitHub](https://img.shields.io/github/license/adapter-hub/adapter-transformers.svg?color=blue)](https://github.com/adapter-hub/adapter-transformers/blob/adapters/LICENSE)
![Tests](https://github.com/Adapter-Hub/adapters/workflows/Tests/badge.svg?branch=adapters)
[![GitHub](https://img.shields.io/github/license/adapter-hub/adapters.svg?color=blue)](https://github.com/adapter-hub/adapters/blob/main/LICENSE)
[![PyPI](https://img.shields.io/pypi/v/adapters)](https://pypi.org/project/adapters/)

`adapters` is an add-on to [HuggingFace's Transformers](https://github.com/huggingface/transformers) library, integrating adapters into state-of-the-art language models by incorporating **[AdapterHub](https://adapterhub.ml)**, a central repository for pre-trained adapter modules.
Expand All @@ -77,13 +45,82 @@ pip install -U adapters
... or from source by cloning the repository:

```
git clone https://github.com/adapter-hub/adapter-transformers.git
git clone https://github.com/adapter-hub/adapters.git
git checkout adapters
cd adapters
pip install .
```

## Getting Started
## Quick Tour

#### Load pre-trained adapters:

```python
from adapters import AutoAdapterModel
from transformers import AutoTokenizer

model = AutoAdapterModel.from_pretrained("roberta-base")
tokenizer = AutoTokenizer.from_pretrained("roberta-base")

model.load_adapter("AdapterHub/roberta-base-pf-imdb", source="hf", set_active=True)

print(model(**tokenizer("This works great!", return_tensors="pt")).logits)
```

**[Learn More](https://docs.adapterhub.ml/loading.html)**

#### Adapt existing model setups:

```python
import adapters
from transformers import AutoModelForSequenceClassification

model = AutoModelForSequenceClassification.from_pretrained("t5-base")

adapters.init(model)

model.add_adapter("my_lora_adapter", config="lora")
model.train_adapter("my_lora_adapter")

# Your regular training loop...
```

**[Learn More](https://docs.adapterhub.ml/quickstart.html)**

#### Flexibly configure adapters:

```python
from adapters import ConfigUnion, PrefixTuningConfig, ParBnConfig, AutoAdapterModel

model = AutoAdapterModel.from_pretrained("microsoft/deberta-v3-base")

adapter_config = ConfigUnion(
PrefixTuningConfig(prefix_length=20),
ParBnConfig(reduction_factor=4),
)
model.add_adapter("my_adapter", config=adapter_config, set_active=True)
```

**[Learn More](https://docs.adapterhub.ml/overview.html)**

#### Easily compose adapters in a single model:

```python
from adapters import AdapterSetup, AutoAdapterModel
import adapters.composition as ac

model = AutoAdapterModel.from_pretrained("roberta-base")

qc = model.load_adapter("AdapterHub/roberta-base-pf-trec")
sent = model.load_adapter("AdapterHub/roberta-base-pf-imdb")

with AdapterSetup(ac.Parallel(qc, sent)):
print(model(**tokenizer("What is AdapterHub?", return_tensors="pt")))
```

**[Learn More](https://docs.adapterhub.ml/adapter_composition.html)**

## Useful Resources

HuggingFace's great documentation on getting started with _Transformers_ can be found [here](https://huggingface.co/transformers/index.html). `adapters` is fully compatible with _Transformers_.

Expand Down
5 changes: 5 additions & 0 deletions docs/_static/custom.css
Original file line number Diff line number Diff line change
Expand Up @@ -31,3 +31,8 @@ a {
.wy-side-scroll {
padding-bottom: 1em;
}

/* override table no-wrap */
.wy-table-responsive table td, .wy-table-responsive table th {
white-space: normal;
}
2 changes: 1 addition & 1 deletion docs/classes/adapter_config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ Classes representing the architectures of adapter modules and fusion layers.
Single (bottleneck) adapters
~~~~~~~~~~~~~~~~~~~~~~~~~~~~

.. autoclass:: adapters.AdapterConfigBase
.. autoclass:: adapters.AdapterConfig
:members:

.. autoclass:: adapters.BnConfig
Expand Down
6 changes: 3 additions & 3 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,9 +19,9 @@

# -- Project information -----------------------------------------------------

project = "adapters"
copyright = "2020-2022, Adapter-Hub Team"
author = "Adapter-Hub Team"
project = "AdapterHub"
copyright = "2020-2023, AdapterHub Team"
author = "AdapterHub Team"

docs_versions = [
"adapters1.1.1",
Expand Down
4 changes: 2 additions & 2 deletions docs/contributing/adding_adapter_methods.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,14 +18,14 @@ Therefore, the described steps might not be applicable to each implementation.
These module classes then have to be inserted into the correct locations within the Transformer model implementation.
Thus, each adapter method implementation at least should provide two classes:

- a configuration class deriving from `AdapterConfigBase` that provides attributes for all configuration options of the method
- a configuration class deriving from `AdapterConfig` that provides attributes for all configuration options of the method
- a module class deriving from the abstract `AdapterLayerBase` that provides the method parameters and a set of standard adapter management functions
- modules supporting [adapter composition](https://docs.adapterhub.ml/adapter_composition.html) should instead derive from `ComposableAdapterLayerBase`

### Configuration

All configuration classes reside in `src/adapters/configuration/adapter_config.py`.
- To add a new configuration class for a new method, create a new subclass of [`AdapterConfigBase`](adapters.AdapterConfigBase).
- To add a new configuration class for a new method, create a new subclass of [`AdapterConfig`](adapters.AdapterConfig).
Make sure to set the `architecture` attribute in your class.
- Finally, also make sure the config class is added to the `__init__.py` files in `src/adapters`.

Expand Down
18 changes: 14 additions & 4 deletions docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,19 +6,29 @@
AdapterHub Documentation
================================================

.. note::
This documentation is based on the new *Adapters* library.

The documentation based on the legacy *adapter-transformers* library can be found at: `https://docs-legacy.adapterhub.ml <https://docs-legacy.adapterhub.ml>`_.

*AdapterHub* is a framework simplifying the integration, training and usage of adapters and other efficient fine-tuning methods for Transformer-based language models.
For a full list of currently implemented methods, see the `table in our repository <https://github.com/adapter-hub/adapters#implemented-methods>`_.

The framework consists of two main components:

- ``adapters``, an extension of Hugging Face's `Transformers <https://huggingface.co/transformers/>`_ library that adds adapter components to transformer models
.. list-table::
:widths: 50 50
:header-rows: 1

- `The Hub <https://adapterhub.ml>`_, a central repository collecting pre-trained adapter modules
* - `Adapters <https://github.com/adapter-hub/adapters>`_
- `AdapterHub.ml <https://adapterhub.ml/explore>`_
* - an add-on to Hugging Face's `Transformers <https://huggingface.co/transformers/>`_ library that adds adapters into transformer models
- a central collection of pre-trained adapter modules

Currently, we support the PyTorch versions of all models as listed on the `Model Overview <model_overview.html>`_ page.

.. toctree::
:maxdepth: 2
:maxdepth: 1
:caption: Getting Started

installation
Expand Down Expand Up @@ -79,7 +89,7 @@ Currently, we support the PyTorch versions of all models as listed on the `Model
classes/models/xmod

.. toctree::
:maxdepth: 2
:maxdepth: 1
:caption: Adapter-Related Classes

classes/adapter_config
Expand Down
2 changes: 1 addition & 1 deletion docs/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ Identifiers and configuration classes are explained in more detail in the [next
## Configuration

All supported adapter methods can be added, trained, saved and shared using the same set of model class functions (see [class documentation](adapters.ModelAdaptersMixin)).
Each method is specified and configured using a specific configuration class, all of which derive from the common [`AdapterConfigBase`](adapters.AdapterConfigBase) class.
Each method is specified and configured using a specific configuration class, all of which derive from the common [`AdapterConfig`](adapters.AdapterConfig) class.
E.g., adding one of the supported adapter methods to an existing model instance follows this scheme:
```python
model.add_adapter("name", config=<ADAPTER_CONFIG>)
Expand Down
2 changes: 1 addition & 1 deletion docs/training.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ Compared to fine-tuning the entire model, we have to make only one significant a
# task adapter - only add if not existing
if task_name not in model.adapters_config:
# resolve the adapter config
adapter_config = AdapterConfigBase.load(adapter_args.adapter_config)
adapter_config = AdapterConfig.load(adapter_args.adapter_config)
# add a new adapter
model.add_adapter(task_name, config=adapter_config)
# Enable adapter training
Expand Down
6 changes: 6 additions & 0 deletions docs/transitioning.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,12 @@ The `adapters` library supports the configuration of adapters using [config stri
For a complete list of config strings and classes see [here](https://docs.adapterhub.ml/overview.html). We strongly recommend using the new config strings, but we will continue to support the old config strings for the time being to make the transition easier.
Note that with the config strings the coresponding adapter config classes have changed, e.g. `PfeifferConfig` -> `SeqBnConfig`.

Another consequence of this that the `AdapterConfig` class is now not only for the bottleneck adapters anymore, but the base class of all the configurations (previously `AdapterConfigBase`). Hence the function this class serves has changed. However, you can still load adapter configs with:
```
adapter_config = AdapterConfig.load("lora")
```


## Features that are not supported by `adapters`

Compared to `adapter-transformers`, there are a few features that are no longer supported by the `adapters` library:
Expand Down
8 changes: 3 additions & 5 deletions examples/pytorch/dependency-parsing/run_udp.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@

import adapters
import adapters.composition as ac
from adapters import AdapterArguments, AdapterConfigBase, AutoAdapterModel, setup_adapter_training
from adapters import AdapterArguments, AdapterConfig, AutoAdapterModel, setup_adapter_training
from preprocessing import preprocess_dataset
from transformers import AutoConfig, AutoTokenizer, HfArgumentParser, set_seed
from utils_udp import UD_HEAD_LABELS, DependencyParsingAdapterTrainer, DependencyParsingTrainer, UDTrainingArguments
Expand Down Expand Up @@ -252,7 +252,7 @@ def main():
logger.info("Loading best model for predictions.")

if adapter_args.train_adapter:
adapter_config = AdapterConfigBase.load(adapter_args.adapter_config, **adapter_config_kwargs)
adapter_config = AdapterConfig.load(adapter_args.adapter_config, **adapter_config_kwargs)
model.load_adapter(
os.path.join(training_args.output_dir, "best_model", task_name)
if training_args.do_train
Expand All @@ -262,9 +262,7 @@ def main():
**adapter_load_kwargs,
)
if adapter_args.load_lang_adapter:
lang_adapter_config = AdapterConfigBase.load(
adapter_args.lang_adapter_config, **adapter_config_kwargs
)
lang_adapter_config = AdapterConfig.load(adapter_args.lang_adapter_config, **adapter_config_kwargs)
lang_adapter_name = model.load_adapter(
os.path.join(training_args.output_dir, "best_model", lang_adapter_name)
if training_args.do_train
Expand Down
Loading

0 comments on commit 159b8bb

Please sign in to comment.