Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integrate X-LoRA #1491

Merged
merged 190 commits into from
Jul 5, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
190 commits
Select commit Hold shift + click to select a range
9aeb69c
Initial commit of integration
EricLBuehler Feb 18, 2024
b9d3878
Pass a reference back to the peftmodel when creating
EricLBuehler Feb 20, 2024
bbd3ad4
Check base model in from_pretrained
EricLBuehler Feb 20, 2024
ea53917
Fix assert, attr
EricLBuehler Feb 20, 2024
79c4df9
Fix inheritance from peft model
EricLBuehler Feb 20, 2024
2ee88af
Update inheritance again
EricLBuehler Feb 20, 2024
19063dc
Update inheritance again and properly instantiate
EricLBuehler Feb 20, 2024
d364e5b
Update comment
EricLBuehler Feb 20, 2024
c9ab310
Export config, model
EricLBuehler Feb 21, 2024
9ed665e
Remove use of default attr
EricLBuehler Feb 21, 2024
ae32006
Remove use of default attr
EricLBuehler Feb 21, 2024
9bfe8a6
Update imports
EricLBuehler Feb 21, 2024
048958f
Update imports
EricLBuehler Feb 21, 2024
024ce15
Update imports
EricLBuehler Feb 21, 2024
3d08fc9
Work on circular import
EricLBuehler Feb 21, 2024
7e5332b
Work on circular import again
EricLBuehler Feb 21, 2024
d5193f4
Remove another circular import
EricLBuehler Feb 21, 2024
5d7ff64
Remove another circular import
EricLBuehler Feb 21, 2024
b41089d
Slightly refactor
EricLBuehler Feb 21, 2024
bfd794f
Update with EricLBuehler/xlora#20
EricLBuehler Feb 21, 2024
726f8a8
Make signature an exact copy
EricLBuehler Feb 21, 2024
8a1c68c
Refactor util fns
EricLBuehler Feb 21, 2024
1a34615
Update the typing structure
EricLBuehler Feb 21, 2024
1f0e03a
Pass super() the dict
EricLBuehler Feb 21, 2024
c326560
Add capability to disable default injection
EricLBuehler Feb 21, 2024
c9dab76
Set base model for loading adapter
EricLBuehler Feb 21, 2024
980b2c8
Set peft type
EricLBuehler Feb 21, 2024
c3f4ed9
Default adapter name
EricLBuehler Feb 21, 2024
effaca0
Ensure all nonzero length
EricLBuehler Feb 21, 2024
031fabb
Fix get nb trainable params
EricLBuehler Feb 21, 2024
9a7943e
Add some docs to tuner class
EricLBuehler Feb 22, 2024
deeb169
Update post init of xlora config
EricLBuehler Feb 22, 2024
1e73f3d
Mention method swapping
EricLBuehler Feb 22, 2024
73901e8
Remove incorrect example
EricLBuehler Feb 22, 2024
279036d
Update to use default args for compat
EricLBuehler Feb 22, 2024
644a562
Move API for nice visibility
EricLBuehler Feb 22, 2024
6e3dc3d
Remove use of __slots__
EricLBuehler Feb 22, 2024
012ed6f
Remove passing of classifier
EricLBuehler Feb 22, 2024
510247d
Remove passing of classifier
EricLBuehler Feb 22, 2024
f3e3d1f
Remove many asserts, converting to exceptions
EricLBuehler Feb 22, 2024
a24f51b
Update naming convention
EricLBuehler Feb 22, 2024
358ea19
Refactor Lora layers toreduce code repetion
EricLBuehler Feb 22, 2024
e6126fa
Remove _disable_inject hack
EricLBuehler Feb 22, 2024
de76531
Remove passing peftmodel to super
EricLBuehler Feb 22, 2024
1dace56
Use post init fn to improve separation of concerns
EricLBuehler Feb 22, 2024
e11e433
Update naming and fix call
EricLBuehler Feb 23, 2024
63debdb
Remove 'n_predictions_lifetime'
EricLBuehler Feb 23, 2024
226ab92
Move generate to XLoraModel, simplifying PeftModelWrapper
EricLBuehler Feb 23, 2024
feba4f6
Simplify save pretrained and from pretrained
EricLBuehler Feb 23, 2024
fd0ab4b
Do not save to nested dir
EricLBuehler Feb 23, 2024
c44cd5a
Use new _save_pretrained_hook to reduce code duplication
EricLBuehler Feb 23, 2024
1f36837
Add explaining comment
EricLBuehler Feb 23, 2024
d2d3b6a
Call .clear on log scalings
EricLBuehler Feb 23, 2024
9a7032b
Seperate method use cases
EricLBuehler Feb 23, 2024
f4a5314
Prefix with _
EricLBuehler Feb 23, 2024
cf4fa9d
Remove get, print trainable params as they are redundant
EricLBuehler Feb 23, 2024
862b880
Fix inclusion of old kwarg
EricLBuehler Feb 23, 2024
9c5e3be
Remove circular imports
EricLBuehler Feb 28, 2024
34694d5
Override method
EricLBuehler Feb 28, 2024
18bd1b8
Set target modules to None
EricLBuehler Feb 28, 2024
306bb34
Do not set target modules to None
EricLBuehler Feb 28, 2024
b8121ac
Try to avoid checking target_modules
EricLBuehler Feb 28, 2024
44cdc07
Remove circular import
EricLBuehler Feb 28, 2024
ba905a6
Override another method
EricLBuehler Feb 28, 2024
d5446f6
Override _check_target_module_exists
EricLBuehler Feb 28, 2024
2e7f51a
Make method instance
EricLBuehler Feb 28, 2024
7c354d9
Make a nicer check for having target_modules
EricLBuehler Feb 28, 2024
76b26b5
Avoid unnecessary injection
EricLBuehler Feb 28, 2024
5828ab4
Remove xlora conf from mark only adapters as trainable
EricLBuehler Feb 28, 2024
9b56c2a
Account for property
EricLBuehler Feb 28, 2024
0ea1fa2
Call correct method
EricLBuehler Feb 28, 2024
8f67a6b
Fix for scoping
EricLBuehler Feb 28, 2024
a4823fe
Set active adapter to not 'default'
EricLBuehler Feb 28, 2024
1bafcd8
Fix recursion err
EricLBuehler Feb 28, 2024
b5f5563
Index into config to check
EricLBuehler Feb 29, 2024
e6b45c3
Get and pass scalings
EricLBuehler Mar 2, 2024
b8ba669
Remove deprecated attr
EricLBuehler Mar 5, 2024
31b121c
Remove the xloralayer for a hook, refactoring changes
EricLBuehler Mar 6, 2024
df6867d
Add copywright notice
EricLBuehler Mar 6, 2024
9025310
Remove some typing things
EricLBuehler Mar 6, 2024
506e96e
Remove some typing things
EricLBuehler Mar 6, 2024
07ff71f
Add note to docstring
EricLBuehler Mar 6, 2024
a111787
Add to config
EricLBuehler Mar 9, 2024
15b7bec
Update for new saving
EricLBuehler Mar 9, 2024
0ef9181
Fix topk impl
EricLBuehler Mar 18, 2024
3804d6a
Update based on comments
EricLBuehler Mar 29, 2024
ea3ea8f
Add the xlora layer structure back in
EricLBuehler Mar 29, 2024
1ce8391
Add the xlora layer structure back in
EricLBuehler Mar 29, 2024
b48d4af
Make some style changes
EricLBuehler Apr 5, 2024
4d48c59
Fix some bugs
EricLBuehler Apr 5, 2024
93069bb
Remove base model id as unnecessary
EricLBuehler Apr 5, 2024
3ee9350
Format with ruff
EricLBuehler Apr 5, 2024
7c1bcfa
Add xlora test
EricLBuehler Apr 5, 2024
e627525
Handle lack of target modules in lora method
EricLBuehler Apr 5, 2024
aa189c6
Handle lack of layer_replication in lora method
EricLBuehler Apr 5, 2024
2df1d5f
Handle lack of target_modules in check target modules exists
EricLBuehler Apr 5, 2024
b14c335
Handle lack of target_modules in inject_adapter
EricLBuehler Apr 5, 2024
8bec326
Fix the recursion err
EricLBuehler Apr 5, 2024
c79b37c
Formatting
EricLBuehler Apr 5, 2024
e453747
Do not include device in config
EricLBuehler Apr 5, 2024
10161b3
Cleaner method to eliminate adapters
EricLBuehler Apr 5, 2024
f223168
Cleaner method to eliminate adapters
EricLBuehler Apr 5, 2024
31078a6
Remove unnecessary methods
EricLBuehler Apr 5, 2024
d52f6ce
Error on multiple xlora adapters
EricLBuehler Apr 5, 2024
7e3c0d0
Error on use of dora
EricLBuehler Apr 5, 2024
1edeb96
Raise errors and other misc fixes
EricLBuehler Apr 9, 2024
bb5e546
Depend on use trainable adapters
EricLBuehler Apr 9, 2024
a4479b6
Implement another test and restructure the model
EricLBuehler Apr 10, 2024
81b069b
Inherit
EricLBuehler Apr 10, 2024
33c2623
Impl some abstract methods and remove the special cases
EricLBuehler Apr 10, 2024
63f3418
Fix return
EricLBuehler Apr 10, 2024
31863fa
Make some progress
EricLBuehler Apr 10, 2024
34f7594
Make some progress
EricLBuehler Apr 10, 2024
a234676
Merge branch 'main' into add_xlora
EricLBuehler Apr 10, 2024
7468114
Working version and add some tests
EricLBuehler Apr 10, 2024
e4053bc
Explain need for refreezing
EricLBuehler Apr 10, 2024
cc0cfee
Fix some ruff lints
EricLBuehler Apr 11, 2024
9debbe3
Get adapter names and handle gracefully with subfolders
EricLBuehler Apr 11, 2024
a283ea8
Fix unused
EricLBuehler Apr 11, 2024
aca6044
Rewrite the flush_log_scalings function with new functionality
EricLBuehler Apr 12, 2024
d10f4ee
Use some fixtures
EricLBuehler Apr 12, 2024
677c64e
Fix passing arg
EricLBuehler Apr 12, 2024
93db8bc
Remove unnecessary checks
EricLBuehler Apr 12, 2024
99fff5c
Fix test
EricLBuehler Apr 12, 2024
42db87c
Fix a bug
EricLBuehler Apr 12, 2024
6787b9f
More sensible defaults
EricLBuehler Apr 12, 2024
6fe86d1
Cast
EricLBuehler Apr 12, 2024
f948883
Refactor forward pass
EricLBuehler Apr 12, 2024
83ecd37
Finish refactoring of model forward
EricLBuehler Apr 12, 2024
ba4093a
Remove some dead code
EricLBuehler Apr 12, 2024
e19cbc6
Run formatting
EricLBuehler Apr 12, 2024
cf54d6a
Add test for saving and loading
EricLBuehler Apr 12, 2024
750e7ac
Fix determining adapter device for embedding
EricLBuehler Apr 15, 2024
a7ac01c
Fix the save and load
EricLBuehler Apr 17, 2024
7fd76ca
Fix the save and load
EricLBuehler Apr 17, 2024
dc8bd3b
Comment
EricLBuehler Apr 17, 2024
b888a69
Merge remote-tracking branch 'upstream/main' into add_xlora
EricLBuehler Apr 17, 2024
c379f70
Update the docstrings
EricLBuehler Apr 17, 2024
eaf5052
Remove the hacking
EricLBuehler Apr 17, 2024
76070d6
Merge
EricLBuehler Apr 22, 2024
a510560
Remove the post_init_lora
EricLBuehler Apr 22, 2024
c9339a6
Fix
EricLBuehler Apr 22, 2024
a6d92b0
More tests
EricLBuehler Apr 22, 2024
261b336
Remove custom load and save code
EricLBuehler Apr 22, 2024
763bf7f
Merge remote-tracking branch 'upstream/main' into add_xlora
EricLBuehler May 21, 2024
3c07d18
Improve tests
EricLBuehler May 21, 2024
44a3bb9
Remove InhibitorFlagPayload
EricLBuehler May 21, 2024
b9c80ef
Improve the docstrings
EricLBuehler May 21, 2024
72f9610
Fix the docstrings and clean up a bit
EricLBuehler May 21, 2024
f4b2df9
Remove some redundant config options
EricLBuehler May 21, 2024
a8fbf85
Remove redundant case
EricLBuehler May 21, 2024
3fb0cd0
Better docstrings
EricLBuehler May 21, 2024
5e62d66
Remove *et_use_trainable_adapters
EricLBuehler May 21, 2024
8faaa0f
Remove method and clean up checks
EricLBuehler May 21, 2024
6b54244
Minor fixes
EricLBuehler May 22, 2024
0d7f9d3
Remove custom loading and saving code
EricLBuehler May 22, 2024
8c58e1c
Update tests
EricLBuehler May 23, 2024
430f9e4
Remove monkey patching for scalings passing
EricLBuehler May 23, 2024
53f0342
Add tests for disabling adapters
EricLBuehler May 23, 2024
ee19910
Add test for embedding model
EricLBuehler May 23, 2024
1735deb
Use peft forward hook
EricLBuehler May 29, 2024
d661266
Fix disable adapter test
EricLBuehler May 29, 2024
8601832
Remove unnecessary var
EricLBuehler May 29, 2024
a3b83c1
Add futures annotations
EricLBuehler May 29, 2024
e3db8df
Merge remote-tracking branch 'upstream/main' into add_xlora
EricLBuehler May 29, 2024
b0f3062
Fix changes to lora prepare adapter config
EricLBuehler May 29, 2024
81e337d
Fix tests
EricLBuehler May 30, 2024
7ddd14b
Fix hacks
EricLBuehler May 30, 2024
7ea1a12
Use infer_device
EricLBuehler May 30, 2024
7058c15
Fix the tests and provide fake default values
EricLBuehler May 31, 2024
587c78a
Fix a mistake
EricLBuehler May 31, 2024
2b1193a
Merge remote-tracking branch 'upstream/main' into add_xlora
EricLBuehler Jun 24, 2024
6a809cb
Fix .weight
EricLBuehler Jun 24, 2024
164af91
Format
EricLBuehler Jun 25, 2024
096927a
Merge remote-tracking branch 'upstream/main' into add_xlora
EricLBuehler Jun 29, 2024
fadc5d9
Embedding does not support DoRA
EricLBuehler Jun 29, 2024
013a084
Remove xlora for embedding layers
EricLBuehler Jun 29, 2024
14efafa
Change order of scalings application
EricLBuehler Jun 29, 2024
624d316
Add tests
EricLBuehler Jun 29, 2024
1505f31
Handle case when inserting
EricLBuehler Jun 29, 2024
0b2dcd0
Check exact type
EricLBuehler Jun 29, 2024
abdcb50
Fix target modules
EricLBuehler Jun 29, 2024
aecd492
Somehow it didn't get formatted
EricLBuehler Jul 1, 2024
68e5f2f
Make tmp dir and tokenizer function scoped
EricLBuehler Jul 1, 2024
1628227
Scope the lora adapters too
EricLBuehler Jul 1, 2024
5a889be
Use unique temp dirs for lora adapters, all tests
EricLBuehler Jul 2, 2024
072987e
Seperation of concerns for xlora and lora
EricLBuehler Jul 3, 2024
13edc61
Merge remote-tracking branch 'upstream/main' into add_xlora
EricLBuehler Jul 3, 2024
77fb6b0
Prevent inf recursion as per 1892
EricLBuehler Jul 3, 2024
2f85f91
Prevent inf recursion on lora_model
EricLBuehler Jul 4, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions src/peft/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -82,6 +82,8 @@
LNTuningModel,
VeraConfig,
VeraModel,
XLoraConfig,
XLoraModel,
)
from .utils import (
TRANSFORMERS_MODELS_TO_PREFIX_TUNING_POSTPROCESS_MAPPING,
Expand Down
5 changes: 5 additions & 0 deletions src/peft/mapping.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,8 @@

import torch

from peft.tuners.xlora.model import XLoraModel

from .config import PeftConfig
from .mixed_model import PeftMixedModel
from .peft_model import (
Expand Down Expand Up @@ -56,6 +58,7 @@
PromptTuningConfig,
VeraConfig,
VeraModel,
XLoraConfig,
)
from .tuners.tuners_utils import BaseTuner as _BaseTuner
from .utils import _prepare_prompt_learning_config
Expand Down Expand Up @@ -90,6 +93,7 @@
"POLY": PolyConfig,
"LN_TUNING": LNTuningConfig,
"VERA": VeraConfig,
"XLORA": XLoraConfig,
}

PEFT_TYPE_TO_TUNER_MAPPING: dict[str, type[_BaseTuner]] = {
Expand All @@ -103,6 +107,7 @@
"POLY": PolyModel,
"LN_TUNING": LNTuningModel,
"VERA": VeraModel,
"XLORA": XLoraModel,
}


Expand Down
3 changes: 1 addition & 2 deletions src/peft/mixed_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,8 +23,6 @@
from torch import nn
from transformers.utils import PushToHubMixin

from peft.tuners.mixed import COMPATIBLE_TUNER_TYPES

from .config import PeftConfig
from .peft_model import PeftModel
from .tuners import (
Expand All @@ -36,6 +34,7 @@
MixedModel,
OFTModel,
)
from .tuners.mixed import COMPATIBLE_TUNER_TYPES
from .utils import PeftType, _set_adapter, _set_trainable


Expand Down
30 changes: 29 additions & 1 deletion src/peft/peft_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
from accelerate import dispatch_model, infer_auto_device_map
from accelerate.hooks import AlignDevicesHook, add_hook_to_module, remove_hook_from_submodules
from accelerate.utils import get_balanced_memory, named_module_tensors
from huggingface_hub import ModelCard, ModelCardData, hf_hub_download
from huggingface_hub import HfFileSystem, ModelCard, ModelCardData, hf_hub_download
from safetensors import safe_open
from safetensors.torch import save_file as safe_save_file
from torch.nn import BCEWithLogitsLoss, CrossEntropyLoss, MSELoss
Expand All @@ -55,6 +55,8 @@
PromptEmbedding,
PromptEncoder,
VeraModel,
XLoraConfig,
XLoraModel,
)
from .tuners.tuners_utils import BaseTuner, BaseTunerLayer
from .utils import (
Expand Down Expand Up @@ -91,6 +93,7 @@
PeftType.POLY: PolyModel,
PeftType.LN_TUNING: LNTuningModel,
PeftType.VERA: VeraModel,
PeftType.XLORA: XLoraModel,
}


Expand Down Expand Up @@ -479,13 +482,38 @@ def from_pretrained(
raise ValueError("Cannot set a prompt learning adapter to trainable when loading pretrained adapter.")
else:
config.inference_mode = not is_trainable
if isinstance(getattr(model, "base_model", None), XLoraModel):
if not isinstance(config, XLoraConfig):
raise TypeError(f"Expected 'XLoraConfig', got '{type(config)}' instead.")
if "adapters" in kwargs:
config.adapters = kwargs["adapters"]
else:
# If the path is on HF hub, then we get the adapter names to create a subfolders list which tells
# `load_adapter` where the adapters are.
if not os.path.exists(model_id):
s = HfFileSystem()

# The names of the adapters which must be in folders
adapter_names = [
file["name"][len(model_id) + 1 :] for file in s.ls(model_id) if file["type"] == "directory"
]
# Prepare a dict of adapter paths, which really just point to the hf id; we will use the subfolders
adapter_paths = {}
for adapter_name in adapter_names:
adapter_paths[adapter_name] = os.path.join(model_id, model_id)
config.adapters = adapter_paths
config._subfolders = adapter_names
else:
if "adapters" not in kwargs:
raise ValueError("If model_id is a local path, then `adapters` must be passed in kwargs.")

if config.task_type not in MODEL_TYPE_TO_PEFT_MODEL_MAPPING.keys():
model = cls(model, config, adapter_name, autocast_adapter_dtype=autocast_adapter_dtype)
else:
model = MODEL_TYPE_TO_PEFT_MODEL_MAPPING[config.task_type](
model, config, adapter_name, autocast_adapter_dtype=autocast_adapter_dtype
)

model.load_adapter(
model_id, adapter_name, is_trainable=is_trainable, autocast_adapter_dtype=autocast_adapter_dtype, **kwargs
)
Expand Down
1 change: 1 addition & 0 deletions src/peft/tuners/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,3 +33,4 @@
from .poly import PolyConfig, PolyModel
from .ln_tuning import LNTuningConfig, LNTuningModel
from .vera import VeraConfig, VeraModel
from .xlora import XLoraConfig, XLoraModel
22 changes: 19 additions & 3 deletions src/peft/tuners/tuners_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,8 @@
from transformers.pytorch_utils import Conv1D

from peft.utils import INCLUDE_LINEAR_LAYERS_SHORTHAND
from peft.utils.constants import DUMMY_TARGET_MODULES
from peft.utils.peft_types import PeftType

from ..config import PeftConfig
from ..utils import ModulesToSaveWrapper, _get_submodules
Expand Down Expand Up @@ -141,7 +143,12 @@ class BaseTuner(nn.Module, ABC):
double-check that the `config.target_modules` were specified correctly.
"""

def __init__(self, model, peft_config: Union[PeftConfig, dict[str, PeftConfig]], adapter_name: str) -> None:
def __init__(
self,
model,
peft_config: Union[PeftConfig, dict[str, PeftConfig]],
adapter_name: str,
) -> None:
super().__init__()

self.model = model
Expand All @@ -164,7 +171,8 @@ def __init__(self, model, peft_config: Union[PeftConfig, dict[str, PeftConfig]],

self.active_adapter: str | list[str] = adapter_name
self._pre_injection_hook(self.model, self.peft_config[adapter_name], adapter_name)
self.inject_adapter(self.model, adapter_name)
if peft_config != PeftType.XLORA or peft_config[adapter_name] != PeftType.XLORA:
self.inject_adapter(self.model, adapter_name)

# Copy the peft_config in the injected model.
self.model.peft_config = self.peft_config
Expand Down Expand Up @@ -389,6 +397,11 @@ def inject_adapter(self, model: nn.Module, adapter_name: str, autocast_adapter_d
is_target_modules_in_base_model = False
key_list = [key for key, _ in model.named_modules()]

if getattr(peft_config, "target_modules", None) == DUMMY_TARGET_MODULES:
# dummy adapter, we allow not matching any module
key_list = []
is_target_modules_in_base_model = True

# update peft_config.target_modules if required
peft_config = _maybe_include_all_linear_layers(peft_config, model)

Expand Down Expand Up @@ -417,7 +430,8 @@ def inject_adapter(self, model: nn.Module, adapter_name: str, autocast_adapter_d
parent, target, target_name = _get_submodules(model, key)
self._create_and_replace(peft_config, adapter_name, target, target_name, parent, current_key=key)

if not is_target_modules_in_base_model:
# Handle X-LoRA case.
if not is_target_modules_in_base_model and hasattr(peft_config, "target_modules"):
raise ValueError(
f"Target modules {peft_config.target_modules} not found in the base model. "
f"Please check the target modules and try again."
Expand Down Expand Up @@ -776,6 +790,8 @@ def _maybe_include_all_linear_layers(peft_config: PeftConfig, model: nn.Module)
Helper function to update `target_modules` to all linear/Conv1D layers if provided as 'all-linear'. Adapted from
the QLoRA repository: https://github.com/artidoro/qlora/blob/main/qlora.py
"""
if not hasattr(peft_config, "target_modules"):
return peft_config

# if `target_modules` is a string, convert to lower case and check if it matches "all-linear"
if not (
Expand Down
19 changes: 19 additions & 0 deletions src/peft/tuners/xlora/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
# Copyright 2023-present the HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

from .config import XLoraConfig
from .model import XLoraModel


__all__ = ["XLoraConfig", "XLoraModel"]
Loading
Loading