Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FEAT: Support quantization for VeRA using bitsandbytes (#2070) #2076

Open
wants to merge 26 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 11 commits
Commits
Show all changes
26 commits
Select commit Hold shift + click to select a range
322382c
Add 8-bit quantization support to VeraConfig
ZiadHelal Sep 18, 2024
e272609
Update imports and __all__ for 8-bit quantization support in VeRA
ZiadHelal Sep 18, 2024
0fe4a85
Implement 8-bit quantization support for VeRA using bitsandbytes
ZiadHelal Sep 18, 2024
ec2d228
Integrate 8-bit quantization support in VeraModel
ZiadHelal Sep 18, 2024
c9e2053
Merge branch 'huggingface:main' into feature/vera-quantization-support
ZiadHelal Sep 18, 2024
6b2cec5
Update imports 4-bit quantization support in VeRA
ZiadHelal Sep 22, 2024
d222da9
Implement 4-bit quantization support for VeRA using bitsandbytes (sti…
ZiadHelal Sep 22, 2024
f3f6026
Remove un-needed configurations
ZiadHelal Sep 22, 2024
f558f48
Integrate 4-bit & 8-bit quantization support in VeraModel
ZiadHelal Sep 22, 2024
84c03de
Merge branch 'huggingface:main' into feature/vera-quantization-support
ZiadHelal Sep 22, 2024
84a3418
Add unit tests for VeRA quantization with 4-bit and 8-bit configurations
ZiadHelal Sep 22, 2024
fd40773
Merge branch 'huggingface:main' into feature/vera-quantization-support
ZiadHelal Sep 23, 2024
eed91d2
Refactor _find_dim to explicitly handle input and output dimensions f…
ZiadHelal Sep 23, 2024
90ad93a
Revert 4-bit quantization forward method to original code
ZiadHelal Sep 23, 2024
13e8d8a
Merge branch 'huggingface:main' into feature/vera-quantization-support
ZiadHelal Sep 25, 2024
762485a
Fix style to make the linter happy
ZiadHelal Sep 25, 2024
c87e4b8
Added more bnb test examples for VeRA
ZiadHelal Sep 25, 2024
5ab1f1d
Merge branch 'huggingface:main' into feature/vera-quantization-support
ZiadHelal Sep 25, 2024
5beff4f
Fixing style using `make style`
ZiadHelal Sep 25, 2024
a88d6c2
Fixing linear layers dimensions
ZiadHelal Sep 26, 2024
14d78ed
Merge branch 'huggingface:main' into feature/vera-quantization-support
ZiadHelal Sep 26, 2024
e584ebd
Merge branch 'huggingface:main' into feature/vera-quantization-support
ZiadHelal Sep 27, 2024
62a91bd
Merge branch 'huggingface:main' into feature/vera-quantization-support
ZiadHelal Oct 3, 2024
0752ff1
Updating the date
ZiadHelal Oct 3, 2024
3b2c017
updating docs to reflect new changes
Oct 3, 2024
640874f
Merge branch 'huggingface:main' into feature/vera-quantization-support
ZiadHelal Oct 4, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 16 additions & 0 deletions src/peft/tuners/vera/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,9 +12,25 @@
# See the License for the specific language governing permissions and
# limitations under the License.

from peft.import_utils import is_bnb_4bit_available, is_bnb_available

from .config import VeraConfig
from .layer import Linear, VeraLayer
from .model import VeraModel


__all__ = ["VeraConfig", "VeraLayer", "Linear", "VeraModel"]


def __getattr__(name):
if (name == "Linear8bitLt") and is_bnb_available():
from .bnb import Linear8bitLt

return Linear8bitLt

if (name == "Linear4bit") and is_bnb_4bit_available():
from .bnb import Linear4bit

return Linear4bit

raise AttributeError(f"module {__name__} has no attribute {name}")
Loading