Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update the library to the latest HF Transformers version #558

Closed
bhavitvyamalik opened this issue Jun 13, 2023 · 2 comments
Closed

update the library to the latest HF Transformers version #558

bhavitvyamalik opened this issue Jun 13, 2023 · 2 comments
Labels
enhancement New feature or request

Comments

@bhavitvyamalik
Copy link

bhavitvyamalik commented Jun 13, 2023

How can we update the library to the latest HF Transformers version?

@bhavitvyamalik bhavitvyamalik added the enhancement New feature or request label Jun 13, 2023
@lenglaender
Copy link
Member

lenglaender commented Jun 19, 2023

Hey @bhavitvyamalik, to update the library to the last HF transformers version, you need to merge the newest release branch of HF into it.
To do this:

  1. Add HF transformers as a remote repository via git remote add huggingface https://github.com/huggingface/transformers.git)
  2. Fetch it: git fetch huggingface
  3. Checkout new sync branch: git checkout -b sync/v4.30.1
  4. Merge Hugging Face code into it: git merge v4.30.1
  5. Execute ./scripts/upstream-sync.sh v4.30.1 to resolve some merge conflicts
  6. Resolve remaining merge conflicts (this part could take a while)

Currently, adapter-transformers is based on HF transformers v4.26.1 which has some known bugs. Maybe for your problems it would be enough to fix them instead of updating the HF version. To do this, copy this file in your local repository, e.g. as diff.patch:

diff --git a/src/transformers/modeling_utils.py b/src/transformers/modeling_utils.py
index 401730584..684cc00e3 100644
--- a/src/transformers/modeling_utils.py
+++ b/src/transformers/modeling_utils.py
@@ -2518,7 +2518,7 @@ class PreTrainedModel(nn.Module, ModuleUtilsMixin, GenerationMixin, PushToHubMix
                     _from_pipeline=from_pipeline,
                     **kwargs,
                 )
-            except OSError:
+            except (OSError, TypeError):
                 logger.info(
                     "Generation config file not found, using a generation config created from the model config."
                 )
@@ -2715,7 +2715,10 @@ class PreTrainedModel(nn.Module, ModuleUtilsMixin, GenerationMixin, PushToHubMix
                         del state_dict[checkpoint_key]
             return mismatched_keys
 
-        folder = os.path.sep.join(resolved_archive_file[0].split(os.path.sep)[:-1])
+        if resolved_archive_file is not None:
+            folder = os.path.sep.join(resolved_archive_file[0].split(os.path.sep)[:-1])
+        else:
+            folder = None
         if device_map is not None and is_safetensors:
             param_device_map = expand_device_map(device_map, original_loaded_keys)

and apply these changes via git apply diff.patch

FYI: We are currently working on disentangling adapter-transformers from Hugging Face transformers which will make updating the version much easier.

@calpt
Copy link
Member

calpt commented Nov 19, 2023

Note that updating as above is not required anymore with the new Adapters codebase. See #584.

@calpt calpt closed this as completed Nov 19, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants