Skip to content

What is the intended way of doing EMA with PEFT? #1557

Answered by samedii
samedii asked this question in Q&A
Discussion options

You must be logged in to vote

I ended up using a wrapper that keeps track of all the trained parameters.

from torch import nn


class TrainablesContainer(nn.ModuleDict):
    @classmethod
    def from_module(cls, module: nn.Module, parent_name=""):
        module_dict = cls()
        for name, sub_module in module.named_children():
            full_name = f"{parent_name}.{name}" if parent_name else name
            if list(sub_module.children()):  # If the submodule has children, recurse
                module_dict[name] = cls.from_module(sub_module, full_name)
            else:
                # Create a ParameterDict for leaf module
                param_dict = nn.ParameterDict()
                for param_name, param…

Replies: 2 comments 11 replies

Comment options

You must be logged in to vote
1 reply
@samedii
Comment options

Comment options

You must be logged in to vote
10 replies
@samedii
Comment options

@BenjaminBossan
Comment options

@bghira
Comment options

@BenjaminBossan
Comment options

@bghira
Comment options

Answer selected by samedii
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants