Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fixes #131, module 'eole.utils' has no attribute 'distributed' error when training multi-gpu #132

Merged
merged 7 commits into from
Oct 25, 2024

Conversation

isanvicente
Copy link
Contributor

fixes #131

@francoishernandez
Copy link
Contributor

Might be cleaner to import only the needed function(s) (like the other eole.utils import below it), thanks! (And it will also make flake happy.)

@isanvicente
Copy link
Contributor Author

Should be done now, not sure why flake is still complaining. Details say :

would reformat eole/trainer.py

Oh no! 💥 💔 💥
1 file would be reformatted, 150 files would be left unchanged.
Error: Process completed with exit code 1.

However a single line has been added to the file...

@francoishernandez
Copy link
Contributor

You just need to apply the black formatter (either on the full code black . or here black eole/trainer.py would be sufficient).

@vince62s
Copy link
Contributor

when importing the function you need to remove the full path where they are used.
this is the reason why I don't really understand the initial error

@francoishernandez francoishernandez merged commit d7959ba into eole-nlp:main Oct 25, 2024
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Error when training WMT with multi gpu
3 participants