-
Notifications
You must be signed in to change notification settings - Fork 277
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to try my own fine-tuning experiments #292
Comments
minghehe-nobug
changed the title
[FEATURE REQUEST] YOUR DESCRIPTION HERE
Howtry my own fine-tuning experiments
Mar 5, 2024
minghehe-nobug
changed the title
Howtry my own fine-tuning experiments
How to try my own fine-tuning experiments
Mar 5, 2024
I want to fine-tune the model, too. Can I ask how much GPU resource is needed. I could have 4x A5000 GPUs, are they enough to complete it? |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hello!
I'm new to multimodal training.
Inspired by this exciting project, I hope to try my own fine-tuning experiments on interleaved data.
Currently, I have downloaded the pro-trained model (3B) and completed the inference process. But can anyone help me how to write the parameters for the "torchrun" script?
The challenge for me is how to change the two parameters "--laion_shards" and "--mmc4_shards" to my own.
And how to modify the original code without using "LAION-2B" (the data set is too large)?
Thanks!
The text was updated successfully, but these errors were encountered: