Anyone feels the new adapters converge slower and has lower sample per second compared to adapter-transformers? #728
14H034160212
started this conversation in
General
Replies: 1 comment
-
Hey, thanks for bringing this up, we'll be investigating this! Do you have any experiments/ results you would be able to share? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
Just a discussion that we performed some experiments by moving from adapter-transfomrers to adapters and we find out the training and inference speed is slower than the adapter-transformers. Does anyone have similar feeling? Does anyone have some suggestion? For example upgrading python version or something else.
Beta Was this translation helpful? Give feedback.
All reactions