-
Notifications
You must be signed in to change notification settings - Fork 354
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Upgrade Transformers to v4.33.3 #586
Conversation
- Add argument to `_resize_token_embeddings()` - Add seq. classification head to T5 - Fix test config of Llama
Move used heads retrieval to new method.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me!
Thanks for working on this!
One question: with which transformer version does the pipeline conduct the tests?
if isinstance(model, T5AdapterModel) or isinstance(model, BertGenerationAdapterModel): | ||
if isinstance(model, BertGenerationAdapterModel): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A general question: Why do we differentiate between the models in these tests? And why can we exclude T5 now?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The model handled differently here don't support sequence classification heads, which are used for these test cases by default. T5 now supports sequence classification heads.
Upgrade notes:
_resize_token_embeddings()