Skip to content

feat(pytorch): Allow TransformerLayer and MultiheadAttention to accept sequence length parameters #4560

feat(pytorch): Allow TransformerLayer and MultiheadAttention to accept sequence length parameters

feat(pytorch): Allow TransformerLayer and MultiheadAttention to accept sequence length parameters #4560

Triggered via pull request August 16, 2024 20:06
Status Success
Total duration 11s
Artifacts

license.yml

on: pull_request
Fit to window
Zoom out
Zoom in

Annotations

1 warning
Check
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/