Skip to content

feat(pytorch): Allow TransformerLayer and MultiheadAttention to accept sequence length parameters #4560

feat(pytorch): Allow TransformerLayer and MultiheadAttention to accept sequence length parameters

feat(pytorch): Allow TransformerLayer and MultiheadAttention to accept sequence length parameters #4560

Annotations

1 warning

This job succeeded