Skip to content

QuantMultiheadAttention: Use signed quantizer for attention weights? #772

QuantMultiheadAttention: Use signed quantizer for attention weights?

QuantMultiheadAttention: Use signed quantizer for attention weights? #772

Triggered via issue February 2, 2024 17:33
@iksnagrebiksnagreb
commented on #755 84f4225
Status Success
Total duration 11s
Artifacts

rebase.yml

on: issue_comment
Rebase
0s
Rebase
Always run job
0s
Always run job
Fit to window
Zoom out
Zoom in