Skip to content

Add support for Flex Attention #9510

Add support for Flex Attention

Add support for Flex Attention #9510

Triggered via pull request January 7, 2025 10:29
@ShashankMosaicMLShashankMosaicML
synchronize #1675
Status Failure
Total duration 18m 32s
Artifacts

pr-gpu.yaml

on: pull_request_target
Matrix: pytest-gpu-1
Matrix: pytest-gpu-2
Matrix: pytest-gpu-4
Fit to window
Zoom out
Zoom in

Annotations

1 error
gpu-2.5.1-1
Process completed with exit code 1.