Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix check_is_flash_attention #22882

Merged
merged 2 commits into from
Aug 28, 2024
Merged

Conversation

wenscarl
Copy link
Contributor

@wenscarl wenscarl commented Aug 5, 2024

layout is a literal, so it has to take value of AttentionLayout.BNTH. @kaixih @Cjkkkk

@zhangqiaorjc
Copy link
Collaborator

how about we add a simple test to catch this?

@superbobry superbobry requested review from zhangqiaorjc and removed request for superbobry August 8, 2024 19:16
@google-ml-butler google-ml-butler bot added kokoro:force-run pull ready Ready for copybara import and testing labels Aug 21, 2024
@zhangqiaorjc
Copy link
Collaborator

i forgot to click a button, attempting to merge again

@copybara-service copybara-service bot merged commit 2785a08 into jax-ml:main Aug 28, 2024
17 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
pull ready Ready for copybara import and testing
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants