Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[PyTorch] Merge k_channels and v_channels back to kv_channels #1094

Merged
merged 5 commits into from
Aug 13, 2024

Conversation

cyanguwa
Copy link
Collaborator

@cyanguwa cyanguwa commented Aug 10, 2024

Description

This PR merges k_channels and v_channels back to kv_channels in DotProductAttention.__init__() to avoid breaking backward compatibility, but it allows users to pass in a tuple (int, int) in MLA (multi-latent attention) cases.

We will add support for MLA in MultiHeadAttention and TransformerLayer in future PRs.

Type of change

  • Documentation change (change only to the documentation, either a fix or a new content)
  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • Infra/Build change
  • Code refractor

Changes

Please list the changes introduced in this PR:

  • Merge k_channels and v_channels back to kv_channels in DotProductAttention

Checklist:

  • I have read and followed the contributing guidelines
  • The functionality is complete
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes

@cyanguwa
Copy link
Collaborator Author

/te-ci pytorch

@cyanguwa cyanguwa requested a review from ptrendx August 10, 2024 01:10
Signed-off-by: Charlene Yang <8636796+cyanguwa@users.noreply.github.com>
@cyanguwa
Copy link
Collaborator Author

/te-ci pytorch

Signed-off-by: Charlene Yang <8636796+cyanguwa@users.noreply.github.com>
@cyanguwa
Copy link
Collaborator Author

/te-ci pytorch

@ptrendx ptrendx merged commit b8d453e into NVIDIA:main Aug 13, 2024
25 of 26 checks passed
mgoldfarb-nvidia pushed a commit to mgoldfarb-nvidia/TransformerEngine that referenced this pull request Aug 14, 2024
…VIDIA#1094)

* merge k_channels and v_channels back to kv_channels and accept a tuple

Signed-off-by: Charlene Yang <8636796+cyanguwa@users.noreply.github.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* fix isinstance call

Signed-off-by: Charlene Yang <8636796+cyanguwa@users.noreply.github.com>

* fix MLA tests

Signed-off-by: Charlene Yang <8636796+cyanguwa@users.noreply.github.com>

---------

Signed-off-by: Charlene Yang <8636796+cyanguwa@users.noreply.github.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants