Skip to content

Conversation

@anhminhnguyenhoang
Copy link

Description

Create the ROCm version of the example notebook attention.ipynb.

Fixes # (issue)

Type of change

  • Documentation change (change only to the documentation, either a fix or a new content)
  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • Infra/Build change
  • Code refractor

Changes

Please list the changes introduced in this PR:

  • Created a jupyter notebook named attention-rocm.ipynb with details on ROCm backend from the original notebook attention.ipynb for cuDNN
  • Added some modifications for local import of the module tests.pytorch.fused_attn.test_fused_attn
  • attention-rocm.ipynb contains a hacky fix to be adviced on how to resolve for being unable to import the function transformer_engine.pytorch.attention._flash_attn_3_is_installed on Docker image rocm/megatron-lm:v25.3

Checklist:

  • I have read and followed the contributing guidelines
  • The functionality is complete
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes

@wenchenvincent
Copy link
Collaborator

@anhminhnguyenhoang Is this PR still needed?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants