Skip to content

Fix experimental attention variant spec#2018

Closed
yuzhongw-nvidia wants to merge 3 commits intoNVIDIA-NeMo:mainfrom
yuzhongw-nvidia:yuzhongw/exp_spec
Closed

Fix experimental attention variant spec#2018
yuzhongw-nvidia wants to merge 3 commits intoNVIDIA-NeMo:mainfrom
yuzhongw-nvidia:yuzhongw/exp_spec

Conversation

@yuzhongw-nvidia
Copy link
Contributor

What does this PR do ?

Add a one line overview of what this PR aims to accomplish.

Changelog

  • Add specific line by line info of high level changes in this PR.

GitHub Actions CI

See the CI sectionin the Contributing doc for how to trigger the CI. A Nvidia developer will need to approve and trigger the CI for external contributors.

Before your PR is "Ready for review"

Pre checks:

  • Make sure you read and followed Contributor guidelines
  • Did you write any new necessary tests?
  • Did you add or update any necessary documentation?
  • Does the PR affect components that are optional to install? (Ex: Numba, Pynini, Apex etc)
    • Reviewer: Does the PR have correct import guards for all optional libraries?

If you haven't finished some of the above items you can still open "Draft" PR.

Additional Information

  • Related to # (issue)

@copy-pr-bot
Copy link

copy-pr-bot bot commented Jan 21, 2026

This pull request requires additional validation before any workflows can run on NVIDIA's runners.

Pull request vetters can view their responsibilities here.

Contributors can view more details about this message here.

Signed-off-by: oliver könig <okoenig@nvidia.com>
Signed-off-by: oliver könig <okoenig@nvidia.com>
Copy link
Contributor

@maanug-nv maanug-nv left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm, but let's wait until internal CI run finishes

@ko3n1g
Copy link
Contributor

ko3n1g commented Jan 22, 2026

/ok to test 98586d6

@ko3n1g
Copy link
Contributor

ko3n1g commented Jan 22, 2026

Close in favor of #2030 due to linting issues

@ko3n1g ko3n1g closed this Jan 22, 2026
@ko3n1g ko3n1g reopened this Jan 22, 2026
config = _transformer_config_from_args(args)

if args.num_experts:
if args.experimental_attention_variant is not None:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this should be safely accessed with getattr

Suggested change
if args.experimental_attention_variant is not None:
if getattr(args, "experimental_attention_variant", None) is not None:

@ko3n1g ko3n1g closed this Jan 23, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants