Skip to content

Conversation

@abdela47
Copy link

This PR adds deterministic unit tests for pseudo_kl_divergence_loss.

The tests cover both documented spin shapes and verify gradient behavior.
They isolate the statistical structure of the loss using deterministic dummy Boltzmann
machines and do not rely on samplers or quantum hardware.

Closes #56.

@kevinchern kevinchern self-requested a review December 22, 2025 01:22
Copy link
Collaborator

@kevinchern kevinchern left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi Ahmed @abdela47. Thank you for the pull request.
The tests are well-reasoned, modular, and nicely documented.
I did a quick first pass and added some minor requests.
Separately, you may find this contribution guide helpful for our conventions and best-practices.

Comment on lines 88 to 91
logits = torch.zeros(batch, n_spins)

# spins: (batch_size, n_samples, n_spins)
spins = torch.ones(batch, n_samples, n_spins)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you explain your rationale for using zero-valued logits and spins in this test versus nonzero values in the 2d test?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Zero logits are used in the 3D shape test to keep the entropy term simple and stable (p = 0.5), allowing the test to focus purely on documented shape support; nonzero values are covered in the 2D numerical correctness test.

Copy link
Collaborator

@kevinchern kevinchern left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @abdela47, just a couple more changes:

  1. Can you note this motivation as a comment in the test?
  2. Can you encapsulate tests in a unittest framework? And then rename tests to be more concise

@kevinchern kevinchern assigned thisac and unassigned thisac Jan 8, 2026
@kevinchern kevinchern requested a review from thisac January 8, 2026 18:15
Copy link
Contributor

@thisac thisac left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Happy to approve as soon as the final comments from @kevinchern are addressed. Also, please rebase or merge main into branch so that tests can pass (issues have been fixed in main).

Comment on lines +67 to +69


spins_model = torch.ones(batch_size, n_spins, dtype=torch.float32)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
spins_model = torch.ones(batch_size, n_spins, dtype=torch.float32)
spins_model = torch.ones(batch_size, n_spins, dtype=torch.float32)

entropy2 = F.binary_cross_entropy_with_logits(logits2, probs2)
(-entropy2).backward()

torch.testing.assert_close(logits.grad, logits2.grad)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't forget to add an

if __name__ == "__main__":
    unittest.main()

at the end of this file to make sure the tests are run correctly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Missing unit tests for pseudo_kl_divergence loss function

3 participants