Skip to content

Conversation

@amreis
Copy link

@amreis amreis commented Oct 30, 2025

Hello!

Thank you for coding this nice implementation of KDE in Torch. I've been using it, but have run into a few issues.

  • Sometimes, the log-likelihood is -inf, which can cause downstream problems when computing gradients (they become NaN). I've added an option for a user to specify an eps value that gets added to the density before the log is taken. Using a small positive value improves stability there.
  • Using this KDE as part of another module in Torch used to run into some issues because KernelDensity never calls super().init(). This means it does not populate some variables (self._modules, for instance) that Torch uses internally. I've added the call.

I've taken the liberty of removing some trailing whitespace too.

Thank you once more for the package =)

Alister Machado and others added 2 commits October 30, 2025 18:07
Also, call super().__init__(), enabling this module
to be composed with other PyTorch and Lightning
constructs.
@rudolfwilliam
Copy link
Owner

Thank you for the contribution. This is indeed a useful feature.

@rudolfwilliam rudolfwilliam merged commit edd19b7 into rudolfwilliam:master Nov 5, 2025
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants