Skip to content

Commit c2e84c9

Browse files
committed
Create individual README for PyPI
1 parent 1a30314 commit c2e84c9

File tree

2 files changed

+97
-1
lines changed

2 files changed

+97
-1
lines changed

README_PyPI.md

Lines changed: 96 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,96 @@
1+
# TorchKDE
2+
3+
![Python Version](https://img.shields.io/badge/python-3.11.11%2B-blue.svg)
4+
![PyTorch Version](https://img.shields.io/badge/pytorch-2.5.1-brightgreen.svg)
5+
![Tests](https://github.com/rudolfwilliam/torch-kde/actions/workflows/ci.yml/badge.svg)
6+
[![DOI](https://zenodo.org/badge/901331908.svg)](https://doi.org/10.5281/zenodo.14674657)
7+
8+
A differentiable implementation of [kernel density estimation](https://en.wikipedia.org/wiki/Kernel_density_estimation) in PyTorch by Klaus-Rudolf Kladny.
9+
10+
$$\hat{f}(x) = \frac{1}{|H|^{\frac{1}{2}} n} \sum_{i=1}^n K \left( H^{-\frac{1}{2}} \left( x - x_i \right) \right)$$
11+
12+
## Installation Instructions
13+
14+
The torch-kde package can be installed via `pip`. Run
15+
16+
```bash
17+
pip install torch-kde
18+
```
19+
20+
Now you are ready to go! If you would also like to run the code from the Jupyter notebooks or contribute to this package, please also install the packages in the `requirements.txt`:
21+
22+
```bash
23+
pip install -r requirements.txt
24+
```
25+
26+
## What's included?
27+
28+
### Kernel Density Estimation
29+
30+
The `KernelDensity` class supports the same operations as the [KernelDensity class in scikit-learn](https://scikit-learn.org/dev/modules/generated/sklearn.neighbors.KernelDensity.html), but implemented in PyTorch and differentiable with respect to input data. Here is a little taste:
31+
32+
```python
33+
from torchkde import KernelDensity
34+
import torch
35+
36+
multivariate_normal = torch.distributions.MultivariateNormal(torch.ones(2), torch.eye(2))
37+
X = multivariate_normal.sample((1000,)) # create data
38+
X.requires_grad = True # enable differentiation
39+
kde = KernelDensity(bandwidth=1.0, kernel='gaussian') # create kde object with isotropic bandwidth matrix
40+
_ = kde.fit(X) # fit kde to data
41+
42+
X_new = multivariate_normal.sample((100,)) # create new data
43+
logprob = kde.score_samples(X_new)
44+
45+
logprob.grad_fn # is not None
46+
```
47+
48+
You may also check out `demo_kde.ipynb` for a simple demo on the [Bart Simpson distribution](https://www.stat.cmu.edu/~larry/=sml/densityestimation.pdf).
49+
50+
### Tophat Kernel Approximation
51+
52+
The Tophat kernel is not differentiable at two points and has zero derivative everywhere else. Thus, we provide a differentiable approximation via a generalized Gaussian (see e.g. [Pascal et al.](https://arxiv.org/pdf/1302.6498) for reference):
53+
54+
$$K^{\text{tophat}}(x; \beta) = \frac{\beta \Gamma \left( \frac{p}{2} \right) }{\pi^{\frac{p}{2}} \Gamma \left( \frac{p}{2\beta} \right) 2^{\frac{p}{2\beta}}} \text{exp} \left( - \frac{\| x \|_2^{2\beta}}{2} \right),$$
55+
56+
where $p$ is the dimensionality of $x$. Based on this kernel, we can approximate the Tophat kernel for large values of $\beta$.
57+
58+
We note that for $\beta = 1$, this approximation corresponds to a Gaussian kernel. Also, while the approximation becomes better for large values of $\beta$, its gradients with respect to the input also become larger. This is a tradeoff that must be balanced when using this kernel.
59+
60+
## Supported Settings
61+
62+
The current implementation provides the following functionality:
63+
64+
<div align="center">
65+
66+
| Feature | Supported Values |
67+
|--------------------------|-----------------------------|
68+
| Kernels | Gaussian, Epanechnikov, Exponential, Tophat Approximation |
69+
| Tree Algorithms | Standard |
70+
| Bandwidths | Float (Isotropic bandwidth matrix), Scott, Silverman |
71+
72+
</div>
73+
74+
## Got an Extension? Create a Pull Request!
75+
76+
In case you do not know how to do that, here are the necessary steps:
77+
78+
1. Fork the repo
79+
2. Create your feature branch (`git checkout -b cool_tree_algorithm`)
80+
3. Run the unit tests (`python -m tests.test_kde`) and only proceed if the script outputs "OK".
81+
4. Commit your changes (`git commit -am 'Add cool tree algorithm'`)
82+
5. Push to the branch (`git push origin cool_tree_algorithm`)
83+
6. Open a Pull Request
84+
85+
## Issues?
86+
87+
If you discover a bug or do not understand something, please create an issue or let me know directly at *kkladny [at] tuebingen [dot] mpg [dot] de*! I am also happy to take requests for implementing specific functionalities.
88+
89+
90+
<div align="center">
91+
92+
> "In God we trust. All others must bring data."
93+
>
94+
> — W. Edwards Deming
95+
>
96+
</div>

pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ build-backend = "setuptools.build_meta"
66
name = "torch-kde"
77
version = "0.1.0"
88
description = "A differentiable implementation of kernel density estimation in PyTorch"
9-
readme = "README.md"
9+
readme = "README_PyPI.md"
1010
license = { text = "MIT" }
1111
authors = [
1212
{ name = "Klaus-Rudolf Kladny", email = "[email protected]" }

0 commit comments

Comments
 (0)