Skip to content

Commit 7675b89

Browse files
authored
Create TODO.md
1 parent 15c66bc commit 7675b89

File tree

1 file changed

+59
-0
lines changed

1 file changed

+59
-0
lines changed

tests/TODO.md

Lines changed: 59 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,59 @@
1+
# Tests to roll in from `notebooks`:
2+
* `2_polblogs_benchmark.ipynb`:
3+
* Verify that ProductDT = Sklearn DT for Euclidean manifolds
4+
* Verify `dist_component_by_manifold` sums to 1
5+
* __NEW__: Check equivalence to Euclidean case for other models as well
6+
* `3_verify_shapes.ipynb`:
7+
* Assert `Manifold.pdist()` is close to sqrt of `Manifold.pdist2()`
8+
* __NEW__: Check this for all `dist2` functions
9+
* Assert pdists is sum of squared dists:
10+
```python
11+
assert torch.allclose(pdists[j, k], M.dist2(x_embed[j, dims], x_embed[k, dims]).sum(), atol=1e-6)
12+
```
13+
* `5_sampling.ipynb`:
14+
* Check that distances to origin are the same for all wrapped normal distributions (except spherical for very high curvature)
15+
* Check that log-likelihoods are generally positive (Q(z) - P(z)): (repeat with ProductManifolds)
16+
```python
17+
for K in [-2, -1.0, -0.5, 0, 0.5, 1.0, 2.0]:
18+
print(K)
19+
m = Manifold(K, 4)
20+
# Pick a random point to use as the center
21+
mu = m.sample(m.mu0)
22+
Sigma = torch.diag(torch.randn(m.dim)) ** 2
23+
samples = m.sample(z_mean=torch.cat([mu] * N_SAMPLES, dim=0), sigma=Sigma)
24+
log_probs_p = m.log_likelihood(z=samples) # Default args
25+
log_probs_q = m.log_likelihood(z=samples, mu=mu, sigma=Sigma)
26+
print(
27+
f"Shape: {log_probs_p.shape},\tP(z) = {log_probs_p.mean().item():.3f},\tQ(z) = {log_probs_q.mean().item():.3f},\tQ(z) - P(z) = {log_probs_q.mean().item() - log_probs_p.mean().item():.3f}"
28+
)
29+
print()
30+
```
31+
* Check that KL divergence is equal to this difference
32+
* `10_torchified_hyperdt.ipynb`:
33+
* Implement legacy (iterative) version of MCDT class
34+
* Assert info gains are the same
35+
* Assert splits are the same
36+
* __TODO__: revisit this more carefully. Probably notebook 13 supersedes all of this.
37+
* `13_information_gain.ipynb`:
38+
* Verify comparisons tensor (`verification_tensor`)
39+
* Verify information gains (`ig_gains_nonan`)
40+
* Verify angles (`angles`)
41+
* `14_covariance_scaling.ipynb`:
42+
* Verify dividing variance by dimension gives you ~same norm of spacelike dimensions
43+
* `17_verify_new_mse.ipynb`:
44+
* Similar to notebook 13 - need to compare information gains in regression setting
45+
* `47_stereographic_tests.ipynb`:
46+
* Does the ReLU decorator work correctly?
47+
* Stereographic projections are ~invertible
48+
* mu0 becomes 0 in stereographic projection
49+
* Projection matrix `man.projection_matrix` is identity matrix in stereographic manifolds
50+
* Check that our complicated left-multiplication is equivalent to this:
51+
```python
52+
def left_multiply(self, A, X):
53+
out = torch.zeros_like(X)
54+
for i, (A_i, X_i) in enumerate(zip(A, X)):
55+
m_i = self.manifold.manifold.weighted_midpoint(xs=X_i, weights=A_i)
56+
out[i] = self.manifold.manifold.mobius_scalar_mul(r=A_i.sum(), x=m_i)
57+
return out
58+
```
59+
* Verify stereographic logits are smooth at kappa=0 (how?)

0 commit comments

Comments
 (0)