Skip to content

Commit edf2f12

Browse files
committed
minor update of readme
1 parent dcee3c0 commit edf2f12

File tree

1 file changed

+8
-6
lines changed

1 file changed

+8
-6
lines changed

README.md

Lines changed: 8 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
[![Build Status](https://github.com/TuringLang/NormalizingFlows.jl/actions/workflows/CI.yml/badge.svg?branch=main)](https://github.com/TuringLang/NormalizingFlows.jl/actions/workflows/CI.yml?query=branch%3Amain)
55

66

7-
**Last updated: 2023-Aug-23**
7+
**Last updated: 2025-Mar-04**
88

99
A normalizing flow library for Julia.
1010

@@ -21,16 +21,16 @@ See the [documentation](https://turinglang.org/NormalizingFlows.jl/dev/) for mor
2121
To install the package, run the following command in the Julia REPL:
2222
```julia
2323
] # enter Pkg mode
24-
(@v1.9) pkg> add git@github.com:TuringLang/NormalizingFlows.jl.git
24+
(@v1.11) pkg> add NormalizingFlows
2525
```
2626
Then simply run the following command to use the package:
2727
```julia
2828
using NormalizingFlows
2929
```
3030

3131
## Quick recap of normalizing flows
32-
Normalizing flows transform a simple reference distribution $q_0$ (sometimes known as base distribution) to
33-
a complex distribution $q$ using invertible functions.
32+
Normalizing flows transform a simple reference distribution $q_0$ (sometimes referred to as the base distribution)
33+
to a complex distribution $q$ using invertible functions.
3434

3535
In more details, given the base distribution, usually a standard Gaussian distribution, i.e., $q_0 = \mathcal{N}(0, I)$,
3636
we apply a series of parameterized invertible transformations (called flow layers), $T_{1, \theta_1}, \cdots, T_{N, \theta_k}$, yielding that
@@ -56,7 +56,7 @@ Given the feasibility of i.i.d. sampling and density evaluation, normalizing flo
5656
\text{Reverse KL:}\quad
5757
&\arg\min _{\theta} \mathbb{E}_{q_{\theta}}\left[\log q_{\theta}(Z)-\log p(Z)\right] \\
5858
&= \arg\min _{\theta} \mathbb{E}_{q_0}\left[\log \frac{q_\theta(T_N\circ \cdots \circ T_1(Z_0))}{p(T_N\circ \cdots \circ T_1(Z_0))}\right] \\
59-
&= \arg\max _{\theta} \mathbb{E}_{q_0}\left[ \log p\left(T_N \circ \cdots \circ T_1(Z_0)\right)-\log q_0(X)+\sum_{n=1}^N \log J_n\left(F_n \circ \cdots \circ F_1(X)\right)\right]
59+
&= \arg\max _{\theta} \mathbb{E}_{q_0}\left[ \log p\left(T_N \circ \cdots \circ T_1(Z_0)\right)-\log q_0(X)+\sum_{n=1}^N \log J_n\left(T_n \circ \cdots \circ T_1(X)\right)\right]
6060
\end{aligned}
6161
```
6262
and
@@ -76,10 +76,12 @@ normalizing constant.
7676
In contrast, forward KL minimization is typically used for **generative modeling**,
7777
where one wants to learn the underlying distribution of some data.
7878

79-
## Current status and TODOs
79+
## Current status and to-dos
8080

8181
- [x] general interface development
8282
- [x] documentation
83+
- [ ] integrating [Lux.jl](https://lux.csail.mit.edu/stable/tutorials/intermediate/7_RealNVP) and [Reactant.jl](https://github.com/EnzymeAD/Reactant.jl).
84+
This could potentially solve the GPU compatibility issue as well.
8385
- [ ] including more NF examples/Tutorials
8486
- WIP: [PR#11](https://github.com/TuringLang/NormalizingFlows.jl/pull/11)
8587
- [ ] GPU compatibility

0 commit comments

Comments
 (0)