File tree Expand file tree Collapse file tree 3 files changed +4
-4
lines changed Expand file tree Collapse file tree 3 files changed +4
-4
lines changed Original file line number Diff line number Diff line change @@ -17,7 +17,7 @@ Differentiable and accelerated spherical transforms
1717=================================================================================================================
1818
1919`S2FFT ` is a Python package for computing Fourier transforms on the sphere
20- and rotation group in JAX and PyTorch. It leverages autodiff to provide differentiable
20+ and rotation group using JAX and PyTorch. It leverages autodiff to provide differentiable
2121transforms, which are also deployable on hardware accelerators
2222(e.g. GPUs and TPUs).
2323
Original file line number Diff line number Diff line change 1212# Differentiable and accelerated spherical transforms
1313
1414` S2FFT ` is a Python package for computing Fourier transforms on the sphere
15- and rotation group [ (Price & McEwen 2023)] ( https://arxiv.org/abs/2311.14670 ) Using
15+ and rotation group [ (Price & McEwen 2023)] ( https://arxiv.org/abs/2311.14670 ) using
1616JAX or PyTorch. It leverages autodiff to provide differentiable transforms, which are
1717also deployable on hardware accelerators (e.g. GPUs and TPUs).
1818
@@ -87,7 +87,7 @@ into the active python environment by [pip](https://pypi.org) when running
8787pip install s2fft
8888```
8989This will install all core functionality which includes JAX support. To install ` S2FFT `
90- with PyTorch support run the following
90+ with PyTorch support run
9191
9292``` bash
9393pip install s2fft[torch]
Original file line number Diff line number Diff line change @@ -2,7 +2,7 @@ Differentiable and accelerated spherical transforms
22===================================================
33
44``S2FFT `` is a Python package for computing Fourier transforms on the sphere and rotation
5- group `(Price & McEwen 2023) <https://arxiv.org/abs/2311.14670 >`_ in JAX and PyTorch.
5+ group `(Price & McEwen 2023) <https://arxiv.org/abs/2311.14670 >`_ using JAX and PyTorch.
66It leverages autodiff to provide differentiable transforms, which are also
77deployable on modern hardware accelerators (e.g. GPUs and TPUs).
88
You can’t perform that action at this time.
0 commit comments