Skip to content

Commit d2ac686

Browse files
authored
Update README.md
1 parent d197a36 commit d2ac686

File tree

1 file changed

+16
-1
lines changed

1 file changed

+16
-1
lines changed

README.md

Lines changed: 16 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ pip install mixed-precision-for-JAX
1010

1111
## Documentation
1212
For basic usage, this README should give you everything you need to know.
13-
For deeper insights, you can find the documentation at: https://data-science-in-mechanical-engineering.github.io/mixed_precision_for_JAX/ and our paper at: TODO
13+
For deeper insights, you can read the [documentation](https://data-science-in-mechanical-engineering.github.io/mixed_precision_for_JAX/) (https://data-science-in-mechanical-engineering.github.io/mixed_precision_for_JAX/) and our [paper](https://www.arxiv.org/pdf/2507.03312) (https://www.arxiv.org/pdf/2507.03312).
1414

1515
## Introduction
1616

@@ -260,6 +260,21 @@ class MultiHeadAttentionBlock(eqx.Module):
260260
return outputs
261261
```
262262

263+
## Citation
264+
265+
To cite this repository, please cite our [paper](https://www.arxiv.org/pdf/2507.03312):
266+
267+
```
268+
@ARTICLE{2025arXiv250703312G,
269+
author = {{Gr{\"a}fe}, Alexander and {Trimpe}, Sebastian},
270+
title = "{MPX: Mixed Precision Training for JAX}",
271+
journal = {arXiv e-prints},
272+
year = 2025,
273+
doi = {10.48550/arXiv.2507.03312},
274+
}
275+
276+
277+
```
263278

264279
## Acknowledgements
265280
We want to thank Partick Kidger for providing equinox and google DeepMind for providing JMP, which was the base for this implementation.

0 commit comments

Comments
 (0)