Skip to content

Commit d86b17a

Browse files
authored
Revise README for clarity and updates
Updated README for clarity and conciseness, including corrections and additional details about MFC's capabilities and sponsors.
1 parent 7b21a8f commit d86b17a

File tree

1 file changed

+23
-20
lines changed

1 file changed

+23
-20
lines changed

README.md

Lines changed: 23 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -27,8 +27,9 @@
2727

2828
**Welcome!**
2929
MFC simulates compressible multi-phase flows, [among other things](#what-else-can-this-thing-do).
30-
It uses metaprogramming to stay short and portable (~20K lines).
31-
MFC conducted the largest known, open CFD simulation at <a href="https://arxiv.org/abs/2505.07392" target="_blank">200 trillion grid points</a>, and 1 quadrillion degrees of freedom (as of September 2025), and is a 2025 Gordon Bell Prize finalist.
30+
It uses metaprogramming and is short (20K lines) and portable.
31+
MFC conducted the largest known CFD simulation at <a href="https://arxiv.org/abs/2505.07392" target="_blank">200 trillion grid points</a>, and 1 quadrillion degrees of freedom (as of September 2025).
32+
MFC is a 2025 Gordon Bell Prize Finalist.
3233

3334
<p align="center">
3435
<a href="https://doi.org/10.48550/arXiv.2503.07953" target="_blank">
@@ -76,27 +77,32 @@ This one simulates high-Mach flow over an airfoil:
7677
<img src="docs/res/airfoil.png" alt="Airfoil Example" width="700"/><br/>
7778
</p>
7879

79-
And here is a high amplitude acoustic wave reflecting and emerging through a circular orifice:
80+
And here is a high-amplitude acoustic wave reflecting and emerging through a circular orifice:
8081

8182
<p align="center">
8283
<img src="docs/res/orifice.png" alt="Orifice Example" width="700"/><br/>
8384
</p>
8485

8586

8687
## Getting started
87-
For a quick start, open a GitHub Codespace to load a pre-configured Docker container to get familiar with MFC commands. Click <kbd> <> Code</kbd> (green button at top right) → <kbd>Codespaces</kbd> (right tab) → <kbd>+</kbd> (create a codespace).
8888

89-
****Notes:**** Codespaces is a free service with a monthly quota of compute time and storage usage. It is recommended for testing commands, troubleshooting, and running simple case files without the need to install dependencies and build MFC on your device. Remember to save any important files locally before closing your codespace. To learn more, read through [how Docker & Containers work](https://mflowcode.github.io/documentation/docker.html).
89+
For a _very_ quick start, open a GitHub Codespace to load a pre-configured Docker container and familiarize yourself with MFC commands.
90+
Click <kbd> <> Code</kbd> (green button at top right) → <kbd>Codespaces</kbd> (right tab) → <kbd>+</kbd> (create a codespace).
9091

91-
Otherwise, you can navigate [to this webpage](https://mflowcode.github.io/documentation/md_getting-started.html) to get started using MFC!
92+
> ****Note:**** Codespaces is a free service with a monthly quota of compute time and storage usage.
93+
> It is recommended for testing commands, troubleshooting, and running simple case files without installing dependencies or building MFC on your device.
94+
> Don't conduct any critical work here!
95+
> To learn more, please see [how Docker & Containers work](https://mflowcode.github.io/documentation/docker.html).
96+
97+
You can navigate [to this webpage](https://mflowcode.github.io/documentation/md_getting-started.html) to get you get started using MFC on your local machine, cluster, or supercomputer!
9298
It's rather straightforward.
93-
We'll give a brief intro. here for MacOS.
99+
We'll give a brief introdocution for MacOS below.
94100
Using [brew](https://brew.sh), install MFC's dependencies:
95101
```shell
96102
brew install coreutils python cmake fftw hdf5 gcc boost open-mpi lapack
97103
```
98104
You're now ready to build and test MFC!
99-
Put it to a convenient directory via
105+
Put it to a local directory via
100106
```shell
101107
git clone https://github.com/MFlowCode/MFC
102108
cd MFC
@@ -126,17 +132,14 @@ You can visualize the output data in `examples/3d_shockdroplet/silo_hdf5` via Pa
126132
## Is this _really_ exascale?
127133

128134
[OLCF Frontier](https://www.olcf.ornl.gov/frontier/) is the first exascale supercomputer.
129-
The weak scaling of MFC on this machine shows near-ideal utilization.
135+
The weak scaling of MFC on this machine shows near-ideal utilization.
136+
We also scale ideally to >98% of LLNL El Capitan.
130137

131138
<p align="center">
132139
<img src="docs/res/scaling.png" alt="Scaling" width="400"/>
133140
</p>
134141

135-
136-
## What else can this thing do
137-
138-
MFC has many features.
139-
They are organized below.
142+
## What else can this thing do?
140143

141144
### Physics
142145

@@ -212,7 +215,7 @@ They are organized below.
212215

213216
If you use MFC, consider citing it as below.
214217
Ref. 1 includes all modern MFC features, including GPU acceleration and many new physics features.
215-
If referencing MFC's (GPU) performance, consider citing ref. 1 and 2, which describe the solver and how it was crafted.
218+
If referencing MFC's (GPU) performance, consider citing ref. 1 and 2, which describe the solver and its design.
216219
The original open-source release of MFC is ref. 3, which should be cited for provenance as appropriate.
217220

218221
```bibtex
@@ -252,11 +255,11 @@ MFC is under the MIT license (see [LICENSE](LICENSE) for full text).
252255

253256
## Acknowledgements
254257

255-
Federal sponsors have supported MFC development, including the US Department of Defense (DOD), the National Institutes of Health (NIH), the Department of Energy (DOE), and the National Science Foundation (NSF).
258+
Federal sponsors have supported MFC development, including the US Department of Defense (DOD), the National Institutes of Health (NIH), the Department of Energy (DOE) and National Nuclear Security Administration (NNSA), and the National Science Foundation (NSF).
256259

257260
MFC computations have used many supercomputing systems. A partial list is below
258-
* OLCF Frontier and Summit, and testbeds Wombat, Crusher, and Spock (allocation CFD154, PI Bryngelson)
259-
* LLNL El Capitan, Tuolumne, and Lassen; El Capitan early access system Tioga
261+
* OLCF Frontier and Summit, and testbeds Wombat, Crusher, and Spock (allocation CFD154, PI Bryngelson).
262+
* LLNL El Capitan, Tuolumne, and Lassen; El Capitan early access system Tioga.
260263
* NCSA Delta and DeltaAI, PSC Bridges(1/2), SDSC Comet and Expanse, Purdue Anvil, TACC Stampede(1-3), and TAMU ACES via ACCESS-CI allocations from Bryngelson, Colonius, Rodriguez, and more.
261-
* DOD systems Blueback, Onyx, Carpenter, Nautilus, and Narwhal via the DOD HPCMP program
262-
* Sandia National Labs systems Doom and Attaway and testbed systems Weaver and Vortex
264+
* DOD systems Blueback, Onyx, Carpenter, Nautilus, and Narwhal via the DOD HPCMP program.
265+
* Sandia National Labs systems Doom and Attaway, and testbed systems Weaver and Vortex.

0 commit comments

Comments
 (0)