|
27 | 27 |
|
28 | 28 | **Welcome!** |
29 | 29 | MFC simulates compressible multi-phase flows, [among other things](#what-else-can-this-thing-do). |
30 | | -It uses metaprogramming to stay short and portable (~20K lines). |
31 | | -MFC conducted the largest known, open CFD simulation at <a href="https://arxiv.org/abs/2505.07392" target="_blank">200 trillion grid points</a>, and 1 quadrillion degrees of freedom (as of September 2025), and is a 2025 Gordon Bell Prize finalist. |
| 30 | +It uses metaprogramming and is short (20K lines) and portable. |
| 31 | +MFC conducted the largest known CFD simulation at <a href="https://arxiv.org/abs/2505.07392" target="_blank">200 trillion grid points</a>, and 1 quadrillion degrees of freedom (as of September 2025). |
| 32 | +MFC is a 2025 Gordon Bell Prize Finalist. |
32 | 33 |
|
33 | 34 | <p align="center"> |
34 | 35 | <a href="https://doi.org/10.48550/arXiv.2503.07953" target="_blank"> |
@@ -76,27 +77,32 @@ This one simulates high-Mach flow over an airfoil: |
76 | 77 | <img src="docs/res/airfoil.png" alt="Airfoil Example" width="700"/><br/> |
77 | 78 | </p> |
78 | 79 |
|
79 | | -And here is a high amplitude acoustic wave reflecting and emerging through a circular orifice: |
| 80 | +And here is a high-amplitude acoustic wave reflecting and emerging through a circular orifice: |
80 | 81 |
|
81 | 82 | <p align="center"> |
82 | 83 | <img src="docs/res/orifice.png" alt="Orifice Example" width="700"/><br/> |
83 | 84 | </p> |
84 | 85 |
|
85 | 86 |
|
86 | 87 | ## Getting started |
87 | | -For a quick start, open a GitHub Codespace to load a pre-configured Docker container to get familiar with MFC commands. Click <kbd> <> Code</kbd> (green button at top right) → <kbd>Codespaces</kbd> (right tab) → <kbd>+</kbd> (create a codespace). |
88 | 88 |
|
89 | | -****Notes:**** Codespaces is a free service with a monthly quota of compute time and storage usage. It is recommended for testing commands, troubleshooting, and running simple case files without the need to install dependencies and build MFC on your device. Remember to save any important files locally before closing your codespace. To learn more, read through [how Docker & Containers work](https://mflowcode.github.io/documentation/docker.html). |
| 89 | +For a _very_ quick start, open a GitHub Codespace to load a pre-configured Docker container and familiarize yourself with MFC commands. |
| 90 | +Click <kbd> <> Code</kbd> (green button at top right) → <kbd>Codespaces</kbd> (right tab) → <kbd>+</kbd> (create a codespace). |
90 | 91 |
|
91 | | -Otherwise, you can navigate [to this webpage](https://mflowcode.github.io/documentation/md_getting-started.html) to get started using MFC! |
| 92 | +> ****Note:**** Codespaces is a free service with a monthly quota of compute time and storage usage. |
| 93 | +> It is recommended for testing commands, troubleshooting, and running simple case files without installing dependencies or building MFC on your device. |
| 94 | +> Don't conduct any critical work here! |
| 95 | +> To learn more, please see [how Docker & Containers work](https://mflowcode.github.io/documentation/docker.html). |
| 96 | +
|
| 97 | +You can navigate [to this webpage](https://mflowcode.github.io/documentation/md_getting-started.html) to get you get started using MFC on your local machine, cluster, or supercomputer! |
92 | 98 | It's rather straightforward. |
93 | | -We'll give a brief intro. here for MacOS. |
| 99 | +We'll give a brief introdocution for MacOS below. |
94 | 100 | Using [brew](https://brew.sh), install MFC's dependencies: |
95 | 101 | ```shell |
96 | 102 | brew install coreutils python cmake fftw hdf5 gcc boost open-mpi lapack |
97 | 103 | ``` |
98 | 104 | You're now ready to build and test MFC! |
99 | | -Put it to a convenient directory via |
| 105 | +Put it to a local directory via |
100 | 106 | ```shell |
101 | 107 | git clone https://github.com/MFlowCode/MFC |
102 | 108 | cd MFC |
@@ -126,17 +132,14 @@ You can visualize the output data in `examples/3d_shockdroplet/silo_hdf5` via Pa |
126 | 132 | ## Is this _really_ exascale? |
127 | 133 |
|
128 | 134 | [OLCF Frontier](https://www.olcf.ornl.gov/frontier/) is the first exascale supercomputer. |
129 | | -The weak scaling of MFC on this machine shows near-ideal utilization. |
| 135 | +The weak scaling of MFC on this machine shows near-ideal utilization. |
| 136 | +We also scale ideally to >98% of LLNL El Capitan. |
130 | 137 |
|
131 | 138 | <p align="center"> |
132 | 139 | <img src="docs/res/scaling.png" alt="Scaling" width="400"/> |
133 | 140 | </p> |
134 | 141 |
|
135 | | - |
136 | | -## What else can this thing do |
137 | | - |
138 | | -MFC has many features. |
139 | | -They are organized below. |
| 142 | +## What else can this thing do? |
140 | 143 |
|
141 | 144 | ### Physics |
142 | 145 |
|
@@ -212,7 +215,7 @@ They are organized below. |
212 | 215 |
|
213 | 216 | If you use MFC, consider citing it as below. |
214 | 217 | Ref. 1 includes all modern MFC features, including GPU acceleration and many new physics features. |
215 | | -If referencing MFC's (GPU) performance, consider citing ref. 1 and 2, which describe the solver and how it was crafted. |
| 218 | +If referencing MFC's (GPU) performance, consider citing ref. 1 and 2, which describe the solver and its design. |
216 | 219 | The original open-source release of MFC is ref. 3, which should be cited for provenance as appropriate. |
217 | 220 |
|
218 | 221 | ```bibtex |
@@ -252,11 +255,11 @@ MFC is under the MIT license (see [LICENSE](LICENSE) for full text). |
252 | 255 |
|
253 | 256 | ## Acknowledgements |
254 | 257 |
|
255 | | -Federal sponsors have supported MFC development, including the US Department of Defense (DOD), the National Institutes of Health (NIH), the Department of Energy (DOE), and the National Science Foundation (NSF). |
| 258 | +Federal sponsors have supported MFC development, including the US Department of Defense (DOD), the National Institutes of Health (NIH), the Department of Energy (DOE) and National Nuclear Security Administration (NNSA), and the National Science Foundation (NSF). |
256 | 259 |
|
257 | 260 | MFC computations have used many supercomputing systems. A partial list is below |
258 | | - * OLCF Frontier and Summit, and testbeds Wombat, Crusher, and Spock (allocation CFD154, PI Bryngelson) |
259 | | - * LLNL El Capitan, Tuolumne, and Lassen; El Capitan early access system Tioga |
| 261 | + * OLCF Frontier and Summit, and testbeds Wombat, Crusher, and Spock (allocation CFD154, PI Bryngelson). |
| 262 | + * LLNL El Capitan, Tuolumne, and Lassen; El Capitan early access system Tioga. |
260 | 263 | * NCSA Delta and DeltaAI, PSC Bridges(1/2), SDSC Comet and Expanse, Purdue Anvil, TACC Stampede(1-3), and TAMU ACES via ACCESS-CI allocations from Bryngelson, Colonius, Rodriguez, and more. |
261 | | - * DOD systems Blueback, Onyx, Carpenter, Nautilus, and Narwhal via the DOD HPCMP program |
262 | | - * Sandia National Labs systems Doom and Attaway and testbed systems Weaver and Vortex |
| 264 | + * DOD systems Blueback, Onyx, Carpenter, Nautilus, and Narwhal via the DOD HPCMP program. |
| 265 | + * Sandia National Labs systems Doom and Attaway, and testbed systems Weaver and Vortex. |
0 commit comments