You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+24-18Lines changed: 24 additions & 18 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -27,8 +27,9 @@
27
27
28
28
**Welcome!**
29
29
MFC simulates compressible multi-phase flows, [among other things](#what-else-can-this-thing-do).
30
-
It uses metaprogramming to stay short and portable (~20K lines).
31
-
MFC conducted the largest known, open CFD simulation at <ahref="https://arxiv.org/abs/2505.07392"target="_blank">200 trillion grid points</a>, and 1 quadrillion degrees of freedom (as of September 2025), and is a 2025 Gordon Bell Prize finalist.
30
+
It uses metaprogramming and is short (20K lines) and portable.
31
+
MFC conducted the largest known CFD simulation at <ahref="https://arxiv.org/abs/2505.07392"target="_blank">200 trillion grid points</a>, and 1 quadrillion degrees of freedom (as of September 2025).
@@ -85,15 +86,23 @@ And here is a high amplitude acoustic wave reflecting and emerging through a cir
85
86
86
87
## Getting started
87
88
88
-
You can navigate [to this webpage](https://mflowcode.github.io/documentation/md_getting-started.html) to get started using MFC!
89
+
For a _very_ quick start, open a GitHub Codespace to load a pre-configured Docker container and familiarize yourself with MFC commands.
90
+
Click <kbd> <> Code</kbd> (green button at top right) → <kbd>Codespaces</kbd> (right tab) → <kbd>+</kbd> (create a codespace).
91
+
92
+
> ****Note:**** Codespaces is a free service with a monthly quota of compute time and storage usage.
93
+
> It is recommended for testing commands, troubleshooting, and running simple case files without installing dependencies or building MFC on your device.
94
+
> Don't conduct any critical work here!
95
+
> To learn more, please see [how Docker & Containers work](https://mflowcode.github.io/documentation/docker.html).
96
+
97
+
You can navigate [to this webpage](https://mflowcode.github.io/documentation/md_getting-started.html) to get you get started using MFC on your local machine, cluster, or supercomputer!
89
98
It's rather straightforward.
90
-
We'll give a brief intro. here for MacOS.
99
+
We'll give a brief introdocution for MacOS below.
91
100
Using [brew](https://brew.sh), install MFC's dependencies:
Ref. 1 includes all modern MFC features, including GPU acceleration and many new physics features.
212
-
If referencing MFC's (GPU) performance, consider citing ref. 1 and 2, which describe the solver and how it was crafted.
218
+
If referencing MFC's (GPU) performance, consider citing ref. 1 and 2, which describe the solver and its design.
213
219
The original open-source release of MFC is ref. 3, which should be cited for provenance as appropriate.
214
220
215
221
```bibtex
@@ -249,11 +255,11 @@ MFC is under the MIT license (see [LICENSE](LICENSE) for full text).
249
255
250
256
## Acknowledgements
251
257
252
-
Federal sponsors have supported MFC development, including the US Department of Defense (DOD), the National Institutes of Health (NIH), the Department of Energy (DOE), and the National Science Foundation (NSF).
258
+
Federal sponsors have supported MFC development, including the US Department of Defense (DOD), the National Institutes of Health (NIH), the Department of Energy (DOE) and National Nuclear Security Administration (NNSA), and the National Science Foundation (NSF).
253
259
254
260
MFC computations have used many supercomputing systems. A partial list is below
255
-
* OLCF Frontier and Summit, and testbeds Wombat, Crusher, and Spock (allocation CFD154, PI Bryngelson)
256
-
* LLNL El Capitan, Tuolumne, and Lassen; El Capitan early access system Tioga
261
+
* OLCF Frontier and Summit, and testbeds Wombat, Crusher, and Spock (allocation CFD154, PI Bryngelson).
262
+
* LLNL El Capitan, Tuolumne, and Lassen; El Capitan early access system Tioga.
257
263
* NCSA Delta and DeltaAI, PSC Bridges(1/2), SDSC Comet and Expanse, Purdue Anvil, TACC Stampede(1-3), and TAMU ACES via ACCESS-CI allocations from Bryngelson, Colonius, Rodriguez, and more.
258
-
* DOD systems Blueback, Onyx, Carpenter, Nautilus, and Narwhal via the DOD HPCMP program
259
-
* Sandia National Labs systems Doom and Attaway and testbed systems Weaver and Vortex
264
+
* DOD systems Blueback, Onyx, Carpenter, Nautilus, and Narwhal via the DOD HPCMP program.
265
+
* Sandia National Labs systems Doom and Attaway, and testbed systems Weaver and Vortex.
0 commit comments