Skip to content

Commit 7e7e721

Browse files
authored
Add documentation for Julia uenv and Julia uage in JupyterHub (#120)
1 parent eba4cae commit 7e7e721

File tree

4 files changed

+121
-1
lines changed

4 files changed

+121
-1
lines changed

docs/services/jupyterlab.md

Lines changed: 23 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ A kernel can be created from an active Python virtual environment with the follo
4545
python -m ipykernel install --user --name="<kernel-name>" --display-name="<kernel-name>"
4646
```
4747

48-
## Using uenvs in JupyterLab
48+
## Using uenvs in JupyterLab for Python
4949

5050
In the JupyterHub Spawner Options form mentioned above, it's possible to pass an uenv and a view.
5151
The uenv will be mounted at `/user-environment`, and the specified view will be activated.
@@ -65,6 +65,28 @@ Then with that virtual environment activated, you can run the command to create
6565
If the uenv is not present in the local repository, it will be automatically fetched.
6666
As a result, JupyterLab may take slightly longer than usual to start.
6767

68+
69+
## Using Julia in JupyterHub
70+
71+
Each time you start a JupyterHub server, you need to do the following in the JupyterHub Spawner Options form mentioned above:
72+
!!! important "pass a [`julia`][ref-uenv-julia] uenv and the view `jupyter`."
73+
74+
At first time use of Julia within Jupyter, IJulia and one or more Julia kernel needs to be installed.
75+
Type the following command in a shell within JupyterHub to install IJulia, the default Julia kernel and, on systems whith Nvidia GPUs, a Julia kernel running under Nvidia Nsight Systems:
76+
```console
77+
install_ijulia
78+
```
79+
80+
You can install additional custom Julia kernels by typing the following in a shell:
81+
```console
82+
julia
83+
using IJulia
84+
installkernel(<args>) # type `? installkernel` to learn about valid `<args>`
85+
```
86+
87+
!!! warning "First time use of Julia"
88+
If you are using Julia for the first time at all, executing `install_ijulia` will automatically first trigger the installation of `juliaup` and the latest `julia` version (it is also triggered if you execute `juliaup` or `julia`).
89+
6890
## Ending your interactive session and logging out
6991

7092
The Jupyter servers can be shut down through the Hub. To end a JupyterLab session, please select `Control Panel` under the `File` menu and then `Stop My Server`. By contrast, clicking `Logout` will log you out of the server, but the server will continue to run until the Slurm job reaches its maximum wall time.

docs/software/prgenv/index.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,10 @@ CSCS provides "programming environments" on Alps vClusters that provide compiler
1818

1919
Provides compilers, MPI and Python, along with linear algebra and mesh partitioning libraries for a broad range of use cases.
2020

21+
- :fontawesome-solid-layer-group: [__julia__][ref-uenv-julia]
22+
23+
Provides a complete HPC setup for running Julia efficiently at scale, using the supercomputer hardware optimally.
24+
2125
- :fontawesome-solid-layer-group: [__Cray Programming Environment__][ref-cpe]
2226

2327
The Cray Programming Environment (CPE) is a suite of compilers, libraries and tools provided by HPE.

docs/software/prgenv/julia.md

Lines changed: 93 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,93 @@
1+
[](){#ref-uenv-julia}
2+
# julia
3+
4+
The `julia` uenv provides a complete HPC setup for running Julia efficiently at scale, using the supercomputer hardware optimally.
5+
Unlike in traditional approaches, this Julia HPC setup enables you to update Julia yourself using the included preconfigured community tool [`juliaup`](https://github.com/JuliaLang/juliaup).
6+
It also does not preinstall any packages site-wide. Instead, for HPC key packages that benefit from using locally built libraries (`MPI.jl`, `CUDA.jl`, `AMDGPU.jl`, `HDF5.jl`, `ADIOS2.jl`, etc.), this uenv provides the libraries and presets package preferences and environment variables for an automatic optimal installation and usage of these packages using these local libraries.
7+
As a result, you only need to type, e.g., `] add CUDA` in the Julia REPL, in order to install `CUDA.jl` optimally.
8+
The `julia` uenv internally relies on the community scripting project [JUHPC](https://github.com/JuliaParallel/JUHPC) to achieve this.
9+
10+
## Versioning
11+
12+
The naming scheme is `julia/<version>`, where `<version>` has the `YY.M[M]` format, for example September 2024 is `24.9`, and May 2025 would be `25.5`.
13+
The release schedule is not fixed; new versions will be released, when there is a compelling reason to update.
14+
15+
| version | node types | system |
16+
|-----------|-----------|--------|
17+
| 24.9 | gh200, zen2 | daint, eiger, todi |
18+
| 25.5 | gh200, zen2 | daint, eiger, santis, clariden, bristen |
19+
20+
=== "25.5"
21+
22+
The key updates in version `25.5:v1` from the version `24.9` were:
23+
24+
* enabling compatibility with the latest `uenv` version `8.0`
25+
* changing the installation directory
26+
* adding the `jupyter` view
27+
* upgrading to `[email protected]` and `[email protected]`
28+
29+
!!! info "HPC key libraries included"
30+
* cray-mpich/8.1.30
31+
* cuda/12.8.0
32+
* hdf5/1.14.5
33+
* adios2/2.10.2
34+
35+
## How to use
36+
37+
Find and pull a Julia uenv image, e.g.:
38+
```bash
39+
uenv image find julia # list available julia images
40+
uenv image pull julia/25.5 # copy version[:tag] from the list above
41+
```
42+
43+
Start the image and activate the Julia[up] HPC setup by loading the following view(s):
44+
=== "`juliaup`"
45+
!!! example ""
46+
```bash
47+
uenv start julia/25.5:v1 --view=juliaup
48+
```
49+
50+
=== "`juliaup` and `modules`"
51+
!!! example "This activates also modules for the available libraries like, e.g, `cuda`."
52+
```bash
53+
uenv start julia/25.5:v1 --view=juliaup,modules
54+
```
55+
56+
There is also a view `jupyter` available, which is required for [using Julia in JupyterHub][using-julia-in-jupyterhub].
57+
58+
!!! info "Automatic installation of Juliaup and Julia"
59+
The installation of `juliaup` and the latest `julia` version happens automatically the first time when `juliaup` is called:
60+
```bash
61+
juliaup
62+
```
63+
64+
Note that the `julia` uenv is built extending the `prgenv-gnu` uenv.
65+
As a result, it provides also all the features of `prgenv-gnu`.
66+
Please see [the `prgenv-gnu` documentation][ref-uenv-prgenv-gnu-how-to-use] for details.
67+
You can for example load the `modules` view to see the exact versions of the libraries available in the uenv.
68+
69+
## Background on Julia for HPC
70+
71+
[Julia](https://julialang.org/) is a programming language that was designed to solve the "two-language problem", the problem that prototypes written in an interactive high-level language like MATLAB, R or Python need to be partly or fully rewritten in lower-level languages like C, C++ or Fortran when a high-performance production code is required.
72+
Julia, which has its origins at MIT, can however reach the performance of C, C++ or Fortran despite being high-level and interactive.
73+
This is possible thanks to Julia's just-ahead-of-time compilation: code can be executed in an interactive shell as usual for prototyping languages, but functions and code blocks are compiled to machine code right before their first execution instead of being interpreted (note that modules are pre-compiled).
74+
75+
Julia is optimally suited for parallel computing, supporting, e.g., MPI (via [`MPI.jl`](https://github.com/JuliaParallel/MPI.jl)) and threads similar to OpenMP.
76+
Moreover, Julia's GPU packages ([`CUDA.jl`](https://github.com/JuliaGPU/CUDA.jl), [`AMDGPU.jl`](https://github.com/JuliaGPU/AMDGPU.jl), etc.) enables writing native Julia code for GPUs [1], which can reach similar efficiency as CUDA C/C++ [2] or the analog for other vendors.
77+
Julia was shown to be suitable for scientific GPU supercomputing at large scale, enabling near optimal performance and nearly ideal scaling on thousands of GPUs on Piz Daint [2,3,4,5].
78+
Packages like [ParallelStencil.jl](https://github.com/omlins/ParallelStencil.jl) [[4](https://doi.org/10.21105/jcon.00138)] and [ImplicitGlobalGrid.jl](https://github.com/eth-cscs/ImplicitGlobalGrid.jl) [[3](https://doi.org/10.21105/jcon.00137)] enable to unify prototype and high-performance production code in one single codebase.
79+
Furthermore, Julia permits direct calling of C/C++ and Fortran libraries without glue code.
80+
It also features similar interfaces to prototyping languages as, e.g., Python, R and MATLAB.
81+
Finally, the [Julia PackageCompiler](https://github.com/JuliaLang/PackageCompiler.jl) enables to compile Julia modules in order to create shared libraries that are callable from C or other languages (a comprehensive [Proof of Concept](https://github.com/omlins/libdiffusion) was already available in 2018 and the PackageCompiler has matured very much since).
82+
83+
## References
84+
85+
[1] Besard, T., Foket, C., & De Sutter B. (2018). Effective Extensible Programming: Unleashing Julia on GPUs. IEEE Transactions on Parallel and Distributed Systems, 30(4), 827-841
86+
87+
[2] Räss, L., Omlin, S., & Podladchikov, Y. Y. (2019). Porting a Massively Parallel Multi-GPU Application to Julia: a 3-D Nonlinear Multi-Physics Flow Solver. JuliaCon Conference, Baltimore, US.
88+
89+
[3] Omlin, S., Räss, L., Utkin I. (2024). Distributed Parallelization of xPU Stencil Computations in Julia. The Proceedings of the JuliaCon Conferences, 6(65), 137, https://doi.org/10.21105/jcon.00137
90+
91+
[4] Omlin, S., Räss, L. (2024). High-performance xPU Stencil Computations in Julia. The Proceedings of the JuliaCon Conferences, 6(64), 138, https://doi.org/10.21105/jcon.00138
92+
93+
[5] Omlin, S., Räss, L., Kwasniewski, G., Malvoisin, B., & Podladchikov, Y. Y. (2020). Solving Nonlinear Multi-Physics on GPU Supercomputers with Julia. JuliaCon Conference, virtual.

mkdocs.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -58,6 +58,7 @@ nav:
5858
- 'prgenv-gnu': software/prgenv/prgenv-gnu.md
5959
- 'prgenv-nvfortran': software/prgenv/prgenv-nvfortran.md
6060
- 'linalg': software/prgenv/linalg.md
61+
- 'julia': software/prgenv/julia.md
6162
- 'Cray modules (CPE)': software/prgenv/cpe.md
6263
- 'Machine Learning':
6364
- software/ml/index.md

0 commit comments

Comments
 (0)