Skip to content

Commit 0651f30

Browse files
committed
update daint-gpu configuration
1 parent b10e62a commit 0651f30

File tree

1 file changed

+7
-36
lines changed

1 file changed

+7
-36
lines changed

configs/cscs/daint/gpu/craype_config

Lines changed: 7 additions & 36 deletions
Original file line numberDiff line numberDiff line change
@@ -16,14 +16,11 @@ module list
1616

1717

1818
# Environment variables for HPC key packages that require system libraries that require system libraries (MPI.jl, CUDA.jl, HDF5.jl and ADIOS2.jl)
19-
export JUHPC_CUDA_HOME=$CUDA_HOME # Used for CUDA.jl runtime discovery (set as CUDA_HOME in the activate script).
20-
export JUHPC_CUDA_RUNTIME_VERSION=$CRAY_CUDATOOLKIT_VERSION # Used for CUDA.jl runtime version definition (set in preferences).
21-
export JUHPC_ROCM_HOME= # Used for AMDGPU.jl runtime discovery (set as ROCM_PATH in the activate script).
22-
export JUHPC_MPI_HOME=$MPICH_DIR # Used for MPI.jl system binary discovery (set in preferences).
23-
export JUHPC_MPI_VENDOR= # Used for MPI.jl system binary discovery (used to set preferences).
24-
export JUHPC_MPI_EXEC="srun -C gpu" # Used for MPI.jl exec command discovery (set in preferences). Arguments are space separated, e.g. "srun -C gpu".
25-
export JUHPC_HDF5_HOME=$HDF5_DIR # Used for HDF5.jl library discovery (set in preferences).
26-
export JUHPC_ADIOS2_HOME= # Used for ADIOS2.jl library discovery (set as JULIA_ADIOS2_PATH in the activate script).
19+
export JUHPC_CUDA_HOME=$CUDA_HOME
20+
export JUHPC_CUDA_RUNTIME_VERSION=$CRAY_CUDATOOLKIT_VERSION
21+
export JUHPC_MPI_HOME=$MPICH_DIR
22+
export JUHPC_MPI_EXEC="srun -C gpu"
23+
export JUHPC_HDF5_HOME=$HDF5_DIR
2724

2825

2926
# Create site-specific post-install script (currently MPIPreferences does not provide an option to set required preloads if not automatically detected; JUHPC_MPI_VENDOR fails on Piz Daint...)
@@ -39,32 +36,6 @@ echo 'using Preferences
3936
# Call JUHPC
4037
git clone https://github.com/omlins/JUHPC
4138
JUHPC=./JUHPC/src/juhpc
42-
JUHPC_SETUP_INSTALLDIR=$SCRATCH/../julia/${HOSTNAME%%[0-9]*}-gpu/juhpc_setup # HPC setup installation environment variables must be expanded during installation.
43-
JULIAUP_INSTALLDIR="\$SCRATCH/../julia/\$USER/\${HOSTNAME%%[0-9]*}-gpu/juliaup" # User environment variables SCRATCH and HOSTNAME must not be expanded HPC setup installation, but during usage. Separate installation by HOSTNAME is required, because different hosts with different architectures can share the same file system (e.g., daint and eiger on ALPS).
39+
JUHPC_SETUP_INSTALLDIR=$SCRATCH/../julia/${HOSTNAME%%[0-9]*}-gpu/juhpc_setup
40+
JULIAUP_INSTALLDIR="\$SCRATCH/../julia/\$USER/\${HOSTNAME%%[0-9]*}-gpu/juliaup"
4441
bash -l $JUHPC $JUHPC_SETUP_INSTALLDIR $JULIAUP_INSTALLDIR $JUHPC_POST_INSTALL_JL
45-
46-
47-
# Activate the HPC setup environment variables
48-
. $JUHPC_SETUP_INSTALLDIR/activate
49-
50-
# Call juliaup to install juliaup and latest julia on scratch
51-
juliaup
52-
53-
# Call juliaup to see its options
54-
juliaup
55-
56-
# Call julia Pkg
57-
julia -e 'using Pkg; Pkg.status()'
58-
59-
# Add CUDA.jl
60-
julia -e 'using Pkg; Pkg.add("CUDA"); using CUDA; CUDA.versioninfo()'
61-
62-
# Add MPI.jl
63-
julia -e 'using Pkg; Pkg.add("MPI"); using MPI; MPI.versioninfo()'
64-
65-
# Add HDF5.jl
66-
julia -e 'using Pkg; Pkg.add("HDF5"); using HDF5; @show HDF5.has_parallel()'
67-
68-
# Test CUDA-aware MPI
69-
cd ~/cudaaware
70-
MPICH_GPU_SUPPORT_ENABLED=1 srun -Acsstaff -C'gpu' -N2 -n2 julia cudaaware.jl

0 commit comments

Comments
 (0)