You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: toolchain/README.md
+14-5Lines changed: 14 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -31,7 +31,6 @@ and give setup files that you can use to compile ABACUS.
31
31
-[ ] A better README and Detail markdown file.
32
32
-[ ] Automatic installation of [DEEPMD](https://github.com/deepmodeling/deepmd-kit).
33
33
-[ ] Better compliation method for ABACUS-DEEPMD and ABACUS-DEEPKS.
34
-
-[ ] A better `setup` and toolchain code structure.
35
34
-[ ] Modulefile generation scripts.
36
35
-[ ] Support for AMD compiler and math lib like `AOCL` and `AOCC`
37
36
@@ -54,9 +53,16 @@ There are also well-modified script to run *install_abacus_toolchain.sh* for `gn
54
53
> ./toolchain_intel-mpich.sh
55
54
```
56
55
57
-
It is recommended to run them first to get a fast installation of ABACUS under certain environments.
56
+
It is recommended to run one of them first to get a fast installation of ABACUS under certain environments.
58
57
59
-
If you have a fresh environments and you have `sudo` permission, you can use *install_requirements.sh* to install system libraries and dependencies needed by toolchain.
58
+
If you are using Intel environments via Intel-OneAPI: please note:
59
+
1. After version 2024.0, Intel classic compilers `icc` and `icpc` are not present, so as `ifort` after version 2025.0. Intel MPI compiler will also be updated to `mpiicx`, `mpiicpx` and `mpiifx`.
60
+
2. toolchain will detect `icx`, `icpx`, `ifx`, `mpiicx`, `mpiicpx` and `mpiifx` as default compiler.
61
+
3. Users can manually specify `--with-intel-classic=yes` to use Intel classic compiler in `toolchain*.sh`, or specify `--with-intelmpi-classic=yes` to use Intel MPI classic compiler in `toolchain*.sh` while keep the CC, CXX and F90 compiler to new version.
62
+
4. Users can manually specify `--with-ifx=no` in `toolchain*.sh` to use `ifort` while keep other compiler to new version.
63
+
5. More information is in the later part of this README.
64
+
65
+
**Notice: You GCC version should be no lower than 5 !!!, larger than 7.3.0 is recommended**
60
66
61
67
**Notice: You SHOULD `source` or `module load` related environments before use toolchain method for installation, espacially for `gcc` or `intel-oneAPI` !!!! for example, `module load mkl mpi icc compiler`**
Notice: You CANNOT use `icpx` compiler for GPU version of ABACUS for now, see discussion here [#2906](https://github.com/deepmodeling/abacus-develop/issues/2906) and [#4976](https://github.com/deepmodeling/abacus-develop/issues/4976)
197
203
204
+
If you wants to use ABACUS GPU-LCAO by `cusolvermp` or `elpa`, please contact the coresponding developer, toolchain do not fully support them now.
198
205
199
206
### Shell problem
200
207
@@ -235,15 +242,17 @@ When you encounter problem like `GLIBCXX_3.4.29 not found`, it is sure that your
235
242
236
243
After my test, you need `gcc`>11.3.1 to enable deepmd feature in ABACUS.
237
244
238
-
### ELPA problem via Intel-oneAPI toolchain in AMD server
245
+
### Intel-oneAPI problem
246
+
247
+
#### ELPA problem via Intel-oneAPI toolchain in AMD server
239
248
240
249
The default compiler for Intel-oneAPI is `icpx` and `icx`, which will cause problem when compling ELPA in AMD server. (Which is a problem and needed to have more check-out)
241
250
242
251
The best way is to change `icpx` to `icpc`, `icx` to `icc`. user can manually change it in toolchain*.sh via `--with-intel-classic=yes`
243
252
244
253
Notice: `icc` and `icpc` from Intel Classic Compiler of Intel-oneAPI is not supported for 2024.0 and newer version. And Intel-OneAPI 2023.2.0 can be found in website. See discussion here [#4976](https://github.com/deepmodeling/abacus-develop/issues/4976)
245
254
246
-
###Intel-oneAPI problem
255
+
#### link problem in early 2023 version oneAPI
247
256
248
257
Sometimes Intel-oneAPI have problem to link `mpirun`,
249
258
which will always show in 2023.2.0 version of MPI in Intel-oneAPI.
0 commit comments