Skip to content

Conversation

@Meinersbur
Copy link
Member

@Meinersbur Meinersbur commented Apr 29, 2025

Move building the .mod files from openmp/flang to openmp/flang-rt using a shared mechanism. Motivations to do so are:

  1. Most modules are target-dependent and need to be re-compiled for each target separately, which is something the LLVM_ENABLE_RUNTIMES system already does. Prime example is iso_c_binding.mod which encodes the target's ABI. Most other modules have #ifdef-enclosed code as well.

  2. CMake has support for Fortran that we should use. Among other things, it automatically determines module dependencies so there is no need to hardcode them in the CMakeLists.txt.

  3. It allows using Fortran itself to implement Flang-RT. Currently, only iso_fortran_env_impl.f90 emits object files that are needed by Fortran applications ([flang] Linker for non-constant accesses to kind arrays (integer_kind, logical_kind, real_kind) #89403). The workaround of [flang][runtime] Build ISO_FORTRAN_ENV to export kind arrays as linkable symbols #95388 could be reverted.

Some new dependencies come into play:

  • openmp depends on flang-rt for building lib_omp.mod and lib_omp_kinds.mod. Currently, if flang-rt is not found then the modules are not built.
  • check-flang depends on flang-rt: If not found, the majority of tests are disabled. If not building in a bootstrpping build, the location of the module files can be pointed to using -DFLANG_INTRINSIC_MODULES_DIR=<path>, e.g. in a flang-standalone build. Alternatively, the test needing any of the intrinsic modules could be marked with REQUIRES: flangrt-modules.
  • check-flang depends on openmp: Not a change; tests requiring lib_omp.mod and lib_omp_kinds.mod those are already marked with openmp_runtime.

As intrinsic are now specific to the target, their location is moved from include/flang to <resource-dir>/finclude/flang/<triple>. The mechnism to compute the location have been moved from flang-rt (previously used to compute the location of libflang_rt.*.a) to common locations in cmake/GetToolchainDirs.cmake and runtimes/CMakeLists.txt so they can be used by both, openmp and flang-rt. Potentially the mechnism could also be shared by other libraries such as compiler-rt.

finclude was chosen because gfortran uses it as well and avoids misuse such as #include <flang/iso_c_binding.mod>. The search location is now determined by ToolChain in the driver, instead of by the frontend. Now the driver adds -fintrinsic-module-path for that location to the frontend call (Just like gfortran does). -fintrinsic-module-path had to be fixed for this because ironically it was only added to searchDirectories, but not intrinsicModuleDirectories_. Since the driver determines the location, tests invoking flang -fc1 and bbc must also be passed the location by llvm-lit. This works like llvm-lit does for finding the include dirs for Clang using -print-file-name=....

Related PRs:

Copy link
Contributor

@jhuber6 jhuber6 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's the main limitation here? If this is just a file dependency it should be identical to how all the OpenMP tests depend on omp.h being in the resource directory. IMHO this is trivial if we do a runtimes build, since we can just require that openmp;flang-rt are in the same toolchain, which then gives us well defined access to openmp's CMake targets so long as it's listed before flang-rt.

@Meinersbur
Copy link
Member Author

While I appreciate the review, it is not yet in the state that warants one. It is still in an experimentation stage, so I did not yet care about formatting. There are also a lot of changes in here that will eventually not be needed.

Goals are:

  1. Currently modules files are expected at $prefix/include/flang/*.mod where prefix is the parent of bin where flang is located. It should be in $prefix/lib/clang/finclude/<triple>/*.h, i.e. the resource directory since mod-files are specific to the version of flang, and distinct for each target triple since the mod files can be different for each target. Necessary for cross-compilation. In addition to the CMake code, the flang driver code needs to change as well because it hardcodes $path/../include/flang.

  2. Use cmake to build the module files within the flang-rt/ runtime. The LLVM_ENABLE_RUNTIMES system handles which target triples to build and Flang being available. CMake should care about the build dependencies. Have to change the driver again to tell it where to emit the module files.

  3. Use the same mechanism as above to build omp_lib.mod and omp_lib_kinds.mod, but in the openmp/ runtime. Since in the same CMake builddir, CMake will ensure dependencies already.

  4. This means flang-rt (and openmp) is necessary to be compiled before running flang's tests which require those module files. Flang's OpenMP tests already require openmp's modules to be compiled, it will be no different to flang-rt's builtin modules.

Sounds relatively simple, but there have been many small issues, starting with CMake's misspelling of CMAKE_Fortran_BUILDING_INSTRINSIC_MODULES.

@DanielCChen
Copy link
Contributor

DanielCChen commented May 7, 2025

  1. It should be in $prefix/lib/clang/finclude/<triple>/*.h

Just want to make sure: Should it be $prefix/lib/clang/${LLVM_VERSION_MAJOR}/finclude/<triple>/*.mod?

@Meinersbur
Copy link
Member Author

Just want to make sure: Should it be $prefix/lib/clang/${LLVM_VERSION_MAJOR}/finclude/<triple>/*.mod?

That is correct, I forgot the version number that is part of the resource directory.

@Meinersbur Meinersbur force-pushed the flang_builtin-mods branch from 907d3d5 to 839198d Compare July 17, 2025 13:04
@github-actions
Copy link

github-actions bot commented Jul 17, 2025

✅ With the latest revision this PR passed the Python code formatter.

@Meinersbur
Copy link
Member Author

What's the main limitation here? If this is just a file dependency it should be identical to how all the OpenMP tests depend on omp.h being in the resource directory.

omp.h is created by configure_file at configure time. No dependency other than runtimes-configure needed.

IMHO this is trivial if we do a runtimes build, since we can just require that openmp;flang-rt are in the same toolchain,

With toolchain you mean a bootstrapping build with LLVM_ENABLE_RUNTIMES=openmp;flang-rt ? Don't forget the users of a Flang-standalone build (cmake -S <llvm-project>/flang).

which then gives us well defined access to openmp's CMake targets so long as it's listed before flang-rt.

check-flang (LLVM_ENABLE_PROJECTS=flang) needs access to libomp.mod (LLVM_ENABLE_RUNTIMES=openmp) and the flang modules as well to work.

@Meinersbur Meinersbur requested a review from kkwli July 17, 2025 13:25
@Meinersbur Meinersbur marked this pull request as ready for review July 19, 2025 12:32
@Meinersbur Meinersbur requested a review from a team as a code owner July 19, 2025 12:32
@sscalpone sscalpone requested review from klausler and vzakhari July 20, 2025 09:25
@Meinersbur Meinersbur merged commit 86fbaef into llvm:main Nov 25, 2025
68 of 70 checks passed
@llvm-ci
Copy link
Collaborator

llvm-ci commented Nov 25, 2025

LLVM Buildbot has detected a new failure on builder amdgpu-offload-ubuntu-22-cmake-build-only running on rocm-docker-ubu-22 while building clang,cmake,flang-rt,flang,llvm,openmp,runtimes at step 4 "annotate".

Full details are available at: https://lab.llvm.org/buildbot/#/builders/203/builds/30105

Here is the relevant piece of the build log for the reference
Step 4 (annotate) failure: '../llvm-zorg/zorg/buildbot/builders/annotated/amdgpu-offload-cmake.py --jobs=32' (failure)
...
  Failed to generate test project build system.
Call Stack (most recent call first):
  /usr/share/cmake-3.22/Modules/CheckFortranSourceCompiles.cmake:97 (cmake_check_source_compiles)
  CMakeLists.txt:440 (check_fortran_source_compiles)
  /home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/llvm-project/openmp/CMakeLists.txt:118 (flang_module_fortran_enable)


-- Configuring incomplete, errors occurred!
See also "/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/runtimes/runtimes-amdgcn-amd-amdhsa-bins/CMakeFiles/CMakeOutput.log".
See also "/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/runtimes/runtimes-amdgcn-amd-amdhsa-bins/CMakeFiles/CMakeError.log".
FAILED: runtimes/runtimes-amdgcn-amd-amdhsa-stamps/runtimes-amdgcn-amd-amdhsa-configure /home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/runtimes/runtimes-amdgcn-amd-amdhsa-stamps/runtimes-amdgcn-amd-amdhsa-configure 
cd /home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/runtimes/runtimes-amdgcn-amd-amdhsa-bins && /usr/bin/cmake --no-warn-unused-cli -DCMAKE_C_COMPILER=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/./bin/clang -DCMAKE_CXX_COMPILER=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/./bin/clang++ -DCMAKE_ASM_COMPILER=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/./bin/clang -DCMAKE_Fortran_COMPILER=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/./bin/flang -DCMAKE_LINKER=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/./bin/ld.lld -DCMAKE_AR=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/./bin/llvm-ar -DCMAKE_RANLIB=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/./bin/llvm-ranlib -DCMAKE_NM=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/./bin/llvm-nm -DCMAKE_OBJDUMP=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/./bin/llvm-objdump -DCMAKE_OBJCOPY=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/./bin/llvm-objcopy -DCMAKE_STRIP=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/./bin/llvm-strip -DCMAKE_READELF=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/./bin/llvm-readelf -DCMAKE_C_COMPILER_TARGET=amdgcn-amd-amdhsa -DCMAKE_CXX_COMPILER_TARGET=amdgcn-amd-amdhsa -DCMAKE_Fortran_COMPILER_TARGET=amdgcn-amd-amdhsa -DCMAKE_ASM_COMPILER_TARGET=amdgcn-amd-amdhsa -DCMAKE_INSTALL_PREFIX=/tmp/llvm.install.test -DLLVM_BINARY_DIR=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build -DLLVM_CONFIG_PATH=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/bin/llvm-config -DLLVM_ENABLE_WERROR=OFF -DLLVM_HOST_TRIPLE=x86_64-unknown-linux-gnu -DLLVM_HAVE_LINK_VERSION_SCRIPT=1 -DLLVM_USE_RELATIVE_PATHS_IN_DEBUG_INFO=OFF -DLLVM_USE_RELATIVE_PATHS_IN_FILES=OFF "-DLLVM_LIT_ARGS=-v --show-unsupported --timeout 100 --show-xfail -j 16" -DLLVM_SOURCE_PREFIX= -DPACKAGE_VERSION=22.0.0git -DCMAKE_BUILD_TYPE=Release -DCMAKE_MAKE_PROGRAM=/usr/bin/ninja -DCMAKE_EXPORT_COMPILE_COMMANDS=1 -DCOMPILER_RT_BUILD_BUILTINS=OFF -DLLVM_INCLUDE_TESTS=ON -DLLVM_ENABLE_PROJECTS_USED=ON -DLLVM_ENABLE_PER_TARGET_RUNTIME_DIR=ON -DCMAKE_C_COMPILER_WORKS=ON -DCMAKE_CXX_COMPILER_WORKS=ON -DCMAKE_Fortran_COMPILER_WORKS=ON -DCMAKE_ASM_COMPILER_WORKS=ON -DCOMPILER_RT_DEFAULT_TARGET_ONLY=ON -DLLVM_RUNTIMES_TARGET=amdgcn-amd-amdhsa -DHAVE_LLVM_LIT=ON -DCLANG_RESOURCE_DIR= -DLLVM_DEFAULT_TARGET_TRIPLE=amdgcn-amd-amdhsa -DFORTRAN_SUPPORTS_REAL16=FALSE "-DLLVM_ENABLE_RUNTIMES=compiler-rt;openmp;offload;flang-rt" -DLLVM_USE_LINKER= -DLLVM_ENABLE_RUNTIMES=openmp -GNinja -C/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/projects/runtimes-amdgcn-amd-amdhsa/tmp/runtimes-amdgcn-amd-amdhsa-cache-Release.cmake /home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/llvm-project/llvm/runtimes/../../runtimes && /usr/bin/cmake -E touch /home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/runtimes/runtimes-amdgcn-amd-amdhsa-stamps//runtimes-amdgcn-amd-amdhsa-configure
[8192/8203] Linking CXX executable bin/bbc
ninja: build stopped: subcommand failed.
['ninja'] exited with return code 1.
The build step threw an exception...
Traceback (most recent call last):
  File "/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/../llvm-zorg/zorg/buildbot/builders/annotated/amdgpu-offload-cmake.py", line 62, in step
    yield
  File "/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/../llvm-zorg/zorg/buildbot/builders/annotated/amdgpu-offload-cmake.py", line 53, in main
    run_command(["ninja"])
  File "/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/../llvm-zorg/zorg/buildbot/builders/annotated/amdgpu-offload-cmake.py", line 75, in run_command
    util.report_run_cmd(cmd, cwd=directory)
  File "/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/llvm-zorg/zorg/buildbot/builders/annotated/util.py", line 49, in report_run_cmd
    subprocess.check_call(cmd, shell=shell, *args, **kwargs)
  File "/usr/lib/python3.10/subprocess.py", line 369, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['ninja']' returned non-zero exit status 1.
@@@STEP_FAILURE@@@
Step 7 (build cmake config) failure: build cmake config (failure)
...
CMAKE_Fortran_PREPROCESS_SOURCE
CMake Error at /usr/share/cmake-3.22/Modules/Internal/CheckSourceCompiles.cmake:92 (try_compile):
  Failed to generate test project build system.
Call Stack (most recent call first):
  /usr/share/cmake-3.22/Modules/CheckFortranSourceCompiles.cmake:97 (cmake_check_source_compiles)
  CMakeLists.txt:440 (check_fortran_source_compiles)
  /home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/llvm-project/openmp/CMakeLists.txt:118 (flang_module_fortran_enable)
-- Configuring incomplete, errors occurred!
See also "/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/runtimes/runtimes-amdgcn-amd-amdhsa-bins/CMakeFiles/CMakeOutput.log".
See also "/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/runtimes/runtimes-amdgcn-amd-amdhsa-bins/CMakeFiles/CMakeError.log".
FAILED: runtimes/runtimes-amdgcn-amd-amdhsa-stamps/runtimes-amdgcn-amd-amdhsa-configure /home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/runtimes/runtimes-amdgcn-amd-amdhsa-stamps/runtimes-amdgcn-amd-amdhsa-configure 
cd /home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/runtimes/runtimes-amdgcn-amd-amdhsa-bins && /usr/bin/cmake --no-warn-unused-cli -DCMAKE_C_COMPILER=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/./bin/clang -DCMAKE_CXX_COMPILER=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/./bin/clang++ -DCMAKE_ASM_COMPILER=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/./bin/clang -DCMAKE_Fortran_COMPILER=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/./bin/flang -DCMAKE_LINKER=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/./bin/ld.lld -DCMAKE_AR=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/./bin/llvm-ar -DCMAKE_RANLIB=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/./bin/llvm-ranlib -DCMAKE_NM=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/./bin/llvm-nm -DCMAKE_OBJDUMP=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/./bin/llvm-objdump -DCMAKE_OBJCOPY=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/./bin/llvm-objcopy -DCMAKE_STRIP=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/./bin/llvm-strip -DCMAKE_READELF=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/./bin/llvm-readelf -DCMAKE_C_COMPILER_TARGET=amdgcn-amd-amdhsa -DCMAKE_CXX_COMPILER_TARGET=amdgcn-amd-amdhsa -DCMAKE_Fortran_COMPILER_TARGET=amdgcn-amd-amdhsa -DCMAKE_ASM_COMPILER_TARGET=amdgcn-amd-amdhsa -DCMAKE_INSTALL_PREFIX=/tmp/llvm.install.test -DLLVM_BINARY_DIR=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build -DLLVM_CONFIG_PATH=/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/bin/llvm-config -DLLVM_ENABLE_WERROR=OFF -DLLVM_HOST_TRIPLE=x86_64-unknown-linux-gnu -DLLVM_HAVE_LINK_VERSION_SCRIPT=1 -DLLVM_USE_RELATIVE_PATHS_IN_DEBUG_INFO=OFF -DLLVM_USE_RELATIVE_PATHS_IN_FILES=OFF "-DLLVM_LIT_ARGS=-v --show-unsupported --timeout 100 --show-xfail -j 16" -DLLVM_SOURCE_PREFIX= -DPACKAGE_VERSION=22.0.0git -DCMAKE_BUILD_TYPE=Release -DCMAKE_MAKE_PROGRAM=/usr/bin/ninja -DCMAKE_EXPORT_COMPILE_COMMANDS=1 -DCOMPILER_RT_BUILD_BUILTINS=OFF -DLLVM_INCLUDE_TESTS=ON -DLLVM_ENABLE_PROJECTS_USED=ON -DLLVM_ENABLE_PER_TARGET_RUNTIME_DIR=ON -DCMAKE_C_COMPILER_WORKS=ON -DCMAKE_CXX_COMPILER_WORKS=ON -DCMAKE_Fortran_COMPILER_WORKS=ON -DCMAKE_ASM_COMPILER_WORKS=ON -DCOMPILER_RT_DEFAULT_TARGET_ONLY=ON -DLLVM_RUNTIMES_TARGET=amdgcn-amd-amdhsa -DHAVE_LLVM_LIT=ON -DCLANG_RESOURCE_DIR= -DLLVM_DEFAULT_TARGET_TRIPLE=amdgcn-amd-amdhsa -DFORTRAN_SUPPORTS_REAL16=FALSE "-DLLVM_ENABLE_RUNTIMES=compiler-rt;openmp;offload;flang-rt" -DLLVM_USE_LINKER= -DLLVM_ENABLE_RUNTIMES=openmp -GNinja -C/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/projects/runtimes-amdgcn-amd-amdhsa/tmp/runtimes-amdgcn-amd-amdhsa-cache-Release.cmake /home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/llvm-project/llvm/runtimes/../../runtimes && /usr/bin/cmake -E touch /home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/runtimes/runtimes-amdgcn-amd-amdhsa-stamps//runtimes-amdgcn-amd-amdhsa-configure
[8192/8203] Linking CXX executable bin/bbc
ninja: build stopped: subcommand failed.
['ninja'] exited with return code 1.
The build step threw an exception...
Traceback (most recent call last):
  File "/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/../llvm-zorg/zorg/buildbot/builders/annotated/amdgpu-offload-cmake.py", line 62, in step
    yield
  File "/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/../llvm-zorg/zorg/buildbot/builders/annotated/amdgpu-offload-cmake.py", line 53, in main
    run_command(["ninja"])
  File "/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/build/../llvm-zorg/zorg/buildbot/builders/annotated/amdgpu-offload-cmake.py", line 75, in run_command
    util.report_run_cmd(cmd, cwd=directory)
  File "/home/botworker/bbot/amdgpu-offload-ubuntu-22-cmake-build-only/llvm-zorg/zorg/buildbot/builders/annotated/util.py", line 49, in report_run_cmd
    subprocess.check_call(cmd, shell=shell, *args, **kwargs)
  File "/usr/lib/python3.10/subprocess.py", line 369, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['ninja']' returned non-zero exit status 1.
program finished with exit code 0
elapsedTime=443.338842

@llvm-ci
Copy link
Collaborator

llvm-ci commented Nov 25, 2025

LLVM Buildbot has detected a new failure on builder hip-third-party-libs-test running on ext_buildbot_hw_05-hip-docker while building clang,cmake,flang-rt,flang,llvm,openmp,runtimes at step 4 "annotate".

Full details are available at: https://lab.llvm.org/buildbot/#/builders/206/builds/9559

Here is the relevant piece of the build log for the reference
Step 4 (annotate) failure: '../llvm-zorg/zorg/buildbot/builders/annotated/hip-tpl.py --jobs=32' (failure)
...
  Failed to generate test project build system.
Call Stack (most recent call first):
  /usr/share/cmake-3.22/Modules/CheckFortranSourceCompiles.cmake:97 (cmake_check_source_compiles)
  CMakeLists.txt:440 (check_fortran_source_compiles)
  /home/botworker/bbot/hip-third-party-libs-test/llvm-project/openmp/CMakeLists.txt:118 (flang_module_fortran_enable)


-- Configuring incomplete, errors occurred!
See also "/home/botworker/bbot/hip-third-party-libs-test/build/runtimes/runtimes-amdgcn-amd-amdhsa-bins/CMakeFiles/CMakeOutput.log".
See also "/home/botworker/bbot/hip-third-party-libs-test/build/runtimes/runtimes-amdgcn-amd-amdhsa-bins/CMakeFiles/CMakeError.log".
FAILED: runtimes/runtimes-amdgcn-amd-amdhsa-stamps/runtimes-amdgcn-amd-amdhsa-configure /home/botworker/bbot/hip-third-party-libs-test/build/runtimes/runtimes-amdgcn-amd-amdhsa-stamps/runtimes-amdgcn-amd-amdhsa-configure 
cd /home/botworker/bbot/hip-third-party-libs-test/build/runtimes/runtimes-amdgcn-amd-amdhsa-bins && /usr/bin/cmake --no-warn-unused-cli -DCMAKE_C_COMPILER=/home/botworker/bbot/hip-third-party-libs-test/build/./bin/clang -DCMAKE_CXX_COMPILER=/home/botworker/bbot/hip-third-party-libs-test/build/./bin/clang++ -DCMAKE_ASM_COMPILER=/home/botworker/bbot/hip-third-party-libs-test/build/./bin/clang -DCMAKE_Fortran_COMPILER=/home/botworker/bbot/hip-third-party-libs-test/build/./bin/flang -DCMAKE_LINKER=/home/botworker/bbot/hip-third-party-libs-test/build/./bin/ld.lld -DCMAKE_AR=/home/botworker/bbot/hip-third-party-libs-test/build/./bin/llvm-ar -DCMAKE_RANLIB=/home/botworker/bbot/hip-third-party-libs-test/build/./bin/llvm-ranlib -DCMAKE_NM=/home/botworker/bbot/hip-third-party-libs-test/build/./bin/llvm-nm -DCMAKE_OBJDUMP=/home/botworker/bbot/hip-third-party-libs-test/build/./bin/llvm-objdump -DCMAKE_OBJCOPY=/home/botworker/bbot/hip-third-party-libs-test/build/./bin/llvm-objcopy -DCMAKE_STRIP=/home/botworker/bbot/hip-third-party-libs-test/build/./bin/llvm-strip -DCMAKE_READELF=/home/botworker/bbot/hip-third-party-libs-test/build/./bin/llvm-readelf -DCMAKE_C_COMPILER_TARGET=amdgcn-amd-amdhsa -DCMAKE_CXX_COMPILER_TARGET=amdgcn-amd-amdhsa -DCMAKE_Fortran_COMPILER_TARGET=amdgcn-amd-amdhsa -DCMAKE_ASM_COMPILER_TARGET=amdgcn-amd-amdhsa -DCMAKE_INSTALL_PREFIX=/tmp/llvm.install.test -DLLVM_BINARY_DIR=/home/botworker/bbot/hip-third-party-libs-test/build -DLLVM_CONFIG_PATH=/home/botworker/bbot/hip-third-party-libs-test/build/bin/llvm-config -DLLVM_ENABLE_WERROR=OFF -DLLVM_HOST_TRIPLE=x86_64-unknown-linux-gnu -DLLVM_HAVE_LINK_VERSION_SCRIPT=1 -DLLVM_USE_RELATIVE_PATHS_IN_DEBUG_INFO=OFF -DLLVM_USE_RELATIVE_PATHS_IN_FILES=OFF "-DLLVM_LIT_ARGS=-v --show-unsupported --timeout 100 --show-xfail -j 16" -DLLVM_SOURCE_PREFIX= -DPACKAGE_VERSION=22.0.0git -DCMAKE_BUILD_TYPE=Release -DCMAKE_MAKE_PROGRAM=/usr/bin/ninja -DCMAKE_EXPORT_COMPILE_COMMANDS=1 -DCOMPILER_RT_BUILD_BUILTINS=OFF -DLLVM_INCLUDE_TESTS=ON -DLLVM_ENABLE_PROJECTS_USED=ON -DLLVM_ENABLE_PER_TARGET_RUNTIME_DIR=ON -DCMAKE_C_COMPILER_WORKS=ON -DCMAKE_CXX_COMPILER_WORKS=ON -DCMAKE_Fortran_COMPILER_WORKS=ON -DCMAKE_ASM_COMPILER_WORKS=ON -DCOMPILER_RT_DEFAULT_TARGET_ONLY=ON -DLLVM_RUNTIMES_TARGET=amdgcn-amd-amdhsa -DHAVE_LLVM_LIT=ON -DCLANG_RESOURCE_DIR= -DLLVM_DEFAULT_TARGET_TRIPLE=amdgcn-amd-amdhsa -DFORTRAN_SUPPORTS_REAL16=FALSE "-DLLVM_ENABLE_RUNTIMES=compiler-rt;flang-rt" -DLLVM_USE_LINKER= -DLLVM_ENABLE_RUNTIMES=openmp -GNinja -C/home/botworker/bbot/hip-third-party-libs-test/build/projects/runtimes-amdgcn-amd-amdhsa/tmp/runtimes-amdgcn-amd-amdhsa-cache-Release.cmake /home/botworker/bbot/hip-third-party-libs-test/llvm-project/llvm/runtimes/../../runtimes && /usr/bin/cmake -E touch /home/botworker/bbot/hip-third-party-libs-test/build/runtimes/runtimes-amdgcn-amd-amdhsa-stamps//runtimes-amdgcn-amd-amdhsa-configure
[8192/8203] Linking CXX executable bin/bbc
ninja: build stopped: subcommand failed.
['ninja'] exited with return code 1.
The build step threw an exception...
Traceback (most recent call last):
  File "/home/botworker/bbot/hip-third-party-libs-test/build/../llvm-zorg/zorg/buildbot/builders/annotated/hip-tpl.py", line 107, in step
    yield
  File "/home/botworker/bbot/hip-third-party-libs-test/build/../llvm-zorg/zorg/buildbot/builders/annotated/hip-tpl.py", line 44, in main
    run_command(["ninja"])
  File "/home/botworker/bbot/hip-third-party-libs-test/build/../llvm-zorg/zorg/buildbot/builders/annotated/hip-tpl.py", line 120, in run_command
    util.report_run_cmd(cmd, cwd=directory)
  File "/home/botworker/bbot/hip-third-party-libs-test/llvm-zorg/zorg/buildbot/builders/annotated/util.py", line 49, in report_run_cmd
    subprocess.check_call(cmd, shell=shell, *args, **kwargs)
  File "/usr/lib/python3.10/subprocess.py", line 369, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['ninja']' returned non-zero exit status 1.
@@@STEP_FAILURE@@@
@@@BUILD_STEP update llvm-test-suite@@@
@@@HALT_ON_FAILURE@@@
Running: git reset --hard origin/main
HEAD is now at 5a39f941b [gfortran] Disable appendix-a/a.6.2.f90 (#296)
Running: git pull
Already up to date.
@@@BUILD_STEP configure test suite@@@
@@@HALT_ON_FAILURE@@@
Running: cmake -GNinja -B TS-build -S . -DTEST_SUITE_EXTERNALS_DIR=/opt/botworker/llvm/External -DAMDGPU_ARCHS=gfx90a -DTEST_SUITE_SUBDIRS=External -DEXTERNAL_HIP_TESTS_KOKKOS=ON -DCMAKE_CXX_COMPILER=/home/botworker/bbot/hip-third-party-libs-test/build/bin/clang++ -DCMAKE_C_COMPILER=/home/botworker/bbot/hip-third-party-libs-test/build/bin/clang
-- The C compiler identification is Clang 22.0.0
-- The CXX compiler identification is Clang 22.0.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /home/botworker/bbot/hip-third-party-libs-test/build/bin/clang - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /home/botworker/bbot/hip-third-party-libs-test/build/bin/clang++ - skipped
-- Detecting CXX compile features
Step 7 (build cmake config) failure: build cmake config (failure)
...
CMAKE_Fortran_PREPROCESS_SOURCE
CMake Error at /usr/share/cmake-3.22/Modules/Internal/CheckSourceCompiles.cmake:92 (try_compile):
  Failed to generate test project build system.
Call Stack (most recent call first):
  /usr/share/cmake-3.22/Modules/CheckFortranSourceCompiles.cmake:97 (cmake_check_source_compiles)
  CMakeLists.txt:440 (check_fortran_source_compiles)
  /home/botworker/bbot/hip-third-party-libs-test/llvm-project/openmp/CMakeLists.txt:118 (flang_module_fortran_enable)
-- Configuring incomplete, errors occurred!
See also "/home/botworker/bbot/hip-third-party-libs-test/build/runtimes/runtimes-amdgcn-amd-amdhsa-bins/CMakeFiles/CMakeOutput.log".
See also "/home/botworker/bbot/hip-third-party-libs-test/build/runtimes/runtimes-amdgcn-amd-amdhsa-bins/CMakeFiles/CMakeError.log".
FAILED: runtimes/runtimes-amdgcn-amd-amdhsa-stamps/runtimes-amdgcn-amd-amdhsa-configure /home/botworker/bbot/hip-third-party-libs-test/build/runtimes/runtimes-amdgcn-amd-amdhsa-stamps/runtimes-amdgcn-amd-amdhsa-configure 
cd /home/botworker/bbot/hip-third-party-libs-test/build/runtimes/runtimes-amdgcn-amd-amdhsa-bins && /usr/bin/cmake --no-warn-unused-cli -DCMAKE_C_COMPILER=/home/botworker/bbot/hip-third-party-libs-test/build/./bin/clang -DCMAKE_CXX_COMPILER=/home/botworker/bbot/hip-third-party-libs-test/build/./bin/clang++ -DCMAKE_ASM_COMPILER=/home/botworker/bbot/hip-third-party-libs-test/build/./bin/clang -DCMAKE_Fortran_COMPILER=/home/botworker/bbot/hip-third-party-libs-test/build/./bin/flang -DCMAKE_LINKER=/home/botworker/bbot/hip-third-party-libs-test/build/./bin/ld.lld -DCMAKE_AR=/home/botworker/bbot/hip-third-party-libs-test/build/./bin/llvm-ar -DCMAKE_RANLIB=/home/botworker/bbot/hip-third-party-libs-test/build/./bin/llvm-ranlib -DCMAKE_NM=/home/botworker/bbot/hip-third-party-libs-test/build/./bin/llvm-nm -DCMAKE_OBJDUMP=/home/botworker/bbot/hip-third-party-libs-test/build/./bin/llvm-objdump -DCMAKE_OBJCOPY=/home/botworker/bbot/hip-third-party-libs-test/build/./bin/llvm-objcopy -DCMAKE_STRIP=/home/botworker/bbot/hip-third-party-libs-test/build/./bin/llvm-strip -DCMAKE_READELF=/home/botworker/bbot/hip-third-party-libs-test/build/./bin/llvm-readelf -DCMAKE_C_COMPILER_TARGET=amdgcn-amd-amdhsa -DCMAKE_CXX_COMPILER_TARGET=amdgcn-amd-amdhsa -DCMAKE_Fortran_COMPILER_TARGET=amdgcn-amd-amdhsa -DCMAKE_ASM_COMPILER_TARGET=amdgcn-amd-amdhsa -DCMAKE_INSTALL_PREFIX=/tmp/llvm.install.test -DLLVM_BINARY_DIR=/home/botworker/bbot/hip-third-party-libs-test/build -DLLVM_CONFIG_PATH=/home/botworker/bbot/hip-third-party-libs-test/build/bin/llvm-config -DLLVM_ENABLE_WERROR=OFF -DLLVM_HOST_TRIPLE=x86_64-unknown-linux-gnu -DLLVM_HAVE_LINK_VERSION_SCRIPT=1 -DLLVM_USE_RELATIVE_PATHS_IN_DEBUG_INFO=OFF -DLLVM_USE_RELATIVE_PATHS_IN_FILES=OFF "-DLLVM_LIT_ARGS=-v --show-unsupported --timeout 100 --show-xfail -j 16" -DLLVM_SOURCE_PREFIX= -DPACKAGE_VERSION=22.0.0git -DCMAKE_BUILD_TYPE=Release -DCMAKE_MAKE_PROGRAM=/usr/bin/ninja -DCMAKE_EXPORT_COMPILE_COMMANDS=1 -DCOMPILER_RT_BUILD_BUILTINS=OFF -DLLVM_INCLUDE_TESTS=ON -DLLVM_ENABLE_PROJECTS_USED=ON -DLLVM_ENABLE_PER_TARGET_RUNTIME_DIR=ON -DCMAKE_C_COMPILER_WORKS=ON -DCMAKE_CXX_COMPILER_WORKS=ON -DCMAKE_Fortran_COMPILER_WORKS=ON -DCMAKE_ASM_COMPILER_WORKS=ON -DCOMPILER_RT_DEFAULT_TARGET_ONLY=ON -DLLVM_RUNTIMES_TARGET=amdgcn-amd-amdhsa -DHAVE_LLVM_LIT=ON -DCLANG_RESOURCE_DIR= -DLLVM_DEFAULT_TARGET_TRIPLE=amdgcn-amd-amdhsa -DFORTRAN_SUPPORTS_REAL16=FALSE "-DLLVM_ENABLE_RUNTIMES=compiler-rt;flang-rt" -DLLVM_USE_LINKER= -DLLVM_ENABLE_RUNTIMES=openmp -GNinja -C/home/botworker/bbot/hip-third-party-libs-test/build/projects/runtimes-amdgcn-amd-amdhsa/tmp/runtimes-amdgcn-amd-amdhsa-cache-Release.cmake /home/botworker/bbot/hip-third-party-libs-test/llvm-project/llvm/runtimes/../../runtimes && /usr/bin/cmake -E touch /home/botworker/bbot/hip-third-party-libs-test/build/runtimes/runtimes-amdgcn-amd-amdhsa-stamps//runtimes-amdgcn-amd-amdhsa-configure
[8192/8203] Linking CXX executable bin/bbc
ninja: build stopped: subcommand failed.
['ninja'] exited with return code 1.
The build step threw an exception...
Traceback (most recent call last):
  File "/home/botworker/bbot/hip-third-party-libs-test/build/../llvm-zorg/zorg/buildbot/builders/annotated/hip-tpl.py", line 107, in step
    yield
  File "/home/botworker/bbot/hip-third-party-libs-test/build/../llvm-zorg/zorg/buildbot/builders/annotated/hip-tpl.py", line 44, in main
    run_command(["ninja"])
  File "/home/botworker/bbot/hip-third-party-libs-test/build/../llvm-zorg/zorg/buildbot/builders/annotated/hip-tpl.py", line 120, in run_command
    util.report_run_cmd(cmd, cwd=directory)
  File "/home/botworker/bbot/hip-third-party-libs-test/llvm-zorg/zorg/buildbot/builders/annotated/util.py", line 49, in report_run_cmd
    subprocess.check_call(cmd, shell=shell, *args, **kwargs)
  File "/usr/lib/python3.10/subprocess.py", line 369, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['ninja']' returned non-zero exit status 1.

@llvm-ci
Copy link
Collaborator

llvm-ci commented Nov 25, 2025

LLVM Buildbot has detected a new failure on builder ppc64le-mlir-rhel-clang running on ppc64le-mlir-rhel-test while building clang,cmake,flang-rt,flang,llvm,openmp,runtimes at step 6 "test-build-check-mlir-build-only-check-mlir".

Full details are available at: https://lab.llvm.org/buildbot/#/builders/129/builds/33821

Here is the relevant piece of the build log for the reference
Step 6 (test-build-check-mlir-build-only-check-mlir) failure: 1200 seconds without output running [b'ninja', b'check-mlir'], attempting to kill
...
PASS: MLIR-Unit :: IR/./MLIRIRTests/100/130 (3674 of 3685)
PASS: MLIR-Unit :: IR/./MLIRIRTests/101/130 (3675 of 3685)
PASS: MLIR-Unit :: IR/./MLIRIRTests/0/130 (3676 of 3685)
PASS: MLIR-Unit :: IR/./MLIRIRTests/39/130 (3677 of 3685)
PASS: MLIR-Unit :: Interfaces/./MLIRInterfacesTests/13/22 (3678 of 3685)
PASS: MLIR-Unit :: IR/./MLIRIRTests/38/130 (3679 of 3685)
PASS: MLIR-Unit :: Interfaces/./MLIRInterfacesTests/11/22 (3680 of 3685)
PASS: MLIR-Unit :: Interfaces/./MLIRInterfacesTests/12/22 (3681 of 3685)
PASS: MLIR-Unit :: Pass/./MLIRPassTests/10/13 (3682 of 3685)
PASS: MLIR :: mlir-reduce/dce-test.mlir (3683 of 3685)
command timed out: 1200 seconds without output running [b'ninja', b'check-mlir'], attempting to kill
process killed by signal 9
program finished with exit code -1
elapsedTime=3475.082337

@hakostra
Copy link

Hi @Meinersbur, I don't know if this is the right way to approach this (I could open a new issue also), but I'm trying here first. I have been waiting for this to be merged as I believe it would possibly solve my issue #146876 (I think there are at least two other related issue in the tracker as well). I have now build the commit 86fbaef form the main branch, and I see the module files in <prefix>/lib/clang/22/finclude/flang/x86_64-unknown-linux-gnu. However, I do not see any module files for nvptx64 (or any other targets for that matter). And trying to use any module file when targeting the GPU gives me an error message error: Cannot parse module file for module 'ieee_arithmetic': Source file 'ieee_arithmetic.mod' was not found (which is correct - the file is not there).

Do I miss something obvious? Do we need to set some new CMake flags/options compared to previous for this to be built correctly?

@jplehr
Copy link
Contributor

jplehr commented Nov 25, 2025

We see this breaking our buildbot in https://lab.llvm.org/staging/#/builders/105/builds/37275

I'm going to open a revert PR.

@Meinersbur
Copy link
Member Author

@hakostra Modules for other targets must be build explicitly, e.g. -DLLVM_RUNTIME_TARGETS=default;nvptx64-nvidia-cuda.

Without this PR, modules files are shared between all targets, i.e. both are using the same module files. For instance x86_64 has a real(10) type, but there is no equivalent on nvptx. Some constants also have different values on different targets.

@eugeneepshteyn
Copy link
Contributor

@Meinersbur , with a potentially disruptive change like this, it would have been helpful to send a heads-up to flang discourse, describing what's about to be merged, what are potential implications and how to handle them. And also add the same notice as a comment to this PR, so people could find it and deal with the fallout.

@Meinersbur
Copy link
Member Author

@eugeneepshteyn
Copy link
Contributor

@eugeneepshteyn https://discourse.llvm.org/t/rfc-building-flangs-builtin-mod-files/84626

The last message on that thread was on Feb 13. What I'm saying is send a fresh note to make people aware that it's going in now/soon and be prepared to do steps x/y/z to deal with fallout.

@llvm-ci
Copy link
Collaborator

llvm-ci commented Nov 25, 2025

LLVM Buildbot has detected a new failure on builder clang-ppc64le-linux-test-suite running on ppc64le-clang-test-suite while building clang,cmake,flang-rt,flang,llvm,openmp,runtimes at step 6 "test-build-unified-tree-check-all".

Full details are available at: https://lab.llvm.org/buildbot/#/builders/95/builds/19222

Here is the relevant piece of the build log for the reference
Step 6 (test-build-unified-tree-check-all) failure: test (failure)
******************** TEST 'SanitizerCommon-lsan-powerpc64le-Linux :: Linux/getpwnam_r_invalid_user.cpp' FAILED ********************
Exit Code: 134

Command Output (stderr):
--
/home/buildbots/llvm-external-buildbots/workers/ppc64le-clang-test-suite/clang-ppc64le-test-suite/build/./bin/clang  --driver-mode=g++ -gline-tables-only -fsanitize=leak  -m64 -fno-function-sections -funwind-tables  -I/home/buildbots/llvm-external-buildbots/workers/ppc64le-clang-test-suite/clang-ppc64le-test-suite/llvm-project/compiler-rt/test -ldl -O0 -g /home/buildbots/llvm-external-buildbots/workers/ppc64le-clang-test-suite/clang-ppc64le-test-suite/llvm-project/compiler-rt/test/sanitizer_common/TestCases/Linux/getpwnam_r_invalid_user.cpp -o /home/buildbots/llvm-external-buildbots/workers/ppc64le-clang-test-suite/clang-ppc64le-test-suite/build/runtimes/runtimes-bins/compiler-rt/test/sanitizer_common/lsan-powerpc64le-Linux/Linux/Output/getpwnam_r_invalid_user.cpp.tmp &&  /home/buildbots/llvm-external-buildbots/workers/ppc64le-clang-test-suite/clang-ppc64le-test-suite/build/runtimes/runtimes-bins/compiler-rt/test/sanitizer_common/lsan-powerpc64le-Linux/Linux/Output/getpwnam_r_invalid_user.cpp.tmp # RUN: at line 2
+ /home/buildbots/llvm-external-buildbots/workers/ppc64le-clang-test-suite/clang-ppc64le-test-suite/build/./bin/clang --driver-mode=g++ -gline-tables-only -fsanitize=leak -m64 -fno-function-sections -funwind-tables -I/home/buildbots/llvm-external-buildbots/workers/ppc64le-clang-test-suite/clang-ppc64le-test-suite/llvm-project/compiler-rt/test -ldl -O0 -g /home/buildbots/llvm-external-buildbots/workers/ppc64le-clang-test-suite/clang-ppc64le-test-suite/llvm-project/compiler-rt/test/sanitizer_common/TestCases/Linux/getpwnam_r_invalid_user.cpp -o /home/buildbots/llvm-external-buildbots/workers/ppc64le-clang-test-suite/clang-ppc64le-test-suite/build/runtimes/runtimes-bins/compiler-rt/test/sanitizer_common/lsan-powerpc64le-Linux/Linux/Output/getpwnam_r_invalid_user.cpp.tmp
+ /home/buildbots/llvm-external-buildbots/workers/ppc64le-clang-test-suite/clang-ppc64le-test-suite/build/runtimes/runtimes-bins/compiler-rt/test/sanitizer_common/lsan-powerpc64le-Linux/Linux/Output/getpwnam_r_invalid_user.cpp.tmp
Result: 110
getpwnam_r_invalid_user.cpp.tmp: /home/buildbots/llvm-external-buildbots/workers/ppc64le-clang-test-suite/clang-ppc64le-test-suite/llvm-project/compiler-rt/test/sanitizer_common/TestCases/Linux/getpwnam_r_invalid_user.cpp:19: int main(): Assertion `res == 0 || res == ENOENT' failed.
/home/buildbots/llvm-external-buildbots/workers/ppc64le-clang-test-suite/clang-ppc64le-test-suite/build/runtimes/runtimes-bins/compiler-rt/test/sanitizer_common/lsan-powerpc64le-Linux/Linux/Output/getpwnam_r_invalid_user.cpp.script: line 1: 1590932 Aborted                 /home/buildbots/llvm-external-buildbots/workers/ppc64le-clang-test-suite/clang-ppc64le-test-suite/build/runtimes/runtimes-bins/compiler-rt/test/sanitizer_common/lsan-powerpc64le-Linux/Linux/Output/getpwnam_r_invalid_user.cpp.tmp

--

********************


@vzakhari
Copy link
Contributor

Hi @Meinersbur, since this PR was reverted, I am not sure when you plan to try to merge it back - can you please let me know?

Can you please postpone merging this until the next week? Many people in US are out due to the Thanksgiving, and there might be some downstream failures that will have to be addressed in a timely manner.

@Meinersbur
Copy link
Member Author

@vzakhari Of course. I still haven't figured out the CMAKE_Fortran_PREPROCESS_SOURCE problem yet. Seems to be a transient problem with older versions of cmake.

Created a re-apply PR: #169638. Will not merge until I get an approval from you. So far there hasn't been any activity from NVidia.

@vzakhari
Copy link
Contributor

Thank you, @Meinersbur! Sorry, I only reviewed/tried your patch at its earlt stages and I haven't been following the recent updates. I will try it again today and will report any comments in the new PR.

augusto2112 pushed a commit to augusto2112/llvm-project that referenced this pull request Dec 3, 2025
Move building the .mod files from openmp/flang to openmp/flang-rt using
a shared mechanism. Motivations to do so are:

1. Most modules are target-dependent and need to be re-compiled for each
target separately, which is something the LLVM_ENABLE_RUNTIMES system
already does. Prime example is `iso_c_binding.mod` which encodes the
target's ABI. Most other modules have `#ifdef`-enclosed code as well.

2. CMake has support for Fortran that we should use. Among other things,
it automatically determines module dependencies so there is no need to
hardcode them in the CMakeLists.txt.

3. It allows using Fortran itself to implement Flang-RT. Currently, only
`iso_fortran_env_impl.f90` emits object files that are needed by Fortran
applications (llvm#89403). The workaround of llvm#95388 could be reverted.


Some new dependencies come into play:
* openmp depends on flang-rt for building `lib_omp.mod` and
`lib_omp_kinds.mod`. Currently, if flang-rt is not found then the
modules are not built.
* check-flang depends on flang-rt: If not found, the majority of tests
are disabled. If not building in a bootstrpping build, the location of
the module files can be pointed to using
`-DFLANG_INTRINSIC_MODULES_DIR=<path>`, e.g. in a flang-standalone
build. Alternatively, the test needing any of the intrinsic modules
could be marked with `REQUIRES: flangrt-modules`.
* check-flang depends on openmp: Not a change; tests requiring
`lib_omp.mod` and `lib_omp_kinds.mod` those are already marked with
`openmp_runtime`.

As intrinsic are now specific to the target, their location is moved
from `include/flang` to `<resource-dir>/finclude/flang/<triple>`. The
mechnism to compute the location have been moved from flang-rt
(previously used to compute the location of `libflang_rt.*.a`) to common
locations in `cmake/GetToolchainDirs.cmake` and
`runtimes/CMakeLists.txt` so they can be used by both, openmp and
flang-rt. Potentially the mechnism could also be shared by other
libraries such as compiler-rt.

`finclude` was chosen because `gfortran` uses it as well and avoids
misuse such as `#include <flang/iso_c_binding.mod>`. The search location
is now determined by `ToolChain` in the driver, instead of by the
frontend. Now the driver adds `-fintrinsic-module-path` for that
location to the frontend call (Just like gfortran does).
`-fintrinsic-module-path` had to be fixed for this because ironically it
was only added to `searchDirectories`, but not
`intrinsicModuleDirectories_`. Since the driver determines the location,
tests invoking `flang -fc1` and `bbc` must also be passed the location
by llvm-lit. This works like llvm-lit does for finding the include dirs
for Clang using `-print-file-name=...`.
kcloudy0717 pushed a commit to kcloudy0717/llvm-project that referenced this pull request Dec 4, 2025
Move building the .mod files from openmp/flang to openmp/flang-rt using
a shared mechanism. Motivations to do so are:

1. Most modules are target-dependent and need to be re-compiled for each
target separately, which is something the LLVM_ENABLE_RUNTIMES system
already does. Prime example is `iso_c_binding.mod` which encodes the
target's ABI. Most other modules have `#ifdef`-enclosed code as well.

2. CMake has support for Fortran that we should use. Among other things,
it automatically determines module dependencies so there is no need to
hardcode them in the CMakeLists.txt.

3. It allows using Fortran itself to implement Flang-RT. Currently, only
`iso_fortran_env_impl.f90` emits object files that are needed by Fortran
applications (llvm#89403). The workaround of llvm#95388 could be reverted.


Some new dependencies come into play:
* openmp depends on flang-rt for building `lib_omp.mod` and
`lib_omp_kinds.mod`. Currently, if flang-rt is not found then the
modules are not built.
* check-flang depends on flang-rt: If not found, the majority of tests
are disabled. If not building in a bootstrpping build, the location of
the module files can be pointed to using
`-DFLANG_INTRINSIC_MODULES_DIR=<path>`, e.g. in a flang-standalone
build. Alternatively, the test needing any of the intrinsic modules
could be marked with `REQUIRES: flangrt-modules`.
* check-flang depends on openmp: Not a change; tests requiring
`lib_omp.mod` and `lib_omp_kinds.mod` those are already marked with
`openmp_runtime`.

As intrinsic are now specific to the target, their location is moved
from `include/flang` to `<resource-dir>/finclude/flang/<triple>`. The
mechnism to compute the location have been moved from flang-rt
(previously used to compute the location of `libflang_rt.*.a`) to common
locations in `cmake/GetToolchainDirs.cmake` and
`runtimes/CMakeLists.txt` so they can be used by both, openmp and
flang-rt. Potentially the mechnism could also be shared by other
libraries such as compiler-rt.

`finclude` was chosen because `gfortran` uses it as well and avoids
misuse such as `#include <flang/iso_c_binding.mod>`. The search location
is now determined by `ToolChain` in the driver, instead of by the
frontend. Now the driver adds `-fintrinsic-module-path` for that
location to the frontend call (Just like gfortran does).
`-fintrinsic-module-path` had to be fixed for this because ironically it
was only added to `searchDirectories`, but not
`intrinsicModuleDirectories_`. Since the driver determines the location,
tests invoking `flang -fc1` and `bbc` must also be passed the location
by llvm-lit. This works like llvm-lit does for finding the include dirs
for Clang using `-print-file-name=...`.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.