Skip to content

Conversation

babichjacob
Copy link
Contributor

@babichjacob babichjacob commented Aug 18, 2025

Trying to use multimodality added in #744 (thank you everyone who contributed!), the repository can build successfully locally, but not as a dependency of another crate (going through crates.io).

Most relevant part of build error: CMake Error at CMakeLists.txt:206 (add_subdirectory): The source directory ... /llama-cpp-sys-2-0.1.118/llama.cpp/tools does not contain a CMakeLists.txt file.
  CMAKE_TOOLCHAIN_FILE_x86_64-pc-windows-msvc = None
  CMAKE_TOOLCHAIN_FILE_x86_64_pc_windows_msvc = None
  HOST_CMAKE_TOOLCHAIN_FILE = None
  CMAKE_TOOLCHAIN_FILE = None
  CMAKE_GENERATOR_x86_64-pc-windows-msvc = None
  CMAKE_GENERATOR_x86_64_pc_windows_msvc = None
  HOST_CMAKE_GENERATOR = None
  CMAKE_GENERATOR = None
  CMAKE_PREFIX_PATH_x86_64-pc-windows-msvc = None
  CMAKE_PREFIX_PATH_x86_64_pc_windows_msvc = None
  HOST_CMAKE_PREFIX_PATH = None
  CMAKE_PREFIX_PATH = None
  CMAKE_x86_64-pc-windows-msvc = None
  CMAKE_x86_64_pc_windows_msvc = None
  HOST_CMAKE = None
  CMAKE = None
  running: "cmake" "C:\\Users\\Jacob\\scoop\\persist\\rustup\\.cargo\\registry\\src\\index.crates.io-1949cf8c6b5b557f\\llama-cpp-sys-2-0.1.118\\llama.cpp" "-G" "Visual Studio 17 2022" "-Thost=x64" "-Ax64" "-DLLAMA_BUILD_TESTS=OFF" "-DLLAMA_BUILD_EXAMPLES=OFF" "-DLLAMA_BUILD_SERVER=OFF" "-DLLAMA_BUILD_TOOLS=OFF" "-DLLAMA_CURL=OFF" "-DLLAMA_BUILD_COMMON=ON" "-DLLAMA_BUILD_TOOLS=ON" "-DCMAKE_BUILD_PARALLEL_LEVEL=12" "-DBUILD_SHARED_LIBS=OFF" "-DGGML_OPENMP=ON" "-DCMAKE_INSTALL_PREFIX=R:\\target\\debug\\build\\llama-cpp-sys-2-ec27c7a4781527a7\\out" "-DCMAKE_C_FLAGS= /O2 /DNDEBUG /Ob2 -nologo -MD -Brepro" "-DCMAKE_C_FLAGS_RELEASE= /O2 /DNDEBUG /Ob2 -nologo -MD -Brepro" "-DCMAKE_CXX_FLAGS= /O2 /DNDEBUG /Ob2 -nologo -MD -Brepro" "-DCMAKE_CXX_FLAGS_RELEASE= /O2 /DNDEBUG /Ob2 -nologo -MD -Brepro" "-DCMAKE_ASM_FLAGS= -nologo -MD -Brepro" "-DCMAKE_ASM_FLAGS_RELEASE= -nologo -MD -Brepro" "-DCMAKE_BUILD_TYPE=Release"
  -- Selecting Windows SDK version 10.0.26100.0 to target Windows 10.0.22631.
  -- The C compiler identification is MSVC 19.44.35214.0
  -- The CXX compiler identification is MSVC 19.44.35214.0
  -- Detecting C compiler ABI info
  -- Detecting C compiler ABI info - done
  -- Check for working C compiler: C:/Program Files (x86)/Microsoft Visual Studio/2022/BuildTools/VC/Tools/MSVC/14.44.35207/bin/Hostx64/x64/cl.exe - skipped
  -- Detecting C compile features
  -- Detecting C compile features - done
  -- Detecting CXX compiler ABI info
  -- Detecting CXX compiler ABI info - done
  -- Check for working CXX compiler: C:/Program Files (x86)/Microsoft Visual Studio/2022/BuildTools/VC/Tools/MSVC/14.44.35207/bin/Hostx64/x64/cl.exe - skipped
  -- Detecting CXX compile features
  -- Detecting CXX compile features - done
  -- Found Git: C:/Users/Jacob/scoop/shims/git.exe (found version "2.50.1.windows.1")
  -- Performing Test CMAKE_HAVE_LIBC_PTHREAD
  -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
  -- Looking for pthread_create in pthreads
  -- Looking for pthread_create in pthreads - not found
  -- Looking for pthread_create in pthread
  -- Looking for pthread_create in pthread - not found
  -- Found Threads: TRUE
  -- ccache found, compilation results will be cached. Disable with GGML_CCACHE=OFF.
  -- CMAKE_SYSTEM_PROCESSOR: AMD64
  -- CMAKE_GENERATOR_PLATFORM: x64
  -- GGML_SYSTEM_ARCH: x86
  -- Including CPU backend
  -- Found OpenMP_C: -openmp (found version "2.0")
  -- Found OpenMP_CXX: -openmp (found version "2.0")
  -- Found OpenMP: TRUE (found version "2.0")
  -- x86 detected
  -- Performing Test HAS_AVX_1
  -- Performing Test HAS_AVX_1 - Success
  -- Performing Test HAS_AVX2_1
  -- Performing Test HAS_AVX2_1 - Success
  -- Performing Test HAS_FMA_1
  -- Performing Test HAS_FMA_1 - Success
  -- Performing Test HAS_AVX512_1
  -- Performing Test HAS_AVX512_1 - Failed
  -- Performing Test HAS_AVX512_2
  -- Performing Test HAS_AVX512_2 - Failed
  -- Adding CPU backend variant ggml-cpu: /arch:AVX2 GGML_AVX2;GGML_FMA;GGML_F16C
  -- ggml version: 0.0.0
  -- ggml commit:  unknown
  -- Configuring incomplete, errors occurred!

--- stderr
fatal: not a git repository (or any of the parent directories): .git
fatal: not a git repository (or any of the parent directories): .git
CMake Warning at common/CMakeLists.txt:32 (message):
Git repository not found; to enable automatic generation of build info,
make sure Git is installed and the project is a Git repository.

CMake Error at CMakeLists.txt:206 (add_subdirectory):
The source directory

  C:/Users/Jacob/scoop/persist/rustup/.cargo/registry/src/index.crates.io-1949cf8c6b5b557f/llama-cpp-sys-2-0.1.118/llama.cpp/tools

does not contain a CMakeLists.txt file.

thread 'main' panicked at C:\Users\Jacob\scoop\persist\rustup.cargo\registry\src\index.crates.io-1949cf8c6b5b557f\cmake-0.1.54\src\lib.rs:1119:5:

command did not execute successfully, got: exit code: 1

build script failed, must exit now
note: run with RUST_BACKTRACE=1 environment variable to display a backtrace

This fixes that by including the file for publishing in llama-cpp-sys-2's Cargo.toml file.

Will continue testing to see if other files like tools/mtmd/CMakeLists.txt also need to be included.

@babichjacob babichjacob marked this pull request as draft August 19, 2025 00:01
@babichjacob babichjacob changed the title Include llama.cpp/tools/CMakeLists.txt for publishing in llama-cpp-sys-2 Include mtmd files for publishing in llama-cpp-sys-2 Aug 19, 2025
@babichjacob
Copy link
Contributor Author

Paused work on this after a few minutes once

  CMake Error at tools/CMakeLists.txt:17 (add_subdirectory):
    add_subdirectory given source "batched-bench" which is not an existing
    directory.


  CMake Error at tools/CMakeLists.txt:18 (add_subdirectory):
    add_subdirectory given source "gguf-split" which is not an existing
    directory.


  CMake Error at tools/CMakeLists.txt:19 (add_subdirectory):
    add_subdirectory given source "imatrix" which is not an existing directory.


  CMake Error at tools/CMakeLists.txt:20 (add_subdirectory):
    add_subdirectory given source "llama-bench" which is not an existing
    directory.


  CMake Error at tools/CMakeLists.txt:21 (add_subdirectory):
    add_subdirectory given source "main" which is not an existing directory.


  CMake Error at tools/CMakeLists.txt:22 (add_subdirectory):
    add_subdirectory given source "perplexity" which is not an existing
    directory.


  CMake Error at tools/CMakeLists.txt:23 (add_subdirectory):
    add_subdirectory given source "quantize" which is not an existing
    directory.


  CMake Error at tools/CMakeLists.txt:27 (add_subdirectory):
    add_subdirectory given source "run" which is not an existing directory.


  CMake Error at tools/CMakeLists.txt:28 (add_subdirectory):
    add_subdirectory given source "tokenize" which is not an existing
    directory.


  CMake Error at tools/CMakeLists.txt:29 (add_subdirectory):
    add_subdirectory given source "tts" which is not an existing directory.


  CMake Error at tools/CMakeLists.txt:36 (add_subdirectory):
    add_subdirectory given source "cvector-generator" which is not an existing
    directory.


  CMake Error at tools/CMakeLists.txt:37 (add_subdirectory):
    add_subdirectory given source "export-lora" which is not an existing
    directory.

all these directories that need to be included showed up and I started to wonder if the entire llama.cpp repository should be included or if we should continue figuring out what precisely is needed.

(I'll come back - energy allowing - later)

@MarcusDunn
Copy link
Contributor

all these directories that need to be included showed up and I started to wonder if the entire llama.cpp repository should be included or if we should continue figuring out what precisely is needed.

We used to do this but ran into crates.io upload size limits.

@fidoriel
Copy link

@MarcusDunn I think it is possible to request bigger crates upload sizes. Default is 10MiB I think, but had a PR within another project were the maintainer got bigger sys crates allowed.

Until then, I think the git repo is the best way to go to use mtmd?

hongkongkiwi added a commit to hongkongkiwi/llama-cpp-rs-2 that referenced this pull request Aug 23, 2025
…feature

Add missing tools/CMakeLists.txt and tools/mtmd/CMakeLists.txt to the include
patterns in Cargo.toml. These files are required when building with the 'mtmd'
feature enabled, as CMake attempts to build the tools directory but the
CMakeLists.txt files were not included in the published crate.

Fixes build error: "CMake Error: The source directory [...]/tools does not contain a CMakeLists.txt file"

Resolves issues identified in PR utilityai#806 for MTMD multimodal support.

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <[email protected]>
hongkongkiwi added a commit to hongkongkiwi/llama-cpp-rs-2 that referenced this pull request Aug 23, 2025
… building

Replace static tools CMakeLists.txt approach with dynamic generation based on
enabled features. This creates a scalable, merge-friendly solution for our
feature branches.

Key improvements:
- Dynamic CMakeLists.txt generation via generate_tools_cmake() function
- Only builds tools for enabled features (currently just MTMD)
- Designed for easy extension by other feature branches
- No static files to conflict during merges
- Clear extension points with commented examples

Architecture:
- generate_tools_cmake() creates tools/CMakeLists.txt at build time
- any_tool_features flag determines if tools directory should be built
- Each feature branch can add their tool by uncommenting their section

This solves the original PR utilityai#806 issue (avoiding building all tools) while
providing a foundation for RPC, server, quantize, and other tool features.

Future branches can easily add their tools by:
1. Adding their feature to any_tool_features check
2. Uncommenting their add_subdirectory() line in generate_tools_cmake()
3. Including their tool's CMakeLists.txt in Cargo.toml

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <[email protected]>
hongkongkiwi added a commit to hongkongkiwi/llama-cpp-rs-2 that referenced this pull request Aug 23, 2025
…r split-model-loading

This commit adds the scalable dynamic tools building system to the split-model-loading branch:

- Adds generate_tools_cmake() function to dynamically create tools/CMakeLists.txt
- Only builds tools for enabled features (solving PR utilityai#806 issue)
- Split model loading doesn't require tools but maintains architecture consistency
- Includes tools/CMakeLists.txt in Cargo.toml for build system compatibility
- Uses feature-based conditional compilation for future extensibility

This creates a merge-friendly architecture where each feature branch can extend
tool building without conflicts.

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <[email protected]>
hongkongkiwi added a commit to hongkongkiwi/llama-cpp-rs-2 that referenced this pull request Aug 23, 2025
Merges the complete MTMD (multimodal) feature implementation including:
- Dynamic tools CMakeLists.txt generation system
- MTMD library and CLI tool building when feature is enabled
- Solves PR utilityai#806 issue by only building tools for enabled features
- Maintains backward compatibility and merge-friendly architecture
- Includes all necessary MTMD source files and CMakeLists.txt files

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants