Skip to content

Conversation

@bugparty
Copy link
Owner

@bugparty bugparty commented Sep 1, 2025

Summary

  • document cmake build steps and curl dependency in AGENTS

Testing

  • cmake -B build
  • cmake --build build --config Release -j 8 (fails: command interrupted while building llama-server)
  • pre-commit run --files AGENTS.md (fails: dependency conflict installing flake8-no-print)

https://chatgpt.com/codex/tasks/task_e_68b4db828abc832796e948429e521d6f


Important

Adds AGENTS.md with build instructions, detailing prerequisites and build commands for source compilation.

  • Documentation:
    • Adds AGENTS.md with build instructions for compiling from source.
    • Details prerequisites: CMake, C/C++ compiler, libcurl development files.
    • Provides Debian/Ubuntu specific installation commands.
    • Includes build commands with options for parallel compilation and disabling curl.

This description was created by Ellipsis for c72d33b. You can customize this summary. It will automatically update as commits are pushed.

Summary by CodeRabbit

  • Documentation
    • Introduced AGENTS.md with step-by-step build-from-source instructions.
    • Outlines prerequisites (CMake, C/C++ compiler toolchain, libcurl development headers).
    • Provides Debian/Ubuntu setup commands using apt-get.
    • Details build commands: cmake -B build and cmake --build build --config Release.
    • Notes curl support is enabled by default and documents optional flags: -j for parallel builds and -DLLAMA_CURL=OFF to disable curl.

@coderabbitai
Copy link

coderabbitai bot commented Sep 1, 2025

Walkthrough

Adds AGENTS.md documenting build-from-source prerequisites, Debian/Ubuntu package installs, CMake configure/build commands, default curl support, and optional flags for parallelism and disabling curl.

Changes

Cohort / File(s) Summary
Docs — Build From Source
AGENTS.md
New documentation file with prerequisites (CMake, C/C++ toolchain, libcurl dev headers), apt-get install examples, CMake configure/build steps, note on default curl support, and optional flags (-j, -DLLAMA_CURL=OFF).

Estimated code review effort

🎯 1 (Trivial) | ⏱️ ~3 minutes

Poem

I thump my paws, a builder’s cheer,
New scrolls appear, the path is clear—
cmake hops, the targets near,
Curl by default, no need to fear;
Flip a flag, disable here—
Parallel nibble, compile gear—
Ship it swift, my warren dear! 🐇🛠️

✨ Finishing Touches
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch codex/document-cmake-build-process-in-agents.md

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

CodeRabbit Commands (Invoked using PR/Issue comments)

Type @coderabbitai help to get the list of available commands.

Other keywords and placeholders

  • Add @coderabbitai ignore or @coderabbit ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Status, Documentation and Community

  • Visit our Status Page to check the current availability of CodeRabbit.
  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@bugparty bugparty changed the base branch from master to dev September 1, 2025 00:14
Copy link

@ellipsis-dev ellipsis-dev bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Important

Looks good to me! 👍

Reviewed everything up to c72d33b in 32 seconds. Click for details.
  • Reviewed 27 lines of code in 1 files
  • Skipped 0 files when reviewing.
  • Skipped posting 3 draft comments. View those below.
  • Modify your settings and rules to customize what types of comments Ellipsis leaves. And don't forget to react with 👍 or 👎 to teach Ellipsis.
1. AGENTS.md:10
  • Draft comment:
    Consider adding a note for non-Debian systems (e.g., Fedora, Arch) as the instructions are Debian/Ubuntu specific.
  • Reason this comment was not posted:
    Confidence changes required: 33% <= threshold 50% None
2. AGENTS.md:8
  • Draft comment:
    Explain why libcurl is required and what features depend on it.
  • Reason this comment was not posted:
    Confidence changes required: 33% <= threshold 50% None
3. AGENTS.md:21
  • Draft comment:
    Include a concrete example for parallel compilation (e.g., 'cmake --build build --config Release -j8').
  • Reason this comment was not posted:
    Confidence changes required: 33% <= threshold 50% None

Workflow ID: wflow_7PF3J9RFD55J5d8R

You can customize Ellipsis by changing your verbosity settings, reacting with 👍 or 👎, replying to comments, or adding code review rules.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (4)
AGENTS.md (4)

5-9: Broaden prerequisites by OS and add pkg-config

Add OS-specific hints and pkg-config to reduce first-time setup friction, especially for finding libcurl.

Apply:

 - Install prerequisites:
   - CMake
   - C/C++ compiler toolchain (e.g. `build-essential`)
   - Development files for libcurl (curl support is enabled by default)
+ - Install prerequisites:
+   - CMake
+   - C/C++ compiler toolchain (e.g. `build-essential`)
+   - pkg-config
+   - Development files for libcurl (curl support is enabled by default)
+
+  On macOS (Homebrew):
+  ```bash
+  brew install cmake pkg-config curl
+  ```
+
+  On Windows:
+  - Install “Visual Studio 2022 Build Tools” (C++ workload) or full Visual Studio.
+  - Optional: install curl via vcpkg if enabling curl: `vcpkg install curl[openssl]`

10-14: Tweak Debian/Ubuntu packages

pkg-config is commonly needed by CMake find modules. Consider adding it.

-  sudo apt-get install build-essential cmake libcurl4-openssl-dev
+  sudo apt-get install build-essential cmake pkg-config libcurl4-openssl-dev

3-21: Optional: add a brief Troubleshooting section

Given the reported failures, a short section helps newcomers past common blockers (missing curl headers, wrong generator, server not enabled).

Append:

+## Troubleshooting
+ - CMake cannot find curl: ensure libcurl dev headers are installed and `pkg-config --exists libcurl` succeeds.
+ - `--config Release` has no effect: use `-DCMAKE_BUILD_TYPE=Release` on Makefiles/Ninja.
+ - Building server fails: verify the server option/target is enabled (see above) and curl is available.

21-21: Document CMake curl and server build flags

Confirmed CMake defines LLAMA_CURL (root CMakeLists.txt) and LLAMA_BUILD_SERVER (root CMakeLists.txt), and the server executable is llama-server (tools/server/CMakeLists.txt). Update AGENTS.md accordingly:

- Add `-j` to the build command for parallel compilation or pass `-DLLAMA_CURL=OFF` to disable curl.
+ Use `--parallel` for faster builds. To disable curl, pass `-DLLAMA_CURL=OFF`.  
+ To enable and build the server example, pass `-DLLAMA_BUILD_SERVER=ON` and build the `llama-server` target:
+ ```bash
+ cmake --build build --config Release --parallel
+ cmake --build build --config Release --target llama-server
+ ```
📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

💡 Knowledge Base configuration:

  • MCP integration is disabled by default for public repositories
  • Jira integration is disabled by default for public repositories
  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between ef47691 and c72d33b.

📒 Files selected for processing (1)
  • AGENTS.md (1 hunks)
🧰 Additional context used
🪛 LanguageTool
AGENTS.md

[grammar] ~5-~5: There might be a mistake here.
Context: ...ld from source - Install prerequisites: - CMake - C/C++ compiler toolchain (e.g....

(QB_NEW_EN)


[grammar] ~6-~6: There might be a mistake here.
Context: ...urce - Install prerequisites: - CMake - C/C++ compiler toolchain (e.g. `build-es...

(QB_NEW_EN)


[grammar] ~7-~7: There might be a mistake here.
Context: ...piler toolchain (e.g. build-essential) - Development files for libcurl (curl supp...

(QB_NEW_EN)

🔇 Additional comments (1)
AGENTS.md (1)

1-4: Good starting structure and scoping

Clear headings and scope. Nice to see curl called out up front.

Comment on lines +16 to +20
- Configure and build:
```bash
cmake -B build
cmake --build build --config Release
```
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Use generator-correct Release config and portable parallel flag

On single-config generators (Unix Makefiles/Ninja), --config Release is ignored; set -DCMAKE_BUILD_TYPE=Release. Use --parallel (CMake ≥3.12) instead of relying on -j passthrough.

- - Configure and build:
-   ```bash
-   cmake -B build
-   cmake --build build --config Release
-   ```
+ - Configure and build:
+   - Single-config generators (Unix Makefiles/Ninja):
+     ```bash
+     cmake -B build -DCMAKE_BUILD_TYPE=Release
+     cmake --build build --parallel
+     ```
+   - Multi-config generators (Visual Studio/Xcode):
+     ```bash
+     cmake -B build
+     cmake --build build --config Release --parallel
+     ```
🤖 Prompt for AI Agents
In AGENTS.md around lines 16 to 20, the build instructions use --config Release
and assume -j passthrough which is incorrect for single-config generators;
update the docs to show two cases: for single-config generators (Unix
Makefiles/Ninja) add -DCMAKE_BUILD_TYPE=Release when invoking cmake -B build and
use cmake --build build --parallel, and for multi-config generators (Visual
Studio/Xcode) keep cmake -B build and use cmake --build build --config Release
--parallel; replace the existing two-line snippet with these two short examples
and note which generators they apply to.

bugparty pushed a commit that referenced this pull request Sep 17, 2025
…gml-org#16038)

Initalizing RESERVED_NAME in is_reserved_name() is not thread
safe and leads to corrupted memory when used from multiple threads
as can be seen in the asan trace below. This fixes the initialization
to make it thread-safe.

    #0 0x000100abd018 in std::__1::pair<std::__1::__hash_iterator<std::__1::__hash_node<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>, void*>*>, bool> std::__1::__hash_table<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>, std::__1::hash<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>>, std::__1::equal_to<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>>, std::__1::allocator<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>>>::__emplace_unique_key_args<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>> const&>(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>> const&, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>> const&) __hash_table:1565
    #1 0x000100ab0320 in SchemaConverter::visit(nlohmann::json_abi_v3_12_0::basic_json<nlohmann::json_abi_v3_12_0::ordered_map, std::__1::vector, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>, bool, long long, unsigned long long, double, std::__1::allocator, nlohmann::json_abi_v3_12_0::adl_serializer, std::__1::vector<unsigned char, std::__1::allocator<unsigned char>>, void> const&, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>> const&) json-schema-to-grammar.cpp:802
    ggml-org#2 0x000100aafc48 in std::__1::__function::__func<build_grammar(std::__1::function<void (common_grammar_builder const&)> const&, common_grammar_options const&)::$_2, std::__1::allocator<build_grammar(std::__1::function<void (common_grammar_builder const&)> const&, common_grammar_options const&)::$_2>, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>> (std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>> const&, nlohmann::json_abi_v3_12_0::basic_json<nlohmann::json_abi_v3_12_0::ordered_map, std::__1::vector, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>, bool, long long, unsigned long long, double, std::__1::allocator, nlohmann::json_abi_v3_12_0::adl_serializer, std::__1::vector<unsigned char, std::__1::allocator<unsigned char>>, void> const&)>::operator()(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>> const&, nlohmann::json_abi_v3_12_0::basic_json<nlohmann::json_abi_v3_12_0::ordered_map, std::__1::vector, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>, bool, long long, unsigned long long, double, std::__1::allocator, nlohmann::json_abi_v3_12_0::adl_serializer, std::__1::vector<unsigned char, std::__1::allocator<unsigned char>>, void> const&) function.h:319
    ggml-org#3 0x000100a2c938 in std::__1::__function::__func<common_chat_params_init_llama_3_x(minja::chat_template const&, templates_params const&, bool)::$_0::operator()(common_grammar_builder const&) const::'lambda'(nlohmann::json_abi_v3_12_0::basic_json<nlohmann::json_abi_v3_12_0::ordered_map, std::__1::vector, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>, bool, long long, unsigned long long, double, std::__1::allocator, nlohmann::json_abi_v3_12_0::adl_serializer, std::__1::vector<unsigned char, std::__1::allocator<unsigned char>>, void> const&), std::__1::allocator<common_chat_params_init_llama_3_x(minja::chat_template const&, templates_params const&, bool)::$_0::operator()(common_grammar_builder const&) const::'lambda'(nlohmann::json_abi_v3_12_0::basic_json<nlohmann::json_abi_v3_12_0::ordered_map, std::__1::vector, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>, bool, long long, unsigned long long, double, std::__1::allocator, nlohmann::json_abi_v3_12_0::adl_serializer, std::__1::vector<unsigned char, std::__1::allocator<unsigned char>>, void> const&)>, void (nlohmann::json_abi_v3_12_0::basic_json<nlohmann::json_abi_v3_12_0::ordered_map, std::__1::vector, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>, bool, long long, unsigned long long, double, std::__1::allocator, nlohmann::json_abi_v3_12_0::adl_serializer, std::__1::vector<unsigned char, std::__1::allocator<unsigned char>>, void> const&)>::operator()(nlohmann::json_abi_v3_12_0::basic_json<nlohmann::json_abi_v3_12_0::ordered_map, std::__1::vector, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>, bool, long long, unsigned long long, double, std::__1::allocator, nlohmann::json_abi_v3_12_0::adl_serializer, std::__1::vector<unsigned char, std::__1::allocator<unsigned char>>, void> const&) function.h:319
    ggml-org#4 0x000100a139f8 in foreach_function(nlohmann::json_abi_v3_12_0::basic_json<nlohmann::json_abi_v3_12_0::ordered_map, std::__1::vector, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>, bool, long long, unsigned long long, double, std::__1::allocator, nlohmann::json_abi_v3_12_0::adl_serializer, std::__1::vector<unsigned char, std::__1::allocator<unsigned char>>, void> const&, std::__1::function<void (nlohmann::json_abi_v3_12_0::basic_json<nlohmann::json_abi_v3_12_0::ordered_map, std::__1::vector, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>, bool, long long, unsigned long long, double, std::__1::allocator, nlohmann::json_abi_v3_12_0::adl_serializer, std::__1::vector<unsigned char, std::__1::allocator<unsigned char>>, void> const&)> const&) chat.cpp:762
    ggml-org#5 0x000100a2a7f4 in std::__1::__function::__func<common_chat_params_init_llama_3_x(minja::chat_template const&, templates_params const&, bool)::$_0, std::__1::allocator<common_chat_params_init_llama_3_x(minja::chat_template const&, templates_params const&, bool)::$_0>, void (common_grammar_builder const&)>::operator()(common_grammar_builder const&) function.h:319
    ggml-org#6 0x000100aa98f4 in build_grammar(std::__1::function<void (common_grammar_builder const&)> const&, common_grammar_options const&) json-schema-to-grammar.cpp:982
    ggml-org#7 0x0001009c9314 in common_chat_params_init_llama_3_x(minja::chat_template const&, templates_params const&, bool) chat.cpp:1110
    ggml-org#8 0x0001009b8afc in common_chat_templates_apply_jinja(common_chat_templates const*, common_chat_templates_inputs const&) chat.cpp:1992
    ggml-org#9 0x0001009b533c in common_chat_templates_apply(common_chat_templates const*, common_chat_templates_inputs const&) chat.cpp:2074
    ggml-org#10 0x000100810120 in llamacpp_apply_chat_template+0x724 (predict_oai-98384e17fb94e863:arm64+0x100090120)
    ...

==45482==Register values:
 x[0] = 0x00006020004147f8   x[1] = 0x00006080000013c8   x[2] = 0x0000000000000000   x[3] = 0x0000604006289738
 x[4] = 0x0000000000000002   x[5] = 0x0000000000000001   x[6] = 0x04034000004b4000   x[7] = 0x0000000000000001
 x[8] = 0xbebebebebebebebe   x[9] = 0x17d7d7d7d7d7d7d7  x[10] = 0x00000c04000828ff  x[11] = 0x0000000000000001
x[12] = 0x000000002018d383  x[13] = 0x0000000000000000  x[14] = 0xfa0000000000fafa  x[15] = 0x000010700001ffff
x[16] = 0x000000019dc012c0  x[17] = 0x00000001021284f8  x[18] = 0x0000000000000000  x[19] = 0x00000001700acdc0
x[20] = 0x0000000000000002  x[21] = 0x000000002018d384  x[22] = 0x16dd16fd2e731151  x[23] = 0x0000007000020000
x[24] = 0x0000000100c69c08  x[25] = 0x0000000100c69c20  x[26] = 0x00006080000013c7  x[27] = 0x0000000100c69c00
x[28] = 0x00000001700acd60     fp = 0x00000001700aceb0     lr = 0x0000000100abce30     sp = 0x00000001700acd60
AddressSanitizer can not provide additional info.
SUMMARY: AddressSanitizer: SEGV __hash_table:1565 in std::__1::pair<std::__1::__hash_iterator<std::__1::__hash_node<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>, void*>*>, bool> std::__1::__hash_table<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>, std::__1::hash<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>>, std::__1::equal_to<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>>, std::__1::allocator<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>>>::__emplace_unique_key_args<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>>, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>> const&>(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>> const&, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char>> const&)
Thread T5 created by T0 here:
    #0 0x0001020b99d4 in pthread_create+0x5c (libclang_rt.asan_osx_dynamic.dylib:arm64e+0x359d4)
    #1 0x000100873910 in std::sys::pal::unix::thread::Thread::new::h77254fdd87a28e05+0x118 (predict_oai-98384e17fb94e863:arm64+0x1000f3910)
    ggml-org#2 0x0001007c7a1c in test::run_test::haeb3c2bcd5ed6cf6+0x76c (predict_oai-98384e17fb94e863:arm64+0x100047a1c)
    ggml-org#3 0x0001007aedb0 in test::console::run_tests_console::he9d142d704f3a986+0x149c (predict_oai-98384e17fb94e863:arm64+0x10002edb0)
    ggml-org#4 0x0001007c5758 in test::test_main::hf86a5e20735245b9+0x118 (predict_oai-98384e17fb94e863:arm64+0x100045758)
    ggml-org#5 0x0001007c5da0 in test::test_main_static::h61ee9c8fd30abca0+0x54 (predict_oai-98384e17fb94e863:arm64+0x100045da0)
    ...

==45482==ABORTING
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants