Redirect llama.cpp logs into tracing#637
Conversation
If the user requests different properties for how llama.cpp is built, trigger a rebuild.
Instead of cp -r / robocopy, build from the source directory. This mildly speeds up the build although probably not noticeable on NVME drives. The cargo-cmake crate will automatically place output in the out/ folder for us. Additionally, walk the source tree to tell cargo that a rebuild is necessary if anything changes from the source. This ensures that changes in the llama.cpp code trigger a rebuild which makes hacking on things a bit easier. Looks like this copying logic was copied from sherpa-onnx given the comments seem to be copy-pasted so remove those references.
45d1b6a to
eb8542e
Compare
MarcusDunn
left a comment
There was a problem hiding this comment.
Super thoughtfully done. Thanks for the PR.
The simple example now needs a --verbose argument to be passed to have the llama.cpp logs printed to the screen.
eb8542e to
373f8c6
Compare
|
I attempted to publish this: https://github.com/utilityai/llama-cpp-rs/actions/runs/13165562823/job/36744752959 Could you take a look @vlovich |
|
Hmmmm @MarcusDunn I don't see any errors in the build aside from the tarball failing. Maybe try rerunning? Not sure what the issue is. |
Looking over your code, I don't see what could cause this, but considering the last release was fine (and this is the only new PR in the release) I think it must be from here. Any ideas? |
|
could emitting for entry in walkdir::WalkDir::new(&llama_src).into_iter().filter_entry(|e| !is_hidden(e)) {
let entry = entry.expect("Failed to obtain entry");
let rebuild = entry.file_name().to_str().map(|f| f.starts_with("CMake")).unwrap_or_default() || rebuild_on_children_of.iter().any(|src_folder| entry.path().starts_with(src_folder));
if rebuild {
println!("cargo:rerun-if-changed={}", entry.path().display());
}
} |
|
Oh that must be something to do with me cleaning up cmake to run directly from the submodule as input without copying to the output but that's supposed to work fine. Any tips on how I can repro locally? |
|
I don't think it's the rerun-if-changed |
|
|
|
https://github.com/edgenai/llama_cpp-rs/blob/main/crates/llama_cpp_sys/include/build-info.h They solve the problem like this - I'm not a huge fan of this solution, but also do not know enough to make a great alternative suggestion. |
|
Kk. Sorry I haven't fixed this yet. Planning on taking a look at it in a couple of hours. Worst case I'll undo my build cleanup but hopefully I can figure out how to make things work. Generating files within the src directory is bad form even in cmake builds |
|
Ok I think #639 should fix it. |
After this change, if you don't pass --verbose, you'll see that simple.exe doesn't print anything from llama.cpp If you do, the logs are formatted through the tracing module. This resolves issues #628