generated from ultralytics/template
-
-
Notifications
You must be signed in to change notification settings - Fork 2
Closed as not planned
Labels
questionFurther information is requestedFurther information is requested
Description
Search before asking
- I have searched the Ultralytics issues and discussions and found no similar questions.
Question
I have this code:
use ultralytics_inference::{YOLOModel, InferenceConfig};
fn main()
{
// Load your custom NCNN model (auto-detects format)
let mut model = YOLOModel::load("custom_ncnn_model/").unwrap();
// Configure exactly like Python predict params
let mut config = InferenceConfig::default();
config.confidence_threshold = 0.6; // Confidence threshold
//config. = true; // Real-time display window
//config.line_thickness = 2; // Box line thickness
config.save = false; // No output saving
//config.stream = true; // Process video frame-by-frame
// Run prediction (handles video streaming + visualization automatically)
//model.predict_config("video.mp4", config);
model.predict("video.mp4").unwrap();
}However I get this error:
thread 'main' (100562) panicked at src/main.rs:6:59:
called `Result::unwrap()` on an `Err` value: ModelLoadError("Failed to load model: Load model from custom_ncnn_model/ failed:Protobuf parsing failed.")
note: run with `RUST_BACKTRACE=1` environment variable to display a backtraceDoesn't this crate support ncnn models? Do I have to use onnx instead?
Additional
No response
Metadata
Metadata
Assignees
Labels
questionFurther information is requestedFurther information is requested