Skip to content

Does this crate support ncnn? #64

@Raj2032

Description

@Raj2032

Search before asking

  • I have searched the Ultralytics issues and discussions and found no similar questions.

Question

I have this code:

use ultralytics_inference::{YOLOModel, InferenceConfig};

fn main()
{
    // Load your custom NCNN model (auto-detects format)
    let mut model = YOLOModel::load("custom_ncnn_model/").unwrap();
    
    // Configure exactly like Python predict params
    let mut config = InferenceConfig::default();
    config.confidence_threshold = 0.6;           // Confidence threshold
    //config. = true;          // Real-time display window
    //config.line_thickness = 2;   // Box line thickness
    config.save = false;         // No output saving
    //config.stream = true;        // Process video frame-by-frame

    // Run prediction (handles video streaming + visualization automatically)
    //model.predict_config("video.mp4", config);
    model.predict("video.mp4").unwrap();
}

However I get this error:

thread 'main' (100562) panicked at src/main.rs:6:59:
called `Result::unwrap()` on an `Err` value: ModelLoadError("Failed to load model: Load model from custom_ncnn_model/ failed:Protobuf parsing failed.")
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace

Doesn't this crate support ncnn models? Do I have to use onnx instead?

Additional

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionFurther information is requested

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions