Skip to content

树莓派4B使用C++部署时,向前推理createSession()返回Segmentation fault #275

@codingSellena

Description

@codingSellena

按照文章https://zhuanlan.zhihu.com/p/672633849的步骤在树莓派4B上部署yolov5lite,使用提供的int8量化模型https://drive.google.com/drive/folders/1mSU8g94c77KKsHC-07p5V3tJOZYPQ-g6?usp=sharing,在树莓派中进行推理,得到Segmentation fault

pi@raspberrypi:~/MNN/MNN_demo/mnn/build $ ./main /home/pi/MNN/MNN_demo/mnn/v5lite-e-mnnd-i8-ppog.mnn create Session Start
The device support i8sdot:0, support fp16:0, support i8mm: 0
The device support i8sdot:0, support fp16:0, support i8mm: 0
Segmentation fault

使用cpp/mnn/src/main.cpp代码如下

   std::shared_ptr<MNN::Interpreter> net = std::shared_ptr<MNN::Interpreter>(MNN::Interpreter::createFromFile(model_name.c_str()));
    if (nullptr == net)
    {
	printf("nullptr==net");
        return 0;
    }

    MNN::ScheduleConfig config;
    config.numThread = 4;
    config.type = static_cast<MNNForwardType>(MNN_FORWARD_CPU);
    MNN::BackendConfig backendConfig;
    //backendConfig.precision = (MNN::BackendConfig::PrecisionMode)1;
    backendConfig.precision = MNN::BackendConfig::Precision_Low_BF16;
    config.backendConfig = &backendConfig;
    
    printf("create Session Start\n");
    MNN::Session *session = net->createSession(config);
    if (nullptr == session) {  
    // 处理错误:创建会话失败  
    printf("create Session Fail");
  
    return -1; // 假设 -1 表示错误  
    }  
    printf("create Session Finish");
    std::vector<BoxInfo> bbox_collection;
    cv::Mat image;
    MatInfo mmat_objection;
    mmat_objection.inpSize = 320;

测试发现,使用./GetMNNInfo命令也无法获得int8量化模型信息,利用我自己训练得到的模型,检查onnx->mnn->i8mnn中的MNN模型能够打印模型信息,查看GetMNNInfo.cpp文件,错误代码行应该为'std::shared_ptr module(Module::load(empty, empty, argv[1]));'

任何回复都非常感谢!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions