-
Describe the bugI want to convert the model.ckpt file in lightning directory to onnx or openvino format separately and save metadata.json at the same time. Is there any convenient way to convert? DatasetN/A ModelN/A Steps to reproduce the behaviorNone OS informationOS information:
Expected behaviorCan you give me sample code? ScreenshotsNo response Pip/GitHubpip What version/branch did you use?No response Configuration YAMLany LogsNone Code of Conduct
|
Beta Was this translation helpful? Give feedback.
Replies: 5 comments
-
Here is my code. the model that comes out of this code cannot detect anything
|
Beta Was this translation helpful? Give feedback.
-
Exporting is part of the training, but in this case, where you don't want to re-train, I'm not sure what exactly would be the solution. Your code looks fine, but I think that the model doesn't have the weight initialized. If you look inside get_model you can see that state dict is loaded from relative path from project root saved in Additionally, are you using provided OpenVINO inferencer, because there are still pre-processing and post-processing steps involved in successful inference. |
Beta Was this translation helpful? Give feedback.
-
I just found that the model.ckpt file in the |
Beta Was this translation helpful? Give feedback.
-
model.ckpt in lighting should be the one saved after training. You can load it in the way I said in the comment above, or you should add
to your code. Withe the current code you are not loading the weights at all. |
Beta Was this translation helpful? Give feedback.
-
Ok, thank you. I just realized that I didn't load weights |
Beta Was this translation helpful? Give feedback.
model.ckpt in lighting should be the one saved after training. You can load it in the way I said in the comment above, or you should add
to your code. Withe the current code you are not loading the weights at all.