Hi Team,
Starting from 0.10.0, torchServe introduced open_inference_grpc.proto to allow Pytorch GRPC APIs to follow Kserve open inference V2 protocol. However, I am wondering why the package name used for the proto is different from what's used in Kserve. Having a different package name would require Pytorch model and non-Pytorch model to use different proto definitions even though they both follow the open inference protocol. I am wondering if it is possible to make the open_inference_grpc.proto within the same package as what is defined in Kserve grpc_predict_v2.proto?
Thank you.