Conversation
CoreyEWood
left a comment
There was a problem hiding this comment.
Looks good to me, as long as it works as intended! Would definitely be good to keep it running for as long as possible to observe if anything pops up - I can't think of anything else that would create issues, but it's always possible to be missing something.
app/api/routes/image_queries.py
Outdated
| background_tasks.add_task(refresh_detector_metadata_if_needed, detector_id, gl) | ||
|
|
||
| confidence_threshold = confidence_threshold or detector_metadata.confidence_threshold | ||
| confidence_threshold = 0.9 # Set an arbitrary value since we cannot get one from the cloud. |
There was a problem hiding this comment.
Do you want to support using the passed in confidence_threshold if it exists? Or is it preferred to always use this hardcoded one?
There was a problem hiding this comment.
The RPC Server never passes in a confidence threshold, but rather relies on the detector's confidence threshold.
The confidence_threshold here doesn't do anything. The only reason I set it is because it is a required field for create_iq.
I added a comment to make this more clear.
app/api/routes/image_queries.py
Outdated
| # for holding edge results if and when available | ||
| results = None |
There was a problem hiding this comment.
Not important either way but this isn't needed anymore.
There was a problem hiding this comment.
Good catch. I removed it.
app/api/routes/image_queries.py
Outdated
|
|
||
| Submit an image query for a given detector. | ||
|
|
||
| This function attempts to run inference locally on the edge, if possible, |
There was a problem hiding this comment.
What this docstring says isn't totally true anymore, not sure if that's important or not though.
There was a problem hiding this comment.
Good point. I have updated it.
| @@ -64,6 +60,8 @@ async def post_image_query( # noqa: PLR0913, PLR0915, PLR0912 | |||
| app_state: AppState = Depends(get_app_state), | |||
There was a problem hiding this comment.
(This comment is meant to be on the line above this but it won't let me comment there)
If you want to simplify this further I think you could just remove gl from the parameters here, since you don't use it (and then you wouldn't have to change the get_groundlight_sdk_instance function either`). But if this way works then it's fine too.
There was a problem hiding this comment.
I have removed the gl arg. I was hesitant to do that originally, because I thought some other caller that I am unaware of might be using it, but now that I have read up on fastapi.Depends, I realize that it is not something that is passed into functions, so it is safe to remove.
I removed it and tested, and it works fine.
…dentials have expired
Changes: * now works with Balena * now totally independent of the cloud service (models are stored in s3 and fetched from there) --------- Co-authored-by: Auto-format Bot <autoformatbot@groundlight.ai>
A modified version of the Edge Endpoint that only supports edge inference. We won't actually merge these changes, but rather keep them in their own branch and deploy from here as needed.
Deploy it normally with an API token and some configured detectors.
After initial deployment, you can delete the API token and reboot and it will continue to serve edge inferences for the configured detectors.
Related PR: https://github.com/positronix-ai/rpc-server/pull/51