-
-
Notifications
You must be signed in to change notification settings - Fork 16.7k
Closed
Description
When I loaded an ultralytics YOLO model outside the framework, I read the image from the client in a request processing function and performed inference. I found that the inference speed was 10 times slower than normal.
I don’t have this problem when I use fastapi.
from flask import Flask, request
from ultralytics import YOLO
import cv2
import numpy
import base64
model = YOLO("/workspace/yolov8s.pt")
def b64_cv(frame_b64):
return cv2.imdecode(numpy.frombuffer(base64.b64decode(frame_b64), numpy.uint8), cv2.IMREAD_COLOR)
app = Flask(__name__)
@app.route('/frame', methods=['POST'])
def read_item():
data = request.json
frame = data.get('frame', None)
results = model(source=b64_cv(frame))
return {}
if __name__ == '__main__':
app.run(host='0.0.0.0', port=8000)import requests
import cv2
import numpy
import base64
def cv_b64(frame_cv):
return str(base64.b64encode(cv2.imencode(".jpg", frame_cv)[1]))[2:-1]
while True:
stream = cv2.VideoCapture("/workspace/road_30_1920x1080.mp4")
while stream.isOpened():
ret, frame = stream.read()
if not ret: break
data = {"frame": cv_b64(frame)}
response = requests.post(url="http://127.0.0.1:8000/frame", json=data)Environment:
- Python version: 3.10.12
- Flask version: 3.0.0
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels