Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
30 changes: 15 additions & 15 deletions surveillance-system/services.pkl
Original file line number Diff line number Diff line change
Expand Up @@ -5,45 +5,45 @@ bindings {
name = "cameraCapture"
`local` = true
scheme = "http"
host = "services"
port = 8000
endPoint = "/bell/on"
host = "localhost"
port = 8001
endPoint = "/capture"
method = "POST"
}
new HttpServiceImplementationBinding {
name = "detectPersons"
`local` = false
scheme = "http"
host = "services"
port = 8000
endPoint = "/bell/off"
host = "localhost"
port = 8002
endPoint = "/detect"
method = "POST"
}
new HttpServiceImplementationBinding {
name = "alarmOn"
`local` = false
scheme = "http"
host = "services"
port = 8000
endPoint = "/gate/up"
host = "localhost"
port = 8001
endPoint = "/alarm/on"
method = "POST"
}
new HttpServiceImplementationBinding {
name = "alarmOff"
`local` = false
scheme = "http"
host = "services"
port = 8000
endPoint = "/gate/down"
host = "localhost"
port = 8001
endPoint = "/alarm/off"
method = "POST"
}
new HttpServiceImplementationBinding {
name = "analyze"
`local` = false
scheme = "http"
host = "services"
port = 8000
endPoint = "/light/on"
host = "localhost"
port = 8003
endPoint = "/analyze"
method = "POST"
}
}
28 changes: 28 additions & 0 deletions surveillance-system/services/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
# Services

All services required by the video surveillance use case are found in [services](services) sub folder.
- `iotservices`<br>
Camera service which simulates image capturing. Adds random noise to each image to ensure uniqueness.
- `edgeservices`<br>
Person detection service which checks whether the provided image contains any persons.
- `cloudservices`<br>
Service which simulates face analysis. Marks persons as threats (20% chance) or no threat (otherwise).

Used by both the Cirrina and Sonataflow version of the smart factory use case.

## Usage with Docker Compose

To run all services locally on a single device, you can use Docker Compose:

```
docker compose up
```

By default, the services will use protobuf to serialize/deserialize response/request data. This can be disabled by
setting an environment variable beforehand:

```
PROTO=false docker compose up
```

Individual Dockerfiles for separate manual builds can be found in the respective service folders.
31 changes: 31 additions & 0 deletions surveillance-system/services/cloudservices/ContextVariable_pb2.py

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

13 changes: 13 additions & 0 deletions surveillance-system/services/cloudservices/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
FROM python:3.11

COPY requirements.txt /tmp/requirements.txt
RUN pip install --no-cache-dir -r /tmp/requirements.txt

WORKDIR /app
COPY ContextVariable_pb2.py .
COPY Event_pb2.py .
COPY services.py .

ENV PROTO="true"

CMD ["uvicorn", "services:app", "--host", "0.0.0.0", "--workers", "33", "--port", "8000"]
30 changes: 30 additions & 0 deletions surveillance-system/services/cloudservices/Event_pb2.py

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

6 changes: 6 additions & 0 deletions surveillance-system/services/cloudservices/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
numpy<2
fastapi==0.104.1
opencv-python-headless==4.8.1.78
opencv-contrib-python-headless==4.8.1.78
uvicorn
protobuf
109 changes: 109 additions & 0 deletions surveillance-system/services/cloudservices/services.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,109 @@
import ContextVariable_pb2

import cv2
import uvicorn

import numpy as np

from fastapi import FastAPI, Request, HTTPException, Response

import json
import os
import random
import base64
import time
import hashlib

local_random = random.Random(os.getpid())

app = FastAPI()

# Decide whether protobuf should be used (optional environment variable)
proto = "PROTO" not in os.environ or os.environ["PROTO"].lower() in ["true", "t", "1"]


# For logging detection times
def log_hash(data: bytes):
# Compute a consistent hash
sha256 = hashlib.sha256()
sha256.update(data)
hash = sha256.hexdigest()

# Acquire the current timestamp in milliseconds
timestamp = time.time_ns() / 1_000_000.0
log_entry = f"{hash},{timestamp}\n"

# Append to log file
with open("/tmp/log_analysis.csv", "a") as log_file:
log_file.write(log_entry)


@app.post("/analyze")
async def detect(request: Request):
time_start = time.time_ns() / 1_000_000.0

if proto:
# Read the raw request body
body = await request.body()

# Parse the protobuf message
context_variables = ContextVariable_pb2.ContextVariables()
context_variables.ParseFromString(body)

# Extract the image from the protobuf message
image_bytes = None
for context_variable in context_variables.data:
if context_variable.name == "image":
image_bytes = context_variable.value.bytes
break
else:
# Parse the request variables
context_variables = await request.json()

# Get the image from the request json
if "image" in context_variables:
image_bytes = base64.b64decode(context_variables["image"].encode("utf-8"))

if image_bytes is None:
raise HTTPException(status_code=400, detail="No image provided in the request")

# Read the image
np_arr = np.frombuffer(image_bytes, np.uint8)
image = cv2.imdecode(np_arr, cv2.IMREAD_COLOR)

if image is None:
raise HTTPException(status_code=400, detail="Failed to decode the image")

is_threat = local_random.random() < (1.0 / 5.0)

# Prepare output data
if proto:
# Create response protobuf message
response_context_variables = ContextVariable_pb2.ContextVariables()
threats_context_variable = ContextVariable_pb2.ContextVariable(
name="hasThreat",
value=ContextVariable_pb2.Value(bool=is_threat),
)
response_context_variables.data.append(threats_context_variable)

# Serialize the response to protobuf format
response = response_context_variables.SerializeToString()
media_type = "application/x-protobuf"
else:
response = json.dumps({"hasThreat": is_threat})
media_type = "application/json"

time_end = time.time_ns() / 1_000_000.0

with open("/tmp/time_analysis.csv", "a") as log_file:
log_file.write(f"{time_end - time_start}\n")

# Log data
log_hash(image_bytes)

# Return the protobuf response
return Response(content=response, media_type=media_type)


if __name__ == "__main__":
uvicorn.run(app, host="0.0.0.0", port=8000)
25 changes: 25 additions & 0 deletions surveillance-system/services/compose.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
services:
iotservices:
build:
context: ./iotservices
dockerfile: Dockerfile
ports:
- "8001:8000"
environment:
- "PROTO=${PROTO:-true}"
edgeservices:
build:
context: ./edgeservices
dockerfile: Dockerfile
ports:
- "8002:8000"
environment:
- "PROTO=${PROTO:-true}"
cloudservices:
build:
context: ./cloudservices
dockerfile: Dockerfile
ports:
- "8003:8000"
environment:
- "PROTO=${PROTO:-true}"
31 changes: 31 additions & 0 deletions surveillance-system/services/edgeservices/ContextVariable_pb2.py

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

13 changes: 13 additions & 0 deletions surveillance-system/services/edgeservices/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
FROM python:3.11

COPY requirements.txt /tmp/requirements.txt
RUN pip install --no-cache-dir -r /tmp/requirements.txt

WORKDIR /app
COPY ContextVariable_pb2.py .
COPY Event_pb2.py .
COPY services.py .

ENV PROTO="true"

CMD ["uvicorn", "services:app", "--host", "0.0.0.0", "--workers", "33", "--port", "8000"]
30 changes: 30 additions & 0 deletions surveillance-system/services/edgeservices/Event_pb2.py

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Loading