Skip to content

Commit f8a4a90

Browse files
authored
Merge pull request #4 from emsec/dev
- Begin to adjust stats reader to new JSON Level List - Fix CI/CD Pipeline not pulling our level repository (missing submodule init), update build steps - Adjust feedback dialogue to better communicate the maximum score - Add Bearer authentication to /metrics endpoint - Update the Grafana examples to show an instance selector and display the client error metrics
2 parents c0a1a93 + 1993afb commit f8a4a90

19 files changed

+1599
-1239
lines changed

.dockerignore

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -21,6 +21,7 @@ conf/
2121
log_merging_config.json
2222
dockerComposeMPI.yml
2323
*.csv
24+
reversim-conf
2425

2526
# Debugpy logs, WinMerge Backups etc.
2627
*.log
@@ -29,6 +30,11 @@ dockerComposeMPI.yml
2930
# Ignore statistics folder
3031
statistics/
3132

33+
# Hide deployment storage
34+
tmp/
35+
secrets/
36+
37+
3238
# Maybe ignore unnecessary code
3339
# app/statistics
3440
# app/tests

.github/workflows/deploy-image.yml

Lines changed: 32 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -1,18 +1,21 @@
11
#
22
name: Create and publish a Docker image
33

4-
# Configures this workflow to run every time a change is pushed to the branch called `main`.
4+
# Configures this workflow to run every time a change is pushed to the branch called
5+
# `main` or `dev`.
56
on:
67
push:
78
branches: ['main', 'dev']
89
tags: ['*']
910

10-
# Defines two custom environment variables for the workflow. These are used for the Container registry domain, and a name for the Docker image that this workflow builds.
11+
# Defines two custom environment variables for the workflow. These are used for the
12+
# Container registry domain, and a name for the Docker image that this workflow builds.
1113
env:
1214
REGISTRY: ghcr.io
1315
IMAGE_NAME: ${{ github.repository }}
1416

15-
# There is a single job in this workflow. It's configured to run on the latest available version of Ubuntu.
17+
# There is a single job in this workflow. It's configured to run on the latest available
18+
# version of Ubuntu.
1619
jobs:
1720
build-and-push-image:
1821
runs-on: ubuntu-latest
@@ -25,21 +28,28 @@ jobs:
2528

2629
steps:
2730
- name: Checkout repository
28-
uses: actions/checkout@v4
31+
uses: actions/checkout@v6
32+
with:
33+
submodules: true
2934

30-
# Uses the `docker/login-action` action to log in to the Container registry registry using the account and password that will publish the packages. Once published, the packages are scoped to the account defined here.
35+
# Uses the `docker/login-action` action to log in to the Container registry registry
36+
# using the account and password that will publish the packages. Once published, the
37+
# packages are scoped to the account defined here.
3138
- name: Log in to the Container registry
32-
uses: docker/login-action@65b78e6e13532edd9afa3aa52ac7964289d1a9c1
39+
uses: docker/login-action@v3
3340
with:
3441
registry: ${{ env.REGISTRY }}
3542
username: ${{ github.actor }}
3643
password: ${{ secrets.GITHUB_TOKEN }}
3744

38-
# This step uses [docker/metadata-action](https://github.com/docker/metadata-action#about) to extract tags and labels that will be applied to the specified image. The `id` "meta" allows the output of this step to be referenced in a subsequent step. The `images` value provides the base name for the tags and labels.
45+
# This step uses [docker/metadata-action](https://github.com/docker/metadata-action#about)
46+
# to extract tags and labels that will be applied to the specified image.
47+
# The `id` "meta" allows the output of this step to be referenced in a subsequent
48+
# step. The `images` value provides the base name for the tags and labels.
3949
# It will automatically create the latest Docker tag, if a git tag is found: https://github.com/docker/metadata-action?tab=readme-ov-file#latest-tag
4050
- name: Extract metadata (tags, labels) for Docker
4151
id: meta
42-
uses: docker/metadata-action@9ec57ed1fcdbf14dcef7dfbe97b2010124a938b7
52+
uses: docker/metadata-action@v5
4353
with:
4454
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
4555

@@ -51,12 +61,17 @@ jobs:
5161
calculatedSha=$(git rev-parse --short ${{ github.sha }})
5262
echo "COMMIT_SHORT_SHA=$calculatedSha" >> $GITHUB_ENV
5363
54-
# This step uses the `docker/build-push-action` action to build the image, based on your repository's `Dockerfile`. If the build succeeds, it pushes the image to GitHub Packages.
55-
# It uses the `context` parameter to define the build's context as the set of files located in the specified path. For more information, see [Usage](https://github.com/docker/build-push-action#usage) in the README of the `docker/build-push-action` repository.
56-
# It uses the `tags` and `labels` parameters to tag and label the image with the output from the "meta" step.
64+
# This step uses the `docker/build-push-action` action to build the image, based on
65+
# your repository's `Dockerfile`. If the build succeeds, it pushes the image to
66+
# GitHub Packages.
67+
# It uses the `context` parameter to define the build's context as the set of files
68+
# located in the specified path. For more information, see [Usage](https://github.com/docker/build-push-action#usage)
69+
# in the README of the `docker/build-push-action` repository.
70+
# It uses the `tags` and `labels` parameters to tag and label the image with the
71+
# output from the "meta" step.
5772
- name: Build and push Docker image
5873
id: push
59-
uses: docker/build-push-action@f2a1d5e99d037542a71f64918e516c093c6f3fc4
74+
uses: docker/build-push-action@v6
6075
with:
6176
context: .
6277
push: true
@@ -66,9 +81,12 @@ jobs:
6681
GAME_GIT_HASH=${{ github.sha }}
6782
GAME_GIT_HASH_SHORT=${{ env.COMMIT_SHORT_SHA }}
6883
69-
# This step generates an artifact attestation for the image, which is an unforgeable statement about where and how it was built. It increases supply chain security for people who consume the image. For more information, see [Using artifact attestations to establish provenance for builds](/actions/security-guides/using-artifact-attestations-to-establish-provenance-for-builds).
84+
# This step generates an artifact attestation for the image, which is an unforgeable
85+
# statement about where and how it was built. It increases supply chain security for
86+
# people who consume the image. For more information, see [Using artifact attestations
87+
# to establish provenance for builds](/actions/security-guides/using-artifact-attestations-to-establish-provenance-for-builds).
7088
- name: Generate artifact attestation
71-
uses: actions/attest-build-provenance@v2
89+
uses: actions/attest-build-provenance@v3
7290
with:
7391
subject-name: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME}}
7492
subject-digest: ${{ steps.push.outputs.digest }}

.gitignore

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,6 @@ __pycache__/
2121
env/
2222
venv/
2323
.venv/
24-
tmp/
2524

2625
# --- Debugpy logs, WinMerge Backups etc.
2726
*.log
@@ -44,3 +43,7 @@ reversim-conf
4443

4544
# --- Generated level thumbnails
4645
doc/levels
46+
47+
# --- Don't push instance relevant configuration
48+
tmp/
49+
secrets/

.vscode/settings.json

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -143,6 +143,7 @@
143143
"hreserver",
144144
"hrestudy",
145145
"htmlsafe",
146+
"httpauth",
146147
"iframe",
147148
"imgdata",
148149
"imgstring",

Dockerfile

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ ARG PROMETHEUS_MULTIPROC_DIR="/tmp/prometheus_multiproc"
1818
MAINTAINER Max Planck Institute for Security and Privacy
1919
LABEL org.opencontainers.image.authors="Max Planck Institute for Security and Privacy"
2020
# NOTE Also change the version in config.py
21-
LABEL org.opencontainers.image.version="2.1.0"
21+
LABEL org.opencontainers.image.version="2.1.1"
2222
LABEL org.opencontainers.image.licenses="AGPL-3.0-only"
2323
LABEL org.opencontainers.image.description="Ready to deploy Docker container to use ReverSim for research. ReverSim is an open-source environment for the browser, originally developed at the Max Planck Institute for Security and Privacy (MPI-SP) to study human aspects in hardware reverse engineering."
2424
LABEL org.opencontainers.image.source="https://github.com/emsec/ReverSim"
@@ -62,6 +62,7 @@ ENV PROMETHEUS_MULTIPROC_DIR=${PROMETHEUS_MULTIPROC_DIR}
6262
# Create empty statistics folders
6363
WORKDIR /usr/var/reversim-instance/statistics/LogFiles
6464
WORKDIR /usr/var/reversim-instance/statistics/canvasPics
65+
WORKDIR /usr/var/reversim-instance/secrets
6566
WORKDIR /usr/src/hregame
6667

6768
# Specify mount points for the statistics folder, levels, researchInfo & disclaimer

app/authentication.py

Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,45 @@
1+
import logging
2+
import os
3+
import secrets
4+
5+
from flask_httpauth import HTTPTokenAuth # type: ignore
6+
7+
from app.config import BEARER_TOKEN_BYTES
8+
from app.model.ApiKey import ApiKey
9+
from app.storage.database import db
10+
from app.utilsGame import safe_join
11+
12+
auth = HTTPTokenAuth(scheme='Bearer')
13+
14+
USER_METRICS = 'api_metrics'
15+
16+
@auth.verify_token # type: ignore
17+
def verifyToken(token: str) -> ApiKey|None:
18+
"""Check if this token exists. If yes return the user object, otherwise return `None`
19+
20+
https://flask-httpauth.readthedocs.io/en/latest/#flask_httpauth.HTTPTokenAuth.verify_token
21+
"""
22+
23+
return db.session.query(ApiKey).filter_by(token=token).first()
24+
25+
26+
def populate_data(instance_path: str):
27+
if ApiKey.query.count() < 1:
28+
apiKey = ApiKey(secrets.token_urlsafe(BEARER_TOKEN_BYTES), USER_METRICS)
29+
db.session.add(apiKey)
30+
db.session.commit()
31+
32+
defaultToken = db.session.query(ApiKey).where(ApiKey.user == USER_METRICS).first()
33+
if defaultToken is not None:
34+
35+
# Try to write the bearer secret to a file so other containers can use it
36+
try:
37+
folder = safe_join(instance_path, 'secrets')
38+
os.makedirs(folder, exist_ok=True)
39+
with open(safe_join(folder, 'bearer_api.txt'), encoding='UTF-8', mode='wt') as f:
40+
f.write(defaultToken.token)
41+
42+
except Exception as e:
43+
# When the file can't be created print the bearer to stdout
44+
logging.error('Could not write bearer token to file: ' + str(e))
45+
logging.info('Bearer token for /metrics endpoint: ' + defaultToken.token)

app/config.py

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,14 +16,17 @@
1616

1717
# CONFIG Current Log File Version.
1818
# NOTE Also change this in the Dockerfile
19-
LOGFILE_VERSION = "2.1.0" # Major.Milestone.Subversion
19+
LOGFILE_VERSION = "2.1.1" # Major.Milestone.Subversion
2020

2121
PSEUDONYM_LENGTH = 32
2222
LEVEL_ENCODING = 'UTF-8' # was Windows-1252
2323
TIME_DRIFT_THRESHOLD = 200 # ms
2424
STALE_LOGFILE_TIME = 48 * 60 * 60 # close logfiles after 48h
2525
MAX_ERROR_LOGS_PER_PLAYER = 25
2626

27+
# The bearer token for the /metrics endpoint
28+
BEARER_TOKEN_BYTES = 32
29+
2730
# Number of seconds, after which the player is considered disconnected. A "Back Online"
2831
# message will be printed to the log, if the player connects afterwards. Also used for the
2932
# Prometheus Online Player Count metric

app/model/ApiKey.py

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
from datetime import datetime, timezone
2+
from sqlalchemy import DateTime, String
3+
from sqlalchemy.orm import Mapped, mapped_column
4+
5+
from app.storage.database import db
6+
7+
8+
class ApiKey(db.Model):
9+
token: Mapped[str] = mapped_column(primary_key=True)
10+
user: Mapped[str] = mapped_column(String(64))
11+
created: Mapped[datetime] = mapped_column(DateTime)
12+
13+
def __init__(self, token: str, user: str) -> None:
14+
self.token = token
15+
self.user = user
16+
self.created = datetime.now(timezone.utc)

app/model/LevelLoader/JsonLevelList.py

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -54,9 +54,11 @@ def getPossibleLevels(self) -> list[Level]:
5454
current_list = MappingProxyType(self.levelList[list_name])
5555

5656
for entry in current_list['levels']:
57+
# If this is a list, add all levels that are contained in that list
5758
if isinstance(entry, list):
5859
for subEntry in cast(list[dict[str, str]], entry):
59-
levels.append(Level(type=subEntry['type'], fileName=subEntry['name']))
60+
levels.append(Level(type=subEntry['type'], fileName=subEntry['name']))
61+
# Else add the single level
6062
else:
6163
levels.append(Level(type=entry['type'], fileName=entry['name']))
6264

@@ -152,7 +154,8 @@ def fromFile(
152154
try:
153155
conf = load_config(fileName=fileName, instanceFolder=instanceFolder)
154156

155-
# TODO Run checks
157+
# TODO Run checks to catch any errors directly on launch and not later when
158+
# someone tries to load the first level
156159

157160
logging.info(f'Successfully loaded {len(conf)} level lists.')
158161
return conf

app/prometheusMetrics.py

Lines changed: 9 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,7 @@
11
import logging
22
from threading import Thread
33
import time
4+
from typing import Any
45
from flask import Flask
56

67
# Prometheus Metrics
@@ -14,27 +15,29 @@
1415

1516
class ServerMetrics:
1617
@staticmethod
17-
def __prometheusFactory():
18+
def __prometheusFactory(auth_provider: Any):
1819
EXCLUDED_PATHS = ["/?res\\/.*", "/?src\\/.*", "/?doc\\/.*"]
1920

2021
# Try to use the uWSGI exporter. This will fail, if uWSGI is not installed
2122
try:
2223
metrics = UWsgiPrometheusMetrics.for_app_factory( # type: ignore
23-
excluded_paths=EXCLUDED_PATHS
24+
excluded_paths=EXCLUDED_PATHS,
25+
metrics_decorator=auth_provider
2426
)
2527

2628
# Use the regular Prometheus exporter
2729
except Exception as e:
2830
logging.error(e)
2931

3032
metrics = PrometheusMetrics.for_app_factory( # type: ignore
31-
excluded_paths=EXCLUDED_PATHS
33+
excluded_paths=EXCLUDED_PATHS,
34+
metrics_decorator=auth_provider
3235
)
3336

3437
logging.info(f'Using {type(metrics).__name__} as the Prometheus exporter')
3538
return metrics
3639

37-
metrics = __prometheusFactory()
40+
metrics: PrometheusMetrics|UWsgiPrometheusMetrics|None = None
3841

3942
# ReverSim Prometheus Metrics
4043
#met_openLogs = Gauge("reversim_logfile_count", "The number of open logfiles") # type: ignore
@@ -47,8 +50,9 @@ def __prometheusFactory():
4750
met_clientErrors: Gauge|None = None
4851

4952
@classmethod
50-
def createPrometheus(cls, app: Flask):
53+
def createPrometheus(cls, app: Flask, auth_provider: Any):
5154
"""Init Prometheus"""
55+
cls.metrics = cls.__prometheusFactory(auth_provider)
5256
cls.metrics.init_app(app) # type: ignore
5357
cls.metrics.info('app_info', 'Application info', version=gameConfig.LOGFILE_VERSION) # type: ignore
5458

0 commit comments

Comments
 (0)