Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 1 addition & 2 deletions .github/release-drafter-config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -22,8 +22,7 @@ categories:
labels:
- documentation

change-template: |
- (#$NUMBER) $TITLE by @$AUTHOR
change-template: "- (#$NUMBER) $TITLE by @$AUTHOR"

no-changes-template: 'No significant changes'

Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/cla.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ on:
issue_comment:
types: [created]
pull_request_target:
types: [opened, closed, synchronize]
types: [opened]

permissions:
contents: read
Expand All @@ -21,7 +21,7 @@ jobs:
steps:
- name: "CLA Assistant"
if: (github.event.comment.body == 'recheck' || github.event.comment.body == 'I have read the CLA Document and I hereby sign the CLA') || github.event_name == 'pull_request_target'
uses: contributor-assistant/github-action@v2.6.1
uses: contributor-assistant/github-action@ca4a40a7d1004f18d9960b404b97e5f30a505a08 #v2.6.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
PERSONAL_ACCESS_TOKEN: ${{ secrets.CLA_ACCESS_TOKEN }}
Expand Down
23 changes: 22 additions & 1 deletion .github/workflows/link-checker.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,10 +13,31 @@ jobs:
steps:
- uses: actions/checkout@v4

- name: Restore lychee cache
uses: actions/cache@v4
id: restore-cache
with:
path: .lycheecache
key: cache-lychee-${{ github.sha }}
restore-keys: cache-lychee-

- name: Link Checker
id: lychee
uses: lycheeverse/lychee-action@v2
with:
args: --base . --verbose --no-progress './**/*.md' --accept 100..=103,200..=299,429
args: >-
'./**/*.md'
--verbose
--no-progress
--user-agent 'Mozilla/5.0 (X11; Linux x86_64) Chrome/134.0.0.0'
--retry-wait-time 60
--max-retries 8
--accept 100..=103,200..=299,429
--cookie-jar cookies.json
--exclude-all-private
--max-concurrency 4
--cache
--cache-exclude-status '429, 500..502'
--max-cache-age 1d
format: markdown
fail: true
59 changes: 59 additions & 0 deletions .github/workflows/security-scan.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
name: Security Scan
on:
workflow_dispatch:
inputs:
target:
description: "Scan part"
required: true
default: "docker"
type: choice
options:
- docker
- source
image:
description: "Docker image (for 'docker' target). By default ghcr.io/<owner>/<repo>:latest"
required: false
default: ""
only-high-critical:
description: "Scan only HIGH + CRITICAL"
required: false
default: true
type: boolean
trivy-scan:
description: "Run Trivy scan"
required: false
default: true
type: boolean
grype-scan:
description: "Run Grype scan"
required: false
default: true
type: boolean
continue-on-error:
description: "Continue on error"
required: false
default: true
type: boolean
only-fixed:
description: "Show only fixable vulnerabilities"
required: false
default: true
type: boolean

permissions:
contents: read
security-events: write
actions: read
packages: read

jobs:
security-scan:
uses: netcracker/qubership-workflow-hub/.github/workflows/re-security-scan.yml@main
with:
target: ${{ github.event.inputs.target || 'source' }}
image: ${{ github.event.inputs.image || '' }}
only-high-critical: ${{ inputs.only-high-critical}}
trivy-scan: ${{ inputs.trivy-scan }}
grype-scan: ${{ inputs.grype-scan }}
only-fixed: ${{ inputs.only-fixed }}
continue-on-error: ${{ inputs.continue-on-error }}
4 changes: 2 additions & 2 deletions alpine/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ ENV S3_CERT_PATH_INTERNAL=/s3CertInternal

ARG PY_APSW_VER="3.40.1.0"
ARG PIP="25.3"
ARG SETUPTOOLS="78.1.1"
ARG SETUPTOOLS="80.9.0"
ARG TMP_DIR="/tmp"

COPY requirements.txt ${BACKUP_DAEMON_HOME}/
Expand Down Expand Up @@ -37,4 +37,4 @@ RUN mkdir -p ${S3_CERT_PATH_INTERNAL} \

VOLUME /backup-storage

CMD ["python3", "/opt/backup/backup-daemon.py"]
CMD ["python3", "/opt/backup/backup-daemon.py"]
12 changes: 6 additions & 6 deletions requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,8 @@ aniso8601==9.0.1
appdirs==1.4.4
apsw==3.40.1.0
attrs==21.4.0
boto3==1.28.62
botocore==1.31.62
boto3==1.34.63
botocore==1.34.63
CacheControl==0.12.10
certifi==2024.7.4
charset-normalizer==2.0.7
Expand Down Expand Up @@ -37,12 +37,12 @@ pyrsistent==0.18.1
python-dateutil==2.8.2
pytz==2021.3
PyYAML==6.0.1
requests==2.31.0
requests==2.32.4
retrying==1.3.3
s3transfer==0.7.0
s3transfer==0.10.0
six==1.16.0
toml==0.10.2
tomli==1.2.2
urllib3==2.0.6
urllib3==2.6.3
webencodings==0.5.1
Werkzeug==3.0.6
Werkzeug==3.1.5
28 changes: 17 additions & 11 deletions src/db.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@
# limitations under the License.

import logging
import threading
import apsw


Expand All @@ -29,6 +30,8 @@ class DbException(Exception):


class DB:
_lock = threading.Lock()
Copy link

Copilot AI Jan 20, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The _lock is a class-level attribute shared across all DB instances. This means all database operations across different database files will compete for the same lock, which could cause unnecessary serialization and performance degradation. Consider using an instance-level lock (self._lock = threading.Lock() in init) if each DB instance operates on a separate database file.

Copilot uses AI. Check for mistakes.

def __init__(self, dbfile):
"""
Create a connection to sqlite database
Expand All @@ -41,7 +44,7 @@ def __init__(self, dbfile):
self.__dbfile = dbfile
try:
log.debug("Database file: %s" % self.__dbfile)
self.__cursor = DB.__create_connection(dbfile).cursor()
self.__conn = DB.__create_connection(dbfile)
except apsw.Error as err:
log.exception("Database error during init: %s" % err)
raise DbException("Database error during init")
Expand All @@ -67,24 +70,27 @@ def __create_connection(db_file):

def __create_table(self, query):
try:
self.__cursor.execute(query)
cursor = self.__conn.cursor()
with cursor:
cursor.execute(query)
except apsw.Error as err:
log.exception("Database Error: %s" % err)
return 0

@staticmethod
def __log_and_execute(cursor, sql, args):
log.debug("SQL command: " + sql.replace('?', '%s') % args)
cursor.execute(sql, args)
with DB._lock:
log.debug("SQL command: " + sql.replace('?', '%s') % args)
cursor.execute(sql, args)
Comment on lines +82 to +84
Copy link

Copilot AI Jan 20, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The threading lock is being used only around the logging and execute operations in __log_and_execute, but this may not provide adequate thread safety. The lock should protect the entire database operation sequence, not just the execute call. Consider that cursor operations might need protection at a higher level, or use connection-level locks instead of a class-level lock since each DB instance has its own connection.

Copilot uses AI. Check for mistakes.

def __insert_or_delete(self, query, params, login=False):
try:
if login:
cursor = DB.__create_connection(self.__dbfile).cursor()
else:
cursor = self.__cursor

DB.__log_and_execute(cursor, query, params)
cursor = self.__conn.cursor()
with cursor:
DB.__log_and_execute(cursor, query, params)
Comment on lines 82 to +93
Copy link

Copilot AI Jan 20, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When login=True, a new connection is created but never explicitly closed, which could lead to resource leaks. The cursor context manager only closes the cursor, not the underlying connection. Consider storing the connection and ensuring it's properly closed, or restructure to reuse the existing connection.

Copilot uses AI. Check for mistakes.
return 1
except apsw.Error as err:
log.exception("Database Error: %s" % err)
Expand All @@ -95,10 +101,10 @@ def __select(self, query, params, login=False):
if login:
cursor = DB.__create_connection(self.__dbfile).cursor()
else:
cursor = self.__cursor

DB.__log_and_execute(cursor, query, params)
return cursor.fetchall()
cursor = self.__conn.cursor()
with cursor:
DB.__log_and_execute(cursor, query, params)
return cursor.fetchall()
Comment on lines 95 to +107
Copy link

Copilot AI Jan 20, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When login=True, a new connection is created but never explicitly closed, which could lead to resource leaks. The cursor context manager only closes the cursor, not the underlying connection. Consider storing the connection and ensuring it's properly closed, or restructure to reuse the existing connection.

Copilot uses AI. Check for mistakes.
except apsw.Error as err:
log.exception("Database Error: %s" % err)
return None
Expand Down
Loading