Conversation
Consolidate all Python dependencies from 7 requirements/*.txt files into pyproject.toml dependency groups. Replace pip install with uv sync in all Docker containers, using --system-site-packages for access to apt-installed packages (python3-gi) and /opt/venv to survive volume mounts in dev/test. Key changes: - Add dependency groups: server, celery, websocket, viewer, wifi-connect, dev, test, host, local (plus existing dev-host, docker-image-builder) - Bump cryptography 3.3.2→44.0.3, pyOpenSSL 19.1.0→25.1.0 for Python 3.11 - Update all Dockerfile templates to use uv sync with --only-group - Install uv via COPY --from=ghcr.io/astral-sh/uv:latest (pip fallback for pi1/pi2) - Add arm64 support to test Dockerfile (chromium from apt instead of Chrome for Testing) - Switch Dependabot from pip to uv ecosystem - Update bin/install.sh to parse pyproject.toml instead of requirements.host.txt - Delete requirements/ directory Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Notable bumps: - celery 5.2.2 → 5.6.2 - cryptography 44.0.3 → 46.0.5 - gunicorn 23.0.0 → 25.1.0 - pyzmq 23.2.1 → 27.1.0 - redis 7.1.0 → 7.3.0 - sh 1.8 → 2.2.2 - time-machine 2.15.0 → 3.2.0 - ruff 0.14.10 → 0.15.5 - selenium 4.36.0 → 4.41.0 - yt-dlp 2026.2.21 → 2026.3.3 Kept at current versions: - Django 4.2.29 (LTS; 5.x/6.x are breaking upgrades) - django-dbbackup 4.2.1 (5.x requires Django 5+) - ansible-core 2.18.3 (used in install.sh) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Add back WORKDIR /usr/src/app in server, celery, and websocket Dockerfiles that was lost during the uv migration. Without it, containers crash on startup because CMD runs from / instead of /usr/src/app. Also apply ruff formatting to utils.py and add a wait step in the OpenAPI schema CI workflow. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- Delete legacy API versions (v1, v1.1, v1.2), keep only v2 - Replace Celery + Redis with threading-based background tasks - Replace React/TypeScript/webpack frontend with Django templates + HTMX + Bootstrap 5 - Replace ZMQ pub/sub with Django Channels WebSocket - Replace Nginx with WhiteNoise for static file serving - Replace Gunicorn with Daphne (ASGI) for WebSocket support - Simplify Docker from 6 services to 2 (server + viewer) - Migrate test suite from unittest to pytest - Remove ~20,000 lines of code and dozens of dependencies Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- Set minimum Python version to 3.13, remove Python 2 compatibility - Remove legacy imports: future, six, builtins, __future__, configparser, importlib-metadata, pep8, mock, unittest-parametrize - Add modern type hints throughout (PEP 604 union syntax, PEP 585 generics) - Replace .format() with f-strings, use super() without args - Replace pytz with datetime.timezone, distutils.strtobool with inline impl - Use conditional import for cec hardware dependency - Clean up uv.lock (6 packages removed) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
| hashed_password = hashlib.sha256(password.encode('utf-8')).hexdigest() | ||
| def check_password(self, password: str) -> bool: | ||
| hashed_password = hashlib.sha256( | ||
| password.encode('utf-8') |
Check failure
Code scanning / CodeQL
Use of a broken or weak cryptographic hashing algorithm on sensitive data High
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI about 6 hours ago
In general, the fix is to replace the use of raw hashlib.sha256 for password hashing with a dedicated password hashing scheme that is slow and salted, such as PBKDF2 (via hashlib.pbkdf2_hmac), bcrypt, scrypt, or Argon2. This requires two coordinated changes: (1) when storing/updating a password, compute a strong password hash (including salt and iteration count) and store enough parameters or metadata to verify it later; and (2) when verifying a login, recompute the hash for the candidate password using the same parameters and compare in constant time.
Within the constraints of only editing the shown snippet and only adding well‑known imports, the least invasive approach that preserves existing behavior is to introduce helper functions in BasicAuth that encapsulate password hashing and verification using PBKDF2‑HMAC with SHA‑256, along with a per‑password random salt. We can continue to store the hash as a hex string in self.settings['password'], but we also need to store the salt and iteration count. Since we must not assume schema changes elsewhere, the safest change is to embed all these parameters into a single string value (e.g., pbkdf2_sha256$iterations$salt_hex$hash_hex) and keep self.settings['password'] as that string. Then:
-
Add a method
hash_password(self, password: str) -> strinBasicAuththat:- Generates a secure random salt with
os.urandom. - Uses
hashlib.pbkdf2_hmac('sha256', password_bytes, salt, iterations)to derive a key. - Returns a formatted string encoding algorithm, iteration count, salt (hex), and hash (hex).
- Generates a secure random salt with
-
Add a method
verify_password(self, password: str, stored: str) -> boolthat:- Parses the stored string.
- Recomputes PBKDF2 with the same parameters.
- Uses
hmac.compare_digestto compare hashes safely.
-
Update
check_passwordto callverify_passwordinstead of doinghashlib.sha256(...).hexdigest(). -
Update
update_settingsso that whenever a new password is set, it callshash_passwordto generate the stored value, instead of hashing with plain SHA‑256. This covers all UPDATE and initial‑SET paths.
These changes remain local to lib/auth.py, do not alter the public interface of BasicAuth, and they address all four alert variants because every path that uses or generates a password hash will stop using plain SHA‑256 and instead use a computationally expensive PBKDF2-based hash.
- Remove static/sass/ (React SCSS) and webpack.prod.js (no longer needed) - Fix channels WebSocketConsumer → WebsocketConsumer (channels 4.x API) - Add build args to docker-compose.dev.yml for GIT_*/DEVICE_TYPE Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
|


This is a big refactoring project. Do not merge.