Skip to content

Modernize#14

Draft
JarbasAl wants to merge 5 commits intodevfrom
modernize
Draft

Modernize#14
JarbasAl wants to merge 5 commits intodevfrom
modernize

Conversation

@JarbasAl
Copy link
Member

@JarbasAl JarbasAl commented Mar 18, 2026

Summary by CodeRabbit

Release Notes

  • New Features

    • Migrated from Flask to FastAPI for improved performance
    • Added multi-service translation compatibility routers (LibreTranslate, DeepL, Google Translate, Azure Translator, Amazon Translate)
    • Introduced dedicated language detection endpoints
  • Documentation

    • Added comprehensive user documentation, FAQ, and technical reports
  • Tests

    • Added compatibility router unit tests
  • Chores

    • Consolidated CI/CD workflows with centralized automation
    • Updated project packaging configuration

JarbasAl and others added 3 commits March 17, 2026 22:53
- Replace Flask with FastAPI + uvicorn[standard] in __init__.py,
  __main__.py, requirements.txt, pyproject.toml, setup.py
- Introduce TranslateEngineWrapper for clean dependency injection;
  remove global TX/DETECT state
- start_translate_server() now returns (app, engine) tuple instead of
  blocking; port/host args moved to uvicorn.run() call site
- CORSMiddleware always allow_origins=["*"] unconditionally
- Fix build_tests.yml: replace broken bespoke inline workflow with
  gh-automations reusable build-tests.yml@dev
- Fix license_tests.yml, publish_stable.yml, release_workflow.yml:
  migrate from TigreGotico/gh-automations@master to
  OpenVoiceOS/gh-automations@dev, add bot-safety guard, add
  workflow_dispatch trigger
- Add lint.yml and pip_audit.yml workflows using gh-automations@dev
- Add pyproject.toml (dynamic version from version.py)
- Add/update docs/index.md, QUICK_FACTS.md, FAQ.md, AUDIT.md,
  SUGGESTIONS.md, MAINTENANCE_REPORT.md

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Adds LibreTranslate, DeepL, Google Translate v2, Azure Translator v3,
and Amazon Translate-compatible endpoints so any 3rd-party app can use
OVOS translation plugins as a drop-in replacement.

- New routers/: libretranslate, deepl, google_translate, azure_translator,
  amazon_translate — all via make_X_router(engine) factory pattern
- All schemas use Pydantic Field() constraints with Literal types
- Lang code normalisation: DeepL inbound EN-US→en-us, outbound→EN-US
- Azure Translator note: /translate, /detect, /languages paths conflict
  with LibreTranslate — must be registered in an isolated app or added
  after libretranslate router (first-registered wins)
- pyproject.toml: optional [lang-names] extra for langcodes
- test/unittests/test_compat_routers.py: 22 tests, all passing

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Each compat router now mounts under a prefix that identifies the API vendor:
  /libretranslate/...     LibreTranslate
  /deepl/v2/...           DeepL
  /google/language/...    Google Translate v2
  /azure/...              Azure Translator v3
  /amazon/translate/...   Amazon Translate

Benefits: all routers coexist in one app with no path conflicts (previously
LibreTranslate and Azure Translator conflicted on /translate, /detect,
/languages). Swagger groups endpoints clearly by vendor.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
@coderabbitai
Copy link

coderabbitai bot commented Mar 18, 2026

Important

Review skipped

Draft detected.

Please check the settings in the CodeRabbit UI or the .coderabbit.yaml file in this repository. To trigger a single review, invoke the @coderabbitai review command.

⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: b4a7712f-f933-42c0-b23f-fd20b822755f

You can disable this status message by setting the reviews.review_status to false in the CodeRabbit configuration file.

Use the checkbox below for a quick retry:

  • 🔍 Trigger review
📝 Walkthrough

Walkthrough

The pull request performs a major architectural migration of ovos-translate-server from Flask to FastAPI, consolidates CI workflows into reusable external workflows, introduces multiple API-compatible routers for different translation providers, and adds comprehensive project documentation.

Changes

Cohort / File(s) Summary
GitHub Workflows Refactor
.github/workflows/build_tests.yml, license_tests.yml, lint.yml, pip_audit.yml
Replace inline CI steps with references to external OpenVoiceOS/gh-automations reusable workflows. Removes local Python setup, dependency installation, and test execution logic; consolidates branch triggers and configuration.
Workflow Jobs Consolidation
.github/workflows/publish_stable.yml, release_workflow.yml
Simplify multi-job publishing and release workflows into single-job designs. Remove separate publish_pypi, sync_dev, notify, and propose_release jobs; fold their logic into guarded publish_alpha and publish_stable jobs with consolidated flags.
Core Server Migration
ovos_translate_server/__init__.py
Migrate from Flask-based server to FastAPI. Introduce TranslateEngineWrapper for plugin loading and manage detection/translation logic. Add create_app factory and refactor start_translate_server to return (app, engine) tuple instead of starting server directly. New endpoints: /status, /detect, /classify, /translate with variants.
CLI Entrypoint Overhaul
ovos_translate_server/__main__.py
Rewrite main() with argparse for CLI argument handling (tx-engine, detect-engine, host, port). Invoke uvicorn.run directly instead of relying on start_translate_server to manage server lifecycle.
API Compatibility Routers
ovos_translate_server/routers/libretranslate.py, deepl.py, google_translate.py, azure_translator.py, amazon_translate.py
Add five new FastAPI routers providing drop-in compatibility with external translation APIs (LibreTranslate, DeepL, Google Translate v2, Azure Translator v3, Amazon Translate). Each implements endpoint(s) for translation, detection, and language listing with provider-specific Pydantic models and request/response formats.
Version & Build Configuration
ovos_translate_server/version.py, pyproject.toml, requirements.txt, setup.py
Add __version__ export computed from version constants. Introduce pyproject.toml with FastAPI/Uvicorn dependencies and console script entry point. Update requirements.txt to replace flask with fastapi and uvicorn[standard]. Update setup.py description to reference FastAPI.
Documentation Suite
QUICK_FACTS.md, FAQ.md, MAINTENANCE_REPORT.md, AUDIT.md, SUGGESTIONS.md, docs/index.md
Add comprehensive project documentation covering quick facts, FAQs, maintenance history, audit findings, improvement suggestions, and detailed API/architecture docs. Document FastAPI migration, plugin loading, endpoint behaviors, and testing recommendations.
Test Suite
test/unittests/test_compat_routers.py
Introduce comprehensive unit tests for all five compatibility routers using FakeEngine mocks. Validate translation, detection, language listing, and error handling across LibreTranslate, DeepL, Google, Azure, and Amazon provider implementations.

Sequence Diagram

sequenceDiagram
    participant Client
    participant FastAPI as FastAPI App
    participant Wrapper as TranslateEngineWrapper
    participant TxPlugin as Translation Plugin
    participant DetectPlugin as Detection Plugin

    Client->>FastAPI: GET /translate/{src}/{tgt}/{text}
    FastAPI->>Wrapper: Lookup router & call translate
    Wrapper->>TxPlugin: Invoke translate(text, src, tgt)
    TxPlugin-->>Wrapper: Return translated_text
    Wrapper-->>FastAPI: Return translation result
    FastAPI-->>Client: 200 with JSON response

    Client->>FastAPI: GET /detect/{text}
    FastAPI->>Wrapper: Call detect endpoint
    alt detect_plugin available
        Wrapper->>DetectPlugin: Invoke detect(text)
        DetectPlugin-->>Wrapper: Return language
    else fallback to tx plugin
        Wrapper->>TxPlugin: Invoke tx.detect(text)
        TxPlugin-->>Wrapper: Return language
    end
    Wrapper-->>FastAPI: Return detected language
    FastAPI-->>Client: 200 with detection data
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

Poem

🐰 A Flask of yesteryear hops away,
FastAPI arrives bright as the day!
Five routers bloom where one stood alone—
DeepL, Google, Azure are shown.
With docs and workflows now automated,
This translate server's elevated!

🚥 Pre-merge checks | ✅ 1 | ❌ 2

❌ Failed checks (1 warning, 1 inconclusive)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 49.15% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
Title check ❓ Inconclusive The title 'Modernize' is vague and generic, failing to convey the specific nature of the substantial changes in this pull request. Use a more descriptive title that captures the main change, such as 'Migrate from Flask to FastAPI and centralize CI workflows' or 'Modernize: Replace Flask with FastAPI and adopt reusable workflows'.
✅ Passed checks (1 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch modernize
📝 Coding Plan
  • Generate coding plan for human review comments

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@github-actions
Copy link

github-actions bot commented Mar 18, 2026

The automated pipeline has reached its destination. 🏁

I've aggregated the results of the automated checks for this PR below.

🔍 Lint

The automated results are now available for your perusal. 📂

ruff: issues found — see job log

🔒 Security (pip-audit)

Scanning for any potential SQL injection points. 💉

✅ No known vulnerabilities found (75 packages scanned).

⚖️ License Check

Auditing the legal lineage of this contribution. 📜

✅ No license violations found (56 packages).

License distribution: 13× MIT, 13× MIT License, 6× Apache Software License, 6× BSD-3-Clause, 5× Apache-2.0, 2× ISC License (ISCL), 2× PSF-2.0, 2× Python Software Foundation License, +7 more

Full breakdown — 56 packages
Package Version License URL
annotated-doc 0.0.4 MIT link
annotated-types 0.7.0 MIT License link
anyio 4.12.1 MIT link
audioop-lts 0.2.2 PSF-2.0 link
build 1.4.0 MIT link
certifi 2026.2.25 Mozilla Public License 2.0 (MPL 2.0) link
charset-normalizer 3.4.6 MIT link
click 8.3.1 BSD-3-Clause link
combo_lock 0.3.0 Apache Software License link
fastapi 0.135.1 MIT link
filelock 3.25.2 MIT link
h11 0.16.0 MIT License link
httptools 0.7.1 MIT link
idna 3.11 BSD-3-Clause link
importlib_metadata 8.7.1 Apache-2.0 link
json-database 0.10.1 MIT link
kthread 0.2.3 MIT License link
langcodes 3.5.1 MIT License link
markdown-it-py 4.0.0 MIT License link
mdurl 0.1.2 MIT License link
memory-tempfile 2.2.3 MIT License link
ovos-config 2.1.1 Apache-2.0 link
ovos-plugin-manager 2.2.0 Apache-2.0 link
ovos-translate-server 0.0.3a2 Apache-2.0 link
ovos-utils 0.8.5 Apache-2.0 link
ovos_bus_client 1.5.0 Apache Software License link
packaging 26.0 Apache-2.0 OR BSD-2-Clause link
pexpect 4.9.0 ISC License (ISCL) link
ptyprocess 0.7.0 ISC License (ISCL) link
pydantic 2.12.5 MIT link
pydantic_core 2.41.5 MIT link
pyee 12.1.1 MIT License link
Pygments 2.19.2 BSD License link
pyproject_hooks 1.2.0 MIT License link
python-dateutil 2.9.0.post0 Apache Software License; BSD License link
python-dotenv 1.2.2 BSD-3-Clause link
PyYAML 6.0.3 MIT License link
quebra-frases 0.3.7 Apache Software License link
regex 2026.2.28 Apache-2.0 AND CNRI-Python link
requests 2.32.5 Apache Software License link
rich 13.9.4 MIT License link
rich-click 1.9.7 MIT License

Copyright (c) 2022 Phil Ewels

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
| link |
| six | 1.17.0 | MIT License | link |
| standard-aifc | 3.13.0 | Python Software Foundation License | link |
| standard-chunk | 3.13.0 | Python Software Foundation License | link |
| starlette | 0.52.1 | BSD-3-Clause | link |
| typing-inspection | 0.4.2 | MIT | link |
| typing_extensions | 4.15.0 | PSF-2.0 | link |
| urllib3 | 2.6.3 | MIT | link |
| uvicorn | 0.42.0 | BSD-3-Clause | link |
| uvloop | 0.22.1 | Apache Software License; MIT License | |
| watchdog | 6.0.0 | Apache Software License | link |
| watchfiles | 1.1.1 | MIT License | link |
| websocket-client | 1.9.0 | Apache Software License | link |
| websockets | 16.0 | BSD-3-Clause | link |
| zipp | 3.23.0 | MIT | link |

Policy: Apache 2.0 (universal donor). StrongCopyleft / NetworkCopyleft / WeakCopyleft / Other / Error categories fail. MPL allowed.

🔨 Build Tests

Checking the blueprint against the actual construction. 📐

Python Build Install
3.10
3.11
3.12
3.13
3.14

❌ 3.10: python -m build failed
❌ 3.11: python -m build failed
❌ 3.12: python -m build failed
❌ 3.13: python -m build failed
❌ 3.14: python -m build failed
Check job logs for details.


An automated high-five for your latest changes! 🖐️

@JarbasAl JarbasAl marked this pull request as draft March 18, 2026 16:35
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 16

🧹 Nitpick comments (8)
QUICK_FACTS.md (1)

9-9: Avoid hard-coding the version in docs.

Line 9 will drift on the next release. Prefer generating this value from ovos_translate_server/version.py (or templating during docs build).

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@QUICK_FACTS.md` at line 9, The README currently hard-codes the version string
in QUICK_FACTS.md; instead, read the package version from
ovos_translate_server/version.py (e.g. the __version__ symbol) or inject it at
docs build time via a template variable; update QUICK_FACTS.md to use a
placeholder (or Jinja/Sphinx substitution) and modify the docs build step to
import ovos_translate_server.version.__version__ and render that value into the
placeholder so the docs always reflect the package version.
ovos_translate_server/routers/amazon_translate.py (1)

87-95: Move import langcodes outside the loop.

The langcodes import is executed on every iteration. Move it before the loop or to module level for better performance, especially with large language lists.

♻️ Proposed fix
     def list_languages(
             authorization: Optional[str] = Header(default=None),
     ) -> AmazonListLanguagesResponse:
         ...
         languages = []
+        try:
+            import langcodes
+        except ImportError:
+            langcodes = None
         for code in engine.langs:
-            try:
-                import langcodes
+            if langcodes is not None:
+                try:
-                name = langcodes.Language.get(code).display_name()
-            except Exception:
+                    name = langcodes.Language.get(code).display_name()
+                except Exception:
+                    name = code
+            else:
                 name = code
             languages.append(AmazonLanguage(LanguageCode=code, LanguageName=name))
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@ovos_translate_server/routers/amazon_translate.py` around lines 87 - 95, The
loop in which AmazonLanguage objects are created repeatedly imports langcodes,
causing unnecessary overhead; move the import of langcodes out of the loop
(preferably to module level or at the top of the function that builds the list)
so that engine.langs is iterated without re-importing; update the block where
languages are constructed (the code that appends
AmazonLanguage(LanguageCode=code, LanguageName=name) and returns
AmazonListLanguagesResponse) to use the single imported langcodes instance.
ovos_translate_server/routers/azure_translator.py (2)

158-169: Move import langcodes outside the loop.

Same issue as in amazon_translate.py - the import is executed on every iteration.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@ovos_translate_server/routers/azure_translator.py` around lines 158 - 169,
The loop in the Azure languages builder repeatedly imports langcodes on each
iteration which is inefficient; move the import langcodes statement out of the
for loop and perform it once before iterating over engine.langs, then use
langcodes.Language.get(code) inside the loop to populate AzureLanguageInfo
entries and return AzureLanguagesResponse(translation=translation); keep the
existing try/except around the Language.get/display_name calls to preserve
fallback behavior.

94-110: Redundant or None assignment.

Line 98 source = from_lang or None is redundant since from_lang is already Optional[str] with default=None. If from_lang is falsy (empty string), you'd want to treat it as None, but the Query already enforces min_length=1, so an empty string won't pass validation.

♻️ Simplified assignment
-            source = from_lang or None
+            source = from_lang
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@ovos_translate_server/routers/azure_translator.py` around lines 94 - 110, The
assignment source = from_lang or None is redundant; replace it by using
from_lang directly (e.g., remove the source variable or set source = from_lang)
and pass that into engine.tx.translate (translate(item.Text, target=tgt,
source=from_lang)) so you don't coerce a valid Optional[str] to None
unnecessarily; keep the existing detected language handling (_detect_lang,
AzureTranslation, AzureTranslateItem) unchanged.
test/unittests/test_compat_routers.py (1)

18-46: Mutable class attribute defaults in test fakes.

available_languages (line 19) and langs (line 41) use mutable list defaults. While harmless in this test context since they're never mutated, annotating with ClassVar or using a property would silence the Ruff RUF012 warning and prevent accidental mutation.

♻️ Optional fix using ClassVar
+from typing import ClassVar, Dict, List, Optional
-from typing import Dict, List, Optional
 
 class FakeTx:
-    available_languages: List[str] = ["en", "de", "fr", "es"]
+    available_languages: ClassVar[List[str]] = ["en", "de", "fr", "es"]
 
 ...
 
 class FakeEngine:
-    plugin_name: str = "fake-translate"
-    langs: List[str] = ["en", "de", "fr", "es"]
+    plugin_name: ClassVar[str] = "fake-translate"
+    langs: ClassVar[List[str]] = ["en", "de", "fr", "es"]
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@test/unittests/test_compat_routers.py` around lines 18 - 46, The test fakes
use mutable list defaults which trigger RUF012: change the class-level mutable
lists to non-mutable class-vars or instance attributes; specifically annotate
FakeTx.available_languages and FakeEngine.langs as ClassVar[List[str]] (or move
them into __init__ as self.available_languages/self.langs) so they are not
mutable class defaults and the linter warning is silenced while preserving
current behavior.
ovos_translate_server/routers/google_translate.py (1)

109-114: Redundant try-except block.

_detect_one(text) (lines 74-86) already catches all exceptions internally and returns a default GoogleDetection(language="und", confidence=0.0). The outer try-except here is unnecessary.

♻️ Simplified code
             detected_src = None
             if not request.source:
-                try:
-                    det = _detect_one(text)
-                    detected_src = det.language
-                except Exception:
-                    pass
+                det = _detect_one(text)
+                detected_src = det.language
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@ovos_translate_server/routers/google_translate.py` around lines 109 - 114,
The outer try-except around _detect_one(text) is redundant because _detect_one
already handles exceptions and returns a default GoogleDetection; remove the
try/except and directly call det = _detect_one(text) and set detected_src =
det.language when request.source is falsy (preserve the existing conditional).
Update the block containing request.source, _detect_one, and detected_src
accordingly (references: _detect_one and detected_src).
ovos_translate_server/routers/libretranslate.py (2)

100-108: Move import langcodes outside the loop.

Same issue as other routers - import inside loop is inefficient.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@ovos_translate_server/routers/libretranslate.py` around lines 100 - 108, The
import of langcodes is done inside the loop (in the block that iterates over
engine.langs) which is inefficient; move the import statement out of the loop
(e.g., at the top of the function that builds result or module top) so langcodes
is imported once, then use langcodes.Language.get(code).display_name() inside
the loop to construct LibreLanguage(code=code, name=name) for each code; update
the function that contains engine.langs and the LibreLanguage construction to
reference the pre-imported langcodes.

71-91: Missing error handling in detect endpoint.

Unlike other routers (DeepL, Azure, Google) which wrap detection in try-except with fallback, this endpoint will propagate exceptions directly to the client if detect_probs fails. Consider adding a try-except block for consistency and resilience.

♻️ Proposed fix with error handling
     `@router.post`("/detect", response_model=List[LibreDetectEntry])
     def detect(request: LibreDetectRequest) -> List[LibreDetectEntry]:
         ...
-        if engine.detect is not None:
-            probs = engine.detect.detect_probs(request.q)
-        else:
-            probs = engine.tx.detect_probs(request.q)
+        try:
+            if engine.detect is not None:
+                probs = engine.detect.detect_probs(request.q)
+            else:
+                probs = engine.tx.detect_probs(request.q)
+        except Exception:
+            return []
 
         results = [
             LibreDetectEntry(language=lang, confidence=float(conf))
             for lang, conf in probs.items()
         ]
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@ovos_translate_server/routers/libretranslate.py` around lines 71 - 91, The
detect endpoint currently calls engine.detect.detect_probs or
engine.tx.detect_probs without error handling; wrap the detection calls inside a
try-except in the detect function so exceptions from detect_probs are caught,
log or handle the exception, and attempt the fallback detection (e.g., call
engine.tx.detect_probs if the primary call fails) before returning the
LibreDetectEntry list; ensure you reference the existing symbols (detect
function, engine.detect, engine.tx, detect_probs, LibreDetectEntry) when adding
the try-except and fallback so behavior matches other routers' resilience.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In @.github/workflows/license_tests.yml:
- Around line 13-15: The reusable workflow invocation currently only passes
system_deps: 'swig' and omits the project-specific license policy; update the
license_tests.yml call to forward the same exclusions/overrides used by
test/license_tests.py by adding the reusable-workflow inputs for
exclude_packages and exclude_licenses and the custom overrides for kthread→MIT,
pyxdg→GPL-2.0, ptyprocess→ISC and psutil→BSD3 (and any strict validation flag
used in test/license_tests.py) so the external workflow enforces the project
policy rather than its defaults; make sure to reference the existing system_deps
input and the package names pyxdg, kthread, ptyprocess, psutil when adding these
inputs.

In @.github/workflows/pip_audit.yml:
- Around line 11-12: Replace the mutable reusable-workflow reference that uses
"@dev" with an immutable commit SHA: locate the line containing "uses:
OpenVoiceOS/gh-automations/.github/workflows/pip-audit.yml@dev" and update the
suffix to a specific commit SHA (e.g., "@<commit-sha>"); do the same for the
other affected workflow files that reference OpenVoiceOS/gh-automations (search
for "uses: OpenVoiceOS/gh-automations/.github/workflows/...@dev") so all
reusable workflow links are pinned to explicit commit SHAs.

In @.github/workflows/publish_stable.yml:
- Around line 10-16: The workflow is importing a reusable workflow via the
moving ref "uses:
OpenVoiceOS/gh-automations/.github/workflows/publish-stable.yml@dev", which
allows behavior to change unexpectedly; replace the branch ref (`@dev`) with an
immutable full commit SHA for that reusable workflow reference so
secrets/publishing behavior is pinned (keep the rest of the inputs—branch,
version_file, publish_pypi, sync_dev—unchanged).

In @.github/workflows/release_workflow.yml:
- Around line 12-22: The workflow currently references the reusable publishing
workflow via a mutable branch ref
("OpenVoiceOS/gh-automations/.github/workflows/publish-alpha.yml@dev"), which
risks changing behavior and exposing inherited secrets; update that "uses" value
to pin to an immutable full commit SHA of the target repo (replace the "@dev"
ref with "@<full-commit-sha>") and do the same for the equivalent reference in
publish_stable.yml (the "publish-stable.yml@dev" entry), ensuring both
publish_alpha and publish_stable workflows point to exact commit SHAs to lock
release behavior and minimize blast radius of inherited secrets.

In `@AUDIT.md`:
- Line 13: Update the AUDIT.md entry that currently claims the test/ directory
is empty: replace the `[MAJOR] tests: No unit tests found` line with a note
acknowledging the added router compatibility tests
(test/unittests/test_compat_routers.py) and reframe the finding to identify the
remaining gap—lack of plugin-loading and configuration coverage (e.g.,
mock-based unit tests for ovos_translate_server plugin loading and config
handling). Ensure the new bullet references the existing test file
(test/unittests/test_compat_routers.py) and clearly states that targeted
plugin-loading/config tests are needed.

In `@docs/index.md`:
- Around line 171-177: TranslateEngineWrapper currently instantiates plugin
classes with config={} which drops per-plugin settings; update the instantiation
in TranslateEngineWrapper (ovos_translate_server.__init__.py) to load and pass
the proper config for each plugin (e.g. cfg = Configuration().get("language",
{}); PluginClass(config=cfg.get(plugin_name, {})) or equivalent) instead of an
empty dict so credential-based backends (DeepL, Azure, Amazon) receive their API
keys/endpoints.
- Around line 96-99: The fenced code examples that currently start with the raw
lines like `GET /detect/o meu nome é Casimiro` (and the other similar
request/response examples in the same docs section) lack a language tag and
trigger MD040; update each fenced block in docs/index.md that contains those
HTTP request/response examples (e.g., the blocks around the GET /detect examples
and the other pairs noted in the comment) to include a language identifier such
as ```http or ```console (or ```text) so the markdownlint warning is resolved
and the examples render with proper syntax highlighting.
- Around line 41-49: Update the docs to reflect the real default and fallback:
change the `--detect-engine` default from `ovos-lang-detector-classics-plugin`
to `None` (or "not set") and document that when unset the server handlers call
engine.tx.detect() / engine.tx.detect_probs() as the fallback behavior;
reference the flag name `--detect-engine`, the module
`ovos_translate_server.__main__.py`, and the handler behavior
`engine.tx.detect()` / `detect_probs()` so readers know the actual runtime
behavior.

In `@FAQ.md`:
- Around line 11-22: Update the repo-root command examples in FAQ.md so they are
copy-pasteable from the project root: replace the development install example
shown in the pip snippet (currently referencing "pip install -e
ovos-translate-server/") with the repo-root form "pip install -e ." and update
the test command (currently "pytest ovos-translate-server/test/...") to "pytest
test/..." (leave any task-runner prefix like "uv" in place if intended). Locate
the two snippets in the FAQ.md file (the pip install block and the pytest block)
and change only the path arguments to the repo-root equivalents.

In `@MAINTENANCE_REPORT.md`:
- Around line 18-21: Update the AI Transparency Report entry so it accurately
reflects that unit tests were added: mention the new test file
test_compat_routers.py instead of saying "no tests added", and replace the
blanket statement with a brief note of the remaining coverage gap (e.g., plugin
mocking/integration tests still required). Edit the section under "Actions
Taken"/"Oversight" to state "Unit tests added: test_compat_routers.py;
remaining: plugin mocking/integration tests and expanded coverage" and ensure
the Oversight line requests targeted human review for those missing tests.

In `@ovos_translate_server/__init__.py`:
- Around line 90-96: The CORS setup uses app.add_middleware with CORSMiddleware
allowing allow_origins=["*"] while allow_credentials=True, which is insecure and
incompatible with credentialed responses; update the middleware configuration in
the app.add_middleware call to either disable credentials (set
allow_credentials=False) if you don't need cookies/authenticated requests, or
replace the wildcard origin with an explicit list of trusted origins (e.g.,
["http://localhost:3000", "https://your-app.com"]) while keeping
allow_credentials=True so credentialed requests only succeed for those specific
origins.

In `@ovos_translate_server/__main__.py`:
- Line 35: Change the CLI default host from "0.0.0.0" to "127.0.0.1" in the
argument parser so the server binds to localhost by default; locate the
parser.add_argument call (the "--host" argument) in
ovos_translate_server/__main__.py and update its default value to "127.0.0.1",
leaving the help text and behavior otherwise unchanged so deployments can still
pass --host 0.0.0.0 explicitly.

In `@pyproject.toml`:
- Around line 2-10: The pyproject.toml declares build-system.requires =
["setuptools>=61.0", "wheel"] while using the simple license field license =
"Apache-2.0", which requires setuptools >= 77.0.0; fix by either raising the
setuptools floor to "setuptools>=77.0.0" in the build-system.requires entry or
keep the current setuptools version and change the license to the table form
license = { text = "Apache-2.0" } so metadata validation succeeds (edit the
build-system.requires or the license field in pyproject.toml accordingly).

In `@QUICK_FACTS.md`:
- Around line 24-25: The "### Scripts" heading in QUICK_FACTS.md causes a
heading level jump (MD001); change its level to increment by one relative to the
previous heading (e.g., replace "### Scripts" with "## Scripts" or adjust
surrounding headings so the sequence is consistent) so the heading hierarchy is
contiguous.

In `@requirements.txt`:
- Around line 1-3: The build fails because setup.py calls
required("requirements.txt") to populate install_requires (remove that call) and
rely on pyproject.toml for dependencies instead; edit setup.py to remove the
install_requires=required("requirements.txt") (or the required(...) helper
usage) so setup.py no longer reads requirements.txt, and optionally add a
MANIFEST.in with "include requirements.txt" only if you keep reading the file;
ensure pyproject.toml remains the single source of declared dependencies
(fastapi, uvicorn[standard], ovos-plugin-manager).

In `@SUGGESTIONS.md`:
- Line 23: Fix the typo in the suggestion title: replace "Add `GET /translate`
POST variant for long texts" with a correct title such as "Add `POST /translate`
endpoint for long texts" (or "Add `POST /translate`") so the heading matches the
described change; update the markdown heading text in SUGGESTIONS.md
accordingly.

---

Nitpick comments:
In `@ovos_translate_server/routers/amazon_translate.py`:
- Around line 87-95: The loop in which AmazonLanguage objects are created
repeatedly imports langcodes, causing unnecessary overhead; move the import of
langcodes out of the loop (preferably to module level or at the top of the
function that builds the list) so that engine.langs is iterated without
re-importing; update the block where languages are constructed (the code that
appends AmazonLanguage(LanguageCode=code, LanguageName=name) and returns
AmazonListLanguagesResponse) to use the single imported langcodes instance.

In `@ovos_translate_server/routers/azure_translator.py`:
- Around line 158-169: The loop in the Azure languages builder repeatedly
imports langcodes on each iteration which is inefficient; move the import
langcodes statement out of the for loop and perform it once before iterating
over engine.langs, then use langcodes.Language.get(code) inside the loop to
populate AzureLanguageInfo entries and return
AzureLanguagesResponse(translation=translation); keep the existing try/except
around the Language.get/display_name calls to preserve fallback behavior.
- Around line 94-110: The assignment source = from_lang or None is redundant;
replace it by using from_lang directly (e.g., remove the source variable or set
source = from_lang) and pass that into engine.tx.translate (translate(item.Text,
target=tgt, source=from_lang)) so you don't coerce a valid Optional[str] to None
unnecessarily; keep the existing detected language handling (_detect_lang,
AzureTranslation, AzureTranslateItem) unchanged.

In `@ovos_translate_server/routers/google_translate.py`:
- Around line 109-114: The outer try-except around _detect_one(text) is
redundant because _detect_one already handles exceptions and returns a default
GoogleDetection; remove the try/except and directly call det = _detect_one(text)
and set detected_src = det.language when request.source is falsy (preserve the
existing conditional). Update the block containing request.source, _detect_one,
and detected_src accordingly (references: _detect_one and detected_src).

In `@ovos_translate_server/routers/libretranslate.py`:
- Around line 100-108: The import of langcodes is done inside the loop (in the
block that iterates over engine.langs) which is inefficient; move the import
statement out of the loop (e.g., at the top of the function that builds result
or module top) so langcodes is imported once, then use
langcodes.Language.get(code).display_name() inside the loop to construct
LibreLanguage(code=code, name=name) for each code; update the function that
contains engine.langs and the LibreLanguage construction to reference the
pre-imported langcodes.
- Around line 71-91: The detect endpoint currently calls
engine.detect.detect_probs or engine.tx.detect_probs without error handling;
wrap the detection calls inside a try-except in the detect function so
exceptions from detect_probs are caught, log or handle the exception, and
attempt the fallback detection (e.g., call engine.tx.detect_probs if the primary
call fails) before returning the LibreDetectEntry list; ensure you reference the
existing symbols (detect function, engine.detect, engine.tx, detect_probs,
LibreDetectEntry) when adding the try-except and fallback so behavior matches
other routers' resilience.

In `@QUICK_FACTS.md`:
- Line 9: The README currently hard-codes the version string in QUICK_FACTS.md;
instead, read the package version from ovos_translate_server/version.py (e.g.
the __version__ symbol) or inject it at docs build time via a template variable;
update QUICK_FACTS.md to use a placeholder (or Jinja/Sphinx substitution) and
modify the docs build step to import ovos_translate_server.version.__version__
and render that value into the placeholder so the docs always reflect the
package version.

In `@test/unittests/test_compat_routers.py`:
- Around line 18-46: The test fakes use mutable list defaults which trigger
RUF012: change the class-level mutable lists to non-mutable class-vars or
instance attributes; specifically annotate FakeTx.available_languages and
FakeEngine.langs as ClassVar[List[str]] (or move them into __init__ as
self.available_languages/self.langs) so they are not mutable class defaults and
the linter warning is silenced while preserving current behavior.

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: fa19630e-27ed-4880-bb63-f269f2005868

📥 Commits

Reviewing files that changed from the base of the PR and between d381195 and b366a94.

⛔ Files ignored due to path filters (8)
  • ovos_translate_server/routers/__pycache__/__init__.cpython-311.pyc is excluded by !**/*.pyc
  • ovos_translate_server/routers/__pycache__/amazon_translate.cpython-311.pyc is excluded by !**/*.pyc
  • ovos_translate_server/routers/__pycache__/azure_translator.cpython-311.pyc is excluded by !**/*.pyc
  • ovos_translate_server/routers/__pycache__/deepl.cpython-311.pyc is excluded by !**/*.pyc
  • ovos_translate_server/routers/__pycache__/google_translate.cpython-311.pyc is excluded by !**/*.pyc
  • ovos_translate_server/routers/__pycache__/libretranslate.cpython-311.pyc is excluded by !**/*.pyc
  • test/unittests/__pycache__/__init__.cpython-311.pyc is excluded by !**/*.pyc
  • test/unittests/__pycache__/test_compat_routers.cpython-311-pytest-9.0.2.pyc is excluded by !**/*.pyc
📒 Files selected for processing (26)
  • .github/workflows/build_tests.yml
  • .github/workflows/license_tests.yml
  • .github/workflows/lint.yml
  • .github/workflows/pip_audit.yml
  • .github/workflows/publish_stable.yml
  • .github/workflows/release_workflow.yml
  • AUDIT.md
  • FAQ.md
  • MAINTENANCE_REPORT.md
  • QUICK_FACTS.md
  • SUGGESTIONS.md
  • docs/index.md
  • ovos_translate_server/__init__.py
  • ovos_translate_server/__main__.py
  • ovos_translate_server/routers/__init__.py
  • ovos_translate_server/routers/amazon_translate.py
  • ovos_translate_server/routers/azure_translator.py
  • ovos_translate_server/routers/deepl.py
  • ovos_translate_server/routers/google_translate.py
  • ovos_translate_server/routers/libretranslate.py
  • ovos_translate_server/version.py
  • pyproject.toml
  • requirements.txt
  • setup.py
  • test/unittests/__init__.py
  • test/unittests/test_compat_routers.py

Comment on lines +13 to +15
uses: OpenVoiceOS/gh-automations/.github/workflows/license-check.yml@dev
with:
system_deps: 'swig'
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
set -euo pipefail

# Fetch reusable workflow content and inspect what it actually runs
gh api repos/OpenVoiceOS/gh-automations/contents/.github/workflows/license-check.yml?ref=dev --jq '.content' \
  | base64 -d > /tmp/license-check.yml

rg -n -C2 "license|licheck|pytest|test/license_tests.py|override|whitelist|allow_" /tmp/license-check.yml || true

Repository: OpenVoiceOS/ovos-translate-server

Length of output: 129


🏁 Script executed:

# First, check what files exist related to licensing in the repo
find . -type f -name "*license*" | head -20

Repository: OpenVoiceOS/ovos-translate-server

Length of output: 138


🏁 Script executed:

# Check the workflows directory structure
ls -la .github/workflows/

Repository: OpenVoiceOS/ovos-translate-server

Length of output: 643


🏁 Script executed:

# Examine the current license_tests.yml file to see full context
cat -n .github/workflows/license_tests.yml

Repository: OpenVoiceOS/ovos-translate-server

Length of output: 446


🏁 Script executed:

# Check if test/license_tests.py exists and examine it
if [ -f test/license_tests.py ]; then
  wc -l test/license_tests.py
  echo "---"
  cat -n test/license_tests.py
fi

Repository: OpenVoiceOS/ovos-translate-server

Length of output: 2064


🏁 Script executed:

# Check git history for changes to license_tests.yml
git log --oneline -10 .github/workflows/license_tests.yml 2>/dev/null || echo "Git history unavailable"

Repository: OpenVoiceOS/ovos-translate-server

Length of output: 143


🏁 Script executed:

# Try to fetch external workflow via public GitHub URL
curl -s "https://raw.githubusercontent.com/OpenVoiceOS/gh-automations/dev/.github/workflows/license-check.yml" | head -100

Repository: OpenVoiceOS/ovos-translate-server

Length of output: 4022


🏁 Script executed:

# Check if test/license_tests.py is referenced anywhere in workflows
grep -r "test/license_tests.py" .github/workflows/ || echo "Not found in workflows"

Repository: OpenVoiceOS/ovos-translate-server

Length of output: 99


🏁 Script executed:

# Search all workflow files for pytest or license test execution
grep -r "pytest.*license" .github/workflows/ || echo "Not found"

Repository: OpenVoiceOS/ovos-translate-server

Length of output: 86


🏁 Script executed:

# Check if there's any configuration or script that passes the overrides
grep -r "license_overrides\|allow_nonfree\|allow_viral" . --include="*.yml" --include="*.yaml" --include="*.py" 2>/dev/null | head -20

Repository: OpenVoiceOS/ovos-translate-server

Length of output: 474


🏁 Script executed:

# Verify that old test is truly not being called by searching broader
grep -r "license_tests\|LicenseChecker" .github/ || echo "Not found in .github"

Repository: OpenVoiceOS/ovos-translate-server

Length of output: 129


🏁 Script executed:

# Check if there's a build_tests or other workflow that might call the old test
cat -n .github/workflows/build_tests.yml

Repository: OpenVoiceOS/ovos-translate-server

Length of output: 375


Compliance regression confirmed: project-specific license policy not enforced by external workflow.

The project's custom license overrides and strict validation flags (defined in test/license_tests.py) are not passed to the external workflow. Specifically:

  • The 4 custom overrides (kthread→MIT, pyxdg→GPL-2.0, ptyprocess→ISC, psutil→BSD3) are missing
  • pyxdg (GPL-2.0) classified as StrongCopyleft will fail under the new workflow's default policy, which blocks GPL
  • The old test remains in the repository but is not executed by any workflow and represents orphaned code

The external workflow accepts exclude_packages and exclude_licenses regex parameters, but the license_tests.yml passes only system_deps: 'swig'. To restore compliance, pass the necessary exclusions or overrides to the reusable workflow.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/license_tests.yml around lines 13 - 15, The reusable
workflow invocation currently only passes system_deps: 'swig' and omits the
project-specific license policy; update the license_tests.yml call to forward
the same exclusions/overrides used by test/license_tests.py by adding the
reusable-workflow inputs for exclude_packages and exclude_licenses and the
custom overrides for kthread→MIT, pyxdg→GPL-2.0, ptyprocess→ISC and psutil→BSD3
(and any strict validation flag used in test/license_tests.py) so the external
workflow enforces the project policy rather than its defaults; make sure to
reference the existing system_deps input and the package names pyxdg, kthread,
ptyprocess, psutil when adding these inputs.

Comment on lines +11 to +12
uses: OpenVoiceOS/gh-automations/.github/workflows/pip-audit.yml@dev
secrets: inherit
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
set -euo pipefail
# Resolve current SHA behind dev branch before pinning
gh api repos/OpenVoiceOS/gh-automations/commits/dev --jq '.sha'

Repository: OpenVoiceOS/ovos-translate-server

Length of output: 237


🏁 Script executed:

# First, examine the pip_audit.yml file to confirm the `@dev` usage
cat -n .github/workflows/pip_audit.yml

Repository: OpenVoiceOS/ovos-translate-server

Length of output: 365


🏁 Script executed:

# Search for other workflow files with similar patterns
find .github/workflows -name "*.yml" -o -name "*.yaml" | head -20

Repository: OpenVoiceOS/ovos-translate-server

Length of output: 323


🏁 Script executed:

# Search for other uses of `@dev` or other mutable branch references in workflow files
rg "uses:.*@(dev|main|master)" .github/workflows/ --type yaml

Repository: OpenVoiceOS/ovos-translate-server

Length of output: 729


Pin all reusable workflow references to immutable commit SHAs.

The pattern of using @dev for reusable workflows appears across 6 workflow files. This mutable reference allows upstream changes in OpenVoiceOS/gh-automations to alter CI behavior without a PR in this repo. Pin each of these to a commit SHA instead:

  • .github/workflows/pip_audit.yml (line 11)
  • .github/workflows/release_workflow.yml
  • .github/workflows/lint.yml
  • .github/workflows/publish_stable.yml
  • .github/workflows/license_tests.yml
  • .github/workflows/build_tests.yml
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/pip_audit.yml around lines 11 - 12, Replace the mutable
reusable-workflow reference that uses "@dev" with an immutable commit SHA:
locate the line containing "uses:
OpenVoiceOS/gh-automations/.github/workflows/pip-audit.yml@dev" and update the
suffix to a specific commit SHA (e.g., "@<commit-sha>"); do the same for the
other affected workflow files that reference OpenVoiceOS/gh-automations (search
for "uses: OpenVoiceOS/gh-automations/.github/workflows/...@dev") so all
reusable workflow links are pinned to explicit commit SHAs.

Comment on lines +10 to +16
uses: OpenVoiceOS/gh-automations/.github/workflows/publish-stable.yml@dev
secrets: inherit
with:
branch: 'master'
version_file: 'ovos_translate_server/version.py'
setup_py: 'setup.py'
publish_pypi: true
sync_dev: true
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Pin this reusable release workflow to an immutable ref.

This job publishes artifacts and inherits secrets, but the implementation comes from the moving @dev branch in another repo. A later force-push there can change stable-release behavior here without any diff in this repository. Use a full commit SHA for the reusable workflow ref.

Quick check: this should return no matches once the workflow refs are pinned.

#!/bin/bash
set -euo pipefail
rg -n 'uses:\s+\S+@(dev|main|master)\b' .github/workflows
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/publish_stable.yml around lines 10 - 16, The workflow is
importing a reusable workflow via the moving ref "uses:
OpenVoiceOS/gh-automations/.github/workflows/publish-stable.yml@dev", which
allows behavior to change unexpectedly; replace the branch ref (`@dev`) with an
immutable full commit SHA for that reusable workflow reference so
secrets/publishing behavior is pinned (keep the rest of the inputs—branch,
version_file, publish_pypi, sync_dev—unchanged).

Comment on lines +12 to +22
uses: OpenVoiceOS/gh-automations/.github/workflows/publish-alpha.yml@dev
secrets: inherit
with:
branch: 'dev'
version_file: 'ovos_translate_server/version.py'
setup_py: 'setup.py'
update_changelog: true
publish_prerelease: true
propose_release: true
changelog_max_issues: 100

notify:
if: github.event.pull_request.merged == true
needs: publish_alpha
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Send message to Matrix bots channel
id: matrix-chat-message
uses: fadenb/matrix-chat-message@v0.0.6
with:
homeserver: 'matrix.org'
token: ${{ secrets.MATRIX_TOKEN }}
channel: '!WjxEKjjINpyBRPFgxl:krbel.duckdns.org'
message: |
new ${{ github.event.repository.name }} PR merged! https://github.com/${{ github.repository }}/pull/${{ github.event.number }}

publish_pypi:
needs: publish_alpha
if: success() # Ensure this job only runs if the previous job succeeds
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
with:
ref: dev
fetch-depth: 0 # otherwise, there would be errors pushing refs to the destination repository.
- name: Setup Python
uses: actions/setup-python@v1
with:
python-version: "3.14"
- name: Install Build Tools
run: |
python -m pip install build wheel
- name: version
run: echo "::set-output name=version::$(python setup.py --version)"
id: version
- name: Build Distribution Packages
run: |
python setup.py sdist bdist_wheel
- name: Publish to PyPI
uses: pypa/gh-action-pypi-publish@master
with:
password: ${{secrets.PYPI_TOKEN}}


propose_release:
needs: publish_alpha
if: success() # Ensure this job only runs if the previous job succeeds
runs-on: ubuntu-latest
steps:
- name: Checkout dev branch
uses: actions/checkout@v3
with:
ref: dev

- name: Setup Python
uses: actions/setup-python@v2
with:
python-version: '3.14'

- name: Get version from setup.py
id: get_version
run: |
VERSION=$(python setup.py --version)
echo "VERSION=$VERSION" >> $GITHUB_ENV

- name: Create and push new branch
run: |
git checkout -b release-${{ env.VERSION }}
git push origin release-${{ env.VERSION }}

- name: Open Pull Request from dev to master
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
# Variables
BRANCH_NAME="release-${{ env.VERSION }}"
BASE_BRANCH="master"
HEAD_BRANCH="release-${{ env.VERSION }}"
PR_TITLE="Release ${{ env.VERSION }}"
PR_BODY="Human review requested!"

# Create a PR using GitHub API
curl -X POST \
-H "Accept: application/vnd.github+json" \
-H "Authorization: token $GITHUB_TOKEN" \
-d "{\"title\":\"$PR_TITLE\",\"body\":\"$PR_BODY\",\"head\":\"$HEAD_BRANCH\",\"base\":\"$BASE_BRANCH\"}" \
https://api.github.com/repos/${{ github.repository }}/pulls
publish_pypi: true
notify_matrix: true
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
set -euo pipefail
rg -n 'uses:\s+\S+@(dev|main|master)\b' .github/workflows

Repository: OpenVoiceOS/ovos-translate-server

Length of output: 747


Pin this publishing workflow to an immutable ref.

This workflow publishes prereleases to PyPI with secrets: inherit, but sources the reusable workflow from @dev, a mutable branch. This risks uncontrolled changes to release semantics and widens the blast radius of inherited secrets. Pin the workflow reference to a full commit SHA instead.

Note: The same issue affects .github/workflows/publish_stable.yml:10, which also references publish-stable.yml@dev — stable release workflows should be pinned as well.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/release_workflow.yml around lines 12 - 22, The workflow
currently references the reusable publishing workflow via a mutable branch ref
("OpenVoiceOS/gh-automations/.github/workflows/publish-alpha.yml@dev"), which
risks changing behavior and exposing inherited secrets; update that "uses" value
to pin to an immutable full commit SHA of the target repo (replace the "@dev"
ref with "@<full-commit-sha>") and do the same for the equivalent reference in
publish_stable.yml (the "publish-stable.yml@dev" entry), ensuring both
publish_alpha and publish_stable workflows point to exact commit SHAs to lock
release behavior and minimize blast radius of inherited secrets.

AUDIT.md Outdated
- [x] docs/index.md

## Technical Debt & Issues
- `[MAJOR]` **tests**: No unit tests found — `test/` directory exists but is empty (`ovos_translate_server/__init__.py` requires live plugins to test meaningfully; mock-based unit tests needed)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Update the audit finding on tests.

This PR already adds router compatibility tests in test/unittests/test_compat_routers.py, so the report should not say the test/ directory is empty or that no unit tests exist. Reframe this as the narrower gap that still remains, e.g. plugin-loading/config coverage.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@AUDIT.md` at line 13, Update the AUDIT.md entry that currently claims the
test/ directory is empty: replace the `[MAJOR] tests: No unit tests found` line
with a note acknowledging the added router compatibility tests
(test/unittests/test_compat_routers.py) and reframe the finding to identify the
remaining gap—lack of plugin-loading and configuration coverage (e.g.,
mock-based unit tests for ovos_translate_server plugin loading and config
handling). Ensure the new bullet references the existing test file
(test/unittests/test_compat_routers.py) and clearly states that targeted
plugin-loading/config tests are needed.

default=None,
help="OPM language-detection plugin entry-point name (optional)",
)
parser.add_argument("--host", default="0.0.0.0", help="Host to bind (default: 0.0.0.0)")
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Default the CLI to localhost instead of 0.0.0.0.

The server exposes unauthenticated endpoints and the app enables wildcard CORS, so binding to all interfaces by default makes a fresh install remotely reachable unless the operator opts out. Use 127.0.0.1 as the default, and let container/systemd deployments pass --host 0.0.0.0 explicitly.

Safer default
-    parser.add_argument("--host", default="0.0.0.0", help="Host to bind (default: 0.0.0.0)")
+    parser.add_argument("--host", default="127.0.0.1", help="Host to bind (default: 127.0.0.1)")
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
parser.add_argument("--host", default="0.0.0.0", help="Host to bind (default: 0.0.0.0)")
parser.add_argument("--host", default="127.0.0.1", help="Host to bind (default: 127.0.0.1)")
🧰 Tools
🪛 Ruff (0.15.6)

[error] 35-35: Possible binding to all interfaces

(S104)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@ovos_translate_server/__main__.py` at line 35, Change the CLI default host
from "0.0.0.0" to "127.0.0.1" in the argument parser so the server binds to
localhost by default; locate the parser.add_argument call (the "--host"
argument) in ovos_translate_server/__main__.py and update its default value to
"127.0.0.1", leaving the help text and behavior otherwise unchanged so
deployments can still pass --host 0.0.0.0 explicitly.

Comment on lines +2 to +10
requires = ["setuptools>=61.0", "wheel"]
build-backend = "setuptools.build_meta"

[project]
name = "ovos-translate-server"
dynamic = ["version"]
description = "FastAPI server to host OpenVoiceOS translate plugins as a service"
readme = "README.md"
license = "Apache-2.0"
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🌐 Web query:

According to setuptools packaging documentation, which setuptools version first supports SPDX expressions like license = "Apache-2.0"inpyproject.toml, and is that syntax valid when build-system.requiresonly declaressetuptools>=61.0?

💡 Result:

According to setuptools documentation, version 77.0.0 first supports SPDX license expressions like license = "Apache-2.0" in pyproject.toml. This syntax is not valid with build-system.requires declaring only setuptools>=61.0, as versions before 77.0.0 expect the deprecated table format license = {text = "Apache-2.0"} and may raise validation errors for string values.

Citations:


Raise setuptools minimum or use table-form license.

license = "Apache-2.0" requires setuptools 77.0.0+, but build-system.requires declares only setuptools>=61.0. Builders meeting that floor will fail during metadata validation. Either bump the setuptools minimum to 77.0.0 or use the table format license = { text = "Apache-2.0" } instead.

Backward-compatible fix
-[project]
-license = "Apache-2.0"
+[project]
+license = { text = "Apache-2.0" }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@pyproject.toml` around lines 2 - 10, The pyproject.toml declares
build-system.requires = ["setuptools>=61.0", "wheel"] while using the simple
license field license = "Apache-2.0", which requires setuptools >= 77.0.0; fix
by either raising the setuptools floor to "setuptools>=77.0.0" in the
build-system.requires entry or keep the current setuptools version and change
the license to the table form license = { text = "Apache-2.0" } so metadata
validation succeeds (edit the build-system.requires or the license field in
pyproject.toml accordingly).

Comment on lines +24 to +25
### Scripts
- `ovos-translate-server`: `ovos_translate_server.__main__:main`
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Fix heading level to satisfy markdownlint MD001.

The docs lint warning indicates a heading increment problem around this section. Adjust this heading hierarchy so it only increments by one level.

📝 Possible fix
-### Scripts
+## Scripts
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
### Scripts
- `ovos-translate-server`: `ovos_translate_server.__main__:main`
## Scripts
- `ovos-translate-server`: `ovos_translate_server.__main__:main`
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@QUICK_FACTS.md` around lines 24 - 25, The "### Scripts" heading in
QUICK_FACTS.md causes a heading level jump (MD001); change its level to
increment by one relative to the previous heading (e.g., replace "### Scripts"
with "## Scripts" or adjust surrounding headings so the sequence is consistent)
so the heading hierarchy is contiguous.

Comment on lines +1 to 3
fastapi
uvicorn[standard]
ovos-plugin-manager
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
set -euo pipefail

# Verify whether requirements.txt is included in source distribution inputs
fd -a 'MANIFEST.in|setup.py|pyproject.toml|requirements.txt'

# Inspect packaging config references
rg -n "requirements.txt|install_requires|dynamic|dependencies|include requirements.txt" MANIFEST.in setup.py pyproject.toml 2>/dev/null || true

Repository: OpenVoiceOS/ovos-translate-server

Length of output: 405


🏁 Script executed:

# Check if MANIFEST.in exists
cat -n MANIFEST.in 2>/dev/null || echo "MANIFEST.in does not exist"

Repository: OpenVoiceOS/ovos-translate-server

Length of output: 103


🏁 Script executed:

# Inspect setup.py to understand how requirements.txt is used
cat -n setup.py | head -70

Repository: OpenVoiceOS/ovos-translate-server

Length of output: 3204


🏁 Script executed:

# Check pyproject.toml for build configuration
cat -n pyproject.toml

Repository: OpenVoiceOS/ovos-translate-server

Length of output: 1457


🏁 Script executed:

# Check for any CI/build configuration that might show the error
fd -e 'yml|yaml' | head -20

Repository: OpenVoiceOS/ovos-translate-server

Length of output: 59


Add MANIFEST.in to include requirements.txt in source distributions, and consolidate dependency management.

The build failure occurs because setup.py (line 52) reads requirements.txt at build time, but sdist does not include it without explicit declaration. Adding include requirements.txt to MANIFEST.in will unblock the build.

However, there's a deeper issue: dependencies are now defined in both pyproject.toml (lines 22–26) and requirements.txt, with setup.py redundantly reading the latter. The modern approach is to remove line 52 from setup.py and rely entirely on pyproject.toml as the dependency source, since it already lists fastapi, uvicorn[standard], and ovos-plugin-manager. This eliminates the file-reading dependency and aligns with PEP 517 packaging standards.

Immediate fix:

+++ MANIFEST.in
+include requirements.txt

Recommended refactor:
Remove install_requires=required("requirements.txt") from setup.py line 52, as pyproject.toml already declares all dependencies correctly.

🧰 Tools
🪛 GitHub Actions: Run Build Tests

[error] 1-1: FileNotFoundError: No such file or directory: '/tmp/build-via-sdist-bnw_w1zv/ovos_translate_server-0.0.3a2/requirements.txt' during get_requires_for_build_wheel (build step: python -m build)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@requirements.txt` around lines 1 - 3, The build fails because setup.py calls
required("requirements.txt") to populate install_requires (remove that call) and
rely on pyproject.toml for dependencies instead; edit setup.py to remove the
install_requires=required("requirements.txt") (or the required(...) helper
usage) so setup.py no longer reads requirements.txt, and optionally add a
MANIFEST.in with "include requirements.txt" only if you keep reading the file;
ensure pyproject.toml remains the single source of declared dependencies
(fastapi, uvicorn[standard], ovos-plugin-manager).

SUGGESTIONS.md Outdated

**Estimated Impact**: Low effort; removes duplication and future confusion.

### 3. Add `GET /translate` POST variant for long texts
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Fix typo in suggestion title.

The title says "Add GET /translate POST variant" but describes adding a POST endpoint. Should be "Add POST /translate" or "Add POST /translate endpoint".

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@SUGGESTIONS.md` at line 23, Fix the typo in the suggestion title: replace
"Add `GET /translate` POST variant for long texts" with a correct title such as
"Add `POST /translate` endpoint for long texts" (or "Add `POST /translate`") so
the heading matches the described change; update the markdown heading text in
SUGGESTIONS.md accordingly.

Adds docs/api-compatibility.md (5 vendors, 12 endpoints, curl examples),
docs/language-codes.md (per-vendor normalisation rules with source citations),
docs/detection.md (plugin priority and compat router behaviour).
Updates FAQ.md (20 Q&As), QUICK_FACTS.md, AUDIT.md, SUGGESTIONS.md,
MAINTENANCE_REPORT.md.

AI-Generated Change:
- Model: claude-sonnet-4-6
- Intent: document all five vendor-compatible API layers exhaustively
- Impact: new docs/api-compatibility.md, language-codes.md, detection.md; FAQ expanded to 20 entries
- Verified via: uv run pytest test/ -v (29 passed)
New tests: LibreTranslate detect with api_key ignored, languages returns
all fake langs; DeepL missing target_lang rejected, uppercase outbound
normalisation; Google detect with list input; Azure explicit source lang,
multiple text items; Amazon empty Text rejected.

AI-Generated Change:
- Model: claude-sonnet-4-6
- Intent: improve edge-case coverage for all five vendor compat routers
- Impact: 21 → 29 tests
- Verified via: uv run pytest test/ -v (29 passed)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant