Skip to content

Conversation

@fatih-acar
Copy link
Contributor

@fatih-acar fatih-acar commented Oct 6, 2025

  • On task workers when using the SDK client
  • On task workers when querying Prefect API
  • On api/task workers when executing Prefect tasks/flows locally
  • On api workers when submitting workflows

Note: this depends on PrefectHQ/prefect#19106 for Prefect client changes to be effective

We have seen that creating SSL/TLS contexts are resource intensive (CPU and memory).

EDIT: sdk client change is now handled directly by opsmill/infrahub-sdk-python#581

Summary by CodeRabbit

  • Bug Fixes

    • TLS verification is now consistently applied during workflow deployments and async execution, reducing HTTPS-related failures.
  • Refactor

    • Centralizes HTTP/TLS configuration and uses a shared async HTTP client across tasks and flows to improve stability and reduce overhead.
  • Chores

    • Adds a configurable TLS verification interface to standardize how verification settings are provided and consumed.

@github-actions github-actions bot added the group/backend Issue related to the backend (API Server, Git Agent) label Oct 6, 2025
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Oct 6, 2025

Walkthrough

Adds TLS verification plumbing across HTTP adapters and execution paths. Introduces InfrahubHTTP.verify_tls(self, verify: bool | None = None) -> bool | ssl.SSLContext and a TYPE_CHECKING-time ssl import. WorkflowWorkerExecution gains a class-level _http_adapter and wraps deployment submission in a prefect.context.AsyncClientContext configured with the adapter’s verify_tls() result. InfrahubWorkerAsync runs run_flow_async inside AsyncClientContext, initializes verify from get_http().verify_tls(), and clears ctx._httpx_settings. workflows.utils now calls get_client with httpx_settings={"verify": get_http().verify_tls()}. Tests add MemoryHTTP.verify_tls() returning False.

Pre-merge checks and finishing touches

❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title Check ✅ Passed The pull request title "fix(backend): re-use tls contexts where possible" directly and specifically describes the main objective of the changeset. All modified files across multiple components (HTTP adapters, workflow workers, async workers, and utilities) work together to implement TLS context caching and reuse to reduce CPU and memory overhead. The title accurately captures this unifying theme without being vague or generic, and it uses clear, conventional commit formatting that effectively communicates the change intent to someone reviewing the history.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch fac-fix-ssl-contexts

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@codspeed-hq
Copy link

codspeed-hq bot commented Oct 6, 2025

CodSpeed Performance Report

Merging #7327 will not alter performance

Comparing fac-fix-ssl-contexts (f5b5c77) with stable (add9edf)1

Summary

✅ 10 untouched

Footnotes

  1. No successful run was found on stable (ac43c24) during the generation of this report, so add9edf was used instead as the comparison base. There might be some changes unrelated to this pull request in this report.

@fatih-acar
Copy link
Contributor Author

@ogenstad is that good use of the http adapter? 😄

@ogenstad
Copy link
Contributor

ogenstad commented Oct 6, 2025

@ogenstad is that good use of the http adapter? 😄

Well, good find with the performance issue :) It was perhaps not how I envisioned it being used, but I don't know if I have a strong objection to it. Possibly self._http_adapter.verify_tls() could live somewhere else, or rather if verify_tls() was more of a generic function perhaps we wouldn't have to inject the HTTP adapter to Prefect as we're not really using it aside from the TLS caching. But it's perhaps fine.

So the real reason why I wanted this to be cached was actually during application startup.

    @model_validator(mode="after")
    def set_tls_context(self) -> Self:
        try:
            # Validate that the context can be created, we want to raise this error during application start
            # instead of running into issues later when we first try to use the tls context.
            self.get_tls_context()
        except ssl.SSLError as exc:
            raise ValueError(f"Unable load CA bundle from {self.tls_ca_bundle}: {exc}") from exc

        return self

I wanted the above to produce an error if you tried to load a custom CA bundle that didn't exist as it would be easier to spot during startup then have it happen some time later when you wanted to do something with HTTP.

But were you using a CA bundle or was it just any TLS call that benefitted from not having to recreate the context?

For the SDK. I also did not have that in mind when I added the ability to provide your custom transform but I like that it works :) I'm wondering though if it might cause any confusion with regards to TLS settings in the SDK if users have environment variables for the SDK that would set this to a different value. I.e. the TLS options in the SDK would be ignored which can be confusing?

@fatih-acar
Copy link
Contributor Author

Possibly self._http_adapter.verify_tls() could live somewhere else, or rather if verify_tls() was more of a generic function perhaps we wouldn't have to inject the HTTP adapter to Prefect as we're not really using it aside from the TLS caching. But it's perhaps fine.

Indeed, I only need the cached SSLContext thus I tried to reuse the current http adapter as much as possible. It can probably live somewhere else as you mentioned but don't know if it's worth the effort.

But were you using a CA bundle or was it just any TLS call that benefitted from not having to recreate the context?

The thing is that SSL contexts are ALWAYS created even if you do not use TLS at all... This is the case for both the SDK client and the Prefect client. The only way to disable creating a SSL context for the SDK client is to disable TLS verification. For the Prefect Client there's no way to avoid the context creation even when TLS is not used anywhere... (thus the upstream PR I opened).

For the SDK. I also did not have that in mind when I added the ability to provide your custom transform but I like that it works :) I'm wondering though if it might cause any confusion with regards to TLS settings in the SDK if users have environment variables for the SDK that would set this to a different value. I.e. the TLS options in the SDK would be ignored which can be confusing?

Glad you mention the part I'm not convinced of... I believe this can be a breaking change. I tried to move the timeout parameter into the HTTP adapter so we can use timeouts either configured in the SDK or passed to SDK calls. But I won't be able to do this for all the SDK client config...
The other option would be to allow the SDK client to grab a pre-built SSL context in that case, thus requiring a change in the SDK (which would be safer?). But I liked using the HTTP adapter for the SDK using the requester wire, looked elegant :)

@fatih-acar fatih-acar force-pushed the fac-fix-ssl-contexts branch 2 times, most recently from 9f2501f to ff57c6b Compare October 6, 2025 21:07
@ogenstad
Copy link
Contributor

ogenstad commented Oct 7, 2025

Indeed, I only need the cached SSLContext thus I tried to reuse the current http adapter as much as possible. It can probably live somewhere else as you mentioned but don't know if it's worth the effort.

I think an inline comment explaining why it's there and mention that if we're doing a refactor we could separate the SSLContext part out to some other shared location.

The thing is that SSL contexts are ALWAYS created even if you do not use TLS at all... This is the case for both the SDK client and the Prefect client. The only way to disable creating a SSL context for the SDK client is to disable TLS verification. For the Prefect Client there's no way to avoid the context creation even when TLS is not used anywhere... (thus the upstream PR I opened).

Based on this I'd say that we should implement a fix for this directly in the SDK so that we always create an SSL context that is cached within the SDK config. That way anyone could benefit from it not just internally when used within Infrahub. I.e. when someone runs a generator outside of Infrahub or just uses the SDK we'd have this in place already. Then what we can do is to cache the first SSLContext somewhere reached by build_client() so that we don't have to recreate the SDK context each time. Or provide a way to set the SSLContext from Infrahub into the SDK. I.e. a method within config.py in the SDK.

How does that sound?

@fatih-acar
Copy link
Contributor Author

Based on this I'd say that we should implement a fix for this directly in the SDK so that we always create an SSL context that is cached within the SDK config. That way anyone could benefit from it not just internally when used within Infrahub. I.e. when someone runs a generator outside of Infrahub or just uses the SDK we'd have this in place already. Then what we can do is to cache the first SSLContext somewhere reached by build_client() so that we don't have to recreate the SDK context each time. Or provide a way to set the SSLContext from Infrahub into the SDK. I.e. a method within config.py in the SDK.

How does that sound?

Sounds good, I think we can do both: create a cached SSL context within the SDK (for the SDK end-users) and allow passing an external SSLContext to the SDK client (for the tasks workers).

@fatih-acar fatih-acar force-pushed the fac-fix-ssl-contexts branch 2 times, most recently from c0a4b5b to 64a9e93 Compare October 15, 2025 07:56
@fatih-acar
Copy link
Contributor Author

Pushed opsmill/infrahub-sdk-python#581 for the SDK change
I removed the previous commit for the SDK client change within this PR (it's not required anymore).

I also added an inline comment for the HttpAdapter usage.

@fatih-acar fatih-acar marked this pull request as ready for review October 15, 2025 07:58
@fatih-acar fatih-acar requested a review from a team as a code owner October 15, 2025 07:58
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Nitpick comments (2)
backend/infrahub/services/adapters/http/__init__.py (1)

12-14: Make this an abstract API and document semantics

Declare the class as ABC and mark verify_tls as @AbstractMethod; add a short docstring clarifying return behavior (False vs SSLContext). Improves clarity and mypy/ruff posture.

As per coding guidelines

Apply within/around this method:

+from abc import ABC, abstractmethod
@@
-class InfrahubHTTP:
+class InfrahubHTTP(ABC):
@@
-    def verify_tls(self, verify: bool | None = None) -> bool | ssl.SSLContext:
-        raise NotImplementedError()
+    @abstractmethod
+    def verify_tls(self, verify: bool | None = None) -> bool | ssl.SSLContext:
+        """Return TLS verification setting.
+
+        Args:
+            verify: If explicitly False, disable verification; otherwise use configured SSLContext.
+
+        Returns:
+            False to disable verification, or a cached SSLContext to enable verification.
+        """
+        ...

Note: Consider adding an optional verify parameter to get(...) for parity with post(...).

backend/infrahub/services/adapters/workflow/worker.py (1)

23-28: Good: class-level adapter to cache TLS context

This effectively reuses the SSLContext. Consider annotating the attribute for mypy clarity.

-    _http_adapter = HttpxAdapter()
+    _http_adapter: HttpxAdapter = HttpxAdapter()
📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between e44c7ae and 64a9e93.

📒 Files selected for processing (4)
  • backend/infrahub/services/adapters/http/__init__.py (1 hunks)
  • backend/infrahub/services/adapters/workflow/worker.py (3 hunks)
  • backend/infrahub/workers/infrahub_async.py (3 hunks)
  • backend/infrahub/workflows/utils.py (2 hunks)
🧰 Additional context used
📓 Path-based instructions (2)
backend/**/*

📄 CodeRabbit inference engine (.github/copilot-instructions.md)

Run backend tests with pytest or via invoke tasks

Files:

  • backend/infrahub/services/adapters/http/__init__.py
  • backend/infrahub/workflows/utils.py
  • backend/infrahub/workers/infrahub_async.py
  • backend/infrahub/services/adapters/workflow/worker.py
**/*.py

📄 CodeRabbit inference engine (.github/instructions/python-docstring.instructions.md)

**/*.py: Use triple double quotes (""") for all Python docstrings
Write docstrings in Google-style format
Include a brief one-line description at the top of each docstring
Add a detailed description section when additional context is needed
Document function/method parameters under an Args/Parameters section without typing information
Include a Returns section describing the return value
Include a Raises section listing possible exceptions
Provide an Examples section demonstrating usage when helpful

Use ruff and mypy to validate and lint Python files

**/*.py: Use type hints for all Python function parameters and return values
Prefer asynchronous code in Python when feasible
Define asynchronous Python functions with async def
Use await for asynchronous calls in Python
Use Pydantic models instead of standard dataclasses
Use ruff and mypy for linting and type checking

**/*.py: Use type hints for all Python function parameters and return values
Use async/await whenever possible in Python code
Define asynchronous functions with async def
Await asynchronous calls with await
Use Pydantic models instead of standard dataclasses for data modeling
Use triple quotes (""") for all Python docstrings
Write docstrings in Google-style format
Include docstring sections when applicable: one-line summary, optional details, Args (without types), Returns, Raises, Examples
Validate and lint Python with ruff and mypy

Files:

  • backend/infrahub/services/adapters/http/__init__.py
  • backend/infrahub/workflows/utils.py
  • backend/infrahub/workers/infrahub_async.py
  • backend/infrahub/services/adapters/workflow/worker.py
🧬 Code graph analysis (3)
backend/infrahub/workflows/utils.py (2)
backend/infrahub/workers/dependencies.py (1)
  • get_http (126-127)
backend/infrahub/services/adapters/http/__init__.py (1)
  • verify_tls (12-13)
backend/infrahub/workers/infrahub_async.py (3)
backend/tests/conftest.py (1)
  • prefect (382-397)
backend/infrahub/workers/dependencies.py (1)
  • get_http (126-127)
backend/infrahub/services/adapters/http/__init__.py (1)
  • verify_tls (12-13)
backend/infrahub/services/adapters/workflow/worker.py (2)
backend/infrahub/services/adapters/http/httpx.py (1)
  • HttpxAdapter (17-105)
backend/infrahub/services/adapters/http/__init__.py (1)
  • verify_tls (12-13)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: backend-tests-unit
🔇 Additional comments (4)
backend/infrahub/services/adapters/http/__init__.py (1)

6-7: Type-only import is fine

Using TYPE_CHECKING for ssl keeps runtime clean and works with postponed annotations.

backend/infrahub/workers/infrahub_async.py (2)

11-11: OK to import AsyncClientContext

Import looks correct and aligns with new usage.


31-31: Import of get_http may be problematic outside DI

You import get_http but later call it outside a DI context; see comment below with a safer approach.

backend/infrahub/services/adapters/workflow/worker.py (1)

89-91: LGTM; TLS-verified client context around deployment

Wrapping run_deployment in AsyncClientContext with cached SSLContext meets the PR goal.

Please confirm this path executes before any other Prefect client usage so that all requests for the deployment inherit the intended TLS settings.

Comment on lines +159 to 162
async with AsyncClientContext(httpx_settings={"verify": get_http().verify_tls()}) as ctx:
ctx._httpx_settings = None # Hack to make all child task/flow runs use the same client
await run_flow_async(flow=flow_func, flow_run=flow_run, parameters=params, return_type="state")

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

Avoid DI-dependent get_http() and hardening around private attribute

  • get_http() relies on FastAPI Depends; outside DI it may return a Depends object, causing AttributeError on .verify_tls(). Use a class-level HttpxAdapter to fetch the cached SSLContext.
  • Mutating ctx._httpx_settings touches a private attribute; guard with hasattr to avoid breakage across Prefect versions.

Apply within this block:

-        async with AsyncClientContext(httpx_settings={"verify": get_http().verify_tls()}) as ctx:
-            ctx._httpx_settings = None  # Hack to make all child task/flow runs use the same client
+        async with AsyncClientContext(httpx_settings={"verify": self._http_adapter.verify_tls()}) as ctx:
+            if hasattr(ctx, "_httpx_settings"):
+                ctx._httpx_settings = None  # Maintain shared client only if attribute exists
             await run_flow_async(flow=flow_func, flow_run=flow_run, parameters=params, return_type="state")

Add these supporting changes outside the snippet:

# imports
from infrahub.services.adapters.http.httpx import HttpxAdapter
# in class InfrahubWorkerAsync
_http_adapter = HttpxAdapter()  # reuse cached SSLContext across runs

If you prefer to keep get_http(), ensure it returns a concrete InfrahubHTTP in this runtime; otherwise this will fail.


🏁 Script executed:

#!/bin/bash
set -euo pipefail
# Show get_http definition and where it's used without DI
rg -n -C2 'def get_http\(' backend/infrahub/workers/dependencies.py || true
rg -n -C3 '\bget_http\s*\(\s*\)' -g '!**/tests/**' backend | sed -n '1,200p'

Length of output: 3418


🏁 Script executed:

#!/bin/bash
set -euo pipefail
head -n 200 backend/infrahub/workers/infrahub_async.py
rg -n 'HttpxAdapter' -n backend/infrahub/workers/infrahub_async.py || true
rg -n '_http_adapter' -n backend/infrahub/workers/infrahub_async.py || true

Length of output: 8203


🏁 Script executed:

#!/bin/bash
set -euo pipefail
head -n 100 backend/infrahub/services/adapters/workflow/worker.py
rg -n "HttpxAdapter" -n backend/infrahub/services/adapters/workflow/worker.py

Length of output: 3706


Replace DI-dependent get_http() with class-level HttpxAdapter and guard private attribute

get_http() uses FastAPI Depends and will return a Depends object outside DI, breaking .verify_tls(). Use a dedicated HttpxAdapter instance and only mutate the private attribute if it exists:

--- a/backend/infrahub/workers/infrahub_async.py
+++ b/backend/infrahub/workers/infrahub_async.py
@@ async def run(self, flow_run: FlowRun, configuration: BaseJobConfiguration, task_status: TaskStatus | None = None,) -> BaseWorkerResult:
-        async with AsyncClientContext(httpx_settings={"verify": get_http().verify_tls()}) as ctx:
-            ctx._httpx_settings = None  # Hack to make all child task/flow runs use the same client
+        async with AsyncClientContext(httpx_settings={"verify": self._http_adapter.verify_tls()}) as ctx:
+            if hasattr(ctx, "_httpx_settings"):
+                ctx._httpx_settings = None  # Maintain shared client only if supported
             await run_flow_async(flow=flow_func, flow_run=flow_run, parameters=params, return_type="state")

Add at the top of the file:

from infrahub.services.adapters.http.httpx import HttpxAdapter

And inside class InfrahubWorkerAsync:

_http_adapter = HttpxAdapter()
🤖 Prompt for AI Agents
In backend/infrahub/workers/infrahub_async.py around lines 159-162, replace the
DI-dependent get_http() call (which returns a Depends outside DI) with a
class-level HttpxAdapter instance and guard mutation of the private attribute:
add "from infrahub.services.adapters.http.httpx import HttpxAdapter" at the top
of the file, declare a class attribute _http_adapter = HttpxAdapter() inside
InfrahubWorkerAsync, use _http_adapter.verify_tls() when creating
AsyncClientContext, and only set ctx._httpx_settings = None if hasattr(ctx,
"_httpx_settings") to avoid unguarded mutation.

from infrahub.core.constants import GLOBAL_BRANCH_NAME
from infrahub.core.registry import registry
from infrahub.tasks.registry import refresh_branches
from infrahub.workers.dependencies import get_http
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

Avoid using FastAPI Depends-based get_http() here; also use an async context for get_client

  • get_http() returns a FastAPI Depends wrapper outside DI, so get_http().verify_tls() may raise at runtime. Use a module-level HttpxAdapter instead to obtain a cached SSLContext.
  • Wrap get_client in an async context to avoid leaking HTTP clients.

As per coding guidelines

Apply this diff here:

- from infrahub.workers.dependencies import get_http
+ from infrahub.services.adapters.http.httpx import HttpxAdapter
-    client = get_client(httpx_settings={"verify": get_http().verify_tls()}, sync_client=False)
+    async with get_client(
+        httpx_settings={"verify": _HTTP_ADAPTER.verify_tls()}, sync_client=False
+    ) as client:

And add this at module scope:

# at top-level
_HTTP_ADAPTER = HttpxAdapter()  # caches SSLContext internally

And adjust the call site:

-    await client.update_flow_run(current_flow_run_id, tags=list(new_tags))
+    await client.update_flow_run(current_flow_run_id, tags=list(new_tags))

Verification script to confirm DI usage of get_http:

Also applies to: 30-30


🏁 Script executed:

#!/bin/bash
set -euo pipefail
echo "Definition of get_http:"
rg -n -C2 'def get_http\(' backend/infrahub/workers/dependencies.py || true
echo
echo "Imports of Depends in dependencies module:"
rg -n -C2 '\bDepends\(' backend/infrahub/workers/dependencies.py || true
echo
echo "Call sites of get_http() outside FastAPI routes:"
rg -n -C3 '\bget_http\s*\(\s*\)' -g '!**/tests/**' backend | sed -n '1,200p'

Length of output: 4978


🏁 Script executed:

#!/bin/bash
set -euo pipefail
echo "Inspect HttpxAdapter definition:"
rg -n "class HttpxAdapter" -C3 backend/infrahub/services/adapters/http/httpx.py || true
echo
echo "Inspect verify_tls in HttpxAdapter:"
rg -n "def verify_tls" -C3 backend/infrahub/services/adapters/http/httpx.py || true
echo
echo "Context around imports and usage in utils.py:"
rg -n -C5 "get_http" -C5 backend/infrahub/workflows/utils.py || true

Length of output: 1387


Replace get_http with a module‐level HttpxAdapter and wrap the client in an async context

  • At top of backend/infrahub/workflows/utils.py, remove:
    from infrahub.workers.dependencies import get_http
    and add:
    from infrahub.services.adapters.http.httpx import HttpxAdapter
    
    _HTTP_ADAPTER = HttpxAdapter()  # caches SSLContext internally
  • Change the client instantiation:
    - client = get_client(httpx_settings={"verify": get_http().verify_tls()}, sync_client=False)
    + async with get_client(
    +     httpx_settings={"verify": _HTTP_ADAPTER.verify_tls()}, sync_client=False
    + ) as client:
🤖 Prompt for AI Agents
In backend/infrahub/workflows/utils.py around line 12, remove the import of
get_http and instead import HttpxAdapter from
infrahub.services.adapters.http.httpx and create a module-level adapter instance
named _HTTP_ADAPTER = HttpxAdapter() (it caches SSLContext internally); then
update any client instantiation to obtain the client from the adapter and use it
within an async context manager (e.g., use the adapter's async client
acquisition so you do "async with _HTTP_ADAPTER.client() as client" where the
client is needed) to ensure proper async lifecycle and reuse of the adapter.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

♻️ Duplicate comments (1)
backend/infrahub/workers/infrahub_async.py (1)

159-161: Do not call DI-bound get_http() here; use a class-level HttpxAdapter and guard private attr

get_http() returns a Depends outside FastAPI DI, so .verify_tls() will fail at runtime. Also, mutating ctx._httpx_settings unguarded risks breakage across Prefect versions.

Apply within this block:

-        async with AsyncClientContext(httpx_settings={"verify": get_http().verify_tls()}) as ctx:
-            ctx._httpx_settings = None  # Hack to make all child task/flow runs use the same client
+        async with AsyncClientContext(httpx_settings={"verify": self._http_adapter.verify_tls()}) as ctx:
+            if hasattr(ctx, "_httpx_settings"):
+                ctx._httpx_settings = None  # Maintain shared client only if supported
             await run_flow_async(flow=flow_func, flow_run=flow_run, parameters=params, return_type="state")

Add outside this snippet:

# imports (top of file)
from infrahub.services.adapters.http.httpx import HttpxAdapter
# inside class InfrahubWorkerAsync
_http_adapter = HttpxAdapter()  # reuse cached SSLContext across runs

Then remove the now-unused get_http import. As per coding guidelines

🧹 Nitpick comments (3)
backend/tests/adapters/http.py (2)

9-13: Redundant TYPE_CHECKING import of httpx

httpx is already imported at module scope; keep only the TYPE_CHECKING import (and drop the top-level) or drop this one to avoid duplication. Either is fine; prefer just TYPE_CHECKING given future annotations.


20-21: Let verify_tls honor the verify argument for test fidelity

Returning False unconditionally is okay, but supporting the verify argument makes tests mirror production behavior better.

Apply:

 def verify_tls(self, verify: bool | None = None) -> bool | ssl.SSLContext:
-    return False
+    return False if verify is not None else False

Or simply pass through when provided:

 def verify_tls(self, verify: bool | None = None) -> bool | ssl.SSLContext:
-    return False
+    return verify if verify is not None else False

Also consider adding a brief docstring. As per coding guidelines

backend/infrahub/services/adapters/workflow/worker.py (1)

89-90: AsyncClientContext wrapping looks correct; consider a brief docstring note

LGTM for TLS reuse. Optionally document that verify is sourced from a cached SSLContext via the adapter for future maintainers.

As per coding guidelines

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 64a9e93 and 820eb29.

📒 Files selected for processing (5)
  • backend/infrahub/services/adapters/http/__init__.py (1 hunks)
  • backend/infrahub/services/adapters/workflow/worker.py (3 hunks)
  • backend/infrahub/workers/infrahub_async.py (3 hunks)
  • backend/infrahub/workflows/utils.py (2 hunks)
  • backend/tests/adapters/http.py (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (2)
  • backend/infrahub/services/adapters/http/init.py
  • backend/infrahub/workflows/utils.py
🧰 Additional context used
📓 Path-based instructions (3)
backend/tests/**/*

📄 CodeRabbit inference engine (.github/copilot-instructions.md)

Place backend tests in backend/tests/

Files:

  • backend/tests/adapters/http.py
backend/**/*

📄 CodeRabbit inference engine (.github/copilot-instructions.md)

Run backend tests with pytest or via invoke tasks

Files:

  • backend/tests/adapters/http.py
  • backend/infrahub/services/adapters/workflow/worker.py
  • backend/infrahub/workers/infrahub_async.py
**/*.py

📄 CodeRabbit inference engine (.github/instructions/python-docstring.instructions.md)

**/*.py: Use triple double quotes (""") for all Python docstrings
Write docstrings in Google-style format
Include a brief one-line description at the top of each docstring
Add a detailed description section when additional context is needed
Document function/method parameters under an Args/Parameters section without typing information
Include a Returns section describing the return value
Include a Raises section listing possible exceptions
Provide an Examples section demonstrating usage when helpful

Use ruff and mypy to validate and lint Python files

**/*.py: Use type hints for all Python function parameters and return values
Prefer asynchronous code in Python when feasible
Define asynchronous Python functions with async def
Use await for asynchronous calls in Python
Use Pydantic models instead of standard dataclasses
Use ruff and mypy for linting and type checking

**/*.py: Use type hints for all Python function parameters and return values
Use async/await whenever possible in Python code
Define asynchronous functions with async def
Await asynchronous calls with await
Use Pydantic models instead of standard dataclasses for data modeling
Use triple quotes (""") for all Python docstrings
Write docstrings in Google-style format
Include docstring sections when applicable: one-line summary, optional details, Args (without types), Returns, Raises, Examples
Validate and lint Python with ruff and mypy

Files:

  • backend/tests/adapters/http.py
  • backend/infrahub/services/adapters/workflow/worker.py
  • backend/infrahub/workers/infrahub_async.py
🧬 Code graph analysis (3)
backend/tests/adapters/http.py (1)
backend/infrahub/services/adapters/http/__init__.py (2)
  • InfrahubHTTP (11-30)
  • verify_tls (12-13)
backend/infrahub/services/adapters/workflow/worker.py (3)
backend/infrahub/services/adapters/http/httpx.py (1)
  • HttpxAdapter (17-105)
backend/infrahub/services/adapters/http/__init__.py (1)
  • verify_tls (12-13)
backend/tests/adapters/http.py (1)
  • verify_tls (20-21)
backend/infrahub/workers/infrahub_async.py (3)
backend/infrahub/workers/dependencies.py (1)
  • get_http (126-127)
backend/infrahub/services/adapters/http/__init__.py (1)
  • verify_tls (12-13)
backend/tests/adapters/http.py (1)
  • verify_tls (20-21)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (8)
  • GitHub Check: E2E-testing-version-upgrade / From 1.3.0
  • GitHub Check: backend-benchmark
  • GitHub Check: E2E-testing-invoke-demo-start
  • GitHub Check: E2E-testing-playwright
  • GitHub Check: backend-tests-functional
  • GitHub Check: backend-docker-integration
  • GitHub Check: backend-tests-unit
  • GitHub Check: backend-tests-integration
🔇 Additional comments (1)
backend/infrahub/services/adapters/workflow/worker.py (1)

24-27: Good use of a class-level HttpxAdapter to cache TLS context

This avoids DI/cycles and reuses the SSLContext as intended.

This avoids overhead due to SSL context initialization within Prefect
client.

Signed-off-by: Fatih Acar <[email protected]>
@fatih-acar fatih-acar force-pushed the fac-fix-ssl-contexts branch from 820eb29 to f5b5c77 Compare October 30, 2025 12:35
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

♻️ Duplicate comments (1)
backend/infrahub/workers/infrahub_async.py (1)

159-161: Replace DI-dependent get_http() with class-level HttpxAdapter.

The previous review comment on these lines remains unresolved. get_http() uses FastAPI's Depends and will fail outside the DI context (this Prefect worker is not a FastAPI handler). Additionally, mutating the private ctx._httpx_settings attribute on line 160 is fragile across Prefect versions.

Follow the pattern used in backend/infrahub/services/adapters/workflow/worker.py (lines 24-27) which uses a class-level HttpxAdapter to avoid the DI dependency issue.

Apply the same fix suggested in the previous review:

+from infrahub.services.adapters.http.httpx import HttpxAdapter
+
 class InfrahubWorkerAsync(BaseWorker):
     type: str = "infrahubasync"
+    _http_adapter = HttpxAdapter()
     ...
     
     async def run(self, ...):
         ...
-        async with AsyncClientContext(httpx_settings={"verify": get_http().verify_tls()}) as ctx:
-            ctx._httpx_settings = None  # Hack to make all child task/flow runs use the same client
+        async with AsyncClientContext(httpx_settings={"verify": self._http_adapter.verify_tls()}) as ctx:
+            if hasattr(ctx, "_httpx_settings"):
+                ctx._httpx_settings = None  # Maintain shared client only if attribute exists
             await run_flow_async(...)
🧹 Nitpick comments (2)
backend/tests/adapters/http.py (1)

9-12: Remove redundant httpx import from TYPE_CHECKING block.

httpx is already imported at runtime on line 5, so the TYPE_CHECKING import on line 12 is redundant. The TYPE_CHECKING block should only contain imports needed exclusively for type annotations.

Apply this diff:

 if TYPE_CHECKING:
     import ssl
-
-    import httpx
backend/infrahub/services/adapters/http/__init__.py (1)

12-13: Add docstring to the abstract method.

The new verify_tls method lacks documentation. Per coding guidelines, all methods should have Google-style docstrings with a brief description, Args, and Returns sections.

As per coding guidelines.

Example docstring:

def verify_tls(self, verify: bool | None = None) -> bool | ssl.SSLContext:
    """Returns TLS verification configuration for HTTP requests.
    
    Args:
        verify: Optional override to disable TLS verification when False.
                When None or True, returns the cached SSLContext.
    
    Returns:
        False when TLS verification is disabled, otherwise an SSLContext
        instance configured with the appropriate CA bundle.
    
    Raises:
        NotImplementedError: This method must be implemented by subclasses.
    """
    raise NotImplementedError()
📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 820eb29 and f5b5c77.

📒 Files selected for processing (5)
  • backend/infrahub/services/adapters/http/__init__.py (1 hunks)
  • backend/infrahub/services/adapters/workflow/worker.py (3 hunks)
  • backend/infrahub/workers/infrahub_async.py (3 hunks)
  • backend/infrahub/workflows/utils.py (2 hunks)
  • backend/tests/adapters/http.py (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
  • backend/infrahub/workflows/utils.py
🧰 Additional context used
📓 Path-based instructions (3)
backend/**/*

📄 CodeRabbit inference engine (.github/copilot-instructions.md)

Run backend tests with pytest or via invoke tasks

Files:

  • backend/infrahub/workers/infrahub_async.py
  • backend/infrahub/services/adapters/http/__init__.py
  • backend/infrahub/services/adapters/workflow/worker.py
  • backend/tests/adapters/http.py
**/*.py

📄 CodeRabbit inference engine (.github/instructions/python-docstring.instructions.md)

**/*.py: Use triple double quotes (""") for all Python docstrings
Write docstrings in Google-style format
Include a brief one-line description at the top of each docstring
Add a detailed description section when additional context is needed
Document function/method parameters under an Args/Parameters section without typing information
Include a Returns section describing the return value
Include a Raises section listing possible exceptions
Provide an Examples section demonstrating usage when helpful

Use ruff and mypy to validate and lint Python files

**/*.py: Use type hints for all Python function parameters and return values
Prefer asynchronous code in Python when feasible
Define asynchronous Python functions with async def
Use await for asynchronous calls in Python
Use Pydantic models instead of standard dataclasses
Use ruff and mypy for linting and type checking

**/*.py: Use type hints for all Python function parameters and return values
Use async/await whenever possible in Python code
Define asynchronous functions with async def
Await asynchronous calls with await
Use Pydantic models instead of standard dataclasses for data modeling
Use triple quotes (""") for all Python docstrings
Write docstrings in Google-style format
Include docstring sections when applicable: one-line summary, optional details, Args (without types), Returns, Raises, Examples
Validate and lint Python with ruff and mypy

Files:

  • backend/infrahub/workers/infrahub_async.py
  • backend/infrahub/services/adapters/http/__init__.py
  • backend/infrahub/services/adapters/workflow/worker.py
  • backend/tests/adapters/http.py
backend/tests/**/*

📄 CodeRabbit inference engine (.github/copilot-instructions.md)

Place backend tests in backend/tests/

Files:

  • backend/tests/adapters/http.py
🧬 Code graph analysis (4)
backend/infrahub/workers/infrahub_async.py (3)
backend/infrahub/workers/dependencies.py (1)
  • get_http (133-134)
backend/infrahub/services/adapters/http/__init__.py (1)
  • verify_tls (12-13)
backend/tests/adapters/http.py (1)
  • verify_tls (20-21)
backend/infrahub/services/adapters/http/__init__.py (1)
backend/tests/adapters/http.py (1)
  • verify_tls (20-21)
backend/infrahub/services/adapters/workflow/worker.py (3)
backend/infrahub/services/adapters/http/httpx.py (1)
  • HttpxAdapter (17-105)
backend/infrahub/services/adapters/http/__init__.py (1)
  • verify_tls (12-13)
backend/tests/adapters/http.py (1)
  • verify_tls (20-21)
backend/tests/adapters/http.py (1)
backend/infrahub/services/adapters/http/__init__.py (2)
  • InfrahubHTTP (11-30)
  • verify_tls (12-13)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (8)
  • GitHub Check: E2E-testing-version-upgrade / From 1.3.0
  • GitHub Check: backend-benchmark
  • GitHub Check: E2E-testing-playwright
  • GitHub Check: E2E-testing-invoke-demo-start
  • GitHub Check: backend-tests-unit
  • GitHub Check: backend-docker-integration
  • GitHub Check: backend-tests-integration
  • GitHub Check: backend-tests-functional
🔇 Additional comments (2)
backend/infrahub/services/adapters/workflow/worker.py (2)

24-27: Good approach to avoid DI dependency issues.

Using a class-level HttpxAdapter instance avoids the circular dependency and DI issues. The inline comments clearly explain the rationale and future refactoring path. This pattern should be consistently applied elsewhere (e.g., backend/infrahub/workers/infrahub_async.py lines 159-161).


89-90: I need to clarify the actual behavior of Prefect's AsyncClientContext to determine if the review comment's concern is valid.Based on the verification results, I now have enough information to determine the validity of the review comment.

Do not merge as-is; the review comment's premise is incorrect.

The review comment asks to verify that the AsyncClientContext "properly shares across multiple run_deployment calls," but the code structure contradicts this:

  1. Single call per context: The submit_workflow method calls run_deployment only ONCE within each AsyncClientContext, not multiple times.

  2. Fresh context per invocation: A new AsyncClientContext is instantiated each time submit_workflow is called, so contexts are not shared across method invocations.

  3. HttpxAdapter caching is separate: The @cached_property on tls_context caches the SSL context at the HttpxAdapter instance level, not at the AsyncClientContext level. The TLS settings are correctly passed to the client via httpx_settings={"verify": self._http_adapter.verify_tls()}.

  4. Compare with existing pattern: The codebase already contains a known pattern in infrahub_async.py (line 159-160) that includes the hack ctx._httpx_settings = None with an explicit comment about reusing the httpx client across multiple child runs. The submit_workflow method does NOT use this pattern.

Action: The code correctly applies TLS settings per deployment submission. If the intent is to share a single httpx client across multiple submit_workflow calls, that would require a different architectural pattern (e.g., reusing the same AsyncClientContext instance globally or using the _httpx_settings hack). The current implementation is appropriate for single-call execution and doesn't require changes.

Likely an incorrect or invalid review comment.

@fatih-acar fatih-acar requested review from a team and removed request for a team October 31, 2025 10:29
Copy link
Contributor

@ogenstad ogenstad left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, did we also need a followup PR to allow the use of the Infrahub TLS context when we initialize a new SDK client?

I don't think this is supported in the Prefect version we have in stable, but I think you also said that it would just be silently ignored until we have the correct version.

db_change: bool = False,
) -> None:
client = get_client(sync_client=False)
client = get_client(httpx_settings={"verify": get_http().verify_tls()}, sync_client=False)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's a bit confusing in the codebase that we use get_client() where we sometimes want an InfrahubClient and in other locations we want a Prefect client. Did you just update this call as a hotspot where you needed this the most? Wondering if we should have a helper function to get the prefect client where we can update these settings in one location. I think we're good to go for this PR but that we should consider the approach for this for all the other calls.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did you just update this call as a hotspot where you needed this the most?

Exactly, add_tags() is called within almost all workflows.
I agree it's confusing to have get_client for both InfrahubClient and PrefectClient...

@fatih-acar
Copy link
Contributor Author

LGTM, did we also need a followup PR to allow the use of the Infrahub TLS context when we initialize a new SDK client?

I don't think this is supported in the Prefect version we have in stable, but I think you also said that it would just be silently ignored until we have the correct version.

Yes, follow up PR is here: #7396 (waiting for an SDK update)

Yes, the changes here will be ignored in stable until Prefect is upgraded (which is the case in develop/release-1.5).

@fatih-acar fatih-acar merged commit 9a4ce0d into stable Oct 31, 2025
39 checks passed
@fatih-acar fatih-acar deleted the fac-fix-ssl-contexts branch October 31, 2025 12:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

group/backend Issue related to the backend (API Server, Git Agent)

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants