Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
38 commits
Select commit Hold shift + click to select a range
12623ca
Initial plan
Copilot Jun 30, 2025
265c9f0
Fix vale lint command to exclude node_modules directory
Copilot Jun 30, 2025
1d78e57
Add setting typename, display_label, and kind if it exists when calli…
FragmentedPacket Jun 30, 2025
560238a
release 1.13.3 (#459)
ajtmccarty Jun 30, 2025
ce43631
fix _process_relationships for Sync client
BeArchiTek Jul 1, 2025
8c99a80
add changelog
BeArchiTek Jul 1, 2025
02b4f12
Merge pull request #460 from opsmill/bkr-fix-process_relationships
BeArchiTek Jul 2, 2025
f58c459
Fix `load_from_disk` method to support folder with multiple file exte…
BaptisteGi Jul 4, 2025
d527bff
fix signature delta
BeArchiTek Jul 14, 2025
cdd64c3
do not ommit offset when offset=0
BeArchiTek Jul 14, 2025
0dd2541
fragment and rollback node change
BeArchiTek Jul 14, 2025
ec3627f
Merge pull request #468 from opsmill/bkr-fix-sync-parallel-filters
BeArchiTek Jul 14, 2025
c67fb29
fixes #469
BeArchiTek Jul 15, 2025
8d89883
Merge pull request #470 from opsmill/bkr-remove-node-process-page
BeArchiTek Jul 16, 2025
121c94d
add check for empty list of schema
BeArchiTek Jul 16, 2025
8c71f45
add changelog
BeArchiTek Jul 16, 2025
4717ffc
Infrahub repostory init (#467)
minitriga Jul 16, 2025
6465b92
fix: improve cardinality many relationship fetch (#476)
fatih-acar Jul 22, 2025
19a03ae
Merge pull request #472 from opsmill/bkr-fix-load-schemas
dgarros Jul 22, 2025
396c50b
Prepare version 1.13.4
dgarros Jul 22, 2025
5246e3f
Merge pull request #477 from opsmill/dga-release-1.13.4
dgarros Jul 22, 2025
ef7fc07
Merge pull request #457 from opsmill/copilot/fix-372
dgarros Jul 22, 2025
d227402
respect ordering of files when loading
ajtmccarty Jul 22, 2025
e12cc84
Merge pull request #478 from opsmill/ajtm-07222025-respect-file-order
dgarros Jul 23, 2025
142df70
Prepare release 1.13.5
dgarros Jul 23, 2025
64a7957
Merge pull request #480 from opsmill/dga-release-1.13.5
dgarros Jul 23, 2025
f5e3b69
pass branch into count call
ajtmccarty Jul 23, 2025
de6cfbf
Create batch directly instead of using create_batch while fetching re…
dgarros Jul 24, 2025
21def0a
Merge pull request #483 from opsmill/dga-20250724-create-batch
dgarros Jul 24, 2025
7876f13
add changelog
ajtmccarty Jul 24, 2025
16da1ba
Merge pull request #482 from opsmill/ajtm-0723205-branch-in-count
dgarros Jul 24, 2025
8e357b9
Finalize typing on ctl.schema
ogenstad Jun 20, 2025
f82744b
Merge pull request #458 from opsmill/stable
ogenstad Jul 28, 2025
936522d
add support for NumerPool attributes to protocols
wvandeun Jul 28, 2025
f3bb071
Merge pull request #484 from opsmill/wvd-20250728-add-numberpool-supp…
BeArchiTek Jul 29, 2025
199532f
Merge pull request #451 from opsmill/pog-ctl-schema-typing-20250620
ogenstad Jul 30, 2025
5daba0e
Rework `get_flat_value` to fix node relationships (#487)
gmazoyer Jul 30, 2025
0cde3c6
Merge stable into develop (#486)
gmazoyer Aug 7, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
22 changes: 22 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,28 @@ This project uses [*towncrier*](https://towncrier.readthedocs.io/) and the chang

<!-- towncrier release notes start -->

## [1.13.5](https://github.com/opsmill/infrahub-sdk-python/tree/v1.13.5) - 2025-07-23

### Fixed

- Respect ordering when loading files from a directory

## [1.13.4](https://github.com/opsmill/infrahub-sdk-python/tree/v1.13.4) - 2025-07-22

### Fixed

- Fix processing of relationshhip during nodes retrieval using the Sync Client, when prefecthing related_nodes. ([#461](https://github.com/opsmill/infrahub-sdk-python/issues/461))
- Fix schema loading to ignore non-YAML files in folders. ([#462](https://github.com/opsmill/infrahub-sdk-python/issues/462))
- Fix ignored node variable in filters(). ([#469](https://github.com/opsmill/infrahub-sdk-python/issues/469))
- Fix use of parallel with filters for Infrahub Client Sync.
- Avoid sending empty list to infrahub if no valids schemas are found.

## [1.13.3](https://github.com/opsmill/infrahub-sdk-python/tree/v1.13.3) - 2025-06-30

### Fixed

- Update InfrahubNode creation to include __typename, display_label, and kind from a RelatedNode ([#455](https://github.com/opsmill/infrahub-sdk-python/issues/455))

## [1.13.2](https://github.com/opsmill/infrahub-sdk-python/tree/v1.13.2) - 2025-06-27

### Fixed
Expand Down
1 change: 1 addition & 0 deletions changelog/+add_numberpool_support_protocols.added.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
add support for NumberPool attributes in generated protocols
1 change: 1 addition & 0 deletions changelog/+batch.fixed.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Create a new batch while fetching relationships instead of using the reusing the same one.
1 change: 1 addition & 0 deletions changelog/+branch-in-count.fixed.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Update internal calls to `count` to include the branch parameter so that the query is performed on the correct branch
1 change: 1 addition & 0 deletions changelog/466.added.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Added `infrahubctl repository init` command to allow the initialization of an Infrahub repository using [infrahub-template](https://github.com/opsmill/infrahub-template).
1 change: 1 addition & 0 deletions changelog/6882.fixed.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Fix value lookup using a flat notation like `foo__bar__value` with relationships of cardinality one
24 changes: 24 additions & 0 deletions docs/docs/infrahubctl/infrahubctl-repository.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@ $ infrahubctl repository [OPTIONS] COMMAND [ARGS]...
**Commands**:

* `add`: Add a new repository.
* `init`: Initialize a new Infrahub repository.
* `list`

## `infrahubctl repository add`
Expand Down Expand Up @@ -47,6 +48,29 @@ $ infrahubctl repository add [OPTIONS] NAME LOCATION
* `--config-file TEXT`: [env var: INFRAHUBCTL_CONFIG; default: infrahubctl.toml]
* `--help`: Show this message and exit.

## `infrahubctl repository init`

Initialize a new Infrahub repository.

**Usage**:

```console
$ infrahubctl repository init [OPTIONS] DIRECTORY
```

**Arguments**:

* `DIRECTORY`: Directory path for the new project. [required]

**Options**:

* `--template TEXT`: Template to use for the new repository. Can be a local path or a git repository URL. [default: https://github.com/opsmill/infrahub-template.git]
* `--data PATH`: Path to YAML file containing answers to CLI prompt.
* `--vcs-ref TEXT`: VCS reference to use for the template. Defaults to HEAD. [default: HEAD]
* `--trust / --no-trust`: Trust the template repository. If set, the template will be cloned without verification. [default: no-trust]
* `--config-file TEXT`: [env var: INFRAHUBCTL_CONFIG; default: infrahubctl.toml]
* `--help`: Show this message and exit.

## `infrahubctl repository list`

**Usage**:
Expand Down
15 changes: 7 additions & 8 deletions infrahub_sdk/client.py
Original file line number Diff line number Diff line change
Expand Up @@ -784,7 +784,6 @@ async def filters(
if at:
at = Timestamp(at)

node = InfrahubNode(client=self, schema=schema, branch=branch)
filters = kwargs
pagination_size = self.pagination_size

Expand Down Expand Up @@ -825,12 +824,12 @@ async def process_batch() -> tuple[list[InfrahubNode], list[InfrahubNode]]:
nodes = []
related_nodes = []
batch_process = await self.create_batch()
count = await self.count(kind=schema.kind, partial_match=partial_match, **filters)
count = await self.count(kind=schema.kind, branch=branch, partial_match=partial_match, **filters)
total_pages = (count + pagination_size - 1) // pagination_size

for page_number in range(1, total_pages + 1):
page_offset = (page_number - 1) * pagination_size
batch_process.add(task=process_page, node=node, page_offset=page_offset, page_number=page_number)
batch_process.add(task=process_page, page_offset=page_offset, page_number=page_number)

async for _, response in batch_process.execute():
nodes.extend(response[1]["nodes"])
Expand All @@ -847,7 +846,7 @@ async def process_non_batch() -> tuple[list[InfrahubNode], list[InfrahubNode]]:

while has_remaining_items:
page_offset = (page_number - 1) * pagination_size
response, process_result = await process_page(page_offset, page_number)
response, process_result = await process_page(page_offset=page_offset, page_number=page_number)

nodes.extend(process_result["nodes"])
related_nodes.extend(process_result["related_nodes"])
Expand Down Expand Up @@ -1946,9 +1945,9 @@ def filters(
"""
branch = branch or self.default_branch
schema = self.schema.get(kind=kind, branch=branch)
node = InfrahubNodeSync(client=self, schema=schema, branch=branch)
if at:
at = Timestamp(at)

filters = kwargs
pagination_size = self.pagination_size

Expand Down Expand Up @@ -1990,12 +1989,12 @@ def process_batch() -> tuple[list[InfrahubNodeSync], list[InfrahubNodeSync]]:
related_nodes = []
batch_process = self.create_batch()

count = self.count(kind=schema.kind, partial_match=partial_match, **filters)
count = self.count(kind=schema.kind, branch=branch, partial_match=partial_match, **filters)
total_pages = (count + pagination_size - 1) // pagination_size

for page_number in range(1, total_pages + 1):
page_offset = (page_number - 1) * pagination_size
batch_process.add(task=process_page, node=node, page_offset=page_offset, page_number=page_number)
batch_process.add(task=process_page, page_offset=page_offset, page_number=page_number)

for _, response in batch_process.execute():
nodes.extend(response[1]["nodes"])
Expand All @@ -2012,7 +2011,7 @@ def process_non_batch() -> tuple[list[InfrahubNodeSync], list[InfrahubNodeSync]]

while has_remaining_items:
page_offset = (page_number - 1) * pagination_size
response, process_result = process_page(page_offset, page_number)
response, process_result = process_page(page_offset=page_offset, page_number=page_number)

nodes.extend(process_result["nodes"])
related_nodes.extend(process_result["related_nodes"])
Expand Down
51 changes: 51 additions & 0 deletions infrahub_sdk/ctl/repository.py
Original file line number Diff line number Diff line change
@@ -1,10 +1,12 @@
from __future__ import annotations

import asyncio
from pathlib import Path
from typing import Optional

import typer
import yaml
from copier import run_copy
from pydantic import ValidationError
from rich.console import Console
from rich.table import Table
Expand Down Expand Up @@ -165,3 +167,52 @@ async def list(
)

console.print(table)


@app.command()
async def init(
directory: Path = typer.Argument(help="Directory path for the new project."),
template: str = typer.Option(
default="https://github.com/opsmill/infrahub-template.git",
help="Template to use for the new repository. Can be a local path or a git repository URL.",
),
data: Optional[Path] = typer.Option(default=None, help="Path to YAML file containing answers to CLI prompt."),
vcs_ref: Optional[str] = typer.Option(
default="HEAD",
help="VCS reference to use for the template. Defaults to HEAD.",
),
trust: Optional[bool] = typer.Option(
default=False,
help="Trust the template repository. If set, the template will be cloned without verification.",
),
_: str = CONFIG_PARAM,
) -> None:
"""Initialize a new Infrahub repository."""

config_data = None
if data:
try:
with Path.open(data, encoding="utf-8") as file:
config_data = yaml.safe_load(file)
typer.echo(f"Loaded config: {config_data}")
except Exception as exc:
typer.echo(f"Error loading YAML file: {exc}", err=True)
raise typer.Exit(code=1)

# Allow template to be a local path or a URL
template_source = template or ""
if template and Path(template).exists():
template_source = str(Path(template).resolve())

try:
await asyncio.to_thread(
run_copy,
template_source,
str(directory),
data=config_data,
vcs_ref=vcs_ref,
unsafe=trust,
)
except Exception as e:
typer.echo(f"Error running copier: {e}", err=True)
raise typer.Exit(code=1)
18 changes: 9 additions & 9 deletions infrahub_sdk/ctl/schema.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ def validate_schema_content_and_exit(client: InfrahubClient, schemas: list[Schem
has_error: bool = False
for schema_file in schemas:
try:
client.schema.validate(data=schema_file.content)
client.schema.validate(data=schema_file.payload)
except ValidationError as exc:
console.print(f"[red]Schema not valid, found '{len(exc.errors())}' error(s) in {schema_file.location}")
has_error = True
Expand All @@ -48,7 +48,7 @@ def validate_schema_content_and_exit(client: InfrahubClient, schemas: list[Schem
raise typer.Exit(1)


def display_schema_load_errors(response: dict[str, Any], schemas_data: list[dict]) -> None:
def display_schema_load_errors(response: dict[str, Any], schemas_data: list[SchemaFile]) -> None:
console.print("[red]Unable to load the schema:")
if "detail" not in response:
handle_non_detail_errors(response=response)
Expand Down Expand Up @@ -87,7 +87,7 @@ def handle_non_detail_errors(response: dict[str, Any]) -> None:
if "error" in response:
console.print(f" {response.get('error')}")
elif "errors" in response:
for error in response.get("errors"):
for error in response["errors"]:
console.print(f" {error.get('message')}")
else:
console.print(f" '{response}'")
Expand All @@ -97,9 +97,9 @@ def valid_error_path(loc_path: list[Any]) -> bool:
return len(loc_path) >= 6 and loc_path[0] == "body" and loc_path[1] == "schemas"


def get_node(schemas_data: list[dict], schema_index: int, node_index: int) -> dict | None:
if schema_index < len(schemas_data) and node_index < len(schemas_data[schema_index].content["nodes"]):
return schemas_data[schema_index].content["nodes"][node_index]
def get_node(schemas_data: list[SchemaFile], schema_index: int, node_index: int) -> dict | None:
if schema_index < len(schemas_data) and node_index < len(schemas_data[schema_index].payload["nodes"]):
return schemas_data[schema_index].payload["nodes"][node_index]
return None


Expand All @@ -122,7 +122,7 @@ async def load(
validate_schema_content_and_exit(client=client, schemas=schemas_data)

start_time = time.time()
response = await client.schema.load(schemas=[item.content for item in schemas_data], branch=branch)
response = await client.schema.load(schemas=[item.payload for item in schemas_data], branch=branch)
loading_time = time.time() - start_time

if response.errors:
Expand Down Expand Up @@ -170,10 +170,10 @@ async def check(
client = initialize_client()
validate_schema_content_and_exit(client=client, schemas=schemas_data)

success, response = await client.schema.check(schemas=[item.content for item in schemas_data], branch=branch)
success, response = await client.schema.check(schemas=[item.payload for item in schemas_data], branch=branch)

if not success:
display_schema_load_errors(response=response, schemas_data=schemas_data)
display_schema_load_errors(response=response or {}, schemas_data=schemas_data)
else:
for schema_file in schemas_data:
console.print(f"[green] schema '{schema_file.location}' is Valid!")
Expand Down
3 changes: 3 additions & 0 deletions infrahub_sdk/ctl/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -187,6 +187,9 @@ def load_yamlfile_from_disk_and_exit(
has_error = False
try:
data_files = file_type.load_from_disk(paths=paths)
if not data_files:
console.print("[red]No valid files found to load.")
raise typer.Exit(1)
except FileNotValidError as exc:
console.print(f"[red]{exc.message}")
raise typer.Exit(1) from exc
Expand Down
Loading