Skip to content

Commit fe082b2

Browse files
authored
Example docs (#10)
* feat: added db health check * feat: added error message to health check * test: fixed health check test * docs: added body examples for post requests * docs: added docs for unit_jobs * docs: added description for tiles model * feat: automatic loading of grid implementations * fix: lint issues
1 parent d8ef257 commit fe082b2

File tree

9 files changed

+186
-29
lines changed

9 files changed

+186
-29
lines changed

CONTRIBUTE.md

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33
## Making Contributions
44

55
Contributions to the APEx Dispatch API are welcome! If you have suggestions for improvements, bug fixes, or new features, please follow these steps:
6-
6+
77
1. **Fork the repository**: Create a personal copy of the repository on GitHub.
88
2. **Create a new branch**: Use a descriptive name for your branch that reflects the changes you plan to make.
99
```bash
@@ -27,7 +27,7 @@ Contributions to the APEx Dispatch API are welcome! If you have suggestions for
2727

2828
## Registration of a new Platform Implementation
2929

30-
To add a new platform implementation, you will need to create a new class that inherits from the `BasePlatform` class located at [`app/platforms/base.py`](app/platforms/base.py). In this new class, you will need to implement all the abstract methods defined in the [`BasePlatform`](app/platforms/base.py) class. This will ensure that your new platform implementation adheres to the expected interface and functionality.
30+
To add a new platform implementation, you will need to create a new class that inherits from the `BaseProcessingPlatform` class located at [`app/platforms/base.py`](app/platforms/base.py). In this new class, you will need to implement all the abstract methods defined in the [`BaseProcessingPlatform`](app/platforms/base.py) class. This will ensure that your new platform implementation adheres to the expected interface and functionality.
3131

3232
To register the new implementation, it is important to add the following directive right above the class definition:
3333

@@ -36,8 +36,10 @@ from app.platforms.dispatcher import register_platform
3636
from app.schemas.enum import ProcessTypeEnum
3737

3838
@register_platform(ProcessTypeEnum.OGC_API_PROCESS)
39+
class OGCAPIProcessPlatform(BaseProcessingPlatform):
40+
...
3941
```
4042

41-
The processing type is the unique identifier for the platform implementation. It is used to distinguish between different platform implementations in the system. This value is used by the different request endpoints to determine which platform implementation to use for processing the request. To add a new platform implementation, you will need to define a new `ProcessTypeEnum` value in the [`app/schemas/enum.py`](app/schemas/enum.py) file. This value should be unique and descriptive of the platform you are implementing.
43+
The processing type, defined by `ProcessTypeEnum`, is the unique identifier for the platform implementation. It is used to distinguish between different platform implementations in the system. This value is used by the different request endpoints to determine which platform implementation to use for processing the request. To add a new platform implementation, you will need to define a new `ProcessTypeEnum` value in the [`app/schemas/enum.py`](app/schemas/enum.py) file. This value should be unique and descriptive of the platform you are implementing.
4244

4345
Once you have completed the above steps, the new platform implementation will be registered automatically and made available for use in the APEx Dispatch API. You can then proceed to implement the specific functionality required for your platform.

app/main.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2,13 +2,15 @@
22

33
from app.middleware.correlation_id import add_correlation_id
44
from app.platforms.dispatcher import load_processing_platforms
5+
from app.services.tiles.base import load_grids
56
from .config.logger import setup_logging
67
from .config.settings import settings
78
from .routers import jobs_status, unit_jobs, health, tiles
89

910
setup_logging()
1011

1112
load_processing_platforms()
13+
load_grids()
1214

1315
app = FastAPI(
1416
title=settings.app_name,

app/routers/tiles.py

Lines changed: 33 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,9 @@
1-
from fastapi import APIRouter, HTTPException, status
1+
from typing import Annotated
2+
from fastapi import APIRouter, HTTPException, status, Body
23
from geojson_pydantic import GeometryCollection
34
from loguru import logger
45

5-
from app.schemas.tiles import TileRequest
6+
from app.schemas.tiles import GridTypeEnum, TileRequest
67
from app.services.tiles.base import split_polygon_by_grid
78

89

@@ -18,7 +19,36 @@
1819
"service’s Max AOI capacity), calculate the number of tiles to be"
1920
"processed by the upscaling service.",
2021
)
21-
def split_in_tiles(payload: TileRequest) -> GeometryCollection:
22+
def split_in_tiles(
23+
payload: Annotated[
24+
TileRequest,
25+
Body(
26+
openapi_examples={
27+
"20x20 Grid": {
28+
"summary": "Request to split up area in a 20x20km grid",
29+
"description": "An example request of splitting up a given area of interest "
30+
"into a 20 by 20km grid.",
31+
"value": TileRequest(
32+
grid=GridTypeEnum.KM_20,
33+
aoi={
34+
"coordinates": [
35+
[
36+
[5.131074140132512, 51.352892918832026],
37+
[4.836037011633863, 51.331277680080774],
38+
[4.789036228520871, 51.12326419975835],
39+
[5.164855813583216, 51.11863683854557],
40+
[5.192048230607185, 51.33847556306924],
41+
[5.131074140132512, 51.352892918832026],
42+
]
43+
],
44+
"type": "Polygon",
45+
},
46+
),
47+
}
48+
}
49+
),
50+
],
51+
) -> GeometryCollection:
2252
try:
2353
logger.debug(f"Splitting tiles in a {payload.grid} formation")
2454
return split_polygon_by_grid(payload.aoi, payload.grid)

app/routers/unit_jobs.py

Lines changed: 77 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,16 @@
1-
from fastapi import APIRouter, Depends, HTTPException, status
1+
from typing import Annotated
2+
from fastapi import Body, APIRouter, Depends, HTTPException, status
23
from loguru import logger
34
from sqlalchemy.orm import Session
45

56
from app.database.db import get_db
6-
from app.schemas.unit_job import BaseJobRequest, ProcessingJob, ProcessingJobSummary
7+
from app.schemas.enum import ProcessTypeEnum
8+
from app.schemas.unit_job import (
9+
BaseJobRequest,
10+
ProcessingJob,
11+
ProcessingJobSummary,
12+
ServiceDetails,
13+
)
714
from app.services.processing import create_processing_job, get_processing_job_by_user_id
815

916
# from app.auth import get_current_user
@@ -18,7 +25,74 @@
1825
summary="Create a new processing job",
1926
)
2027
async def create_unit_job(
21-
payload: BaseJobRequest, db: Session = Depends(get_db), user: str = "foobar"
28+
payload: Annotated[
29+
BaseJobRequest,
30+
Body(
31+
openapi_examples={
32+
"openEO Example": {
33+
"summary": "Valid openEO job request",
34+
"description": "The following example demonstrates how to create a processing "
35+
"job using an openEO-based service. This example triggers the "
36+
"[`variability map`](https://github.com/ESA-APEx/apex_algorithms/blob/main/algo"
37+
"rithm_catalog/vito/variabilitymap/records/variabilitymap.json) "
38+
"process using the CDSE openEO Federation. In this case the `endpoint`"
39+
"represents the URL of the openEO backend and the `application` refers to the "
40+
"User Defined Process (UDP) that is being executed on the backend.",
41+
"value": BaseJobRequest(
42+
label=ProcessTypeEnum.OPENEO,
43+
title="Example openEO Job",
44+
service=ServiceDetails(
45+
endpoint="https://openeofed.dataspace.copernicus.eu",
46+
application="https://raw.githubusercontent.com/ESA-APEx/apex_algorithms"
47+
"/32ea3c9a6fa24fe063cb59164cd318cceb7209b0/openeo_udp/variabilitymap/"
48+
"variabilitymap.json",
49+
),
50+
parameters={
51+
"spatial_extent": {
52+
"type": "FeatureCollection",
53+
"features": [
54+
{
55+
"type": "Feature",
56+
"properties": {},
57+
"geometry": {
58+
"coordinates": [
59+
[
60+
[
61+
5.170043941798298,
62+
51.25050990858725,
63+
],
64+
[
65+
5.171035037521989,
66+
51.24865722468999,
67+
],
68+
[
69+
5.178521828188366,
70+
51.24674578027137,
71+
],
72+
[
73+
5.179084341977159,
74+
51.24984764553983,
75+
],
76+
[
77+
5.170043941798298,
78+
51.25050990858725,
79+
],
80+
]
81+
],
82+
"type": "Polygon",
83+
},
84+
}
85+
],
86+
},
87+
"temporal_extent": ["2025-05-01", "2025-05-01"],
88+
},
89+
).model_dump(),
90+
}
91+
},
92+
),
93+
],
94+
db: Session = Depends(get_db),
95+
user: str = "foobar",
2296
) -> ProcessingJobSummary:
2397
"""Create a new processing job with the provided data."""
2498
try:

app/schemas/tiles.py

Lines changed: 11 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
from enum import Enum
2-
from pydantic import BaseModel
2+
from pydantic import BaseModel, Field
33
from geojson_pydantic import Polygon
44

55

@@ -8,8 +8,16 @@ class GridTypeEnum(str, Enum):
88

99

1010
class TileRequest(BaseModel):
11-
aoi: Polygon
12-
grid: GridTypeEnum
11+
aoi: Polygon = Field(
12+
...,
13+
description="Polygon representing the area of interest for which the tiling grid should "
14+
"be calculated",
15+
)
16+
grid: GridTypeEnum = Field(
17+
...,
18+
description="Identifier of the grid system that needs to be used to split up the area of "
19+
"interest",
20+
)
1321

1422

1523
# class TileResponse(BaseModel):

app/schemas/unit_job.py

Lines changed: 47 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -1,37 +1,68 @@
11
from datetime import datetime
22
from typing import Optional
33

4-
from pydantic import BaseModel
4+
from pydantic import BaseModel, Field
55

66
from app.schemas.enum import ProcessingStatusEnum, ProcessTypeEnum
77

88

99
class ServiceDetails(BaseModel):
10-
endpoint: str
11-
application: str
10+
endpoint: str = Field(
11+
...,
12+
description="URL to the endpoint where the service is hosted. For openEO, this is the "
13+
"openEO backend. For OGC API Processes, this field should include the base URL of the "
14+
"platform API",
15+
)
16+
application: str = Field(
17+
...,
18+
description="Path to the application that needs to be executed. For openEO this is "
19+
"referring to the public URL of the UDP (JSON) to execute. For OGC API Processes, this "
20+
"field should include the URL path pointing to the hosted service.",
21+
)
1222

1323

1424
class ProcessingJobSummary(BaseModel):
15-
id: int
16-
title: str
17-
label: ProcessTypeEnum
18-
status: ProcessingStatusEnum
25+
id: int = Field(..., description="Unique identifier of the processing job")
26+
title: str = Field(..., description="Title of the job")
27+
label: ProcessTypeEnum = Field(
28+
...,
29+
description="Label that is representing the type of the service that will be executed",
30+
)
31+
status: ProcessingStatusEnum = Field(
32+
..., description="Current status of the processing job"
33+
)
1934

2035

2136
class ProcessingJobDetails(BaseModel):
22-
service: ServiceDetails
23-
parameters: dict
24-
result_link: Optional[str]
25-
created: datetime
26-
updated: datetime
37+
service: ServiceDetails = Field(
38+
..., description="Details of the service to be executed"
39+
)
40+
parameters: dict = Field(
41+
..., description="JSON representing the parameters for the service execution"
42+
)
43+
result_link: Optional[str] = Field(
44+
..., description="URL to the results of the processing job"
45+
)
46+
created: datetime = Field(..., description="Creation time of the processing job")
47+
updated: datetime = Field(
48+
...,
49+
description="Timestamp representing the last time that the job details were updated",
50+
)
2751

2852

2953
class ProcessingJob(ProcessingJobSummary, ProcessingJobDetails):
3054
pass
3155

3256

3357
class BaseJobRequest(BaseModel):
34-
title: str
35-
label: ProcessTypeEnum
36-
service: ServiceDetails
37-
parameters: dict
58+
title: str = Field(..., description="Title of the job to execute")
59+
label: ProcessTypeEnum = Field(
60+
...,
61+
description="Label that is representing the type of the service that will be executed",
62+
)
63+
service: ServiceDetails = Field(
64+
..., description="Details of the service to be executed"
65+
)
66+
parameters: dict = Field(
67+
..., description="JSON representing the parameters for the service execution"
68+
)

app/services/tiles/base.py

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,19 +1,29 @@
1+
import importlib
2+
import pkgutil
13
from typing import Callable, Dict
24
from geojson_pydantic import GeometryCollection, Polygon
35
from loguru import logger
6+
import app.services.tiles.grids
47
from app.schemas.tiles import GridTypeEnum
58

69
GRID_REGISTRY: Dict[GridTypeEnum, Callable[[Polygon], GeometryCollection]] = {}
710

811

912
def register_grid(grid_type: GridTypeEnum):
1013
def decorator(func: Callable[[Polygon], GeometryCollection]):
14+
logger.debug(f"Registering grid {grid_type}")
1115
GRID_REGISTRY[grid_type] = func
1216
return func
1317

1418
return decorator
1519

1620

21+
def load_grids():
22+
"""Dynamically load all processing platform implementations."""
23+
for _, module_name, _ in pkgutil.iter_modules(app.services.tiles.grids.__path__):
24+
importlib.import_module(f"app.services.tiles.grids.{module_name}") # noqa: F821
25+
26+
1727
def split_polygon_by_grid(polygon: Polygon, grid: GridTypeEnum) -> GeometryCollection:
1828
"""
1929
Split a GeoJSON Polygon into smaller polygons according to the specified grid type.

tests/services/test_tiles_km_grids.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
from geojson_pydantic import GeometryCollection, Polygon
22

3-
from app.services.tiles.km_grids import _split_by_km_grid, split_by_20x20_km_grid
3+
from app.services.tiles.grids.km_grids import _split_by_km_grid, split_by_20x20_km_grid
44

55

66
def test__split_by_km_grid_creates_multiple_cells():

0 commit comments

Comments
 (0)