Skip to content

Commit 4389407

Browse files
committed
fix: build and version endpoints
build now uses buildx to build cross platform images for the containers this reads from the python version which is echoed through running python inside the container. the build phase will tag the release as latest and the version number that the python package is set to, ensure you have update the python version before running builds
1 parent 609abf3 commit 4389407

File tree

2 files changed

+90
-18
lines changed

2 files changed

+90
-18
lines changed

README.md

Lines changed: 75 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -226,7 +226,13 @@ Python snake case is translated to camel case in JavaScript. So `my_var` becomes
226226

227227
```python
228228
from pydantic import BaseModel
229-
from humps import camelize
229+
230+
def to_lower_camel(name: str) -> str:
231+
"""
232+
Converts a snake_case string to lowerCamelCase
233+
"""
234+
upper = "".join(word.capitalize() for word in name.split("_"))
235+
return upper[:1].lower() + upper[1:]
230236

231237
class User(BaseModel):
232238
first_name: str
@@ -235,15 +241,15 @@ class User(BaseModel):
235241

236242
model_config = ConfigDict(
237243
from_attributes=True,
238-
alias_generator=camelize,
244+
alias_generator=to_lower_camel,
239245
)
240246
```
241247

242248
> Source: [CamelCase Models with FastAPI and Pydantic](https://medium.com/analytics-vidhya/camel-case-models-with-fast-api-and-pydantic-5a8acb6c0eee) by Ahmed Nafies
243249
244250
It is important to pay attention to such detail, and doing what is right for the environment and language.
245251

246-
To assist with this `src/labs/schema/utils.py` provides the class `AppBaseModel` which inherits from `pydantic`'s `BaseModel` and configures it to use `humps`'s `camelize` function to convert snake case to camel case. To use it simply inherit from `AppBaseModel`:
252+
To assist with this `src/labs/schema/utils.py` provides the class `AppBaseModel` which inherits from `pydantic`'s `BaseModel` and configures it to use `to_lower_camel` function to convert snake case to camel case. If you inherit from `AppBaseModel` you will automatically get this behaviour:
247253

248254
```python
249255
from .utils import AppBaseModel
@@ -254,10 +260,6 @@ class MyModel(AppBaseModel):
254260
age: float
255261
```
256262

257-
> **WARNING:** `humps` is published as `pyhumps` on `pypi`
258-
259-
As per [this issue](https://github.com/anomaly/lab-python-server/issues/27) we have wrapped this
260-
261263
FastAPI will try and generate an `operation_id` based on the path of the router endpoint, which usually ends up being a convoluted string. This was originally reported in [labs-web-client](https://github.com/anomaly/lab-web-client/issues/6). You can provide an `operation_id` in the `decorator` e.g:
262264

263265
```python
@@ -268,6 +270,24 @@ which would result in the client generating a function like `someSpecificIdYouDe
268270

269271
For consistenty FastAPI docs shows a wrapper function that [globally re-writes](https://fastapi.tiangolo.com/advanced/path-operation-advanced-configuration/?h=operation_id#using-the-path-operation-function-name-as-the-operationid) the `operation_id` to the function name. This does put the onus on the developer to name the function correctly.
270272

273+
As of FastAPI `0.99.x` it takes a `generate_unique_id_function` parameter as part of the constructor which takes a callable to return the operation id. If you name your python function properly then you can use them as the operation id. `api.py` features this simple function to help with it:
274+
275+
```python
276+
def generate_operation_id(route: APIRoute) -> str:
277+
"""
278+
With a little help from FastAPI docs
279+
https://bit.ly/3rXeAvH
280+
281+
Globally use the path name as the operation id thus
282+
making things a lot more readable, note that this requires
283+
you name your functions really well.
284+
285+
Read more about this on the FastAPI docs
286+
https://shorturl.at/vwz03
287+
"""
288+
return route.name
289+
```
290+
271291
## TaskIQ based tasks
272292

273293
The project uses [`TaskIQ`](https://taskiq-python.github.io) to manage task queues. TaskIQ supports `asyncio` and has FastAPI like design ideas e.g [dependency injection](https://taskiq-python.github.io/guide/state-and-deps.html) and can be tightly [coupled with FastAPI](https://taskiq-python.github.io/guide/taskiq-with-fastapi.html).
@@ -277,9 +297,9 @@ TaskIQ is configured as recommend for production use with [taskiq-aio-pika](http
277297
`broker.py` in the root of the project configures the broker using:
278298

279299
```python
280-
broker = AioPikaBroker(
281-
config.amqp_dsn,
282-
result_backend=redis_result_backend
300+
broker = (
301+
AioPikaBroker(str(settings.amqp.dsn),)
302+
.with_result_backend(redis_result_backend)
283303
)
284304
```
285305

@@ -323,6 +343,22 @@ async def verify_user(request: Request):
323343

324344
There are various powerful options for queuing tasks both scheduled and periodic tasks are supported.
325345

346+
Towards the end of `broker.py` you will notice the following override:
347+
348+
```python
349+
# For testing we use the InMemory broker, this is set
350+
# if an environment variables is set, please note you
351+
# will require pytest-env for environment vars to work
352+
env = os.environ.get("ENVIRONMENT")
353+
if env and env == "pytest":
354+
from taskiq import InMemoryBroker
355+
broker = InMemoryBroker()
356+
```
357+
358+
which allows us to use the `InMemoryBroker` for testing. This is because `FastAPI` provides it's own testing infrastructure which routes the calls internally and the RabbitMQ broker and redis backend is not available.
359+
360+
> Note: that you will need to install `pytest-env` for this to work and be sure to set the `ENVIRONMENT` environment variable to `pytest`. Refer to `pyproject.toml` to see ho we configure it for the template.
361+
326362
## SQLAlchemy wisdom
327363

328364
SQLAlchemy is making a move towards their `2.0` syntax, this is available as of `v1.4` which is what we currently target as part of our template. This also brings the use of `asyncpg` which allows us to use `asyncio` with `SQLAlchemy`.
@@ -580,13 +616,37 @@ We run the application using `uvicorn` and pass in `--root-path=/api` for FastAP
580616

581617
`Dockerfile` is the configuration referenced by `docker-compose.yml` for development and `Dockerfile.prod` is the configuration referenced by `docker-compose.prod.yml` for production. For Kubernetes based deployment please reference `Dockerfile.prod`.
582618

583-
## Docker in Production
619+
## Containers in production
620+
621+
The template provides Docker file for production, this uses [multi staged builds](https://docs.docker.com/develop/develop-images/multistage-build/) to build a slimmer image for production.
622+
623+
There's a fair bit of documentation available around deploying [uvicorn for production](https://www.uvicorn.org/deployment/). It does suggest that we use a process manager like `gunicorn` but it might be irrelevant depending on where we are deploying. For example if the application is deployed in a Kubernetes cluster then each `pod` would sit behind a load balancer and/or a content distribution network (CDN) and the process manager would be redundant.
624+
625+
> The production container does have the `postgres` client installed to provide you access to `psql` this is rather handy for initialising the database or performing any manual patches.
626+
627+
### .pgpass
584628

585-
Multi staged builds
586-
https://docs.docker.com/develop/develop-images/multistage-build/
629+
Many a times you will want to interactively get a shell to `postgres` to update the database. Our containers have the postgres client installed. If you have a file called `.pgpass` in `/root/` then you can use `psql` directly without having to enter a password.
630+
631+
Remember that the container has very limited software installed so you will require to save contents of `.pgpass` using:
632+
633+
```sh
634+
echo "kubernetescluster-aurora-cluster.cluster-cc3g.ap-southeast-2.rds.amazonaws.com:5432:harvest:dbuser:password" > ~/.pgpass
635+
```
587636

588-
gunicorn vs uvicorn
589-
https://www.uvicorn.org/deployment/
637+
Once you have done that you can use `kubectl` to execute psql directly.
638+
639+
```sh
640+
kubectl exec -it server-565497855b-md96l -n harvest -- /usr/bin/psql -U dbuser -h kubernetescluster-aurora-cluster.cluster-cc3g.ap-southeast-2.rds.amazonaws.com -d harvest
641+
```
642+
643+
> Note that you have to specify the hostname and username as well as the database name. The password is read from the `.pgpass` file.
644+
645+
Once you have this working you can pipe the contents of a SQL file from your local machine to the container.
646+
647+
```sh
648+
cat backup.sql | kubectl exec -it server-565497855b-md96l -n harvest -- /usr/bin/psql -U dbuser -h kubernetescluster-aurora-cluster.cluster-cc3g.ap-southeast-2.rds.amazonaws.com -d harvest
649+
```
590650

591651
## Distribution
592652

Taskfile.yml

Lines changed: 15 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,12 @@ tasks:
77
desc: builds a publishable docker image
88
cmds:
99
- |
10-
docker buildx build --platform=linux/amd64,linux/arm64 --push -t ghcr.io/anomaly/ghcr.io/{{.ORG_NAME}}/{{.PROJ_NAME}}:v{{.PROJ_VER}} -t ghcr.io/ghcr.io/{{.ORG_NAME}}/{{.PROJ_NAME}}:latest -f Dockerfile .
10+
docker buildx build \
11+
--platform=linux/amd64,linux/arm64 \
12+
--push \
13+
-t ghcr.io/anomaly/ghcr.io/{{.ORG_NAME}}/{{.PACKAGE_NAME}}:v{{.PROJ_VER}} \
14+
-t ghcr.io/ghcr.io/{{.ORG_NAME}}/{{.PACKAGE_NAME}}:latest \
15+
-f Dockerfile .
1116
vars:
1217
PROJ_VER:
1318
sh: "task version"
@@ -35,6 +40,12 @@ tasks:
3540
desc: generate a random cryptographic hash
3641
cmds:
3742
- openssl rand -hex {{.CLI_ARGS}}
43+
db:dump:
44+
desc: dump the database to a file
45+
cmds:
46+
- |
47+
docker compose exec db sh -c \
48+
"pg_dump -U {{.POSTGRES_USER}} {{.POSTGRES_DB}}" > {{.CLI_ARGS}}
3849
db:init:
3950
desc: initialise the database schema
4051
cmds:
@@ -84,8 +95,9 @@ tasks:
8495
INSERT INTO alembic_version ( version_num ) VALUES ( '{{.HEAD_SHA}}' );\""
8596
version:
8697
cmds:
87-
- cd src
88-
- python -c "from labs import __version__; print(__version__)"
98+
- |
99+
docker compose exec api sh \
100+
-c "python -c \"from {{.PROJ_NAME}} import __version__; print(__version__)\""
89101
# version:next:
90102
# cmds:
91103
# - |

0 commit comments

Comments
 (0)