Skip to content

Commit de225ff

Browse files
authored
Merge pull request #1 from ESA-APEx/sql_integration
Sql integration
2 parents 3715f64 + e9aa5ec commit de225ff

File tree

19 files changed

+623
-73
lines changed

19 files changed

+623
-73
lines changed

README.md

Lines changed: 61 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,22 +1,76 @@
11
# APEx Dispatch API (FastAPI)
2-
Implementation of the APEx Upscaling Service API
32

3+
This repository contains the implementation of the APEx Upscaling Service API using FastAPI.
44

5-
## Running the API locally
5+
## Getting Started: Running the API Locally
6+
7+
1. **Install dependencies:**
68

7-
1. Install the required dependencies:
89
```bash
910
pip install -r requirements.txt
1011
```
11-
2. Set up your environment variables in a `.env.local` file
12-
3. Run the FastAPI application:
12+
13+
2. **Configure environment variables:**
14+
15+
Create a `.env` file and set your environment variables accordingly (e.g., `DATABASE_URL`).
16+
17+
3. **Set up the database:**
18+
19+
Follow the [Database Setup](#database-setup) instructions below to prepare your local PostgreSQL instance.
20+
21+
4. **Run the FastAPI application:**
22+
1323
```bash
1424
uvicorn app.main:app --reload
1525
```
1626

1727
## Running Tests
1828

19-
To run the tests, use the following command:
29+
Execute the test suite using:
30+
2031
```bash
2132
pytest
22-
```
33+
```
34+
35+
## Database Setup
36+
37+
1. **(Optional) Create a Docker volume to persist PostgreSQL data:**
38+
39+
```bash
40+
docker volume create local-postgres-data
41+
```
42+
43+
2. **(Optional) Inspect the volume mount point:**
44+
45+
```bash
46+
docker volume inspect local-postgres-data
47+
```
48+
49+
This shows the physical location of your data on the host machine.
50+
51+
3. **Start a PostgreSQL container linked to the volume:**
52+
53+
```bash
54+
docker run -d --name postgres -p 5432:5432 \
55+
-e POSTGRES_USER=testuser \
56+
-e POSTGRES_PASSWORD=secret \
57+
-e POSTGRES_DB=testdb \
58+
-v local-postgres-data:/var/lib/docker/volumes/local-postgres-data \
59+
postgres:latest
60+
```
61+
62+
4. **Set your database connection string:**
63+
64+
Add the following to your `.env.local` (or `.env`) file:
65+
66+
```env
67+
DATABASE_URL=postgresql+psycopg2://testuser:secret@localhost:5432/testdb
68+
```
69+
70+
5. **Apply database migrations:**
71+
72+
Make sure your database schema is up-to-date by running:
73+
74+
```bash
75+
alembic upgrade head
76+
```

alembic.ini

Lines changed: 147 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,147 @@
1+
# A generic, single database configuration.
2+
3+
[alembic]
4+
# path to migration scripts.
5+
# this is typically a path given in POSIX (e.g. forward slashes)
6+
# format, relative to the token %(here)s which refers to the location of this
7+
# ini file
8+
script_location = %(here)s/alembic
9+
10+
# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s
11+
# Uncomment the line below if you want the files to be prepended with date and time
12+
# see https://alembic.sqlalchemy.org/en/latest/tutorial.html#editing-the-ini-file
13+
# for all available tokens
14+
# file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s
15+
16+
# sys.path path, will be prepended to sys.path if present.
17+
# defaults to the current working directory. for multiple paths, the path separator
18+
# is defined by "path_separator" below.
19+
prepend_sys_path = .
20+
21+
22+
# timezone to use when rendering the date within the migration file
23+
# as well as the filename.
24+
# If specified, requires the python>=3.9 or backports.zoneinfo library and tzdata library.
25+
# Any required deps can installed by adding `alembic[tz]` to the pip requirements
26+
# string value is passed to ZoneInfo()
27+
# leave blank for localtime
28+
# timezone =
29+
30+
# max length of characters to apply to the "slug" field
31+
# truncate_slug_length = 40
32+
33+
# set to 'true' to run the environment during
34+
# the 'revision' command, regardless of autogenerate
35+
# revision_environment = false
36+
37+
# set to 'true' to allow .pyc and .pyo files without
38+
# a source .py file to be detected as revisions in the
39+
# versions/ directory
40+
# sourceless = false
41+
42+
# version location specification; This defaults
43+
# to <script_location>/versions. When using multiple version
44+
# directories, initial revisions must be specified with --version-path.
45+
# The path separator used here should be the separator specified by "path_separator"
46+
# below.
47+
# version_locations = %(here)s/bar:%(here)s/bat:%(here)s/alembic/versions
48+
49+
# path_separator; This indicates what character is used to split lists of file
50+
# paths, including version_locations and prepend_sys_path within configparser
51+
# files such as alembic.ini.
52+
# The default rendered in new alembic.ini files is "os", which uses os.pathsep
53+
# to provide os-dependent path splitting.
54+
#
55+
# Note that in order to support legacy alembic.ini files, this default does NOT
56+
# take place if path_separator is not present in alembic.ini. If this
57+
# option is omitted entirely, fallback logic is as follows:
58+
#
59+
# 1. Parsing of the version_locations option falls back to using the legacy
60+
# "version_path_separator" key, which if absent then falls back to the legacy
61+
# behavior of splitting on spaces and/or commas.
62+
# 2. Parsing of the prepend_sys_path option falls back to the legacy
63+
# behavior of splitting on spaces, commas, or colons.
64+
#
65+
# Valid values for path_separator are:
66+
#
67+
# path_separator = :
68+
# path_separator = ;
69+
# path_separator = space
70+
# path_separator = newline
71+
#
72+
# Use os.pathsep. Default configuration used for new projects.
73+
path_separator = os
74+
75+
# set to 'true' to search source files recursively
76+
# in each "version_locations" directory
77+
# new in Alembic version 1.10
78+
# recursive_version_locations = false
79+
80+
# the output encoding used when revision files
81+
# are written from script.py.mako
82+
# output_encoding = utf-8
83+
84+
# database URL. This is consumed by the user-maintained env.py script only.
85+
# other means of configuring database URLs may be customized within the env.py
86+
# file.
87+
sqlalchemy.url = driver://user:pass@localhost/dbname
88+
89+
90+
[post_write_hooks]
91+
# post_write_hooks defines scripts or Python functions that are run
92+
# on newly generated revision scripts. See the documentation for further
93+
# detail and examples
94+
95+
# format using "black" - use the console_scripts runner, against the "black" entrypoint
96+
# hooks = black
97+
# black.type = console_scripts
98+
# black.entrypoint = black
99+
# black.options = -l 79 REVISION_SCRIPT_FILENAME
100+
101+
# lint with attempts to fix using "ruff" - use the module runner, against the "ruff" module
102+
# hooks = ruff
103+
# ruff.type = module
104+
# ruff.module = ruff
105+
# ruff.options = check --fix REVISION_SCRIPT_FILENAME
106+
107+
# Alternatively, use the exec runner to execute a binary found on your PATH
108+
# hooks = ruff
109+
# ruff.type = exec
110+
# ruff.executable = ruff
111+
# ruff.options = check --fix REVISION_SCRIPT_FILENAME
112+
113+
# Logging configuration. This is also consumed by the user-maintained
114+
# env.py script only.
115+
[loggers]
116+
keys = root,sqlalchemy,alembic
117+
118+
[handlers]
119+
keys = console
120+
121+
[formatters]
122+
keys = generic
123+
124+
[logger_root]
125+
level = WARNING
126+
handlers = console
127+
qualname =
128+
129+
[logger_sqlalchemy]
130+
level = WARNING
131+
handlers =
132+
qualname = sqlalchemy.engine
133+
134+
[logger_alembic]
135+
level = INFO
136+
handlers =
137+
qualname = alembic
138+
139+
[handler_console]
140+
class = StreamHandler
141+
args = (sys.stderr,)
142+
level = NOTSET
143+
formatter = generic
144+
145+
[formatter_generic]
146+
format = %(levelname)-5.5s [%(name)s] %(message)s
147+
datefmt = %H:%M:%S

alembic/README

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
Generic single-database configuration.

alembic/env.py

Lines changed: 89 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,89 @@
1+
from logging.config import fileConfig
2+
import os
3+
4+
from sqlalchemy import engine_from_config
5+
from sqlalchemy import pool
6+
7+
from alembic import context
8+
9+
from app.database.db import Base # import your Base here
10+
from app.database.models.processing_job import ProcessingJobRecord # import your models here
11+
12+
13+
# this is the Alembic Config object, which provides
14+
# access to the values within the .ini file in use.
15+
config = context.config
16+
17+
# Interpret the config file for Python logging.
18+
# This line sets up loggers basically.
19+
if config.config_file_name is not None:
20+
fileConfig(config.config_file_name)
21+
22+
# add your model's MetaData object here
23+
# for 'autogenerate' support
24+
# from myapp import mymodel
25+
# target_metadata = mymodel.Base.metadata
26+
target_metadata = Base.metadata
27+
28+
# other values from the config, defined by the needs of env.py,
29+
# can be acquired:
30+
# my_important_option = config.get_main_option("my_important_option")
31+
# ... etc.
32+
33+
34+
def get_url():
35+
print("Fetching database URL from environment variables")
36+
return os.getenv("DATABASE_URL")
37+
38+
39+
def run_migrations_offline() -> None:
40+
"""Run migrations in 'offline' mode.
41+
42+
This configures the context with just a URL
43+
and not an Engine, though an Engine is acceptable
44+
here as well. By skipping the Engine creation
45+
we don't even need a DBAPI to be available.
46+
47+
Calls to context.execute() here emit the given string to the
48+
script output.
49+
50+
"""
51+
url = get_url()
52+
context.configure(
53+
url=url,
54+
target_metadata=target_metadata,
55+
literal_binds=True,
56+
dialect_opts={"paramstyle": "named"},
57+
)
58+
59+
with context.begin_transaction():
60+
context.run_migrations()
61+
62+
63+
def run_migrations_online() -> None:
64+
"""Run migrations in 'online' mode.
65+
66+
In this scenario we need to create an Engine
67+
and associate a connection with the context.
68+
69+
"""
70+
connectable = engine_from_config(
71+
config.get_section(config.config_ini_section, {}),
72+
prefix="sqlalchemy.",
73+
poolclass=pool.NullPool,
74+
url=get_url()
75+
)
76+
77+
with connectable.connect() as connection:
78+
context.configure(
79+
connection=connection, target_metadata=target_metadata
80+
)
81+
82+
with context.begin_transaction():
83+
context.run_migrations()
84+
85+
86+
if context.is_offline_mode():
87+
run_migrations_offline()
88+
else:
89+
run_migrations_online()

alembic/script.py.mako

Lines changed: 28 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,28 @@
1+
"""${message}
2+
3+
Revision ID: ${up_revision}
4+
Revises: ${down_revision | comma,n}
5+
Create Date: ${create_date}
6+
7+
"""
8+
from typing import Sequence, Union
9+
10+
from alembic import op
11+
import sqlalchemy as sa
12+
${imports if imports else ""}
13+
14+
# revision identifiers, used by Alembic.
15+
revision: str = ${repr(up_revision)}
16+
down_revision: Union[str, Sequence[str], None] = ${repr(down_revision)}
17+
branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)}
18+
depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)}
19+
20+
21+
def upgrade() -> None:
22+
"""Upgrade schema."""
23+
${upgrades if upgrades else "pass"}
24+
25+
26+
def downgrade() -> None:
27+
"""Downgrade schema."""
28+
${downgrades if downgrades else "pass"}

0 commit comments

Comments
 (0)