Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
180 changes: 19 additions & 161 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Full Stack FastAPI Template
# Full Stack FastAPI Project

<a href="https://github.com/fastapi/full-stack-fastapi-template/actions?query=workflow%3ATest" target="_blank"><img src="https://github.com/fastapi/full-stack-fastapi-template/workflows/Test/badge.svg" alt="Test"></a>
<a href="https://coverage-badge.samuelcolvin.workers.dev/redirect/fastapi/full-stack-fastapi-template" target="_blank"><img src="https://coverage-badge.samuelcolvin.workers.dev/fastapi/full-stack-fastapi-template.svg" alt="Coverage"></a>
Expand Down Expand Up @@ -52,165 +52,15 @@

[![API docs](img/docs.png)](https://github.com/fastapi/full-stack-fastapi-template)

## How To Use It
## Getting Started / Local Development Setup

You can **just fork or clone** this repository and use it as is.
Docker Compose is the recommended way to run the project locally.
Use the command `docker compose watch` to start the services.

✨ It just works. ✨
For more details on Docker Compose, including how to access services (frontend, backend, docs, etc.), please refer to `development.md`.

### How to Use a Private Repository

If you want to have a private repository, GitHub won't allow you to simply fork it as it doesn't allow changing the visibility of forks.

But you can do the following:

- Create a new GitHub repo, for example `my-full-stack`.
- Clone this repository manually, set the name with the name of the project you want to use, for example `my-full-stack`:

```bash
git clone [email protected]:fastapi/full-stack-fastapi-template.git my-full-stack
```

- Enter into the new directory:

```bash
cd my-full-stack
```

- Set the new origin to your new repository, copy it from the GitHub interface, for example:

```bash
git remote set-url origin [email protected]:octocat/my-full-stack.git
```

- Add this repo as another "remote" to allow you to get updates later:

```bash
git remote add upstream [email protected]:fastapi/full-stack-fastapi-template.git
```

- Push the code to your new repository:

```bash
git push -u origin master
```

### Update From the Original Template

After cloning the repository, and after doing changes, you might want to get the latest changes from this original template.

- Make sure you added the original repository as a remote, you can check it with:

```bash
git remote -v

origin [email protected]:octocat/my-full-stack.git (fetch)
origin [email protected]:octocat/my-full-stack.git (push)
upstream [email protected]:fastapi/full-stack-fastapi-template.git (fetch)
upstream [email protected]:fastapi/full-stack-fastapi-template.git (push)
```

- Pull the latest changes without merging:

```bash
git pull --no-commit upstream master
```

This will download the latest changes from this template without committing them, that way you can check everything is right before committing.

- If there are conflicts, solve them in your editor.

- Once you are done, commit the changes:

```bash
git merge --continue
```

### Configure

You can then update configs in the `.env` files to customize your configurations.

Before deploying it, make sure you change at least the values for:

- `SECRET_KEY`
- `FIRST_SUPERUSER_PASSWORD`
- `POSTGRES_PASSWORD`

You can (and should) pass these as environment variables from secrets.

Read the [deployment.md](./deployment.md) docs for more details.

### Generate Secret Keys

Some environment variables in the `.env` file have a default value of `changethis`.

You have to change them with a secret key, to generate secret keys you can run the following command:

```bash
python -c "import secrets; print(secrets.token_urlsafe(32))"
```

Copy the content and use that as password / secret key. And run that again to generate another secure key.

## How To Use It - Alternative With Copier

This repository also supports generating a new project using [Copier](https://copier.readthedocs.io).

It will copy all the files, ask you configuration questions, and update the `.env` files with your answers.

### Install Copier

You can install Copier with:

```bash
pip install copier
```

Or better, if you have [`pipx`](https://pipx.pypa.io/), you can run it with:

```bash
pipx install copier
```

**Note**: If you have `pipx`, installing copier is optional, you could run it directly.

### Generate a Project With Copier

Decide a name for your new project's directory, you will use it below. For example, `my-awesome-project`.

Go to the directory that will be the parent of your project, and run the command with your project's name:

```bash
copier copy https://github.com/fastapi/full-stack-fastapi-template my-awesome-project --trust
```

If you have `pipx` and you didn't install `copier`, you can run it directly:

```bash
pipx run copier copy https://github.com/fastapi/full-stack-fastapi-template my-awesome-project --trust
```

**Note** the `--trust` option is necessary to be able to execute a [post-creation script](https://github.com/fastapi/full-stack-fastapi-template/blob/master/.copier/update_dotenv.py) that updates your `.env` files.

### Input Variables

Copier will ask you for some data, you might want to have at hand before generating the project.

But don't worry, you can just update any of that in the `.env` files afterwards.

The input variables, with their default values (some auto generated) are:

- `project_name`: (default: `"FastAPI Project"`) The name of the project, shown to API users (in .env).
- `stack_name`: (default: `"fastapi-project"`) The name of the stack used for Docker Compose labels and project name (no spaces, no periods) (in .env).
- `secret_key`: (default: `"changethis"`) The secret key for the project, used for security, stored in .env, you can generate one with the method above.
- `first_superuser`: (default: `"[email protected]"`) The email of the first superuser (in .env).
- `first_superuser_password`: (default: `"changethis"`) The password of the first superuser (in .env).
- `smtp_host`: (default: "") The SMTP server host to send emails, you can set it later in .env.
- `smtp_user`: (default: "") The SMTP server user to send emails, you can set it later in .env.
- `smtp_password`: (default: "") The SMTP server password to send emails, you can set it later in .env.
- `emails_from_email`: (default: `"[email protected]"`) The email account to send emails from, you can set it later in .env.
- `postgres_password`: (default: `"changethis"`) The password for the PostgreSQL database, stored in .env, you can generate one with the method above.
- `sentry_dsn`: (default: "") The DSN for Sentry, if you are using it, you can set it later in .env.
For faster development iteration, frontend and backend services can be run directly on the host machine.
Instructions for this can be found in `frontend/README.md` and `backend/README.md`.

## Backend Development

Expand All @@ -230,10 +80,18 @@ General development docs: [development.md](./development.md).

This includes using Docker Compose, custom local domains, `.env` configurations, etc.

## Release Notes
## Useful Scripts

Check the file [release-notes.md](./release-notes.md).
Here's a list of scripts available in the project to help with common development tasks:

## License
- `scripts/build.sh`: Builds the Docker images for the project.
- `scripts/test.sh`: Runs the complete test suite in a Dockerized environment. This typically includes backend tests and can be expanded to include frontend end-to-end tests.
- `scripts/test-local.sh`: Runs backend tests directly on the host. It assumes the backend services (like the database) are already running (e.g., via `docker compose watch` or a similar local setup).
- `scripts/generate-client.sh`: Generates or updates the frontend client based on the backend's OpenAPI schema. This usually involves fetching the schema and running a code generation tool.
- `backend/scripts/format.sh`: Formats the backend Python code using Ruff to ensure consistent code style.
- `backend/scripts/lint.sh`: Lints the backend Python code using MyPy for static type checking and Ruff for identifying potential errors and style issues.
- `backend/scripts/test.sh`: Runs backend tests directly on the host (similar to `scripts/test-local.sh` but often focused only on backend unit/integration tests) and generates a test coverage report.

The Full Stack FastAPI Template is licensed under the terms of the MIT license.
## Release Notes

Check the file [release-notes.md](./release-notes.md).
30 changes: 15 additions & 15 deletions backend/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ Start the local development environment with Docker Compose following the guide

## General Workflow

By default, the dependencies are managed with [uv](https://docs.astral.sh/uv/), go there and install it.
Dependencies are managed with [uv](https://docs.astral.sh/uv/). If you haven't already, please install it.

From `./backend/` you can install all the dependencies with:

Expand All @@ -31,25 +31,25 @@ Modify or add SQLModel models for data and SQL tables in `./backend/app/models.p

## VS Code

There are already configurations in place to run the backend through the VS Code debugger, so that you can use breakpoints, pause and explore variables, etc.
VS Code configurations are provided to run the backend with the debugger, allowing use of breakpoints, variable exploration, etc.

The setup is also already configured so you can run the tests through the VS Code Python tests tab.
The setup also allows running tests via the VS Code Python tests tab.

## Docker Compose Override

During development, you can change Docker Compose settings that will only affect the local development environment in the file `docker-compose.override.yml`.
Docker Compose settings specific to local development can be configured in `docker-compose.override.yml`.

The changes to that file only affect the local development environment, not the production environment. So, you can add "temporary" changes that help the development workflow.
These overrides only affect the local development environment, not production, allowing for temporary changes that aid development.

For example, the directory with the backend code is synchronized in the Docker container, copying the code you change live to the directory inside the container. That allows you to test your changes right away, without having to build the Docker image again. It should only be done during development, for production, you should build the Docker image with a recent version of the backend code. But during development, it allows you to iterate very fast.
For instance, the backend code directory is synchronized with the Docker container, reflecting live code changes inside the container. This allows for immediate testing of changes without rebuilding the Docker image. This live synchronization is intended for development; for production, Docker images should be built with the finalized code. This approach significantly speeds up the development iteration cycle.

There is also a command override that runs `fastapi run --reload` instead of the default `fastapi run`. It starts a single server process (instead of multiple, as would be for production) and reloads the process whenever the code changes. Have in mind that if you have a syntax error and save the Python file, it will break and exit, and the container will stop. After that, you can restart the container by fixing the error and running again:
There is also a command override that runs `fastapi dev` instead of the default command. It starts a single server process (unlike multiple processes typical for production) and reloads the process whenever code changes are detected. Note that a syntax error in a saved Python file will cause the server to break and exit, stopping the container. After fixing the error, the container can be restarted by running:

```console
$ docker compose watch
```

There is also a commented out `command` override, you can uncomment it and comment the default one. It makes the backend container run a process that does "nothing", but keeps the container alive. That allows you to get inside your running container and execute commands inside, for example a Python interpreter to test installed dependencies, or start the development server that reloads when it detects changes.
A commented-out `command` override is available in `docker-compose.override.yml`. If uncommented (and the default one commented out), it makes the backend container run a minimal process that keeps it alive without starting the main application. This allows you to `exec` into the running container and execute commands manually, such as starting a Python interpreter, testing installed dependencies, or running the development server with live reload.

To get inside the container with a `bash` session you can start the stack with:

Expand All @@ -71,16 +71,16 @@ root@7f2607af31c3:/app#

that means that you are in a `bash` session inside your container, as a `root` user, under the `/app` directory, this directory has another directory called "app" inside, that's where your code lives inside the container: `/app/app`.

There you can use the `fastapi run --reload` command to run the debug live reloading server.
There you can use the `fastapi dev` command to run the debug live reloading server.

```console
$ fastapi run --reload app/main.py
$ fastapi dev app/main.py
```

...it will look like:

```console
root@7f2607af31c3:/app# fastapi run --reload app/main.py
root@7f2607af31c3:/app# fastapi dev app/main.py
```

and then hit enter. That runs the live reloading server that auto reloads when it detects code changes.
Expand Down Expand Up @@ -123,7 +123,7 @@ When the tests are run, a file `htmlcov/index.html` is generated, you can open i

## Migrations

As during local development your app directory is mounted as a volume inside the container, you can also run the migrations with `alembic` commands inside the container and the migration code will be in your app directory (instead of being only inside the container). So you can add it to your git repository.
During local development, the application directory is mounted as a volume within the container. This allows you to run Alembic migration commands inside the container, with the generated migration code appearing directly in your application directory, ready to be committed to Git.

Make sure you create a "revision" of your models and that you "upgrade" your database with that revision every time you change them. As this is what will update the tables in your database. Otherwise, your application will have errors.

Expand All @@ -133,7 +133,7 @@ Make sure you create a "revision" of your models and that you "upgrade" your dat
$ docker compose exec backend bash
```

* Alembic is already configured to import your SQLModel models from `./backend/app/models.py`.
* Alembic is configured to import SQLModel models from `./backend/app/models.py`.

* After changing a model (for example, adding a column), inside the container, create a revision, e.g.:

Expand All @@ -149,7 +149,7 @@ $ alembic revision --autogenerate -m "Add column last_name to User model"
$ alembic upgrade head
```

If you don't want to use migrations at all, uncomment the lines in the file at `./backend/app/core/db.py` that end in:
If migrations are not desired for this project, uncomment the lines in `./backend/app/core/db.py` that end with:

```python
SQLModel.metadata.create_all(engine)
Expand All @@ -161,7 +161,7 @@ and comment the line in the file `scripts/prestart.sh` that contains:
$ alembic upgrade head
```

If you don't want to start with the default models and want to remove them / modify them, from the beginning, without having any previous revision, you can remove the revision files (`.py` Python files) under `./backend/app/alembic/versions/`. And then create a first migration as described above.
If you need to reset or start fresh with migrations (e.g., squash existing migrations or initialize a new migration history), you can remove the existing revision files (the `.py` Python files) under `./backend/app/alembic/versions/`. After doing so, you can create a new initial migration as described above.

## Email Templates

Expand Down
15 changes: 9 additions & 6 deletions backend/app/api/main.py
Original file line number Diff line number Diff line change
@@ -1,14 +1,17 @@
from fastapi import APIRouter

from app.api.routes import items, login, private, users, utils
from app.api.routes import items, login, private, users, utils, events, speeches # Added events, speeches
from app.core.config import settings

api_router = APIRouter()
api_router.include_router(login.router)
api_router.include_router(users.router)
api_router.include_router(utils.router)
api_router.include_router(items.router)
api_router.include_router(login.router) # No prefix, tags=['login'] typically
api_router.include_router(users.router, prefix="/users", tags=["users"])
api_router.include_router(utils.router, prefix="/utils", tags=["utils"]) # Assuming utils has a prefix
api_router.include_router(items.router, prefix="/items", tags=["items"])
api_router.include_router(events.router, prefix="/events", tags=["events"]) # Added
api_router.include_router(speeches.router, prefix="/speeches", tags=["speeches"]) # Added


if settings.ENVIRONMENT == "local":
api_router.include_router(private.router)
# Assuming private router also has a prefix if it's for specific resources
api_router.include_router(private.router, prefix="/private", tags=["private"])
Loading
Loading