DEV Community

Cover image for Scalable Python backend: Building a containerized FastAPI Application with uv, Docker, and pre-commit: a step-by-step guide
Ismael Arce
Ismael Arce

Posted on

Scalable Python backend: Building a containerized FastAPI Application with uv, Docker, and pre-commit: a step-by-step guide

In today’s world of containerized deployments, building and deploying backend applications efficiently is critical. FastAPI has emerged as one of the most popular Python frameworks for creating fast, high-performance APIs. To manage dependencies, we can also leverage uv (a package manager) as a handy tool.

uv

I will assume you have previously installed uv and docker locally.

Now, we can move on to creating our app by initializing our project with: uv init simple-app

uv will create the following files:

simple-app/
├── .python-version
├── README.md
├── hello.py
└── pyproject.toml
Enter fullscreen mode Exit fullscreen mode

The pyproject.toml file contains metadata about our project:

[project]
name = "simple-app"
version = "0.1.0"
description = "Add your description here"
readme = "README.md"
requires-python = ">=3.11"
dependencies = []
Enter fullscreen mode Exit fullscreen mode

Next, we can start adding project dependencies. You should end up with the following structure in pyproject.toml:

dependencies = [
    "fastapi[standard]<1.0.0,>=0.114.2",
    "python-multipart<1.0.0,>=0.0.7",
    "email-validator<3.0.0,>=2.1.0",
    "pydantic>2.0",
    "SQLAlchemy>2.0",
    "alembic<2.0.0,>=1.12.1",
    ]

[tool.uv]
dev-dependencies = [
    "pytest<8.0.0,>=7.4.3",
    "mypy<2.0.0,>=1.8.0",
    "ruff<1.0.0,>=0.2.2",
    "pre-commit<4.1.0,>=4.0.0",
]
Enter fullscreen mode Exit fullscreen mode

Note the [tool.uv] section: here, we define some dependencies that we will exclude when deploying our project because they are not required at that stage.

At this point, we haven’t created any virtual environment. To do so, simply run: uv sync, uv will do the following

  1. Create a uv.lock file.
  2. Create a virtual environment (.venv folder) with the specified Python version (as indicated by .python-version and requires-python in pyproject.toml). If uv cannot find a local Python interpreter, it will download one.
  3. Install all dependencies.

FastAPI

Now, we can start creating our FastAPI application manually by adding the following folder structure:

recipe-app/
├── app/
│   ├── main.py
│   ├── __init__.py
│   └── ...
├── .python-version
├── README.md
└── pyproject.toml
Enter fullscreen mode Exit fullscreen mode

Inside main.py, add the following code:

from fastapi import FastAPI
from pydantic import BaseModel

app = FastAPI()


class Hello(BaseModel):
    message: str


@app.get("/", response_model=Hello)
async def hello() -> Hello:
    return Hello(message="Hi, I am using FastAPI")
Enter fullscreen mode Exit fullscreen mode

We can run our project by executing: uv run fastapi dev app/main.py, you should see output similar to the following

FastAPI running locally.

If you go to http://127.0.0.1:8000/, you will see: message "Hi, I am using FastAPI"

Docker

So far, so good. However, we haven’t integrated Docker yet. We will be developing with containers (some argue this is not convenient, but it’s ultimately up to you). Also, we will use uv inside a container, which may be debatable, but it’s what I am used to.

uv provides some useful information about using uv in Docker here. We’ll start by adding a Dockerfile at the root of our application with the following configuration:

FROM python:3.11-slim

ENV PYTHONUNBUFFERED=1

COPY --from=ghcr.io/astral-sh/uv:0.5.11 /uv /uvx /bin/

ENV UV_COMPILE_BYTE=1

ENV UV_LINK_MODE=copy

# Change the working directory to the `app` directory
WORKDIR /app

ENV PATH="/app/.venv/bin:$PATH"

COPY ./pyproject.toml ./uv.lock ./.python-version /app/

# Install dependencies
RUN --mount=type=cache,target=/root/.cache/uv \
    --mount=type=bind,source=uv.lock,target=uv.lock \
    --mount=type=bind,source=pyproject.toml,target=pyproject.toml \
    uv sync --frozen --no-install-project --no-dev

# Copy the project into the image
COPY ./app /app/app

# Sync the project
RUN --mount=type=cache,target=/root/.cache/uv \
    uv sync --frozen --no-dev

CMD ["fastapi", "dev", "app/main.py", "--host", "0.0.0.0"]
Enter fullscreen mode Exit fullscreen mode

Although you can create a multi-stage Dockerfile, we’re keeping things simpler for this tutorial.

We could just use our container, however I find it more convenient to create a docker-compose.yaml file to manage all of our containers:

services:
  app:
    # Build configuration for the "app" service:
    # - 'context: .' tells Docker to use the current directory as the build context
    # - 'dockerfile: Dockerfile' specifies the file to use for building the image
    build:
      context: .
      dockerfile: Dockerfile

    # This sets the default working directory inside the container
    working_dir: /app

    # Mounts the local "app" directory into the container so code changes are reflected without rebuild
    volumes:
      - ./app:/app/app

    # Maps the container port 8000 to the host machine port defined by APP_PORT
    # If APP_PORT is not set, it defaults to 8000
    ports:
      - "${APP_PORT:-8000}:8000"

    # Passes the DATABASE_URL environment variable to the container
    environment:
      - DATABASE_URL=${DATABASE_URL}

    # Ensures the 'app' service won't start until 'postgres' is running
    depends_on:
      - postgres

  postgres: ## just for reference...
    # Official Postgres image version 15
    image: postgres:15

    # Set up the default database, user, and password
    environment:
      POSTGRES_DB: ${POSTGRES_DB}
      POSTGRES_USER: ${POSTGRES_USER}
      POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}

    # This volume stores PostgreSQL data outside of the container filesystem,
    # preserving data between container restarts or recreations
    volumes:
      - postgres_data:/var/lib/postgresql/data

# Declare named volumes to be used for persistent storage
volumes:
  postgres_data: {}
Enter fullscreen mode Exit fullscreen mode

To run all containers, create an .env file with all the required variables.

You may wonder why we need to define our database credentials twice. Well, DATABASE_URL is for Alembic and SQLAlchemy, and the individual credentials are for the database itself. This can seem repetitive, but we only need to configure it once.

Once everything is set up, we can run our project with: docker compose up --build

[tool.uv]

The final piece we need to cover is the [tool.uv] section in pyproject.toml, where we listed dev dependencies.

  • pytest

    • Pytest is a widely-used testing framework in Python, allowing you to write small, concise tests while providing powerful features such as fixtures and assertions.
  • mypy

    • Mypy is a static type checker for Python that uses type hints (PEP 484) to detect potential bugs or inconsistencies before runtime.
  • ruff

    • Ruff is a fast Python linter, written in Rust, capable of replacing multiple tools (e.g., Flake8, isort) by providing thorough style, error, and formatting checks.
  • pre-commit

    • Pre-commit is a framework for managing and maintaining multi-language pre-commit hooks, ensuring consistency and quality in your codebase by running checks before a commit is finalized.

Since pytest is out of scope, we will configure ruff and pre-commit first. We need to create a .pre-commit-config.yaml file, which will be run every time we execute a Git commit operation. Below is a suggested configuration:

repos:
  - repo: https://github.com/pre-commit/pre-commit-hooks
    rev: v4.4.0
    hooks:
      - id: check-added-large-files
      - id: check-toml
      - id: check-yaml
        args:
          - --unsafe
      - id: end-of-file-fixer
      - id: trailing-whitespace
  - repo: https://github.com/astral-sh/ruff-pre-commit
    rev: v0.8.6
    hooks:
      - id: ruff
        args: [--fix]
      - id: ruff-format

Enter fullscreen mode Exit fullscreen mode

You can configure mypy inside pre-commit as well, but it can be somewhat tricky because it needs an isolated environment to check your code and might fail to find packages that are already part of your dependencies. This is why I prefer to run it manually by executing: uv run mypy app and it will runs mypy on our app folder.

Additional configuration can be added to pyproject.toml for both mypy and ruff. This is my standard configuration (some values are defaults, but I prefer to be explicit):

[tool.mypy]
strict = true
exclude = ["venv", ".venv", "alembic"]
ignore_missing_imports = true
allow_untyped_decorators = true
plugins = ["pydantic.mypy"]
follow_imports = "silent"
warn_redundant_casts = true
warn_unused_ignores = true
disallow_any_generics = true
no_implicit_reexport = true
disallow_untyped_defs = true

[tool.pydantic-mypy]
init_forbid_extra = true
init_typed = true
warn_required_dynamic_aliases = true

[tool.ruff]
target-version = "py312"
exclude = ["venv", ".venv", "alembic"]
line-length = 100
indent-width = 4

[tool.ruff.lint]
select = [
    "E",  # pycodestyle errors
    "W",  # pycodestyle warnings
    "F",  # pyflakes
    "I",  # isort
    "B",  # flake8-bugbear
    "C4",  # flake8-comprehensions
    "UP",  # pyupgrade
    "ARG001", # unused arguments in functions
]
ignore = [
    "B008",  # do not perform function calls in argument defaults
    "W191",  # indentation contains tabs
    "B904",  # Allow raising exceptions without from e, for HTTPException
]

[tool.ruff.format]
quote-style = "double"
line-ending = "auto"

[tool.ruff.lint.pyupgrade]
# Preserve types, even if a file imports `from __future__ import annotations`.
keep-runtime-typing = true

[tool.pyright]
ignore = ["alembic"]

Enter fullscreen mode Exit fullscreen mode

Now you can install a Ruff extension directly from the VS Code Marketplace. This plugin will automatically lint your code and highlight issues in real time, providing immediate feedback as you work, this extension will take into account all our configuration from pyproject.toml

With this configuration, your development environment will enforce consistent code style, type-checking, and pre-commit checks, enabling a smoother workflow for building containerized FastAPI applications with uv.

Top comments (0)