Skip to content

petrarca/python-api-server

Repository files navigation

Python API server starter project

Overview

Starter project demonstrating backend API-server best practices: unified REST & GraphQL endpoints, layered architecture, shared SQLModel models, and a modern tooling stack (uv, Alembic, Ruff, pytest).

Current Implementation

  • REST endpoints: /health-check, /ping, /version (and a base API router at /api)
  • GraphQL at /graphql with GraphiQL UI
  • CLI via api-server console command (Typer) and python -m api_server.main
  • Database layer using SQLModel/SQLAlchemy; migrations via Alembic
  • Taskfile-driven workflow (uv venv/install, ruff format/lint, pytest, alembic, run)

Technology Stack

Core

  • Language: Python 3.13+
  • Web Framework: FastAPI
  • GraphQL: Strawberry GraphQL (with FastAPI integration)
  • ORM: SQLModel (SQLAlchemy-based)
  • Database: PostgreSQL (via psycopg)
  • Migration: Alembic

Development Tools

  • Package Manager: uv (faster alternative to pip)
  • CLI Framework: Typer
  • Linting & Formatting: ruff
  • Testing: pytest
  • Environment: python-dotenv
  • Logging: loguru
  • Task Runner: Taskfile (Go Task)

Documentation

  • API Docs: OpenAPI (Swagger UI), ReDoc
  • GraphQL UI: GraphiQL
  • Templates: Jinja2

Project Structure

python-api-server/
├── src/
│   └── api_server/                 # Main application package
│       ├── api/                    # REST API endpoints
│       │   ├── api_router.py       # API router configuration
│       │   ├── health_check.py     # /health-check endpoint
│       │   ├── ping.py             # /ping endpoint
│       │   └── version.py          # /version endpoint
│       ├── graphql/                # GraphQL schema and router
│       │   ├── graphql_router.py   # /graphql with GraphiQL
│       │   ├── context.py         # GraphQL context and dependencies
│       │   ├── schema.py          # GraphQL schema definitions (queries, mutations)
│       │   └── types.py           # GraphQL type definitions (Strawberry models)
│       ├── models/                 # Data models (SQLModel)
│       ├── services/               # Business logic and services
│       ├── templates/              # HTML templates (index)
│       ├── app.py                  # FastAPI application setup
│       ├── database.py             # Database connection and utilities
│       ├── logging.py              # Logging configuration
│       └── main.py                 # CLI entry (Typer) and uvicorn runner
├── migrations/                     # Database migrations (Alembic)
│   ├── versions/
│   ├── env.py
│   └── script.py.mako
├── Taskfile.yml                    # Common development tasks (uv, ruff, pytest, alembic)
├── pyproject.toml                  # Project metadata, deps, tooling
├── setup.py                        # Package setup shim
├── alembic.ini                     # Alembic configuration
├── Dockerfile                      # Containerization
├── docs/                           # Additional docs
└── README.md

Development Setup

Prerequisites

  • Python 3.13 or higher
  • uv package manager (faster alternative to pip)

Installation

# Install dependencies using uv directly
uv pip install -e ".[dev]"

# Or using Task (recommended)
task install

Development Workflow with Task

Backend uses Task as a unified task runner (simple YAML definitions).

Git Hooks

Pre-commit hooks run task fct (format, check, test) to enforce style and correctness before commits.

To install the pre-commit hooks:

# Install pre-commit hooks
task pre-commit:install

This will install hooks that run automatically on git commit and git push operations. You can also run the hooks manually on all files:

# Run pre-commit hooks on all files
task pre-commit:run

Virtual Environment and Package Management

The backend uses uv for virtual environment creation and package management. The Taskfile is configured to use uv directly with the Python executable in the virtual environment, avoiding the need to activate the virtual environment manually. This approach ensures compatibility across different shell environments.

Available Tasks

This project's Taskfile provides the following commands:

  • task setup - Create a Python virtual environment using uv
  • task install - Install the package and development dependencies using uv
  • task clean - Clean build artifacts
  • task clean:all - Clean all artifacts including virtual environment
  • task format - Format code using ruff
  • task lint - Run ruff linter (auto-fix enabled)
  • task test - Run tests with pytest
  • task test:cov - Run tests with coverage report
  • task check - Run format and lint
  • task build - Build the package
  • task db:setup - Setup the database and run all migrations
  • task db:migrate - Generate a new migration
  • task db:upgrade - Run all pending migrations
  • task db:downgrade - Rollback the last migration
  • task db:reset - Reset the database (WARNING - This will delete all data)
  • task run - Run the application without auto-reload
  • task run:dev - Run the application with auto-reload for development
  • task fct - Format, check, test
  • task pre-commit:install - Install pre-commit hooks
  • task pre-commit:update - Update pre-commit hooks
  • task pre-commit:run - Run pre-commit hooks on all files

Run tasks from the project root, e.g.:

task run:dev

Development Server

For development with hot reloading (automatically restarts the server when code changes):

# Start the development server with hot reloading
task run:dev

Recommended for development: automatic reload on code changes.

Configuration

Runtime configuration is centralized in a Pydantic Settings model (api_server.config.Settings).

Resolution order (highest precedence last):

  1. Default values embedded in Settings
  2. .env file (if present) – loaded automatically
  3. Environment variables with prefix API_SERVER_
  4. CLI overrides (Typer arguments) passed to python -m api_server.main or the api-server entrypoint

Access the settings instance anywhere:

from api_server.config import get_settings

settings = get_settings()
print(settings.host, settings.port)

Inside FastAPI routes you can depend on it:

from fastapi import Depends
from api_server.config import get_settings, Settings

@app.get("/info")
def info(settings: Settings = Depends(get_settings)):
   return {"host": settings.host, "port": settings.port}

When log level is DEBUG or TRACE, settings are logged once at startup (non-secret fields).

Environment Variables

Current Settings fields (extend as needed):

Field Env Var Default Description
host API_SERVER_HOST 0.0.0.0 Interface to bind
port API_SERVER_PORT 8080 Port to listen on
log_level API_SERVER_LOG_LEVEL INFO TRACE / DEBUG / INFO / WARNING / ERROR
sql_log API_SERVER_SQL_LOG False Enable SQLAlchemy SQL echo logging
reload API_SERVER_RELOAD True Auto-reload in dev (CLI override supported)

Add database URL / secrets by appending fields to Settings:

class Settings(BaseSettings):
   database_url: str = Field(..., description="Primary database DSN")  # required
   # secret_token: SecretStr | None = None  # example secret

Then export API_SERVER_DATABASE_URL (and optionally add to .env). Pydantic enforces required fields.

Example .env file:

API_SERVER_HOST=0.0.0.0
API_SERVER_PORT=8080
API_SERVER_LOG_LEVEL=DEBUG
API_SERVER_SQL_LOG=true
API_SERVER_DATABASE_URL=postgresql+psycopg://postgres:[email protected]:5432/postgres

Variables can live in .env (example: .env.example).

Command-line Arguments

CLI arguments override environment-driven settings for that invocation:

Argument Effect
--host / --port Override bind interface/port
--log-level Override log_level (case-insensitive)
--reload / --no-reload Force enable/disable auto-reload
--sql-log / --no-sql-log Toggle SQL logging

Examples:

api-server --host=127.0.0.1 --port=9000 --log-level=DEBUG --sql-log
python -m api_server.main --no-reload --log-level=WARNING

Redacting Sensitive Fields (Optional)

If you add secrets (e.g., SecretStr), avoid dumping them verbatim:

if settings.log_level in {"DEBUG", "TRACE"}:
   safe = settings.model_dump(exclude={"secret_token"})
   logger.debug(f"Effective settings: {safe}")

Running the Server

# Run the server with default settings (port 8080)
python -m api_server.main

# Run the server on a specific port
python -m api_server.main --port=8081

# Run the server with custom host
python -m api_server.main --host=127.0.0.1 --port=8080

# Run the server without auto-reload
python -m api_server.main --no-reload

# Using the installed console script (via Typer)
api-server --host=127.0.0.1 --port=8080 --log-level=INFO

Or run via Task:

# Run with default settings
task run

# Run on a specific port
task run -- --port=8080

Database Setup and Migration

Alembic handles database migrations:

Setting Up a New Database

# Configure your database connection in .env file
API_SERVER_DATABASE_URL="postgresql+psycopg://username:password@hostname:port/database"

# Initialize the database with all migrations
task db:setup

Creating a New Migration

After making changes to your database models, create a new migration:

# Generate a migration with a descriptive message
task db:migrate -- "Add patient addresses table"

Upgrading the Database

To apply all pending migrations:

task db:upgrade

Downgrading the Database

To roll back the last migration:

task db:downgrade

Resetting the Database

Warning: This will delete all data in the database.

task db:reset

API Documentation

API documentation is available at:

Health Check Endpoint

Health-related endpoints:

Deployment

Docker Deployment

The backend server can be containerized using Docker with the provided utility scripts:

# Build the Docker image
./scripts/docker-rebuild.sh

# Run the container
./scripts/docker-run.sh

# Stop the container
./scripts/docker-stop.sh

Scripts advantages over raw Docker commands:

  • Automatic environment variable resolution: Variables are automatically resolved from your environment or .env files
  • Automatic port mapping: Exposed ports from the Dockerfile are automatically mapped
  • Container name resolution: Container names are automatically extracted from the Dockerfile
  • Custom environment files: Use --dot-env option to specify custom environment files (e.g., --dot-env .env.docker)
# Run with a custom environment file
./scripts/docker-run.sh --dot-env .env.docker

# Show the command without executing it
./scripts/docker-run.sh --no-exec

# Pass additional arguments to docker run
./scripts/docker-run.sh -- -v /local/path:/container/path --restart always

Important: When running in Docker, use host.docker.internal as the database host in your .env.docker file to connect to services running on your host machine.

Once running, you can access:

Dockerfile highlights:

  • Exposes port 8080 for the FastAPI application
  • Sets environment variables:
    • API_SERVER_HOST=0.0.0.0
    • API_SERVER_LOG_LEVEL=INFO
  • Required environment variables that must be set:
    • API_SERVER_DATABASE_URL (can be set in .env file)
  • Runs the application with the command: api-server --host=0.0.0.0 --port=8080 --no-reload

Adapting This Template For Your Project

This template is designed to be easily adapted for your own projects. Here's how to customize it for your needs:

Step-by-Step Guide

Let's say you want to rename the project to my_server:

  1. Rename the source directory:

    # Rename the main package directory
    mv src/api_server src/my_server
  2. Replace all occurrences of 'api_server' with 'my_server':

    • Important: This is case sensitive!
    • Use your IDE's "Replace in Files" feature (in VS Code: Ctrl+Shift+H or Cmd+Shift+H)
    • Search for api_server and replace with my_server across all files
    • Make sure to update:
      • Python imports
      • Environment variable prefixes
      • Package names in pyproject.toml
      • References in Taskfile.yml
      • Database connection strings
      • Docker configuration
  3. Update the configuration prefix (optional): If you want a different env prefix (e.g. MY_SERVER_), edit env_prefix in Settings.model_config inside config.py.

  4. Test if everything is still working:

    # Run format, check, and tests to verify the project is still functional
    task fct
    
    # Test database migrations
    task db:setup
  5. Replace example code with your implementations:

    • Replace the example models in src/my_server/models/
    • Update services in src/my_server/services/
    • Modify GraphQL schema in src/my_server/graphql/
    • Add your API endpoints to src/my_server/api/
    • Update database migrations as needed

Tips for a Smooth Transition

  • After renaming, update .env to use the new prefix if changed
  • Adjust Settings to include project-specific fields early (keeps config discoverable)
  • Update the console script name in pyproject.toml if you want to change the CLI command
  • Regenerate database migrations if you change the models
  • Add secrets as SecretStr fields to automatically prevent accidental plain-text printing
  • Update or add tests that call get_settings() to validate new required fields

License

This project is licensed under the MIT License - see the LICENSE file for details.

About

Python API server starter project with FastAPI, GraphQL, SQLModel, and Alembic

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published